The report from the Department of Energy’s Lawrence Berkeley National Laboratory figures that those data centers use an enormous amount of energy — some 70 billion kilowatt hours per year. That amounts to 1.8% of total American electricity consumption. At an average cost of 10 cents per kwh, the annual cost of all that juice is on the order of $7 billion.
Seventy billion kilowatt hours is such a giant number that it’s helpful to put it into some other terms. For comparison purposes, 1 kwh is enough power to keep ten 100-watt lightbulbs illuminated for one hour, or to keep your smartphone charged for an entire year.
To generate 70 billion kwh you’d need power plants with a baseload capacity of 8,000 megawatts — equivalent to about 8 big nuclear reactors, or twice the output of all the nation’s solar panels.
Sliced up per capita, the average American uses about 200 kwh a year for his or her internet use, costing about $20. For those of you obsessed with carbon footprints, your internet use is responsible for the emission of about 300 pounds of carbon dioxide per year.
But our internet addiction is only growing. According to Nielsen, the average adult in the United States spends 10 hours and 39 minutes a day consuming digital media. That’s up an hour a day in the past year. And we’re spending most of that additional time peering at our smart phones, which now occupy us for an hour and a half each day.
Forbes contributor Michael Kanellos notes that research firm IDC estimates that by 2025 152,000 new devices will be jumping, directly or indirectly, onto the Internet every minute, bringing the total number of connected devices to 80 billion worldwide.
Can our data centers handle the growth?
The Berkeley Lab report says that data centers are on track to use 73 billion kwh by 2020.
That’s a much slower pace of power demand growth than in the early days of the interwebs — in large part because most new servers are being deployed in very large data centers (think 400,000 square feet) that operate at high utilization rates and with advanced cooling systems and redundant power supplies. If not for increases in efficiency since 2010, data centers would likely be using 200 billion kwh by 2020. More efficiency gains can be achieved by getting rid of “zombie” servers that are obsolete or unused, but still plugged in.
And there’s something else to think about: for all this electricity that we use updating Facebook or streaming Game of Thrones, think about all the energy we’re not using on the stuff we used to have before the internet, like paper phonebooks and encyclopedias, snail mail, fax machines, book stores, Tower Records, Blockbuster Video, nudity in Playboy magazine.
In time the energy we use to power the internet (and the internet of things) will save us even more energy in what used to be called the real world. As the Berkeley Lab researchers wrote in their study, “seamless telework results in less commuting, and driverless vehicles allow for more productive use of commuting time.” [forbes.com]