There is a lot of data out there, and more is being created every day. It takes a lot of resources to keep it around, and make sure that you and everyone else can access what they want, when they want, with minimal downtime. Naturally this takes a lot of energy, but the New York Times looked into exactly how much. It's a ridiculous amount.
Most data centers, by design, consume vast amounts of energy in an incongruously wasteful manner, interviews and documents show. Online companies typically run their facilities at maximum capacity around the clock, whatever the demand. As a result, data centers can waste 90 percent or more of the electricity they pull off the grid, The Times found.
Wasting 90 percent of anything is already incredibly inefficient, but it only gets worse when you consider how much electricity these data centers are actually pulling off the grid. In many cases, it's more than a medium-sized town. That's per data center, and we have quite a few of those.
There are ways to make it better, but there are also obstacles to implementing them. For one, data centers are notoriously secretive; not only are their locations often secret, but their hardware can be proprietary and hush-hush too. On top of that, when uptime is priority number one, taking risks on anything new is counter-intuitive. Sure, maybe this new thing could increase energy efficiency, but it also might break.
As the world generates more and more data, and that data continues to migrate to the cloud, data center efficiency is going to become a bigger and bigger problem. You can read more about the harrowing details over at the New York Times. [The New York Times]