In 2012, American homes consumed 3.65 billion kilowatthours (kWh) worth of electricity—up from 720 million kWh in 1950—more than double per household than our British counterparts, and second only to China. How did the American home become such an energy hog, despite so many advances in efficient appliances and construction? Blame the American Dream.

Electricity Consumption by the Numbers

That 3.65 billion kWh is just part of America's total annual energy consumption, but it is overwhelmingly employed by the residential and commercial sectors. According the the US Department of Energy, the residential sector alone consumes 37 percent of the total electricity production—that's an average annual consumption of 10,837 kWh, or 903 kWh monthly. Louisiana residents consumed the most electricity in 2012 with 15,046 kWh, while Maine consumed the least, just 6,367 kWh, though this is due partly to LA's ubiquitous electrically-driven A/C units and Maine's opposing reliance on heating oil.


In the residential sector, most of the electricity is used for air conditioning, refrigerators, space and water heating, lighting, and powering appliances and equipment. You known, all of conveniences of the modern home that the 1950s promised us. Per the EIA, the electricity usage of an average American household breaks down like this:

Share of
Space cooling0.9327319%
Water heating0.451319%
Color televisions and set-top boxes0.32937%
Space heating0.27796%
Clothes dryers0.20574%
Personal computers and related equipment0.16463%
Furnace fans and boiler circulation pumps0.13393%
Clothes washers0.03101%
Other uses (small electric devices, heating elements, and motors not listed above)1.0731322%
Total consumption4.861,424


There are some interesting numbers to note in the table above. For one, a large part of our consumption goes towards keeping our homes lit and cool. That's directly impacted by the slowly growing floor plans of American homes (which we'll discuss more shortly). Second, televisions and set-top cable boxes alone suck down 93 billion kWh annually, more than double what we spend running our home computers (we'll discuss this more in a second as well). But most importantly, nearly a quarter of our electricity use goes towards charging portable electronics and other miscellaneous gadgets. The mobile revolution may have freed us from power cords, but only temporarily.

And while US electricity consumption has exploded since the middle of the last century, we appear to be reaching market saturation. Consumptive growth has slowed from 9 percent year over year in the 1950s to just under 2.5 percent at the tail end of the 1990s. Analysts expect the rate to continue to drop down to around 0.8 percent by 2035.

However, even as American consumption growth slows, the rate of electricity consumption abroad is rapidly expanding. In 2008, non-OECD nations (underdeveloped countries not party to the 1960 UN Convention on the Organisation for Economic Co-operation and Development) consumed 47 percent of the world's total electricity supply. By 2035, these nations are expected to consume 60 percent of the world's supply as they develop modern industries and infrastructure.


On the other hand, the average British home consumed just 4,227 kWh in 2012—less than half of the American average. According to the UK Department of Energy & Climate Change:

Between 1970 and 2012, electricity consumption from consumer electronics increased by 376 percent, wet appliances by 151 percent and cold appliances by 96 percent. Home computing, which had no recorded energy use in 1970, rose to 587 ktoe (11.63 million kWh) in 2012.

Since 1990 electricity consumption from consumer electronics increased by 77 percent and wet appliances by 26 percent, whilst electricity consumption from lighting appliances and cold appliances fell by 14 percent and 21 percent respectively, reflecting improved efficiency. Home computing rose by 409 percent between 1990 and 2012.


Overall, the average electricity consumption of British homes grew just 1.7 percent per year between 1970 and 2012. So how does the average American household still manage to use 2.7 times the electricity than the British home despite a downward trend in total and per-capita energy use in recent years?

How We Got Here


image: Alex Yeung

There are a number of contributing factors to America's bloated home energy use compared the the UK. For one, the British climate is much more mild than the United States—the UK doesn't have intolerably muggy weather of Florida or the bitterly cold winters of the Northeast—and, as such, UK homes typically do not employ home A/C units (which account for 19 percent of American home consumption). Far fewer British homes own tumble dryers compared to the US, too. According to a recent survey by OPower:

Whereas 85% of US homes own energy-gulping dryers, the UK gets by on a 57% dryer ownership rate. And survey data from Opower's UK-based customers reinforces their relative comfort with surviving sans tumble dryer: in our data sample, UK customers who embrace hang drying far outnumber those who don't — by a ratio of 14:1.


And it's not just clothes dryers and A/C units. British appliances are also quickly outpacing their US counterparts in terms of efficiency. In the UK, chest freezers now consume 66 percent less energy than they did in 1990, upright freezers use 59 percent less, and new freezers use 55 percent less. Similarly, wet appliances like dishwashers and washing machines consume 39 percent and 32 percent less power, respectively, than they did in the 1990s.

US appliances, while certainly more efficient than they were in the 1990s, cannot match these gains. Energy Star certified refrigerators, for example, only have to be 15 percent more efficient than non-qualified models or 20 percent more efficient than models that simply meet the federal minimum energy efficiency standard. As the DoE's National Renewable Energy Lab points out:

Extensive technological analyses have demonstrated the large energy savings that result from the standards, especially in later years as market penetration increases. An American Council for an Energy-Efficient Economy (ACEEE) study estimates similar savings. Over the 1990-2000 period, net present benefits exceed costs by an over three to one ratio. In 2010, existing standards will save 250 billion kWh (6.5% of projected electricity use) and reduce peak demand by 7.6% (ACEEE 2009).

A 2003 study (Meyers et al.) suggests that standards taking effect from 1988-2003 will capture cumulative reductions in energy use from 1988-2050 of 8%-9% relative to a no-standards baseline. The corresponding cumulative costs of these standards have been estimated to be $200-$250 million (2002 dollars), with a cumulative (through 2050) benefit/cost ratio of 2.75:1 (Meyers et al. 2003).

A more recent study corroborates these findings, estimating 4% and 8% energy reductions resulting from standards in the commercial and residential sectors, respectively, for standards in place from 1987-2006 (Meyers et al. 2008).


That said, Energy Star appliances can still lead to increased electricity usage according to the NREL:

One criticism of these types of labeling programs, both for the comparative information and ENERGY STAR endorsement, is their focus on efficiency rather than total consumption. Products are segregated by class, allowing large, upscale, energy-consuming products to be a separate category from similar products with much smaller energy footprints. This enables products, such as refrigerators, to grow much larger and offer more energy-consuming features without losing energy-efficiency endorsements. A smaller, less energy-consuming product can be labeled inefficient using the same energy consumption per cubic volume metric. Some evaluations of labeling programs have cited the need for a cap on total energy consumption of products labeled as energy efficient (Deumling 2008).

The UK's rapid strides in increasing its energy efficiency is largely due to a concerted nationwide push. While US efforts are primarily the result of state legislation and local utility programs, the UK's ambitious national Energy Efficiency Strategy has helped the country reduce its primary energy consumption for the last seven consecutive years. What's more, the UK government is also participating in a regional EU effort to reduce energy use by 20 percent by 2020.


These aggressive steps have no American equivalent. And in one particular case, it's simply because nobody bothered to ask.

Look No Further Than Your TV


image: Glowonconcept

Half of all Americans—roughly 160 million of us—use a set-top box from either the cable company or a unified TiVo equivalent. And for those of us that also use a standalone DVR, which can consume up to 40 percent more power than the cable box they're connected to, the pair can suck down as much as 446 kilowatt hours a year. That's 10 percent more than a 21-cubic-foot energy-efficient refrigerator, or about $3 billion in electricity per year according to the Natural Resources Defense Council.

These devices are such energy hogs because, unlike their European versions which drop into sleep mode (cutting consumption by 50 percent) or hibernation mode (reducing their draw by 95 percent), American set top boxes are always on, running at full power 24/7. And as the NRDC study found, these devices are only active—either broadcasting feeds or recording shows—some 33 percent of the day. The other 66 percent of they're just sitting there, slowly hiking your electricity bill. Nationwide, these devices waste more energy than it takes to power the entire state of Maryland for a year.


This always-on state is actually marketed as a feature by many cable companies a means of ensuring that viewers will be able to instantly start watching the moment they turn on the TV rather than wait 45 seconds for the programming guide to load.

"People in the energy efficiency community worry a lot about these boxes, since they will make it more difficult to lower home energy use," John Wilson, a former member of the California Energy Commission who is now with the San Francisco-based Energy Foundation, told the NY Times. "Companies say it can't be done or it's too expensive. But in my experience, neither one is true. It can be done, and it often doesn't cost much, if anything."

However, to date, only five cable/satellite providers provide Energy Star compliant set-top boxes in the US, and only in specific markets: AT&T in Birmingham AL, DirecTV in El Segundo CA, Dish Network in Englewood CO, EPB in Chattanooga TN, and Suddenlink Communications in St Louis.


Of course, there is very little urgency from government regulators like the EPA to coerce cable companies and internet providers to employ these energy-saving functions. Instead, the best we get is a voluntary program wherein cable companies can become Energy Star "partners" if they upgrade between 25 and 50 percent of their boxes to Energy Star certified equipment. Granted, an Energy Star cable box will consume, on average, 45 percent less energy than non-qualified alternatives, but their limited available severely curtails the program's effectiveness.

What's more, these companies have been slow to embrace the increasingly stringent Energy Star requirements—which dropped from 138 kWh to 97 in September 2012 and again to 29 kWh in 2013—or even include the Energy Star label on qualified boxes.

"The issue of having more efficient equipment is of interest to us," Justin Venech, a spokesman for Time Warner Cable told the NYT. But, "when we purchase the equipment, functionality and cost are the primary considerations."


Apparently, even mentioning energy efficiency is taboo if it means delaying the instant gratification that we've been conditioned to expect from televisions. In fact, as Wilson told the Times, when he was still on the CEC he would often ask manufacturers why the devices never powered down. Their response: "Nobody asked us to use less." Which is exactly the level of initiative we have come to expect from such monopolies.

The Rebound Effect

It's not just the efficiency of our appliances and climate control systems that's increasing American home energy use. The size of our homes and the sheer number of devices we pack into them have an effect as well. It's known as the Rebound Effect.


Energy efficiency gains are only effective in reducing energy consumption and carbon emissions if demand for energy services remain static. Problem is, this doesn't often happen, with economic gains routinely outpacing those derived from efficiency improvements. By reducing the amount of energy needed to run, these services become relatively less expensive, which increases the demand for them, thereby negating the savings from the added efficiency.

Image: U.S. Energy Information Administration, Residential Energy Consumption Survey


For example, while Energy Star-compliant A/C units are now commonplace, their energy savings don't cover the fact that their usage has doubled since 1980. And while stoves and refrigerators are now more efficient, they're now often accompanied by a large influx of additional microwaves, dishwashers, clothes washers and dryers, multiple computers and TV sets, gaming systems, and DVRs—not to mention the menagerie of mobile devices competing for space on your power strips. It's little wonder why these appliances now constitute more than 33 percent of our home electricity use.

Even the energy savings derived from new construction methods aren't enough to keep pace with economic growth. Homes built after the year 2000 account for 14 percent of the US housing market, according to a recent Residential Energy Consumption Survey conducted by the EIA. Despite being 30 percent larger than older homes, they consume roughly 2 percent more than their predecessors. That's some impressive savings—except that most of the efficiency increase comes from reduced space heating costs (21 percent less than older homes), due to a majority of these new homes being built in the warmer South rather than our northern tundras.

What's more, these new homes often come outfitted with all the trappings of modern comfort, translating to an 18 percent energy consumption increase since 2009 for appliances, electronics, and lighting, than that of older homes. The rebound effect is therefore estimated at around 30 percent for new homes—that is, the added usage of a more efficient technology means that it will only save 70 percent of what is estimated by engineering models—though the effect can range from 5 to 80 percent for specific situations.


Image: U.S. Energy Information Administration, Residential Energy Consumption Survey

America's particularly susceptible; moving into bigger and better houses has been held up as the American Dream since at least the Hoover administration. As Americans gain wealth, we move into bigger houses and fill them with more stuff. And nobody is going to let the new neighbors see them hanging their clothes on a line like a damn heathen when there's a brand new clothes dryer to do the job. And if that fancy-pants A/C system is so much more efficient, why wouldn't you run it 24/7?


So yes, while increased energy efficiency is very necessary to reducing America's carbon footprint and our contributions to global warming, it can't be the only answer. Because, as the American people have shown, making services relatively less expensive to run only encourages us to run them more. [WaPo - Wiki - OPower - UK Gov 1, 2 - LLNS - EIA 1, 2, 3 - Europa - Carbon Brief - Science Daily - Energy Star]

Top image: winui