You're not the only one who sweats during a heat wave. The power company feels the burn too—in the form of skyrocketing demand from those air conditioners, fans, and stressed out refrigerators. Not to mention all the TVs clicking on as we cower in front of our media stacks to escape the cruel, cruel sun.
What you may not know, though, is that, because of the way our grid is designed, utilities sometimes have to throttle our power allowances to keep all of our gadgets a-humming. And though it rarely happens, that very act could put some of them in danger.
Curiously enough, the rise of electric utilities and centralized electricity production is probably the fault of geek genius Nikola Tesla. Back in the early days of commercial electricity production, Thomas Edison pushed for direct current distribution of electricity. Direct current was safer, Edison felt, and was compatible with already existing electricity infrastructure. At that time though, direct current could only move about a mile from a generator to an individual home or factory; beyond that distance, the system was so inefficient that most of the electricity was wasted as heat. Under Edison's plan, cities would be dotted with small neighborhood-scale electrical power plants, and it's not out of the realm of possibility that when it came time to electrify rural areas, Edison would have come up with cheap low power single-family or single-farm generating stations.
Tesla (and others) developed alternating current, which could travel much more efficiently along high-voltage wires. This negated the need for small local power generation, giving utilities the option to create centralized power plants to serve large areas. If Edison's DC power had won the battle, the nation's energy infrastructure might look very different today, and it's possible that neighborhood-level generation of electricity would have led to a very different way to handle heat waves.
The flow of electricity through wires is a lot like the flow of water in pipes, with voltage being compared to water pressure, amps being compared to flow rate, and watts being compared to the total amount of water that comes out of the pipe (the formula is volts * amps = watts). The more "pressure" in your pipes (whether water pipes or electrical pipes), the more work can be done, at the price of more wear and tear on the fixtures. By lowering the voltage to your house, while keeping the amps the same, the total wattage (or amount of electricity) you receive is less. A common voltage reduction is about 5-percent.
According to Michael Clendenin, a spokesperson for New York's Consolidated Edison, at that level "most people won't notice a thing; 5-percent is not a dramatic drop." If more saving is needed, a utility can call for a "brownout", a voltage reduction of about 8%. At this point, Clendenin states, "You might start to notice dimming lights, and air conditioners not working right... You don't have an outage, but your equipment isn't running as well."
And here's where Tesla's long-distance distribution scheme comes in: Electrical wires in use today are not 100-percent efficient: A small amount of the electricity they carry gets converted to heat. Overhead wires exposed to sunlight get even hotter. Underground cables, buried to protect them from the elements, bake inside their conduits, which can be 20 degrees hotter than the street above them. All of this heat can cause component failures, and lower the efficiency of the wires that remain in use. Lowered efficiency means more waste heat, (and greater chance of component failure), which leads to greater inefficiency... which gives off more waste heat. And so on. By lowering voltage, a utility does what it can to cool its wires and distribute the electricity it has more efficiently.
"It's not about the supply of electricity," Clendenin says. "In New York City we have plenty of power. But if the distribution networks break down, it really doesn't matter how much supply you have." The dirty little secret is that much of this country's electrical infrastructure dates back to the 1960s at best. If an electrical feeder—a blocks-long or miles-long system of cables, transformers, and network protectors running to a specific neighborhood — develops a heat-related fault, other feeders can usually take up the slack while the broken feeder is being repaired. But overloading these feeders comes with a risk—a cascade failure caused by overloaded lines contributed to the 1965, 1977, and 2003 blackouts. By lowering the voltage to particular neighborhoods, the utility takes some of the stress off the feeders that still remain, minimizing the chance that they too will fail and bring down the whole system.
Since most electrical appliances in North America are designed to work between 110 and 120 volts, an 8% voltage drop from 120 to 110 usually presents no problem. According to Brian Markwalter, senior vice president, research and standards, at the Consumer Electronics Association, "Products have always been designed to tolerate a range of power voltages and frequencies. This is true of appliances and electronics. Although I do not have exact figures, my guess is that products are designed to handle at least a 10% reduction in nominal voltage." Voltage drops below 105 volts, however, have the potential to damage motors and compressors in appliances like refrigerators and air conditioners. And that's where you need to watch out.
Of course if your power starts going below 105 volts, you're probably headed for a blackout. But it can't hurt to take measures to protect that dedicated ice cream freezer or that expensive new window unit just in case. There are numerous gadgets on the market that display how much juice you are currently sipping—the Killawatt and Belkin's Conserve Insight are two examples. Keep an eye on your power; if you see your supply go below 105 volts you might consider turning stuff off for a while to keep the motors and compressors from getting damaged.
Patrick Di Justo is a freelance writer sweating it out in Brooklyn. Check him out on Twitter.