Since 1995, electrical grid power outages have steadily increased as R&D in new technologies steadily decreased.
Author’s Note: Recently, I wrote an article for the IEEE Spectrum on the U.S. Electrical Grid. The original post can be found in the IEEE Energy Policy Department. I also spoke on this subject to NPR and TMCnet.
The U.S. electrical grid has been plagued by ever more and ever worse blackouts over the past 15 years. In an average year, outages total 92 minutes per year in the Midwest and 214 minutes in the Northeast. Japan, by contrast, averages only 4 minutes of interrupted service each year. The outage data excludes interruptions caused by extraordinary events such as fires or extreme weather.
I analyzed two sets of data, one from the U.S. Department of Energy’s Energy Information Administration (EIA) and the other from the North American Electric Reliability Corp. (NERC). Generally, the EIA database contains more events, and the NERC database gives more information about the events. In both sets, each five-year period was worse than the preceding one: According to data assembled by the U.S. Energy Information Administration (EIA) for most of the past decade, there were 156 outages of 100 megawatts or more during 2000-2004; such outages increased to 264 during 2005-2009. The number of U.S. power outages affecting 50,000 or more consumers increased from 149 during 2000-2004 to 349 during 2005-2009, according to EIA.
Adjusting for a two percent per year increase in load to 2000 levels, these outages reflect a trend. First, there were 147 outages of 100 megawatts or more during 2000-2004; such outages increased to 230 during 2005-2009. Second, assuming the same two percent annual demand growth, the number of U.S. power outages affecting 50,000 or more consumers increased from 140 during 2000-2004 to 303 during 2005-2009.
What happened? Starting in 1995, the amortization and depreciation rate has exceeded utility construction expenditures. In other words, for the past 15 years, utilities have harvested more than they have planted. The result is an increasingly stressed grid. Indeed, grid operators should be praised for keeping the lights on, while managing a system with diminished shock absorbers.
R&D spending for the electric power sector dropped 74 percent, from a high in 1993 of US $741 million to $193 million in 2000. R&D represented a meager 0.3 percent of revenue in the six-year period from 1995 to 2000, before declining even further to 0.17 percent from 2001 to 2006. Even the hotel industry put more into R&D.
Our first strategy for greater reliability should be to expand and strengthen the transmission backbone (at a total cost of about $82 billion), augmented with highly efficient local microgrids that combine heat, power, and storage systems. In the long run, we need a smart grid with self-healing capabilities (total cost, $165 to $170 billion).
Investing in the grid would pay for itself, to a great extent. You’d save stupendous outage costs-about $49 billion per year (and get 12 to 18 percent annual reductions in emissions). Improvement in efficiency would cut energy usage, saving an additional $20.4 billion annually.
Dr. Massoud Amin is the Director of the Technological Leadership Institute (TLI) at the University of Minnesota – Twin Cities. TLI offers Masters of Science programs in security technologies, management of technology and infrastructure systems engineering.
[EDITOR'S NOTE]: This post was updated from its original version to include additional data.