Behind the Buzzword: A Brief History of Energy ‘Resiliency’

The energy industry has overcome many “resiliency” challenges. Climate change is the biggest one yet, the author writes.

There’s no shortage of “resilience” talk in the energy industry, but it hasn’t always been that way.

Over the last 30 years, the transition away from centralized energy infrastructure, global climate events like the devastating hurricanes of 2017, extreme volatility in the cost of fossil fuels, and a growing demand for renewable energy have shaped an unfamiliar theoretical term into not only a buzzword but a cornerstone of infrastructure planning.

Resilience 101: Where we’ve been

Thirty years ago, the concept of resilience held a very different and less pressing connotation than it does today. Actual energy generation looked different: Power was generated at centralized power plants by heavy machinery, protected from weather elements inside large buildings.

Following the New York blackout of 1965, utilities and regulators realized the growing complexity of the nation’s electric system required a more stringent set of standards. In response, the North American Electric Reliability Council was formed to develop standards that required energy infrastructure to be built to withstand extreme weather conditions and included additional redundancy and controls to ensure energy reliability and responsiveness. The concept of resilience was baked into utility regulations and therefore was seldom discussed.

The definition of resiliency changed with the oil embargo of the 1970s, when oil supplies were limited and prices spiked, causing the realization within the industry that our electricity supply was dependent on fuel sources and costs we couldn’t control. In search of alternatives, utilities turned to domestic coal, nuclear power and natural gas. Within a decade, the electricity supply’s dependence on imported oil was sharply reduced, making the electricity supply more resilient to fluctuating costs.

Read the full story here.