Energy storage valuation studies walk cautiously around questions relating to the costs associated with power disruptions. They tend to focus more, if not entirely, on reliability questions rather than addressing the value of resiliency.
There are a number of techniques used to monetize the value of power reliability and grid resilience, all with various benefits and drawbacks. This section summarizes several of the most common approaches and presents the results of several studies using these techniques.
Customer Damage Functions and Value of Lost Load
The most common method for estimating interruption costs is the customer damage function (CDF), which establishes a relationship between costs to a customer and interruption duration. Every study yielding a power reliability value presented in Fig. 1 relied on the CDF as expressed in terms of value of lost load (VOLL).
In , the BESS demonstrated the capabilities to effectively operate in an islanded model for the customers in the downtown area of Glacier, Washington. This operation results in benefits accruing to Puget Sound Energy customers located in the islanded area, and were monetized in terms of VOLL.
Using historic outage data, the research team constructed a statistically average outage year. Outages were then randomly selected and scaled to reach the average outage duration for the year. The outages were then built into an optimization tool, assuming no event foreknowledge, and the energy on-hand at the moment each outage struck was then used to mitigate the outage. The with and without BESS conditions were then compared to determine the change in the number of outages and outage duration. The savings to customers were estimated at $310,000 or $310/kW-year for the 1 MW/2.2 MWh Glacier BESS. The basis of the estimated VOLL was the interruption cost estimates presented in [13••].
The economic benefits of a combustion turbine generator (CTG) with a temperature-dependent maximum capacity of 16 MW and a 6 MW/48 MWh Tesla lithium-ion BESS deployed on Nantucket Island, Massachusetts, were provided in [14•]. The study evaluated 704 outages that occurred over 11 years. The Nantucket Island analysis was more comprehensive than  because the research team had developed a distribution system model for the island in GridLAB-D and OpenDSS. Furthermore, any outages listed as secondary/service, transformer, or fused branch in the description were eliminated because the BESS + CTG could not address them. The remaining outages were simulated in the distribution system model, and the change in VOLL to customers was estimated based on the with and without BESS + CTG conditions. The model simulated multiple scenarios with and without distribution system investments (i.e., reconductoring and automated feeder switching), yielding $780 thousand to $1.0 million in annual avoided costs [14•].
The Nantucket Island study also illuminated the practice in energy storage valuation assessments where reliability is monetized in an indirect manner, in this case as a transmission deferral benefit. Based on an extensive analysis of load growth on the island, the load will be expected to exceed the capacity of the island grid during certain days in the summer if an N-1 contingency is triggered by the failure of one of the submarine transmission cables connecting the island to mainland Massachusetts. Rather than making an investment in a third transmission cable with a cost that could exceed $200 million, the National Grid elected to invest in the CTG and BESS. Transmission deferral in this case effectively serves as a proxy for reliability, the benefit of which was estimated at nearly $110 million [14•].
Individual CDFs often serve as the basis of monetizing outage mitigation. The Avista Utilities in Washington State deployed a 1 MW/3.2 MWh vanadium flow battery system that was developed by UniEnergy Technologies . Avista’s battery system was located on the campus of Schweitzer Engineering Laboratories (SEL), which is powered by two redundant feeders (regular feeder TUR117 and alternate feeder TUR116) from Avista’s Turner substation located nearby. The SEL facility contains sensitive manufacturing processes that are prone to power quality disturbances, such as voltage sags.
The research team analyzed voltage sag data from 2014 to 2017 provided by SEL. Applying the Computer Business Equipment Manufacturers Association (CBEMA)–defined power quality curve, over 40 voltage sag events (<70% in voltage magnitude, >20 ms in duration) were identified. The results matched the finding of SEL’s power quality monitoring system. SEL indicated that each voltage sag event of sufficient magnitude and duration caused equipment to shut down for a minimum of 3 h at a cost of $150,000 per hour in lost productivity . With an energy storage system on-site, a solution was devised to engage the fast real and reactive power control capability of its power electronic converters to mitigate the voltage sags and avoid interruptions.
By reviewing the BESS inverter reactive current profiles during voltage sag events, it was determined that the response time of a reactive current would be lower than the time criteria (>20 mSec) for defining voltage sag using the CBEMA curve. However, the magnitude of improvement by reactive power would depend on the voltage sensitivity. An additional analysis performed by the Washington State University showed voltage improvement for scenarios where base case voltage (i.e., without reactive power support) did not sag below 95%, which is higher than the voltage sag definition criterion (<70% of nominal voltage).
Overall, due to the extensive down time caused by each outage of a minimum of 3 h, the benefits from voltage sag mitigation are substantial. Over the course of the 20-year battery life, the present value benefit from this use case was estimated at nearly $10 million . It is important to note that the battery system became non-operational and was removed from the facility prior to the end of its promised/anticipated life. Therefore, the results presented in  represented the potential benefits that could have been derived had the battery operated as tested and remained in place for its entire usable life.
The CDF method, while perhaps the most direct and extensively applied, does have drawbacks. Residential customers, for example, experience very few direct economic costs associated with short-duration outages, yet might be willing to pay more to avoid the discomfort or uncertainty associated with unplanned service interruptions. These studies often fail to statistically control for cognitive bias in the respondent. CDF or VOLL-based estimates generally do not extend to outages registering longer than 24 h [15••].
Stated Preference Techniques
To address some of the shortcomings associated with CDF studies, contingent value (CV) studies use interview and bidding techniques to elicit a customer’s willingness to pay (WTP) to avoid service interruptions. In [15••], a research team defined an approach for estimating the WTP on the part of residential customers to avoid a 24-h outage. The study found that the expressed WTP, presented in terms of cost per kWh, for low amperage backup service was much higher than for full service, and that WTP grew as respondents’ knowledge of the outage characteristics and consequences improved [15••].
This technique was employed in a survey of electric service customers in Allegheny County, Pennsylvania. Face-to-face interviews taking 1 h on average were conducted with 73 customers in 2015. The findings of the study indicate that higher priority backup services were more highly valued than lower priority ones ($0.75/kWh vs. $0.51/kWh), and when given more information regarding the ability to provide partial (20 Amps) services, the difference between the two values grew to $1.2/kWh vs. $0.35/kWh [15••].
There are several challenges associated with implementing the CV method, not least of which are several biases that could be introduced during the interview process. Anchoring bias occurs when the value provided by the respondent depends on the first bid presented. Information bias is evident when the information presented by the interviewer influences the response. Hypothetical bias exists if the context is not viewed as realistic by the respondent. The CV method also fails to address broader macroeconomic effects of long-duration outages, and societal or community effects associated with personal injury, loss of life, or property damage. Like the CDF, nearly all CV studies elicit WTP values for outages defined as under one day [15••]. Finally, extending WTP surveys to other regions and customer types can yield implausible results.
Discreet choice experiments (DCEs) are an approach to overcome some of the theoretical problems associated with the CV method. DCEs attempt to elicit more precise estimates of WTP by offering respondents choices between two and more discreet alternatives. The DCE alternatives are varied in a manner that allows researchers to extract information from respondents that leads to an indirect value estimate for resilience [15••].
To address the broader economic impacts of service disruptions, some studies employ input-output (I-O) and computable general equilibrium (CGE) models. An I-O model captures inter-industry relationships within an economic system in order to determine how an impact on one industry cascades throughout an economy. In addition to direct economic impacts, I-O models also measure indirect and induced effects. Indirect effects include those relating to purchases of goods and services from companies in the supply chain, while induced effects are those tied to household spending of income earned through business operations. CGE models expand on the theoretical underpinnings of I-O analysis by eliminating the fixed-coefficient character of I-O models, enabling sectors and households to respond to relative factor prices. The CGE model is more dynamic than the I-O model and capable of evaluating substitution effects and changes in consumer spending in response to rising prices caused by scarcity.
A simplified CGE model was developed to evaluate the economic impact of a 2-week outage on the San Francisco Bay Area. The study found that in the absence of substitution responses and mitigation efforts tied to investments in backup energy infrastructure, economy-wide net costs reach $1 billion. Substitution responses reduce the impact to $123–$644 million, while investments in backup infrastructure investments further reduce economic costs to $19–$30 million. Note that this estimate excludes the costs to residential customers, which using [13••] would cost nearly $1.5 billion [15••].
While macroeconomic models capture a more complete picture of resiliency, they do not include power flow simulations or model the physical characteristics of an electrical grid. Applying them to specific energy storage investments is very challenging. CGE models are extremely data-intensive and require significant skill and knowledge to build. Finally, these models would not capture any societal or community effects associated with loss of life, injuries, and infrastructure or property damage.