Using information to close the energy efficiency gap: a review of benchmarking and disclosure ordinances

Abstract

Building energy use accounted for 38 % of total US carbon dioxide (CO2) emissions in 2012, and roughly half of those emissions were attributable to the commercial building sector. A new policy that has been adopted in 15 US cities and one US county is a requirement that commercial and sometimes also multifamily residential building owners disclose their annual energy use and benchmark it relative to other buildings. We discuss these nascent policies, summaries of the data that have been collected so far, and how to evaluate whether they are having an effect on energy use and CO2 emissions. Missing or imperfect information is a contributor to the energy efficiency gap, the finding that many low-cost options for improving energy efficiency fail to be adopted. These new laws may be an important step in closing the gap in the commercial and multifamily building sectors, but careful evaluation of the programs will be essential.

Introduction

Building energy use accounted for 38 % of total US carbon dioxide (CO2) emissions in 2012, and roughly half of these emissions were attributable to the commercial building sector (EPA,(2014). In many jurisdictions, building codes require minimum levels of energy efficiency in new buildings, but few policies are directed at older buildings. The average age of commercial buildings in the USA in 2011 was 50 years and even older for apartment buildings (CBI, 2012). In many cities, especially in the midwestern and eastern USA, older buildings make up a significant portion of the building stock. In Washington, DC, for example, recent information suggests that over 45 % of the largest commercial buildings are more than 35 years old.Footnote 1

Designing policies to spur retrofits and improvements to existing buildings is difficult. An increasingly popular policy that has been adopted in 15 US cities and one county is a requirement that building owners disclose their annual energy use and benchmark it relative to other buildings.Footnote 2 As of April 2016, the cities of Washington; Austin, Texas; New York; Seattle; San Francisco; Philadelphia; Minneapolis; Boston; Chicago; Cambridge, Massachusetts; Berkeley, California; Atlanta; Portland, Oregon; Kansas City, Missouri; and Boulder, Colorado all had passed local benchmarking and disclosure ordinances, as had Montgomery County, Maryland. This approach is also popular in Europe where it is driven largely by the European Union’s Energy Performance of Buildings Directive, first issued in 2002 and updated in 2010.Footnote 3

The rationale typically given for such laws is that publicizing building energy efficiency will provide valuable information to potential renters, buyers, and financiers, information that is otherwise not widely available. This will make it easier for these market participants to take into account the energy characteristics of buildings, in particular the likely energy costs of building operation, when making purchase, lease, and financing decisions. Gradually, the information is expected to move the commercial and multifamily residential building markets toward greater efficiency as building owners invest in energy improvements in order to compete for tenants and buyers.

In this paper, we review these new laws, assess their potential for closing the “energy efficiency gap,” describe outcomes in some of the early-adopting cities, and discuss the data and analysis necessary to evaluate whether the laws are having an impact on energy use and CO2 emissions. The energy efficiency gap—sometimes called the energy paradox—is the term used for the observation that consumers and firms fail to make investments that appear to more than pay for themselves through the subsequent stream of energy savings (Gillingham and Palmer, 2014; Jaffe and Stavins 1994). Missing and asymmetric information is held up as one driver of the efficiency gap. We describe the nature of these problems in the commercial building market and assess the potential for benchmarking and disclosure laws to solve these information problems.

Cities are beginning to report the outcomes from their benchmarking and disclosure laws—reported energy use numbers and a variety of other statistics—and a handful of academic studies have analyzed the disclosed data. We discuss these findings; however, while the summary statistics and trends are useful for providing some insights into commercial building energy use, they are not sufficient for evaluating whether these new policies are having an impact. We describe the data and methodologies that would be necessary for such an evaluation, summarize one recent study, and provide suggestions for future research. Benchmarking and disclosure laws are growing in popularity. As more cities consider such laws, it is important to understand their impacts on energy use and CO2 emissions.

The energy efficiency gap and buildings

Much has been written about the energy efficiency gap in general, what phenomena might explain it, and the role of different types of policies to address it (Jaffe and Stavins 1994; Gillingham et al., 2006, 2009; Gillingham and Palmer, 2014). Reasons for the gap tend to fall into three categories: market failures, behavioral anomalies, and hidden costs. Within these categories, commercial and multifamily residential buildings exhibit four specific problems that could be relevant for policy design, and each is related in some way to information: missing or imperfect information, principal-agent problems, credit constraints, and “inattentiveness” to energy metrics. We discuss each of these issues in turn and how benchmarking and disclosure laws might address them.

A building is inherently a bundled good consisting of many attributes, some of which are more readily observable than others. In a commercial or large apartment building, energy efficiency is a function of how a building is constructed and how equipment is operated. Observing features such as the amount of insulation in the walls and the performance of boilers, chillers and air handling systems, and elevators can be difficult or costly. With energy use representing about one third of building operating costs, building owners would be well served by cost-effective energy efficiency investments, but with multiyear paybacks and uncertainty about energy savings, many building owners may not have enough information to make these risky investments.Footnote 4 Compounding the problem is the difficulties owners face in conveying energy efficiency information to potential future buyers.

Principal-agent problems

Information problems may be particularly acute in the face of potential principal-agent problems in real estate markets. The principal-agent problem, also known in this context as the landlord-tenant problem, occurs when one party makes an investment and another party reaps the benefits or pays the costs that result from that investment (Gillingham et al., 2012; Myers 2014). A manifestation of the landlord-tenant problem is when a landlord pays for the key energy investments, such as insulation and equipment, but the tenant pays the energy bills. The landlord has little incentive to invest in efficiency improvements because he does not directly reap the benefit, nor can he typically recoup the cost through higher rents because he cannot credibly convey the building’s energy efficiency properties to prospective tenants because these features are difficult to observe.Footnote 5

Credit market failures

Most building owners, especially owners of large commercial buildings, will need to finance any investments they make in energy improvements and retrofits. They may choose to finance internally through their capital or operating budgets, but for some companies, internal competition for capital may favor alternative investments (Palmer et al. 2012). For commercial property owners who are mainly in the real estate business, commercial mortgage underwriting practices present a hurdle. According to Jaffee et al. (2011a, b), energy costs are essentially a “wash” in the net operating income calculations that lenders make and use for mortgage approval: they are a component of operating costs but in most cases are assumed to be offset by tenant lease payments. Lenders evaluate overall risks rather than energy risks, typically setting maximum loan-to-value ratios and minimum debt service coverage ratios (Palmer et al. 2012). Even though owners of buildings with lower energy costs may be at lower risk for default on a loan, it is not common practice for this to be reflected in these ratios.Footnote 6 Again, if better information on likely energy costs going forward were readily available to lenders, it is possible that this problem in credit markets could be resolved.

Rational inattention

A fourth potential problem recently discussed in the literature is inattention to energy efficiency attributes when purchasing an energy-using durable, such as a car or new appliance (Hausman 1979; Sallee, 2014). If it takes time and effort to figure out the energy costs associated with a product, it may be rational for a consumer to ignore this attribute when making a purchase decision.Footnote 7 Houde (2014) finds evidence that inattention plays a role in refrigerator purchases, Sallee (2014) in car choices, and Palmer and Walls (2015b) in follow-up by homeowners on energy audits. To our knowledge, there is no empirical study of inattention in commercial buildings. However, real estate transactions and the contracts involved can be complex, thus less attention may be paid to energy costs than other variables in those transactions. Inattentiveness often results in choices that are ex post suboptimal, which suggests a potential role for policy (Allcott et al. 2014).

Key features of benchmarking and disclosure laws

Information plays an important role in each of the above four problems in the commercial and multifamily building sector, and benchmarking and disclosure requirements are one form of information provision. But could they alleviate some of the energy efficiency gap problems we described? This section provides more detail about how the requirements work, including implementation, reporting, benchmarking tools, and ancillary requirements.

The first municipal benchmarking and disclosure law was enacted in Washington, DC, in August 2008, followed by Austin, Texas, 3 months later and then New York City a year later. The West Coast cities of Seattle and San Francisco were the next to adopt, in February 2010 and February 2011, respectively. Between May 2012 and June 2013, four additional cities—Philadelphia, Minneapolis, Boston, and Chicago—adopted policies of their own. In late April 2014, Montgomery County, Maryland, a suburb of Washington, DC, became the first county in the country to adopt a benchmarking ordinance. The city of Cambridge, Massachusetts passed an ordinance soon after Montgomery County in July 2014. Since the beginning of 2015, Berkeley, California; Atlanta; Portland, Oregon; Kansas City, Missouri; and Boulder, Colorado have enacted benchmarking and disclosure ordinances.Footnote 8

The benchmarking and disclosure laws adopted in these various cities all bring a building’s energy use to the attention of its owners and occupants, as well as potential tenants or new owners and those who might finance any real estate transactions or property investments. In many cases, the information is also disclosed to the public at large via a government website and published government reports. We summarize several of the key parameters of the policies in each city in Table 1. All of the laws cover commercial buildings, although the minimum building size varies across the cities. Several of the laws also cover multi-family residential buildings. In most of the cities, buildings have been or are being phased in over time by size, with the largest buildings required to report first.

Table 1 Municipal benchmarking and disclosure ordinance provisions for privately owned buildings

In addition to minimum size thresholds, each benchmarking law specifies a set of additional provisions regarding the reporting and disclosure of building energy use information. Of the 16 localities, 13 require disclosure by municipal government buildings (not shown in Table 1), and 11 cities include multifamily residential buildings. All of the laws require that energy use be reported to the government, and most require disclosure on a public website of some subset of that information, sometimes with a delay or exempting the first year of data from public disclosure. Austin, Berkeley, and Seattle do not require public disclosure of building-level data, but instead require disclosure as a part of certain real estate transactions or to current building tenants.

All of the cities have very similar reporting requirements. Building owners or their energy providers are required to submit 12 months of electric and natural gas bills (as well as other energy purchases and purchases of district steam) and certain building characteristics, including gross square footage, year built, and operating hours, to the administering agency in the city. (Many of the localities, with the exception of Austin, Boulder, Chicago, Portland, San Francisco, Seattle, and Montgomery County require reporting of water usage as well.) Additionally, Atlanta, Austin, Berkeley, Boston, Boulder, Cambridge, New York, and San Francisco all require buildings to submit engineering audit data. For example, Local Law 87, which covers some of the additional requirements under New York’s benchmarking program, stipulates that covered buildings of 50,000 gross square feet or more must undergo an energy audit every 10 years. The program ordinances for Boulder, Cambridge, New York, and San Francisco also contain mandatory retrocommissioning provisions for buildings that do not meet a minimum level of performance, while Atlanta has an optional retrocommissioning provision. Retrocommissioning involves a systematic process for identifying inefficiencies in a building’s equipment, lighting, and control systems and making changes to improve their functioning without system replacements. For benchmarking energy use to other buildings, most of the laws require (and all allow) the use of EPA’s Energy Star Portfolio Manager (PM) software program. We describe how the PM program works in Box 1.

Because public disclosure has yet to happen in several of the cities, exactly what building level information will be reported is not yet known, but several of the ordinances list only energy use intensities and Energy Star scores. Energy Star scores are based on measures of source energy use intensity, which captures the energy inputs used to create electricity and the effects of losses in the production and transmission processes on the total energy requirements for delivering electricity-based energy services to the building.Footnote 9 A small adjustment also applies to natural gas to capture losses in distribution. Source energy use allows for more relevant comparisons across buildings than site energy use. However, disclosure of the actual fuel and electricity consumption would provide a richer source of information. As the first city to report building-level benchmarking results, New York is providing both source and site energy use intensity measures and the building’s Energy Star score. The city also reports greenhouse gas emissions and water usage. In its data release for 2011 and 2012, Washington went one step further, reporting annual energy use by type (electricity, natural gas, etc.), building owner and year built, as well as GIS coordinates. In the Appendix, we provide a table with more details about program requirements in each city.

Box 1 Benchmarking building energy use with EPA’s Energy Star Portfolio Manager

Initial outcomes in eight cities

Benchmarking and disclosure laws are still in their infancy. New York City has the most data, with 4 years of data for privately owned buildings available as of April 2016. The city has released several summary reports and independent analyses by academics for some of the early data (Kontokosta 2012, 2014; Hsu 2012). Washington, DC, has 4 years of data available in spreadsheet format but a summary report for only 1 year. A few of the other cities have released some initial summary information. In this section, we briefly summarize the main findings of these studies, focusing on average energy use intensities, median Energy Star scores, and relationships among building size, age, and energy use, as well as issues related to data quality and rates of compliance with the law.

Table 2 shows the number and square footage of benchmarked buildings, along with compliance rates (i.e., percentage of buildings required to report that reported), average energy use intensities, and median Energy Star scores. All of the cities report very high compliance rates, from 83 % in Washington, DC to as high as 99 % in Seattle. In most of the cities, building owners did not meet the requirements by the original reporting dates set in the laws, but after city prompting and the threat of fines, compliance rates increased. Table 2 gives the final rates reported by the cities for the year listed. In Washington, DC, compliance rates vary substantially by building type, with office buildings at 88 % and multifamily buildings at 73 %, whereas only 52 % of retail buildings and 42 % of hospitals are in compliance (DDOE, 2014).

Table 2 Benchmarking and disclosure results in selected cities

The table makes clear that New York City dwarfs other cities in terms of the number and square footage of the buildings required to disclose. In fact, the city reported in 2014 that the square footage covered by its law accounted for 52 % of the square footage covered by all disclosure and benchmarking laws in that year (City of New York 2014).

The median Energy Star score for commercial buildings in each of the seven cities for which we have this information is above the nationwide average of 50. A score of 75 is needed for a building to be Energy Star certified, so in Washington, DC and San Francisco, the median score is high enough for certification. In San Francisco, Hooper (2013) reports that buildings reporting an Energy Star score of 75 or above accounted for 93 % of the total floor area in benchmarked buildings. In Washington, which has the largest number of Energy Star-certified buildings in the USA, 76 % of the buildings have a score of 75 or above.Footnote 10 In Seattle, 43 % of the buildings have Energy Star scores of 75 or above, and 17 % received a score of 91 or above (City of Seattle 2015).

Looking into the data and published analyses in more detail reveals some consistent and noteworthy findings across cities. First, all cities that present relevant data on ranges of EUIs show a significant amount of variability in energy use across buildings, even within the same building type category. Among office buildings, for example, the source EUI for the 95th percentile in New York is seven times that of the fifth percentile; in Washington, the 95th percentile is 2.4 times that of the fifth percentile (DDOE, 2014; City of New York 2014). Similar findings show up in the data for the other cities. The Seattle and New York reports provide calculations of the energy savings potential that could be reached if the poorer-performing buildings improved. If buildings below the median energy use intensity increased their energy efficiency just enough to reach the median, Seattle estimates that total energy use in all buildings would decrease by 3.46 billion kBtu, roughly 22 %; in New York, based on 2011 disclosed data, the figure was 18 % (City of Seattle 2015; City of New York 2012).Footnote 11

The second consistent finding across several of the cities is the relationship between energy use and building age. In all cities, older office buildings use less energy than newer ones or roughly the same amount. Comparing averages across building age categories, older buildings in New York appear to use decidedly less energy, and the city speculates that this may be due to less extensive ventilation, better insulation, and a lower intensity of use in older buildings (City of New York 2014). Looking at earlier data reported by buildings in New York City, Kontokosta (2014) also finds that newer buildings consume approximately 40 % more energy per square foot, on average, than buildings built before 1930. New York and Seattle find little difference in energy use in multifamily buildings across age categories, though each city shows a peak for buildings constructed in the 1970s. Kontokosta (2013) estimates a regression model with the New York data, regressing source EUI on many building characteristics and dummy variables for various age categories. The only statistically significant coefficient on the age dummies is for buildings 81 or more years old, which have lower energy use, all else equal. Analysis of EUI data for office buildings in Boston shows that those built before 1950 use less energy per square foot than those built before. DDOE (DDOE, 2014) also estimates a regression with the Washington data and finds no statistically significant effect from building age. Similarly, the building level data reported in Minneapolis, Chicago and Philadelphia show no relationship between EUI and building age (City of Minneapolis 2016; Mayor’s Office of Sustainability 2014).

Finally, the cities all show wide variation in median EUIs by building category. While the cities do not report exactly the same categories, they appear to have some consistencies. In Minneapolis, Seattle, Philadelphia, Chicago and Washington hospitals have the highest or second-highest median EUIs and K-12 schools have relatively low median EUIs in comparison with other building categories.Footnote 12 A graphic for Seattle reported in Baker (2013) highlights the differences, as the two highest-use categories, supermarkets and hospitals, have median site EUIs that are 5.4 to 7 times greater than the site EUIs of the lowest two categories, warehouses and multifamily buildings.Footnote 13 In Boston, laboratories have the highest median EUI while office buildings have the lowest.

The data collected so far provides a window into commercial building energy use in a limited set of cities and provides a general sense of some interesting relationships and patterns. It does not, however, tell us anything about the effects that the benchmarking and disclosure laws might be having on energy use. Even analyzing trends over time in Energy Star scores and EUIs for buildings covered by the laws in a particular city does not tell us whether the laws are responsible for those trends.

Evaluating energy savings: mechanisms, limitations, and data needs

In this section, we describe the mechanisms by which the benchmarking and disclosure laws might have an effect on energy use, the limitations of the laws, and the kinds of data and statistical techniques needed to conduct an evaluation of their effect on energy use.

Mechanisms

We see three ways in which the laws may directly lead to reductions in energy use and emissions. First, if market participants are currently inattentive to energy costs, the simple act of entering energy use and building characteristics into Portfolio Manager may bring energy issues into focus for building owners and lead to some changes in building operations to lower energy costs and changes in contract structures to address those costs. As we explained above, the extent to which inattention exists for commercial buildings is an open question, but some studies have identified it as a problem in other settings. Moreover, peer effects have also been shown to influence energy consumption (Allcott 2011; Costa and Kahn 2013, Ayres et al. 2013), thus if building owners see their energy use benchmarked against other buildings, this may reinforce the attentiveness effect.

Second, if tenants prefer to lease space in more efficient buildings and the disclosure laws provide new energy information to the marketplace, this could lead to improvements in efficiency. Prospective tenants may get value from both private and public good aspects of energy efficiency (Kotchen 2006). In terms of private benefits, tenants may prefer to rent in efficient buildings in order to lower their energy bills or because they are more comfortable. But prospective tenants may also have “green” preferences. Such preferences have been found by Kahn (2007) to exist in the market for hybrid cars and by Kotchen and Moore (2008) in the market for green electricity. Building owners would respond to these market pressures by making improvements and retrofits as a means of competing for tenants.Footnote 14 Research on commercial building certifications and real estate values suggests that both Energy Star and LEED certified buildings have higher rental values and higher sales prices than non-certified buildings (Eichholtz et al. 2010, 2013). Studies of commercial buildings in Europe have reached similar findings (Kok and Jennen 2012) and an analysis of apartment building sales in Singapore finds that green-certified properties sell at a premium (Deng and Wu 2014).

A third way that the requirements may have an effect is through investor behavior. Many commercial buildings are owned by real estate investment fiduciaries or real estate investment trusts (REITs). REITs are similar to mutual funds and are traded on public stock exchanges. Investors could prefer more efficient buildings because the lower energy costs increase net income, because of “green” preferences, or as a quality signal to prospective tenants. This increased demand by investors could drive up the value of more efficient buildings.Footnote 15 The market for REITs may already be moving in this direction. In late 2012, the National Association of Real Estate Investment Trusts (NAREIT), the US Green Building Council, and FTSE Group, a British provider of stock market indices and related services, announced a new green property index (Thomas 2012). While the index will be based on LEED and Energy Star certification, it is possible that the next step could be an index based on data from disclosure requirements.

Potential limitations

There are several reasons to be cautious about the ability of these requirements to provide significant reductions in energy use. In some cities (Austin, Berkeley and Seattle), energy use information is not being made available to the public, but only to tenants, prospective tenants, and others involved in real estate transactions. Having the information readily available to the public, such as on a government website, is preferable. Even in these cases, though, it is not clear how helpful the information disclosed is to prospective tenants trying to choose space to lease based on expected energy costs. In New York, data by building is available in both Excel spreadsheet form and on the NYC Open Data platform and includes annual average source and site energy use intensity, along with an Energy Star score. The Energy Star score is an index useful only for comparison among similar types of buildings. Moreover, in a large building, the average energy use intensity also may not be that helpful, as it may not be representative of the particular space a prospective tenant is considering leasing. The EUI provides only a rough indicator of expected energy costs, which is the information the tenant needs for decision making. In cities that use Portfolio Manager, building owners must report energy use separately for natural gas, electricity, and other fuels, and in Washington and Philadelphia, this detailed information is included in the public disclosure. In our view, this is an improvement, as prospective tenants can use local prices to estimate costs and compare the numbers with those on their current utility bills.

In most cities, building owners are required to report whole building energy use, and in Atlanta, Berkeley, Boston, Cambridge, Chicago, Kansas City, Minneapolis, Montgomery County, New York, Philadelphia, Portland, Seattle, San Francisco and Washington, nonresidential tenants are required to provide the data to their landlords. Obtaining information from tenants can be difficult, however, and this is another reason to be concerned about the quality of the information disclosed.Footnote 16 The general issue of energy billing data access in benchmarking and disclosure requirements has been identified as a key issue for utilities and their regulators (SEE Action 2013). Washington may be ahead of some other cities in this regard. The District worked out an agreement with the local electric utility, Pepco, under which Pepco will provide building-level billing data to authorized requestors—namely, building owners and their agents—when five or more accounts are present in a building and a single account does not represent more than 80 % of total energy consumption for the building (DDOE 2014).Footnote 17 Use of this service was optional for the 2012 reporting year but is required for 2013 and beyond. Pepco is also the service provider for most of Montgomery County, Maryland, and is providing the County with building-level data in the same way as in Washington. Seattle has also facilitated and now requires automatic upload of energy use data by utilities into the PM software. In Boulder, Xcel Energy is providing automatic uploading of whole-building energy consumption data into Portfolio Manager.

Another concern, pointed out by Stavins et al. (2013), is the veracity of the information disclosed. One problem in this regard is the estimate of building size that is used to calculate the EUI. In some cities, such as Minneapolis, the ordinances provide no guidance on what to use for size. In others, such as Chicago, the ordinance is very specific, listing exactly which areas to include.Footnote 18 Ordinances in Montgomery County, Berkeley, Boulder and Seattle are also very specific about what to include in the floor space calculation. However, it still is not clear that all building owners will calculate square footage in the same way, and periodic independent verification may not be enough to adequately maintain a consistent standard for this measurement. In Washington, only 12 % of buildings reported exactly the same square footage as what is recorded in the tax records (DDOE 2014). The numbers reported in the disclosure requirements are generally larger than those in the tax records. Without further information, it is not clear which numbers are more accurate.

Kontakosta (2013), who has carefully studied the New York program, also argues that manual input of the energy disclosure data leads to significant errors. His analysis using PM data from the New York City benchmarking program identifies some common data entry problems such as a frequent misallocation of energy consumption data when two buildings on separate parcels share the same meter (Kontokosta 2014).

Despite these concerns, the laws provide an important source of information on building energy use that was previously unavailable. This is particularly true for the confidential data on building characteristics and use that feed into Portfolio Manager and that will make possible more detailed analysis of how building features and use affect energy use intensity. In some cities, the data that these laws provide are also being used by utilities and other entities that operate energy efficiency programs, such as the DC Sustainable Energy Utility, to target investment of rate payer and public dollars into buildings where the data suggest there are large unrealized opportunities for energy savings.Footnote 19

From a broader policy perspective, these new laws may be serving as useful real-world experiments. Information provision is widely touted as something that will be necessary to overcome the energy efficiency gap. A careful evaluation should be able to shed light on whether benchmarking and disclosure laws are serving that role.

Data needs for evaluation

The data that building owners are required to report are useful for understanding the components and drivers of building energy use, but they are not sufficient for assessing the full effects of the policy. Performing such an assessment requires a representation of what energy demand would have been in the absence of the program, which by definition is unobservable. As a substitute, analysts need data for a control or comparison group that approximates energy use under baseline conditions in buildings that are subject to the policy (SEE Action 2012). Energy use in affected buildings before the policy takes effect (which is required for reporting in some cities, including Washington) is a potential baseline. However, because other factors that affect energy use, such as weather or economic conditions, also change over time, the pre-policy data are generally insufficient, and an analysis that compares the use of energy in affected buildings before and after the policy takes effect could confound the effects of the policy with other factors, thereby producing a biased estimate of the program effects (Angrist and Pischke 2009, 2010). A better comparison group is one that allows the analyst to capture the effects of other factors that change over time and distinguish those effects from the effects of the policy.

The inclusion of building size thresholds in the design of benchmarking and disclosure laws creates a natural experiment that provides a well-defined control group for assessing program effects. Buildings that fall just short of the minimum size threshold are similar to those just above the threshold. Thus, one could compare energy use before and after the policy takes effect between these two groups of buildings, controlling for other factors such as weather. This should provide an unbiased estimate of the energy savings resulting from the policy. A regression discontinuity approach enables such an evaluation (Imbens and Lemieux 2008). Another possibility is to compare buildings in cities with benchmarking and disclosure laws before and after adoption of the laws with buildings in other cities. A difference-in-differences regression approach could be employed (Meyer 1995; Angrist and Pischke 2009).

Conducting either of these analyses requires energy consumption data beyond that collected under the policy. In a regression discontinuity approach, data would be needed for buildings that lie below the minimum size threshold in cities that have benchmarking and disclosure laws. In a difference-in-differences model, data from buildings in cities that have not passed these laws would be needed. These data are typically in the possession of utilities and subject to strict confidentiality requirements. However, overcoming this hurdle is paramount to a full evaluation of how well the policy is working.Footnote 20

Independent sources of data may be available in some cases to carry out one of these approaches. In recent research, Palmer and Walls (2015a) use a national dataset on investor-owned commercial office buildings to assess the impact of disclosure and benchmarking requirements.Footnote 21 The study employs a difference-in-differences regression model to compare utility expenditures per square foot in office buildings in cities with and without benchmarking policies, before and after the initial reporting deadlines in each of four early adopter cities: Austin, New York City, San Francisco, and Seattle.Footnote 22 The results indicate that disclosure laws have a statistically significant negative effect on utility expenditures after the first reporting deadline. In the central specification, which includes property-level fixed effects and thus controls for many unobserved building-level characteristics, the results show that, all else equal, utility expenditures per square foot are approximately 3 % lower after the laws’ reporting requirements take effect in office buildings covered by the laws.Footnote 23

Other performance metrics: emissions and costs

Energy reductions matter for purposes of reducing CO2 emissions and slowing global warming, and thus it is important to consider the ultimate effects of disclosure policies—and all energy policies—on emissions. In addition, the best policies are the ones that have the largest impact on emissions at the least cost, so assessing the policies’ cost is also important.

The relationship between energy savings and CO2 emissions reductions is not a matter of simple multiplication by a single emissions factor, although this approach is a typical one in many evaluations. Emissions reductions from benchmarking and disclosure requirements are likely to differ across cities, as the mix of fuels used for heating and the demand for heating vary across regions of the country, as does the mix of fuels used to produce electricity, which can even vary by time of day and year within a particular region. This suggests that the effectiveness as well as the cost-effectiveness of energy efficiency as an emissions reduction strategy—whether through benchmarking and disclosure requirements or a host of other efficiency policies—will vary across cities and states that rely on these policies for emissions reductions.

The cities adopting the ordinances and many analysts suggest that benchmarking programs are relatively low cost in comparison with other policies, especially policies targeting energy use and emissions in older buildings. Cox et al. (2013) estimate compliance costs in New York City at approximately $200 per building. Hsu (2014b) cites personal communication with experts who put the cost at between $500 and $1500 per building. In the long run and as more, typically smaller, buildings are brought into the programs, it will be important for the cities implementing the laws to assess their costs more carefully. The appropriate measure of costs is the full welfare costs—that is, an estimate of the value of the resources diverted from other uses. It is also important to incorporate the costs of monitoring and enforcing compliance with the laws.Footnote 24

Conclusions

Many energy efficiency improvements have been identified as “low-hanging fruit” to reduce US energy use and CO2 emissions (McKinsey and Company 2009). Several of these options have to do with improvements and retrofits to buildings, which account for approximately 40 % of US energy use. Finding effective and low-cost ways to spur building owners to make these improvements, however, is an ongoing challenge for policymakers. Thus far, 16 local jurisdictions have stepped up to this challenge by passing new energy benchmarking and disclosure ordinances, and several other localities are considering following their lead. In this paper, we have described how these policies work and how they might move the commercial and multifamily building markets toward improved efficiency.

The laws require building owners to provide energy information that may otherwise be obscured to the marketplace. Buildings are complex; prospective tenants and buyers consider a variety of attributes when making lease and purchase decisions, and energy attributes may be low on the list simply due to the difficulty of obtaining the relevant information. Disclosure laws should make at least some of this information easier to get. The laws also could ease some problems that building owners face in making retrofit and improvement decisions. Building owners may want to reduce energy use so as to lower their buildings’ operating costs, but they may believe they are unable to recoup these investment costs in rents and/or they cannot persuade creditors to make loans to cover the costs. Because benchmarking and disclosure of energy use makes energy costs more transparent, it may help overcome these problems.

As currently designed, the laws do have some potential shortcomings. The heavy reliance on Portfolio Manager and the limitations of that software—namely, calibration to the outdated 2003 CBECS data, which include limited numbers of some types of buildings—is a concern. More problematic, perhaps, than Portfolio Manager itself is the reporting, in most cities, of only EUIs and Energy Star scores. These pieces of information might not be all that useful for prospective tenants and buyers. A survey of real estate agents could shed some light on the extent of this problem and reveal if prospective tenants and buyers are using the EUI and Energy Star scores in making their decisions. Our concern is that the actual energy costs for leased space that a prospective tenant is considering may be only weakly correlated with the building’s EUI and Energy Star score.

Perhaps our greatest concern is in the evaluation of these laws as time moves forward. Measurement and verification of energy savings from energy efficiency programs in general are fraught with problems. Access to the requisite utility billing data is difficult to obtain, and simple comparisons of energy use by program participants before and after the program intervention generally are not sufficient to identify the effects of a policy for a variety of reasons. In the case of disclosure requirements, we are concerned that cities will report average Energy Star scores or summary EUI statistics over time and draw conclusions about the efficacy of the requirements. It is essential that independent researchers conduct careful and systematic evaluations that rely on data for both affected and nonaffected buildings from time periods before and after the program takes effect. This type of evaluation will be necessary to understand how effective benchmarking and disclosure requirements are in narrowing the energy efficiency gap.

Notes

  1. 1.

    Percentage calculated from disclosed energy use data from the District Department of Environment (DDOE), http://ddoe.dc.gov/node/784702.

  2. 2.

    Two states, California and Washington, have also adopted benchmarking disclosure laws, but this paper focuses on local ordinances.

  3. 3.

    The Energy Performance of Buildings Directive called for large buildings to be certified at regular intervals and for energy performance certificates to be included in all advertisements for rentals or sales of buildings. According to Leipziger (2013), 31 European countries have established energy rating systems in response to the EU Directive. Leipziger compares and contrasts the rating programs in six European countries (Germany, UK, Ireland, France, Denmark and Portugal) as well as programs from China, the USA and Canada. Consistent with the EU Directive, the European programs typically cover all types of buildings, including single family homes, although separate rating schemes apply to different building types.

  4. 4.

    See https://www.energystar.gov/ia/business/challenge/learn_more/CommercialRealEstate.pdf (accessed April 11, 2014).

  5. 5.

    The principal-agent problem in rental properties does not go away if the landlord pays the bills; in this case, tenants have no incentive to economize on their energy use.

  6. 6.

    Well-known studies by Akerlof (1970) and Stiglitz and Weiss (1981) show that credit rationing can be an equilibrium outcome in situations in which lenders cannot distinguish ex ante between high-risk and low-risk borrowers. This result may apply to energy investments: if lower energy costs make a borrower less likely to default but this is difficult for the lender to observe, many low-risk borrowers may not get loans.

  7. 7.

    Gabaix (2014) develops a general model of bounded rationality and limited attention and shows that it is optimal for an agent to pay more attention to choices that have greater variability, that matter more for the ultimate outcomes, that have lower information costs, and that lead to relatively large losses if an imperfect choice is made.

  8. 8.

    See http://www.buildingrating.org/content/us-policy-briefs. Delays in the development of enabling regulation in Washington, DC, postponed the initial reporting deadline there by several years, until April 2013.

  9. 9.

    To make the comparisons robust to fluctuations in weather, the energy use intensity measures are adjusted for deviations in weather around a typical year for the relevant location.

  10. 10.

    Statistics calculated from disclosed Washington data; 76 % of the square footage also has a score of 75 or above. Interestingly, many of these buildings have not gone through the process of becoming Energy Star certified. DDOE (2014) reports that only 55 % were certified in 2012 or 2013, and more than one-third have never been certified in any year.

  11. 11.

    The New York calculation is the percentage reduction in large buildings—those over 50,000 square feet, which account for 45 % of the city’s total energy consumption across all sources (including transportation).

  12. 12.

    In Chicago, the building category with the highest median EUI is actually health care, which includes hospitals but also medical offices, urgent care facilities and outpatient surgical centers among others.

  13. 13.

    Supermarkets are the highest energy use category in Seattle, but this category is not reported separately by Washington or New York.

  14. 14.

    This market pressure argument is the main rationale that the cities usually give for adoption of the programs. The Washington, DC, Green Building Report states that “transparent building performance information is expected to drive the real estate market toward greater energy efficiency, without explicitly requiring that retrofit improvements be made” (DDOE 2014).

  15. 15.

    Studies by Hamilton (1995) and Khanna et al. (1998) have found stock market effects on firms that disclose their toxic chemical releases in the Toxics Release Inventory.

  16. 16.

    While separate metering of tenant energy use poses challenges for whole building disclosure, separate billing of tenants does provide a way to give individual tenants a full sense of the costs of their own energy use and a direct incentive to reduce energy consumption in order to reduce those costs.

  17. 17.

    The utility and the government feel that there are privacy concerns when the number is below five.

  18. 18.

    City of Chicago, amendment of Title 18 of Municipal Code by adding new Chapter 18–14 regarding building energy use benchmarking (June 26, 2013).

  19. 19.

    See http://green.dc.gov/release/district-releases-benchmarking-performance-large-privately-owned-buildings.

  20. 20.

    Challenges remain even if utility-level data are available. While cities with benchmarking laws will collect information on building features and use that would affect energy consumption, such information typically will not be available to the utilities for those buildings that are not covered by the reporting requirements. Analysts would need to use property fixed effects or find a way to match with other available data, such as from tax records, which provides its own set of challenges. Furthermore, besides New York City, many of the cities do not have a large number of buildings covered by the laws; further restricting the data to buildings just above and below the threshold, in order to estimate a regression discontinuity model, may lead to very small sample problems.

  21. 21.

    The data come from the National Council of Real Estate Investment Fiduciaries (NCREIF), a member-based organization that represents the institutional real estate investment community. NCREIF has maintained a property database from its members since 1979 that includes quarterly information on income and cash flow, property valuation, capital improvement expenditures, operating expenditures, and other information; since 2000, the database has also included quarterly utility expenditures.

  22. 22.

    Buildings in Washington are excluded because of the long delay between passage of the DC law and the initial reporting date.

  23. 23.

    A limitation of the analysis is that there is no way to separate energy from water in the utility expenditures. Energy accounts for 90 % of average utility bills, but the amount could vary by location (Romani et al. 2009).

  24. 24.

    Based on his evaluation of the New York program, Kontokosta (2013) claims that it is costly and time-consuming for building owners to assemble the correct data, enter them into the Portfolio Manager program, and report the required information to the government. Collecting energy data from tenants can be particularly time-consuming and difficult. Third-party companies are entering the market to collect and manage data, however, and even file benchmarking reports on behalf of property owners. These services come at a cost but as the market matures, costs may come down.

References

  1. Akerlof, G. A. (1970). The market for “lemons”: quality uncertainty and the market mechanism. Quarterly Journal of Economics, 84, 488–500.

    Article  Google Scholar 

  2. Allcott, H. (2011). Social norms and energy conservation. Journal of Public Economics, 95(9–10), 1082–1095.

    Article  Google Scholar 

  3. Allcott, H., Mullainathan, S., & Taubinsky, D. (2014). Energy policy with externalities and internalities. Journal of Public Economics, 112, 72–88.

    Article  Google Scholar 

  4. Angrist, J. D., & Pischke, J.-S. (2009). Mostly harmless econometrics: an Empiricist’s companion. Princeton, NJ: Princeton University Press.

    MATH  Google Scholar 

  5. Angrist, J. D., & Pischke, J.-S. (2010). The credibility revolution in empirical economics: how better research design is taking the con out of econometrics. Journal of Economic Perspectives, 24(2), 3–30.

    Article  Google Scholar 

  6. Ayres, I., Raseman, S., & Shih, A. (2013). Evidence from two large field experiments that peer comparison feedback can reduce residential energy usage. Journal of Law Economics and Organization, 29(5), 992–1022.

    Article  Google Scholar 

  7. Baker, Rebecca. 2013. Seattle’s Benchmarking and Reporting Program: 2012 Analysis. Presentation to Institute for Market Transformation, The Energy Data Revolution, November 14.

  8. Burr, Andrew. (2013). Building Energy Benchmarking and Disclosure: US Policy Overview. Presentation to US Department of Energy Better Buildings Summit, Washington, DC, May 30–31.

  9. CBI (Commercial Building Inventory). (2012). The Age of US Commercial Buildings. Available at http://www.commbuildings.com/Research_Reports_Main.htm (accessed April 21, 2014).

  10. City of Boston. (2015). Energy and Water Use in Boston’s Large Buildings, 2014. Available at http://www.cityofboston.gov/images_documents/BERDO_rprt_webfinal_tcm3-52025.pdf (accessed May 15, 2016).

  11. City of Chicago. (2015). 2015 Chicago Energy Benchmarking Report. Available at http://www.cityofchicago.org/content/dam/city/progs/env/EnergyBenchmark/2015_Chicago_Benchmarking_Report_Web_16DEC2015.pdf (accessed May 15, 2016).

  12. City of Minneapolis. (2016). 2014 Energy Benchmarking Report. Available at http://www.minneapolismn.gov/www/groups/public/@health/documents/images/wcmsp-176597.pdf (accessed May 15, 2016).

  13. City of New York. (2012). New York City Local Law 84 Benchmarking Report. New York: Mayor’s Office of Long-Term Planning and Sustainability (August).

  14. City of New York (2014). New York City local law 84 benchmarking report. New York: Mayor’s Office of Long-Term Planning and Sustainability.

    Google Scholar 

  15. City of Seattle. (2015). Seattle Building Energy Benchmarking Analysis Report 2013 Data. January. Seattle: City of Seattle Office of Sustainability and Environment (OSE).

  16. Costa, D., & Kahn, M. (2013). Energy conservation “nudges” and environmental ideology: evidence from a randomized residential electricity field experiment. Journal of the European Economic Association, 11(3), 680–702.

    Article  Google Scholar 

  17. Cox, M., Brown, M., & Sun, X. (2013). Energy benchmarking of commercial buildings: a low-cost pathway toward urban sustainability. Environmental Research Letters, 8(3).

  18. DDOE (District Department of the Environment) (2014). Green building report 2012. Washington, DC: DDOE.

    Google Scholar 

  19. Deng, Y., & Wu, J. (2014). Economic returns to residential green building investment: the developers’ perspective. Regional Science and Urban Economics, 47, 35–44.

    Article  Google Scholar 

  20. Eichholtz, P., Kok, N., & Quigley, J. M. (2010). Doing well by doing good? Green office buildings. American Economic Review, 100(5), 2492–2509.

    Article  Google Scholar 

  21. Eichholtz, P., Kok, N., & Quigley, J. M. (2013). The economics of green building. Review of Economics and Statistics, 95(1), 50–63.

    Article  Google Scholar 

  22. EPA (US Environmental Protection Agency). 2014. Draft Inventory of US Greenhouse Gas Emissions and Sinks 1990–2013. http://www.epa.gov/climatechange/ghgemissions/usinventoryreport.html.

  23. Gabaix, X. (2014). A sparsity-based model of bounded rationality. Quarterly Journal of Economics, 129(4), 1661–1710.

    Article  Google Scholar 

  24. Gillingham, K., & Palmer, K. (2014). Bridging the energy efficiency gap: policy insights from economic theory and empirical evidence. Review of Environmental Economics and Policy, 8(1), 18–38.

    Article  Google Scholar 

  25. Gillingham, K., Newell, R. G., & Palmer, K. (2006). Energy efficiency policies: a retrospective examination. Annual Review of Environment and Resources, 31, 161–192.

    Article  Google Scholar 

  26. Gillingham, K., Newell, R. G., & Palmer, K. (2009). Energy efficiency economics and policy. Annual Review of Resource Economics, 1, 597–620.

    Article  Google Scholar 

  27. Gillingham, K., Harding, M., & Rapson, D. (2012). Split incentives and household energy consumption. Energy Journal, 33(2), 37–62.

    Article  Google Scholar 

  28. Hamilton, J. T. (1995). Pollution as news: media and stock market reactions to the toxic release inventory data. Journal of Environmental Economics and Management, 28(1), 98–113.

    Article  Google Scholar 

  29. Hausman, J. A. (1979). Individual discount rates and the purchase and utilization of energy-using durables. Bell Journal of Economics, 10, 33–54.

    Article  Google Scholar 

  30. Hooper, Barry. 2013. Energy Data and Efficiency of Existing Buildings. Presentation to Institute for Market Transformation webinar, The Energy Data Revolution, November 14.

  31. Houde, Sébastien. 2014. How Consumers Respond to Environmental Certification and the Value of Energy Information. Working Paper 20019. National Bureau of Economic Research (NBER).

  32. Hsu, David. 2012. City of New York LL84, Data Analysis & Quality Assessment. Report to New York City, March 14. http://www.nyc.gov/html/gbee/html/plan/ll84_scores.shtml.

  33. Hsu, D. (2014a). Improving energy benchmarking with self-reported data. Building Research and Information, 42(5), 641–656.

    Article  Google Scholar 

  34. Hsu, D. (2014b). How much information disclosure of building energy performance is necessary? Energy Policy, 64, 263–272.

    Article  Google Scholar 

  35. Jaffe, A. B., & Stavins, R. N. (1994). The energy paradox and the diffusion of conservation technology. Resource and Energy Economics, 16, 91–122.

    Article  Google Scholar 

  36. Jaffee, Dwight, Richard Stanton, and Nancy Wallace. 2011a. Energy Efficiency and Commercial-Mortgage Valuation. Working paper. UC-Berkeley Haas School of Business.

  37. Jaffee, Dwight, Richard Stanton, and Nancy Wallace. 2011b. Energy Factors, Leasing Structure and the Market Price of Office Buildings in the US. Working paper. UC-Berkeley Haas School of Business.

  38. Kahn, M. (2007). Do greens drive hummers? Environmental ideology as a determinant of consumer choice. Journal of Environmental Economics and Management, 54(2), 129–145.

    Article  MATH  Google Scholar 

  39. Khanna, M., Quimio, W. R. H., & Bojilova, D. (1998). Toxic release information: a policy tool for environmental protection. Journal of Environmental Economics and Management, 36(3), 243–266.

    Article  MATH  Google Scholar 

  40. Kok, N., & Jennen, M. (2012). The impact of energy labels and accessibility on office rents. Energy Policy, 46(1), 489–497.

    Article  Google Scholar 

  41. Kontokosta, Constantine. 2012. Local Law 84 Energy Benchmarking Data: Report to the New York City Mayor’s Office of Long-Term Planning and Sustainability, April. http://www.nyc.gov/html/gbee/html/plan/ll84_scores.shtml.

  42. Kontokosta, C. (2013). Energy disclosure, market behavior, and the building data ecosystem. Annals of the New York Academy of Sciences, 1295, 34–43.

    Article  Google Scholar 

  43. Kontokosta, Constantine. (2014). A Market-Specific Methodology for a Commercial Building Performance Index. Journal of Real Estate Finance and Economics (August): 1–29.

  44. Kotchen, M. (2006). Green markets and private provision of public goods. Journal of Political Economy, 114, 816–834.

    Article  Google Scholar 

  45. Kotchen, M., & Moore, M. R. (2008). Conservation: from voluntary restraint to a voluntary price premium. Environmental and Resource Economics, 20, 195–215.

    Article  Google Scholar 

  46. Leipziger, David. 2013. Comparing Building Energy Efficiency Measurement: A Framework for International Energy Efficiency Assessment Systems, Institute for Market Transformation. http://www.imt.org/uploads/resources/files/ComparingBuildingEnergyPerformanceMeasurementFINAL.pdf.

  47. Fannie Mae. 2014. Transforming Multifamily Housing: Fannie Mae’s Green Initiative and Energy Star for Multifamily. Washington, DC: Fannie Mae. https://www.fanniemae.com/content/fact_sheet/energy-star-for-multifamily.pdf.

  48. Mayor’s Office of Sustainability (2014). City of Philadelphia energy benchmarking report: 2014. Philadelphia: Mayor’s Office of Sustainability.

    Google Scholar 

  49. McKinsey & Company (2009). Unlocking energy efficiency in the US economy. New York: McKinsey & Company.

    Google Scholar 

  50. Meyer, B. (1995). Natural and quasi-experiments in economics. Journal of Business and Economics Statistics, 13(2), 151–161.

    Google Scholar 

  51. Myers, Erica. 2014. Asymmetric Information in Residential Rental Markets: Implications for the Energy Efficiency Gap. Working paper. Berkeley: University of California.

  52. Palmer, Karen, Margaret Walls, and Todd Gerarden. 2012. Borrowing to Save Energy: An Assessment of Energy Efficiency Financing Programs. Report. Washington, DC: Resources for the Future.

  53. Palmer, Karen and Margaret Walls. 2015a. Does Information Provision Shrink the Energy Efficiency Gap? A Cross-City Comparison of Energy Benchmarking and Disclosure Laws. Discussion paper 15–12. Washington, DC: Resources for the Future.

  54. Palmer, Karen and Margaret Walls. 2015b. Limited Attention and the Residential Energy Efficiency Gap.” American Economic Review 105(5): 192–195.

  55. Romani, L., Towers, M., Abate, D., & Dotz, R. (2009). The Whitestone facility operations cost reference: third (Annual ed.). Washington, DC: Whitestone Research Corporation.

    Google Scholar 

  56. Sallee, J. (2014). Rational inattention and energy efficiency. Journal of Law and Economics, 57(3), 781–820.

    Article  Google Scholar 

  57. SEE Action (State and Local Energy Efficiency Action Network). (2012). Evaluation, Measurement, and Verification (EM&V) of Residential Behavior-Based Energy Efficiency Programs: Issues and Recommendations. Prepared by A. Todd, E. Stuart, S. Schiller, and C. Goldman, Lawrence Berkeley National Laboratory. http://behavioranalytics.lbl.gov.

  58. SEE Action (State and Local Energy Efficiency Action Network). 2013. A Utility Regulator’s Guide to Data Access for Commercial Building Energy Performance Benchmarking. Prepared by Andrew Schulte, ICF International.

  59. Stavins, Robert, Todd Schatzki, and Jonathan Borck. 2013. An Economic Perspective on Building Labeling Policies. Boston: Analysis Group. Report for the Building Owners and Managers Association (BOMA) International and the Greater Boston Real Estate Board (March 28).

  60. Stiglitz, J. E., & Weiss, A. (1981). Credit rationing in markets with imperfect information. American Economic Review, 71(3), 393–410.

    Google Scholar 

  61. Thomas, Brad. 2012. Benchmarking Green: The First Investable US Green Property Indexes for REITs. Forbes, November 19.

Download references

Acknowledgments

The authors thank Lucy O’Keeffe and Shefali Khanna for research assistance and Paige Gance for editorial assistance. They also gratefully acknowledge funding support from the Alfred P. Sloan Foundation. An earlier version of this paper was presented at the MIT Energy Initiative Symposium, Large Opportunities, Complex Challenges: Seizing the Energy Efficiency Opportunity in the Commercial Buildings Sector, Cambridge, MA, May 12, 2014.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Karen Palmer.

Appendix. Other local benchmarking and disclosure provisions

Appendix. Other local benchmarking and disclosure provisions

Table 3 In the table below, we provide some additional information on benchmarking and disclosure requirements in the 16 jurisdictions beyond the basic information provided in Table 1. We also list the benchmarking tool the cities require or allow and the precise information disclosed.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Palmer, K., Walls, M. Using information to close the energy efficiency gap: a review of benchmarking and disclosure ordinances. Energy Efficiency 10, 673–691 (2017). https://doi.org/10.1007/s12053-016-9480-5

Download citation

Keywords

  • Energy efficiency
  • Commercial buildings
  • Disclosure
  • Benchmarking
  • Energy star
  • LEED