Summary: In a randomized controlled trial of over 30,000 households in Michigan, where 6% of them received energy efficiency improvements, the cost of efficiency upgrades were about double the energy savings, with a negative -9.5% rate of return of annually.
[But it’s still a good idea to insulate your home! What price can you put on not suffering, or dying from excess heat or cold?
I have questions about the study. Energy prices are temporarily cheap because of fracked gas and oil which are likely to peak in 2015-2019, and high unemployment lowering demand. If prices double or triple, then weatherization will pay for itself.
I didn’t see a breakdown of labor and materials. If the purpose of this program is also to create jobs, what percent went to labor rather than insulation, etc. And what was done exactly? Insulation, caulking, double-pane windows?
Monetary studies are always troubling when it comes to energy. What’s the EROI of energy saved versus embodied energy in the materials and labor?
Alice Friedemann www.energyskeptic.com]
New evidence supports the need for additional policy solutions to confront climate change while more field evidence is gathered to identify the most beneficial energy efficiency investments.
Berkeley, CA, June 23, 2015 – Energy efficiency investments are widely popular because they are believed to deliver a double win: saving consumers money by reducing the amount of energy they use, while cutting climate-forcing greenhouse gas emissions and other pollutants harmful to human health. But a new study by a team of economists finds residential energy efficiency investments may not deliver on all that they promise.
Through a randomized controlled trial of more than 30,000 households in Michigan – where one-quarter of the households were encouraged to make residential energy efficiency investments and received assistance – the economists find that the costs to deploy the efficiency upgrades were about double the energy savings.
“In the case of residential energy efficiency investments, the projected savings overestimate the reality on the ground,” says Michael Greenstone, the Milton Friedman professor of economics and director of the Energy Policy Institute at the University of Chicago (EPIC). “A problem as urgent as climate change must be addressed using policies that deliver the greatest bang for their buck….In the meantime, it is critical that we field test energy efficiency programs to determine which investments offer the greatest potential.”
The study – a part of The E2e Project and led by Greenstone, as well as Meredith Fowlie and Catherine Wolfram of UC Berkeley – assessed the nation’s largest residential energy efficiency program, the Federal Weatherization Assistance Program (WAP). Participating low-income households were provided with about $5,000 worth of weatherization upgrades (e.g. furnace replacement, attic and wall insulation, and weather stripping) per home at zero out-of-pocket costs. The upgrades did reduce the households’ energy consumption by about 10 to 20% each month. But that only translated into $2,400 in savings over the lifetime of the upgrades – half of what was originally spent to make the upgrades, and less than half of projected energy savings.
“Energy efficiency programs are generally viewed as cost effective. This view is often based on engineering calculations and associated savings projections,” says Fowlie, an associate professor of agriculture and resource economics and Class of 1935 Endowed Chair in Energy at UC Berkeley. “Our data-driven analysis that measures the actual returns on energy efficiency investments shows how these projections can be quite flawed. In actuality, the energy efficiency investments we evaluated delivered significantly lower savings than the models predict.”
Past studies have claimed that energy efficiency investments don’t deliver the expected energy savings because of a ‘rebound effect’: households adjust their behaviors and consume more energy services than they had before the investments were made. However, the economists could find no evidence of this ‘rebound effect’ in the households they studied.
Further, some say that the broader societal benefits – savings as a result of reductions in pollution from energy production– justify the investments. Again, the findings did not support this.
Another claim is that energy efficiency programs have a low take-up rate because consumers don’t know about the programs or how to participate, driving down the expected benefits. To investigate this, the authors studied whether extensive outreach and assistance would boost the take-up rate of the program. Using a firm with extensive experience in managing outreach campaigns, the research team made almost 7,000 home visits, more than 32,000 phone calls, and 2,700 follow-up appointments. Yet, despite this aggressive outreach and personal assistance, only 6% of households in the treatment group participated in the program, compared to 1% in the control group. In the end, it cost more than $1,000 for each additional household encouraged to undertake these free energy efficiency investments.
“At the end of the day, the models don’t capture some of the hard-to-quantify costs involved in making energy efficient choices, which could help explain why people aren’t taking advantage of the opportunities as much as the models predict,” says Wolfram, the Cora Jane Flood professor of business administration at UC Berkeley’s Haas School of Business and faculty director at the Energy Institute at Haas. “This is another reason why potential energy efficiency investments need to be rigorously tested in real-world conditions before relying too heavily on them to solve climate change.”
Read the full working paper
This research was made possible thanks to generous support from the Alfred P. Sloan Foundation, the MacArthur Foundation, the Rockefeller Foundation, and the UC Berkeley Energy and Climate Institute.
[Some excerpts from the 28 page paper]:
The Weatherization Assistance Program (WAP) is the nation’s largest residential energy-efficiency program. WAP supports improvements in the energy efficiency of dwellings occupied by low-income families. Since its inception in 1976, over 7 million low-income households have received weatherization assistance through the program. Proponents credit the program with saving energy, creating jobs, reducing emissions, and assisting low-income households. The American Recovery and Reinvestment Act PL111-5 (ARRA) dramatically increased the scale and scope of WAP.6 Our analysis seeks to estimate the impacts of weatherization assistance over the ARRA-funded time period. WAP funds are distributed to states based on a formula tied to a state’s climate, the number of low-income residents, and their typical energy bills. The states distribute WAP money to over 1,000 local sub-grantees, which are typically community action agencies (CAAs) or similar nonprofit groups. These sub-grantees are then tasked with identifying and serving eligible households. Participating WAP households receive free energy audits and a home retrofit.
The average participating household in our data received an average of $4,143 of energy efficiency investments and over $1,000 worth of additional house improvements at zero out-of-pocket costs.7 Before implementing a weatherization retrofit, CAA program staff conduct an energy audit of the home. The purpose of the audit is to make recommendations regarding which efficiency improvements should be implemented at the home. During the visit, program auditors collect detailed information about the building structure and other construction details, heating and cooling systems, appliances, ventilation, etc. This information is combined with local climate conditions and retrofit measure costs, then fed into a computer-based audit tool: the National Energy Audit Tool (NEAT). This tool uses engineering algorithms to model the energy use of single-family and small multi-family residential units. NEAT is the most widely used tool for weatherization audits; it is used by state and local WAP sub-grantees, utility companies, and home energy auditors (EERE, 2010).
Michigan is one of the largest recipients of WAP program funding on account of its cold winters and large low-income population, and received over $200 million in ARRA funding for weatherization assistance. All stimulus funds had to be spent by March 2012. After that point, the pace of weatherization activity dropped precipitously.
A critical issue for the validity of the estimates from this design is how households in this sample were chosen for weatherization. The road from application to energy efficiency investments is long and there are many potential off-ramps. Applicant households may fail to complete the necessary – and involved – paperwork or may be deemed ineligible based on the information they provide. Once paperwork is completed successfully, households are put on a list where the waiting times can exceed one year. After rising to the top of the list, homeowners must accommodate scheduling of energy audits. Households may fail to receive weatherization if they miss an audit appointment, or if the auditors discover risks to WAP contractors (e.g., asbestos in the home). Because of significant delays in ramping up weatherization activities under ARRA, the agencies were unable to complete the weatherizations they anticipated prior to the March 2012 ARRA deadline, which helps to explain why fewer than half of the applicants in our sample were weatherized by mid-2014.
we spent around $475,000 on the encouragement or a little more than $55 per household in the treatment group. The low take-up rates in the encouraged group are quite striking. Program participants receive substantive home improvements, yet incur no out-of-pocket expenses. All households in the encouraged group received some information about the program via a phone call or door hanger. It may seem straightforward to encourage households to participate in a program that provides free efficiency retrofits worth an average of approximately $5,000 that are designed to significantly reduce energy expenditures. In our experience, that was hardly the case. The impact of reducing barriers to participation (e.g., information and process costs) on program uptake is of independent interest both to policymakers and researchers.
In the end, the average cost of encouragement per completed weatherization was about $1,050, which is more than 20% of the average costs of weatherization improvements.
We obtained monthly natural gas and electricity consumption data over the period June 2008 to May 2014. This period includes at least two years of pre-retrofit data for all weatherized households in our sample.
What Explains the Low Rate of Return on These Efficiency Investments?
It is natural to ask why the returns to residential energy efficiency investments are so low. After all, WAP is designed so that the only measures implemented are ones with projected savings to costs ratios greater than one. An important factor leading to negative returns on investment is the incomplete realization of projected energy savings. The projected savings are about 2.5 times the preferred experimental savings estimate. Further, the projected savings are roughly 4 times the quasi-experimental estimates of energy savings. There are relatively few ex-post estimates in the academic peer-reviewed literature, with Davis et al. (2014) and Dubin et al. (1986) serving as notable exceptions. Both of those papers similarly find low realization rates, although they largely attribute it to behavioral responses (i.e., the rebound effect) which we have shown plays at most a minor role in this paper’s setting.
Because energy efficiency programs are implemented by regulated utilities, there are a number of regulatory filings that estimate ex post program savings and it is not unusual for them to find that the programs deliver estimated savings considerably lower than projected.
Having ruled out the rebound effect as the primary explanation for the gap between projected and realized energy savings, we conclude that the efficiency audit tool must systematically overstate the real returns to these investments.
Along these lines, we explore some alternative sources of this bias. First, we compare the distribution of temperatures observed during our study period against the typical weather patterns on which engineering calculations are based. Although we do observe some moderate spells in our time-frame, on average we observe colder than average temperatures and higher than average degree day measures in our sample; these colder temperatures should lead to greater than average savings. A second potential source of bias concerns the over-statement of baseline energy use. Several studies and utility reports have documented how software-based energy analysis of existing homes tends to over-predict pre-retrofit energy use and retrofit energy savings.46 Indeed, we found in our data that the NEAT program predicts baseline natural gas consumption that exceed actual consumption by more than 25% prior to weatherization. This suggests that the auditing tool could be under-estimating the efficiency properties of the average home prior to weatherization, which may partly explain the over-statement of the benefits of upgrading to a given efficiency standard. Overall, our findings suggest that the NEAT audit tool over-estimates returns by a significant margin. Further, this overestimation of savings does not appear to be due to behavioral responses. This is an important finding in its own right; NEAT is widely used by state and local WAP sub grantees, utility companies, and home energy audit firms.
The results are striking because Michigan’s cold winters and the likelihood that the weatherized homes were not in perfect condition suggests that it may have been reasonable to expect high returns in this setting. Regardless of one’s priors, this paper underscores that it is critical to develop a body of credible evidence on the true, rather than projected, returns to energy efficiency investments in the residential and other sectors. The findings also suggest that the last several decades may have seen too much investigation into the why of the energy efficiency gap and not enough into whether there really was one.
Future research should examine whether the real world returns to energy efficiency investments differ so starkly from engineering projections in other settings.
Allcott, H. and Greenstone, M. (2012). Is there an energy efficiency gap? The Journal of Economic Perspectives, 6(1):3–28.
Allcott, H. and Greenstone, M. (2015). Maximizing money vs. utility: Measuring the welfare effects of energy efficiency programs. Mimeograph.
Barbose, G. L., Goldman, C. A., Hoffman, I. M., and Billingsley, M. A. (2013). The future of utility customer-funded energy efficiency programs in the United States: projected spending and savings to 2025. Energy Efficiency Journal, 6(3):475–493.
Callaway, D., Fowlie, M., and McCormick, G. (2015). Location, location, location: the variable value of renewable energy and demand-side efficiency resources. Working paper.
CPUC (2015). 2010 to 2012 energy efficiency annual progress evaluation report. Technical report, California Public Utilities Commission.
Davis, L. (2008). Durable goods and residential demand for energy and water: evidence from a field trial. RAND Journal of Economics, 39(2):530–546.
Davis, L., Fuchs, A., and Gertler, P. (2014). Cash for coolers: evaluating a large-scale appliance replacement program in Mexico. American Economic Journal: Economic Policy, 6(4):207–238.
Davis, L. and Muehlegger, E. (2010). Do Americans consume too little natural gas? An empirical test of marginal cost pricing. The RAND Journal of Economics, 41(4):791–810.
Dubin, J. A., Miedema, A. K., and Chandran, R. V. (1986). Price effects of energy-efficient technologies: a study of residential demand for heating and cooling. The RAND Journal of Economics, 17(3):310–325.
Dyson, M., Borgeson, S., Michaelangelo, T., and Callaway, D. (2014). Using smart meter data to estimate demand response potential, with application to solar energy integration. Energy Policy, 73:607–619.
EERE (2010). Review of selected home energy auditing tools: in support of the development of a national building performance assessment and rating program. Technical report, U.S. Department of Energy, Energy Efficiency & Renewable Energy.
EIA (2015). Natural gas monthly. Technical report, Energy Information Administration.
Fowlie, M., Greenstone, M., and Wolfram, C. (2015). Are the non-monetary costs of energy efficiency investments large? Understanding low take-up of a free energy efficiency program. American Economic Review Papers and Proceedings. Friedman, D. (1987). Cold houses in warm climates and vice versa: A paradox of rational heating. Journal of Political Economy, 95(5):1089–97.
Gerarden, T. D., Newell, R. G., and Stavins, R. N. (2015). Assessing the energy-efficiency gap. Technical report, Harvard Environmental Economics Program. 37
Gillingham, K., Kotchen, M. J., Rapson, D. S., andWagner, G. (2013). Energy policy: The rebound effect is overplayed. Nature, 493(7433):475–476.
Gillingham, K. and Palmer, K. (2014). Bridging the energy efficiency gap: policy insights from economic theory and empirical evidence. Review of Environmental Economics and Policy, 8(1):18– 38.
Graff Zivin, J. and Novan, K. (2015). Upgrading efficiency and behavior: electricity savings from residential weatherization programs. arefiles.ucdavis.edu.
Greenstone, M., Kopits, E., and Wolverton, A. (2013). Developing a social cost of carbon for US regulatory analysis: a methodology and interpretation. Review of Environmental Economics and Policy, 7(1):23–46.
Hirst, E. (1987). Cooperation and community conservation comprehensive report. Technical report, Oak Ridge National Laboratory.
ICF (2014). EPA’s 111(d) clean power plan could increase energy efficiency impacts, net benefits, and total value. White paper and webinar. Technical report, ICF International.
IEA (2013). Redrawing the energy-climate map. Technical report, International Energy Agency.
Jacobsen, G. D. and Kotchen, M. J. (2013). Are building codes effective at saving energy? Evidence from residential billing data in Florida. Review of Economics and Statistics, 95(1):34–49.
Joskow, P. L. and Marron, D. B. (1992). What does a negawatt really cost? Evidence from utility conservation programs. The Energy Journal, 13(4):41–74.
Loftus, P. J., Cohen, A. M., Long, J. C. S., and Jenkins, J. D. (2015). A critical review of global decarbonization scenarios: what do they tell us about feasibility? Wiley Interdisciplinary Reviews: Climate Change, 6(1):93–112.
McKinsey & Company (2009). Unlocking energy efficiency in the U.S. economy. Technical report, McKinsey Global Energy and Materials.
Metcalf, G. E. and Hassett, K. A. (1999). Measuring the energy savings from home improvement investments: evidence from monthly billing data. The Review of Economics and Statistics, 81(3):516–528.
Muller, N. and Mendelsohn, R. (2009). Efficient pollution regulation: getting the prices right. The American Economic Review, 99(5):1714–1739.
Radnofsky, L. (2010). A stimulus project gets all caulked up. Wall Street Journal. SBW (2012). 2010-2012 pge and sce whole house retrofit program process evaluation study. Technical report.
Schwarz, P. M. and Taylor, T. N. (1995). Cold hands, warm hearth? Climate, net takeback, and household comfort. The Energy Journal, 16(1):pp. 41–54.
Thorpe, D. (2013). Energy Management in Buildings: The Earthscan Expert Guide. Routledge. 38