Hourly kilowatt data were analysed at the individual customer level using mixed-effects models, also referred to as hierarchical or multilevel models. These models provide a major statistical framework for many kinds of statistical problems involving nested data and are now routinely applied in a great variety of disciplines. Thus, there is an extensive statistical literature on this particular methodology. (See for example Raudenbush and Bryk 2002.) For the current study, a sample of customers was obtained, and from those individuals, hourly data were obtained. These data have a hierarchical structure: hourly kilowatt data are nested within days and days are nested within customers.
For each analysis described below, three types of impacts were calculated: Overall Energy impacts, Non-event Peak impacts, and Event Peak impacts. These three values represent different time periods, as follows.
Overall Energy impacts represent the average change in hourly energy use across all 24 h of the day and all 7 days of the week, including weekends and holidays;
Non-event Peak impacts represent the average hourly change across the three peak hours, from 4 to 7 pm on non-event weekdays (weekends and holidays are excluded);
Event Peak impacts represent the average hourly change across the three event hours, from 4 to 7 pm on event weekdays.
The analysis measures all impacts relative to baselines modeled using summer 2010 participant loads corrected to reflect 2011 temperatures. The same analysis applied to the wait list, used as a control group, showed no statistically significant difference between 2010 and 2011 energy use; thus, a correction for exogenous effects was not required. For consistency and ease of comparison, all impacts are presented in units of average kilowatt-hours per hour, abbreviated in most cases to kilowatts. Positive impact values indicate an increase in energy use relative to the baseline, whereas negative values indicate energy savings.
Load impacts: effect of real-time information
Although this study touches on many aspects of new residential programs and technologies, the main objective of this study was to investigate the usefulness of real-time energy data at the home and appliance levels in reducing energy use and peak loads. This objective is met through a three-way comparison between the Baseline information group, the Home information group, and the Appliance information group.
Table 3 provides the load impacts for each of the three information groups. In each case, the negative kilowatt values indicate the average hourly savings for the summer of 2011, where each group is compared to its own 2010 baseline corrected for weather effects.
Keep in mind that the Baseline group received the same non-treatment interventions as the Home and Appliance groups: invitation to participate, website, customer support, home energy assessments, and the like. Thus, a direct comparison of the information treatment groups to the Baseline group negates the effects of these non-treatment interventions, revealing the effect of the information treatment alone.
Compared to the Baseline group, real-time Home information lowered overall energy use by about 4 %. This value translates to 50 W savings every hour of the summer, or roughly 150 kWh savings over the course of a 4-month summer. This energy savings effect was not seen for the Appliance information group.
Real-time information at both the Home and Appliance levels had a significant effect on non-event peak loads relative to the Baseline group, with real-time Home information increasing savings by 4 %, and real-time Appliance information increasing savings by 7 %.
In contrast, real-time information did not have a significant effect on event peak impacts—i.e. all three information treatments dropped roughly 1 kW during events. Figure 3 illustrates the variation in impacts by treatment group.
Load impacts: effect of dynamic rate and ATC
Unlike the information treatments, which were randomly assigned to customers prior to recruitment, the Summer Solutions rate and ATC option were offered in the Participation Agreement, where participants could freely choose to sign up for the Summer Solutions rate only (SS Rate), the ATC option only, or both, or neither. As a result, the reader is cautioned against extrapolation of any individual program option result to the general population; however, results are representative of a voluntary program in which the same program options are offered. Load impacts by program option are shown in Table 4.
In every case, the “Neither option” group—participants who chose to have the equipment installed, but chose to not sign up for either the Summer Solutions Rate or the ATC option—was the least responsive of the four groups. Overall energy savings were similar for the three groups that chose at least one program option; however, peak impacts varied significantly. On both event and non-event weekdays, those on the Summer Solutions rate saved significantly more than did those on the ATC program only. Figure 4 illustrates the variation in impacts by treatment group.
On average, Summer Solutions participants saved about $10 (7.6 %) per month on their electricity bills due to energy savings alone. For those on the standard rate, bill savings ended there. Those on the Summer Solutions rate saved an additional $9.45 per month on average, for a total savings of roughly 15 % compared to what they would have paid had they not signed up for the study.
Figure 5 shows a scatterplot of rate-dependent bill impacts for those on the Summer Solutions rate. Nearly three quarters (72 %) of the participants who signed up for the Summer Solutions rate saved money on their 2011 summer bills relative to the standard rate. At the extremes, about 4 % of participants saw their summer bills increase by more than 10 % as a result of the rate, while one third (33 %) saw their summer bills decrease by more than 10 % as a result of the rate.
One of the secondary objectives of this study was to estimate participation rates for the Summer Solutions rate and ATC programs. Of the 237 customers that responded to the first 4,000 invitations, 117 (49 %) chose both the Summer Solutions rate and the ATC option, 60 (25 %) chose the Summer Solutions rate only, 31 (13 %) wanted only the ATC option, and 30 (13 %) wanted neither option. The overall 75 % signup rate for the residential Summer Solutions rate is comparable to the 65 % signup rate found for the TOU-CPP rate offered in the Small Business Summer Solutions Study (Herter et al. 2009).
Correlating behaviour with load impacts
This section reports the Pearson product–moment correlation coefficients (r) for correlations between customer-specific energy use, impact values, and survey answers collected after the summer 2011 test period. Customer-specific impact values for these correlations were calculated using a difference-in-differences between treatment and control group 2010 and 2011 values.
Table 5 provides correlation coefficients for customer-specific impact values and summer 2011 behaviours collected via the post-summer survey. Values are statistically significant at the α = 0.05 level where |r| ≥ 0.14, presented in bold font for ease of review. Notable results include significant savings across the board for participants who set their thermostats at 78 °F (25.6 °C) or higher in summer, or increased their thermostat setpoint during the peak period. Participants who precooled before the peak period saved on both non-event and event peak periods. Overall energy savings were improved for participants who replaced older AC units. Non-event peak savings were significant for those who removed a refrigerator from the garage or had a Home Energy Assessment completed. Load sheds during events were also improved for those who spent event periods outside the home and for those who avoided showers because they have electric water heaters.
Thermostat offsets and overrides during events
At the end of the summer, the Summer Solutions thermostats were polled for user programmed offsets and event overrides. Recall that customers in the ATC group had an obligatory 4 °F (2.2 °C) event offset, while all non-ATC participants had the option to change this offset to any value 0–10 °F (0–5.6 °C). Among those on the Summer Solutions Rate, very few changed the default setting, resulting in an average programmed offset of 3.9 °F (2.2 °C). The non-ATC participants on the standard rate were most likely to downgrade their event offset, but even this group maintained an average default event offset of 3.3 °F (1.8 °C; Fig. 6).
An event override is an intervention by an occupant to change the preset automated response during an event period. Participants on the ATC program were allowed to override 8.3 % of summer events (1 out of 12), while non-ATC customers on the Summer Solutions or Standard rates were allowed to override 100 % of events. Figure 7 shows that overrides were most prevalent for those who had no monetary or obligatory incentive to respond to events, i.e. the Non-ATC participants on the Standard rate. On average, those on the Summer Solutions rate overrode less than one of the 12 events, while those on the ATC program were least likely to override.
Participant satisfaction and comments
Of the 236 participants who rated the Summer Solutions Study in the End-of-Summer Survey, 83 % said their satisfaction with the program was Excellent or Good, and more than 90 % of participants chose to remain in the Summer Solutions Study for 2012. Of those who dropped, 5 % explicitly chose not to participate beyond 2011, and 5 % were unreachable.