According to the US Energy Information Administration, the average annual electricity consumption for a U.S. residential utility customer was 10,766 kilowatt-hours (kWh) in 2016. This is an average of 897kWh per month, around 30kWh per day, or 1,250 watts.
There is quite a wide variation of consumption by state, with Louisiana homes using over 40kWh per day and homes in Hawaii using just over 16kWh. Additionally, homes have pronounced diurnal, weekly and seasonal variation; meaning electricity consumption is not spread evenly over hours of the day, days of the week or months of the year.
The majority of power is used during the daytime (referred to as “on-peak” and usually occurring between 7am and 10pm on weekdays). Demand levels tend to be lowest between 10pm and 7 am and on weekends (this is usually referred to as “off-peak”). Demand levels also tend to be highest in winter and summer when the need for space conditioning (heating or cooling) is high.
All of these different variation factors mean that average figures alone aren’t sufficient when modelling different types of electricity supply. In order to prevent power shortages, it is necessary to consider peak demand, i.e. the maximum electricity demanded by homes. In an ideal world this would be a detailed analysis of a specific home in a specific location, but for large scale modelling purposes this is clearly impractical. At Upstart Power, we use the Peak to Average demand ratio in our modelling. We track the ratio by region by year but as a rule of thumb tend to use the information for New England as our reference point. The Peak to Average demand ratio for New England has increased steadily over the last 20 years from around 1.5 to 1.9. This means that the highest peak hour demand for electricity is nearly twice the average hourly level.
We will return to these figures in a future blog about how we designed our power systems.