But this simple concept conceals a great deal of complexity. And given the stakes of solar grid parity, it’s worth exploring the details.
The Cost of Solar Energy
For starters, what’s the right metric for the cost of solar? The installed cost for residential solar ($6.40 in 2011), or commercial solar ($5.20) or utility-scale solar ($3.75)? Even if we pick one of these, it’s difficult to compare apples to apples, because grid electricity is priced in dollars per kilowatt-hour of electricity, not dollars per Watt.
Enter “levelized cost,” or the cost of a solar PV array averaged over a number of years of production. For example, a 1 kilowatt (kW) solar array installed in Minneapolis for $6.40 per Watt costs $6,400. Over 25 years, we can expect that system to produce about 30,000 kilowatt-hours (kWh), so the “simple levelized cost” is $6,400 divided by 30,000, or about $0.21 per kWh.
But people usually borrow money, and pay interest, to install solar power. And there are some maintenance costs over those 25 years. And we also use a “discount rate” that puts heavier weight on dollars spent or earned today compared to those earned 20 years from now. A 1 kW solar array that is 80% paid for by borrowing at 5% interest, with maintenance costs of about $65 per year, and discounted at 5% per year will have a levelized cost of around $0.37.
That means that “solar grid parity” for this 1 kW solar array happens if the grid electricity price is $0.37 per kWh. But this calculation is location specific.
In Los Angeles, that same 1 kW system produces 35,000 kWh over 25 years, lowering the levelized cost to $0.31. The timeframe also matters.
If we only look back at the Minneapolis project with a levelized cost of $0.37, but instead look at the output over 20 years instead of 25 years, it increases the levelized cost to $0.43 because we have fewer kWh of electricity over which to divide our initial cost.
We choose 25 years because solar PV panels have a good chance of producing for that long.
We also use a lower installed cost that the U.S. average. Residential solar projects may average $6.40 per Watt, but there are some good examples of aggregate purchase residential solar projects costing $4.40 per Watt. The levelized cost of solar at $4.40 per Watt in Minneapolis is $0.25; in Los Angeles it is $0.21.
Utilities like to compare new electricity production to their existing fleet, which means comparing new solar power projects to long-ago-paid-off (amortized) coal and nuclear power plants that can produce electricity for 3-4 cents per kWh. But this is apples to oranges, because utilities can’t get any new electricity for that price, from any source.
A more appropriate measure of the grid price is the marginal cost for a utility of getting wholesale power from a new power plant. In California, this is called the “market price referent” and it’s around 12 cents per kWh. The figure varies from state to state.
But while the market price referent provides a reasonable comparison for the cost of utility-scale solar, it’s not the number that matters for solar installed on rooftops or near buildings. In those cases, the power is used “behind the meter,” and depending on the type of state policy for net metering, the customer can essentially spin their electric meter backward when their solar panels produce electricity. That means that solar power is really competing against the energy cost on a utility bill, known as the “retail price.”
In general, the residential retail electricity price is the generally accepted grid parity price. With this price and our previous map of the levelized cost of solar, we can assess the state of solar grid parity. The following map shows the ratio of the levelized cost of solar to the grid parity price in each state. Only Hawaii has reached solar grid parity without incentives.
As time rolls ahead, and grid prices rise while solar costs fall, the picture changes. In five years (2016), three states representing 57 million Americans will be at solar grid parity: Hawaii, New York, and California.
There are other considerations in the grid parity calculation.
Time-of-Use Rates
Some utility customers pay “time-of-use” rates that charge more for electricity consumed during times of peak demand, such as when a hot sunny day has everyone using their air conditioners. Under these rates, a solar project can be replacing electricity that costs upwards of $0.30 per kWh. Over a year, time-of-use rates can (on average) boost the cost of electricity – at peak times, when solar panels produce a lot of power – by about 30 percent. Assuming every state implemented time-of-use pricing (and that it was equivalent to a 30 percent increase in grid prices during peak times), solar grid parity would be a reality in 14 states in 2016, instead of just 3.
Solar v. Grid Over Time
There’s one other calculation. Let’s say that in 2011 solar still costs just a bit more than the grid electricity price, but that the grid price is rising at a modest rate each year. In this case, solar may still be the right choice because the lifetime cost of solar (at a fixed price) will be less than the rising cost of grid electricity. We can use an accounting tool called net present value to estimate the savings from solar compared to grid power over 25 years, and we find that for every percentage point annual increase in electricity prices, solar can be ~10% more expensive that grid power today and still be at “parity.” We find that with electricity price inflation of 2% per year, solar grid parity shifts up two years using this method.
John Farrell is a senior researcher at the Institute for Local Self Reliance. This piece was originally published at Energy Self Reliant States.