Follow Us

Wind and solar are said to be cheaper, but utilities keep asking for higher rates from consumers

"Green" energy proponents and the media have claimed that solar and wind generation are "now cheaper than gas, oil or coal, and renewables now account for 15% of power generation, a three-fold increase over 7 years. Yet, utilities have this month filed rate increase requests across the country.

Published: January 12, 2024 11:00pm

Updated: January 13, 2024 10:53am

It’s often reported that wind and solar are the cheapest energy sources available. In the U.S, between 2015 and 2022, wind and solar increased from 5.6% of the electricity produced to nearly 15%. If wind and solar are so cheap, this three-fold increase should have produced a wave of reports of falling electricity rates across the country. Instead, we’ve seen the exact opposite.

Rising rates

In the past thirty days, the media reported on rate increase requests in Illinois, California, Wyoming, Maryland, Florida, Oregon, North Carolina, Georgia, Nevada, Louisiana, Alaska, West Virginia, and many other states. In some of the cases, public utility commissions denied the requests.

Despite widespread reports of rate increases, the media continue to claim that wind and solar are the cheapest energy to be had. “The cost of generating electricity from the sun and wind is falling fast and in many areas is now cheaper than gas, oil or coal,” the New York Times reported last year. “Clean energy is often now the least expensive,” the Associated Press reported last month.

In its 2023 World Energy Outlook, the International Energy Agency (IEA) proclaimed that solar additions were increasing rapidly because “they are now the cheapest new sources of electricity in most markets.”

Many of these claims, including the IEA outlook, are based on a metric called Levelized Cost of Energy (LCOE). The metric, which was developed by the financial services firm Lazard, calculates the expected lifetime electricity generation of a power plant and compares that to the cost of building and maintaining the facility.

The problem with this equation, experts say, is that it assumes people will consume electricity as it's available. With natural gas, coal, hydroelectric, and nuclear energy, it’s available most of the time. With wind and solar, it’s only available under the right, unpredictable weather conditions. Lee Cordner, a power-grid consultant with 50 years experience, told Just The News that LCOE doesn’t calculate what it costs to support the intermittent nature of renewables. “I’m fond of saying that wind is cheaper 30% of the time. The other 70% gets really expensive,” Cordner said.

Renewable data center

Google is planning to run its entire operations 100% on renewable energy by 2030.

Using simple math, Cordner illustrated a cost comparison for a single, 100-megawatt data center running on natural gas-fired power, and the same facility entirely on solar and batteries. This hypothetical data center has an average energy consumption size of 100 megawatts. It runs 24 hours per day, 365 days per year, which means just under 900,000 megawatt hours per year.

A 100 megawatt gas turbine, Cordner explained, costs $125 million and uses about $10 million in fuel per year. The plant runs 96% of the time. The average solar farm costs $1 million per megawatt, and a battery facility costs between $300,000 and $600,000 per megawatt hour.

On sunny summer days, a 100 megawatt hours solar farm will produce power for six to seven hours, Cordner said, or about 600 megawatt hours. To keep the data center running during the dawn, dusk and night would require 1,800 megawatt hours of battery storage. Batteries, Cordner explained, have to be cooled. These cooling systems can eat up 70% of the power that’s put into them. For this hypothetical scenario, Cordner assumed 30% of the power going into the battery facility is used to cool it. “So you put in 100 megawatt hours, you run the cooling system, and you can only get 70 megawatt hours coming back,” Cordner said.

In order to meet the 1,800 megawatts of storage needed, the battery facility will therefore need 2,400 megawatts to account for energy lost in running the facility. Thus, for this 400 megawatt solar farm, the data center would need to invest $400 million dollars in the solar farm. For the battery facility costs, Cordner went with $500,000 per megawatt hour, which will cost $1.2 billion dollars.

But the expenses don’t end there. The solar output falls by half on cloudy winter days. To be sure this data center will have enough power during the day and plenty at night during shorter winter days, the solar farm needs to be doubled in size. Now, it’s an $800 million investment.

That brings the grand total to $2 billion.

Wait, there’s more

This system will not guarantee the data center won’t experience blackouts. On a snowy day, the solar panel output falls to zero. So, they would need to keep a gas plant maintained and ready to fire up in such conditions, which means more costs. Google wouldn’t meet its 2030 goal of 100% renewable energy with a system backed up by natural gas.

Cordner also points out that during the summer, this solar facility would produce more power than it can store, which means a lot of electricity with nowhere to go. That's a problem that’s arisen in California, where wind and solar curtailments, as they’re called, are happening more frequently when the overbuilt renewable energy sector is producing more power than the grid can use.

When the data center’s solar farm is curtailed on long sunny summer days, those panels would just sit there exposed to the elements and soaking up rays and producing nothing that can be used. “It’s a tremendously inefficient use of resources,” Cordner said.

The bill

At current natural gas prices, he said, the gas-fired power plant would cost about $0.05 per kilowatt hour.

A battery-solar system is currently estimated to last 20 years, but Cordner said that they’re wearing out more quickly than the estimates projected. Based on a 20-year amortization, which is the time LCOE uses, at 6% interest, the price per kilowatt hour is $0.17.

Cordner said a 10-year amortization is probably more correct, which would be $0.34 per kilowatt hour.

The average U.S. electricity rate for all sectors in October 2023 was 12.68 cents per kilowatt hour.

This estimation doesn’t factor in the cost of keeping the natural gas plant on standby for those snowy days.

There is also a difference in land use between the fossil fueled and solar-battery system. The 100-megawatt natural gas plant, according to a study by Strata, would require 12.41 acres. The solar farm would require 4,000 acres, or 6.25 square miles. The battery facility would take up about 60 more acres.

This is the cost and size of a 100% renewable energy system to one data center requiring 900 gigawatt hours per year. By comparison, in 2022, the U.S. consumed 4.07 trillion kilowatt hours of electricity, which is over 4,500 times larger than this hypothetical data center.

A transition to 100% renewable energy for this center requires an investment of over $2 billion dollars every 10 to 20 years.

The Biden administration plans to reach net zero by 2050, which includes 100% “clean” electricity by 2035.

 

Just the News Spotlight

Support Just the News