> No one is building a natural gas plant to staff it and let it sit idle for 95% of the time. The natural gas burned is only a fraction of its input costs.
Serious question: why do you think that’s true? If it costs X per year to run it 5% of the year and you save more than X with this strategy, then the maths is simple and someone will build it. Several energy companies could probably be convinced to each pay a share so no one is left footing the whole bill but everyone benefits from the existence of the facility. If the maths works, potentially even some of the cost could be passed on to the taxpayer.
In the UK we already have a couple of facilities that operate exactly like this, Cruachan for example (it’s not gas, it’s water). Over the years, ways to improve its utilisation have been found, but it’s still sitting there at a relatively low portion of its capacity so that it can black start the grid if it’s ever needed.
Because it has been, at least thus far. Perhaps in the hazy future this will change, and some regulatory/capacity/energy market will evolve into making such things profitable by paying someone to build underutilized power plants. I know of no such market currently.
I'm only somewhat familiar with the US market, not the UK. But a single plant is really not interesting for the discussion at hand. It can be considered a cost of doing business to have such a plant be useful for "black starts" - but that's all a single plant will ever be useful for. If it's ever being used for such a purpose you've already lost the game.
The scale is what matters. A single power plant that is 1% of your grid capacity being utilized 5% of the time is an expense that can probably be justified. Hundreds of power plants that match 100% (or close to it) of your grid capacity used 5% of the time would be an economically unjustifiable expense as you've effectively built your entire generation capacity twice.
Right now that's what we would be talking about building since every regional grid seems to experience week (or longer) periods where intermittent power generation is extremely unreliable due to weather events. It's not 100%, but it's close to it. You need to plan for the 1000 year event for something as critical as a national grid or folks literally start dying and the economic impact is astronomical.
I don't know what the exact capacity factor you'd need to have for a reasonable intermittent:dispatchable ratio, but it's certainly quite a lot higher than most would seemingly believe. Once batteries get to the point of backing the entire grid for a single night while the wind doesn't blow there might be signs of change. In most markets in the US where batteries are considered huge successes they have only recently (in the past year or two) transitioned from providing ancillary services to actual energy production for regular daily usage during the duck curve.
This can all be solved in time and in theory with a number of technologies and additional grid interconnection. But the trends simply are not as positive as one would like to see when you start delving into primary sources.
> But a single plant is really not interesting for the discussion at hand.
Maybe, but the UK has 4, and is building another 5 in the next 5 years.
Average grid consumption is somewhere around 30GW, the existing facilities have around 30GWh of storage. The additional 5 should bring around another 100GWh.
So we’re already at 1 hour’s worth of grid capacity stored, by 2030 we’ll be at 4 hours, and that’s assuming absolutely zero energy from other sources (wind, solar, nuclear, fossil fuel, biomass, other countries), although to be fair the existing facilities can only deliver at around 3GW, and the new facilities will only bring that up to 6GW.
I’m not sure if you’ve ever been to the UK, but a whole week without either sun or wind seems a bit unlikely, especially when half of the UKs wind comes from offshore wind farms.
Stick in a few more of these, and keep a couple of the existing fossil fuel plants around in case of emergencies, and I can definitely see how this continues to be just a “cost of doing business”.
I appreciate the situation in the US may be worse.
Serious question: why do you think that’s true? If it costs X per year to run it 5% of the year and you save more than X with this strategy, then the maths is simple and someone will build it. Several energy companies could probably be convinced to each pay a share so no one is left footing the whole bill but everyone benefits from the existence of the facility. If the maths works, potentially even some of the cost could be passed on to the taxpayer.
In the UK we already have a couple of facilities that operate exactly like this, Cruachan for example (it’s not gas, it’s water). Over the years, ways to improve its utilisation have been found, but it’s still sitting there at a relatively low portion of its capacity so that it can black start the grid if it’s ever needed.