Hacker News new | past | comments | ask | show | jobs | submit login

We're at a point where even "free hot water" is not competitive with solar for power generation. It costs more to build a 1GW coal power plant than it does to build a 3GW solar power plant (the 3X is capacity factor compensation). And most of the cost of that coal power plant is the steam turbine and its infrastructure.

We're not at that point yet with natural gas because a combined cycle turbine is more efficient than a steam turbine.




People really don’t understand how huge that is. There is no way to make the math on nuclear or fusion work when the power extraction portion of the plant costs more than solar even if you zero out the generation costs


I see this is fallacy, there are a ton of industrial processes that use a ton of power just to produce heat. A great early use case for fusion will directly use the heat for these industrial processes. For example, aluminum requires ~14-17MWh to produce 1 ton... If you use the heat directly you reduce your processes inefficiency by removing the conversions: heat to steam to electric to heat.

Yeah, next 50 years you might not see coal/nat gas being replaced by fusion. But you will see fusion displacing chunks of what those powerplants will be powering


> I see this is fallacy, there are a ton of industrial processes that use a ton of power just to produce heat. A great early use case for fusion will directly use the heat for these industrial processes. For example, aluminum requires ~14-17MWh to produce 1 ton... If you use the heat directly you reduce your processes inefficiency by removing the conversions: heat to steam to electric to heat.

The other guy was correct while you are the one who posted the fallacy. If using heat from nuclear sources to drive aluminum production were feasible, people would already be doing it using heat from HTGR reactors rather than waiting for nuclear fusion reactors to be made. The reason it is not feasible is because the heat is an output, not an input. The actual input is electricity, which is what drives the reaction. The 940–980°C temperatures reached during the reaction are from the electricity being converted into heat from resistive losses.

It should be noted that production nuclear fusion reactors would be even more radioactive than nuclear fission reactors in terms of total nuclear waste production by weight. The only reason people think otherwise is that the hypothetical use of helium-3 fuel would avoid it, but getting enough helium-3 fuel to power even a test reactor is effectively an impossibility. There are many things that are hypothetically attainable if all people in the world decide to do it. The permanent elimination of war, crime and poverty are such things. Obtaining helium-3 in the quantity needed for a single reactor is not.

However, the goal of powering the Hall–Héroult process from a nuclear fusion reactor is doable. Just use solar panels. Then it will be powered by the giant fusion reactor we have in the sky. You would want to add batteries to handle energy needs when the sun is not shining or do a grid tie connection and let the grid operator handle the battery needs.

Finally, industrial processes that actually need heat at high temperatures (up to around 950°C if my searches are accurate) as input could be served by HTGR reactors. If they are not already using them, then future fusion reactors will be useless for them, since there is no future in sight where a man made fusion reactor is a cheaper energy source than a man made fission reactor. Honestly, I suspect using solar panels to harness energy from the giant fusion reactor in the sky is a more cost effective solution than the use of any man-made reactor.


> The other guy was correct while you are the one who posted the fallacy. If using heat from nuclear sources to drive aluminum production were feasible,

Aluminum reduction is electrochemical, not thermochemical. Yes, the pots are hot, but they are kept hot by resistive dissipation as the alumina is electrolysed.

(There is some chemical energy contributed from oxidation of the carbon electrode.)


That is why using heat as an input to drive the reaction is not feasible. I explained this in the sentences that followed.


Somewhere there was an excellent blog post that I lost the link to that explains that fusion and nuclear have basically the same requirements for energy extraction. You can therefore estimate the ideal cost of a perfect fusion reactor and zero out the cost of the generation side and get a rough estimate of the lowest possible cost for fusion with current technology. I think that put you at somewhere near the 20c mark. Solar is 4-5c and going down, it’s hard to beat that. Also you have to be careful when comparing solar to fusion because there are significant lifecycle costs on fusion that are not present in solar. So you have to take that into account when calculating total cost


This is why I think the only fusion approach that possibly has a chance is Helion's, since it avoids the turbines of the heat -> electricity approach that fission and DT fusion use.


> A great early use case for fusion will directly use the heat for these industrial processes.

There is no chance that early fusion plants will be small enough to justify building them in the same building as a factory. They will start large.

> For example, aluminum requires ~14-17MWh to produce 1 ton

The Hall–Héroult process runs at 950 C, just below the melting point of copper. It is close to twice the temperature of steam entering the turbines. It is not something that can be piped around casually- as a gas it will always be at very high pressure because lowering the pressure cools it down. Molten salt or similar is required to transport that much heat as a liquid. Every pipe glows orange. Any industrial process will effectively be a part of the power plant because of how difficult it is to transport that heat away.

Also NB that the Hall–Héroult process is for creating aluminum from ore, and recycling aluminum is the primary way we make aluminum.


> Every pipe glows orange. Any industrial process will effectively be a part of the power plant because of how difficult it is to transport that heat away.

Industrial parks centered around power plants might become a thing in the future, being looked at as essential infrastructure investment.

Heat transport could be seen as an entire sub-industry unto itself, adding efficiency and cost-savings for conglamorates that choose to partner with companies that invest in and build power plants.


To take advantage of this you would need to build an integrated power/manufacturing hub. The project would be extremely expensive and difficult to finance in places that don’t have strong central planning.


Exactly! Fusion hype fades when solar already outcompetes on cost.

Agreed, fusion is a cool physics problem for now. In the far futrue, if it can scale down, it my have applications in shipping or space.


1GW worth of power during a period of high demand and low supply will cost you more than 3 GW worth of power during a period of high supply low and demand. In some situations, even 100 GW worth of power during some time periods will be cheaper to buy than 1 GW.

For a typical consumer in Sweden, the consumer buys most of their electricity when the prices is at its highest point, which is around 4 times of what the energy cost during the cheapest months. Electricity consumption is at the lowest point when prices are at the lowest, which is the same period when solar peak in production. Inversely, consumption is at the highest when solar production is at the lowest.


> It costs more to build a 1GW coal power plant than it does to build a 3GW solar power plant (the 3X is capacity factor compensation)

That “3X” figure assumes a high‐insolation region (CF ~25 %). In Central Europe, where solar CF is only ~12 %, you’d need about 5x the PV capacity to equal a 1 GW coal plant’s annual generation. How does scaling up to 5 GW of PV change the cost comparison vs a coal plant?


Comparing solar power generation to solar hot water seems wrong to me because there is solar hot water:

https://www.energy.gov/energysaver/solar-water-heaters

I recall hearing that they are 80% efficient while photovoltaics tend to be around 20% efficient.


We're talking about electricity generation here, not heat generation. People have tried generating electricity using solar heat, but we've stopped doing that because it's too expensive.

https://en.wikipedia.org/wiki/Solar_power_tower


> We're talking about electricity generation here, not heat generation

As a peer post noted (without back it up but seems reasonable):

> Only 20% of our energy needs are supplied by electricity.

It is a fair viewpoint to talk about energy instead of only electricity. For exemple the current EV are build using charcoal (steel and cement for the infrastructure) and parts/final product are moved around continent with oil (ships). Same for solar panels and their underlying steel structure. Same for the road were using those EV, etc… there’s technical solutions for those, but they didn’t prove to be economically competitive yet. So I’ll happily take that 80% efficiency when we need relatively low heat : domestic and commercial AC and water heating. Those are by far the most energy intensive usage in the residential sector when there isn’t an electric vehicle and are most needs in pick time (mornings, evening at winter). We better take that +60%.


Any low heat solution is going to have a very difficult time competing economically with heat pumps, which often have an efficiency > 300%.

The most economical solution for reducing our carbon emissions by 95% is doing these two steps in parallel:

1. Use electricity instead of fossil fuel 2. Generate electricity in carbon free manner

Yes, there are some use cases this doesn't work well at yet: steel & ocean transport are two you listed. But it does cover the 4 biggest sources of carbon emissions: ground transport, heating, electricity generation and agriculture. The big 4 are 95% of our carbon emissions.


The Rheem heat pump for domestic hot water that I have in my home claims a maximum energy savings of 75%. That implies that at 20% efficiency out of my solar panels, the efficiency of photovoltaic panels + the heat pump is equal to the 80% efficiency of solar hot water. However, this ignores losses from DC to AC and the lines.

The photovoltaic panels have the added bonus that the energy can be used for other purposes (e.g. transport, HVAC, computers, cooking, laundry, A/V equipment) should my hot water needs be low compared to what the system is designed to produce. However, from a pure efficiency standpoint, it is unclear to me which approach is better. They seem to be a rough tie, with losses for both approaches making the real world worse than ideal conditions. I am not sure if one is better than the other in the actual real world and if anyone who knows the answer is kind enough to share it, I would find the answer enlightening.


I mean, from a distribution standpoint, electricity is way easier to distribute than heat (pressurized steam? Hot water?) and has less loss over longer distances.


Doesn't matter that much if you have excess solar available, beyond that many who do solar also tend to go to a heat pump water heater which is 400% efficient bringing photovoltaics in line with solar hot water without running plumbing up to the roof and now that roof space can be used to power many things rather than just hot water.

https://www.energy.gov/energysaver/heat-pump-water-heaters


The two being equal in efficiency is true in a best case scenario, but that ignores real world effects such as inverter losses. I wonder which would be superior in a real world test.

That said, in my home, I use net metered photovoltaic panels with a Rheem heat pump for domestic hot water. This was not done because I considered it to be a better solution, but because it was the only solution available to me from local installers.


Solar hot water has to account for pumping losses as well, its going to be in the same ballpark but the electric heat pump hot water system is much more flexible in how the power is used and decouples production from use along with electrical vs plumbing on the roof which is simpler and dare say less prone to issues.

Solar thermal heating used to make more sense but cost of photovoltaics has come down so much along with relatively cheap heat pump systems nobody is doing the former anymore it seems.

I just got a large solar system installed and next up is a heat pump water heater as thats the second largest user of power next to the HVAC, plus it will cool and dehumidify my garage some where the solar inverter and batteries are located, converting some of the waste heat from the inverter into hot water at the same time.


This is getting away from the topic, but the capital cost of most solar hot water heaters and their inflexibility with regard to clouds, solar angle, and outside temperatures has made using photovoltaics to resistively heat water a better deal even at the residential level for the past ten years.


Even so, it's cheaper these days to drive a water heater with PV electricity than it is to directly heat the water in thermal collectors.


You need a lot more than 3X solar capacity to deal with night time, which Coal has no issue with. You need some kind of storage (battery? Pumped hydro?) and that is expensive.


Storage is expensive for now. Just a few years ago, solar was mocked as being ridiculous and having no future due to terrible efficiency and high costs. Now that manufacturing is well established and people know how to set it all up, it's the cheapest source of energy.

Batteries beyond the scale of a handheld device only started getting massively manufactured and invested in fairly recently as well. Once it's obvious that it's possible to build megabatteries that can power towns, everyone will want in on the market and prices will go down.


Do you expect batteries to be cheaper than solar panels?

Or inverters? (Also not included in your calc I think?)


Eventually. Right now there's the idea that batteries=lithium, and most manufacturers aren't even considering other materials. There are much cheaper materials out there that can make batteries, especially when looking at devices of massive scale.


> 3GW solar power plant

Except that it needs to be around 30GW plant to compete with a 1GW coal. And it needs storage for several days of energy.


Sure, if it's by itself and not connected to a continental grid.


However, solar caused problems in Spain recently due to its lack of mechanical inertia, which brought their grid down due to frequency instability.

Fusion would use a conventional turbine with boiling water. Is this a better source of mechanical inertia than hydropower or fission?

Is there a better way to solve the problem of frequency instability?

Why is this fact downvoted? This article mentions "synthetic inertia;" what are its drawbacks?

https://www.bloomberg.com/news/articles/2025-05-09/spain-bla...

https://archive.ph/VI32e


Solar caused problems in Spain because it was misconfigured. AC inverters are a fabulous source of power stabilization; many grids choose to install batteries and inverters for grid stabilization.


The article mentions that largish batteries are needed for synthetic inertia, which I am guessing use A/C inverters. Spain appeared to lack sufficient batteries.

Obviously, this configuration of solar and battery banks will work more optimally when they are closer to the equator.

Will different types of power grids be required for areas further away, or is it practical to ship power long distances to far Northern/Southern areas?


Synthetic inertia needs a large DC source. At the time of the outage, solar power was a large DC source.


The power source needs to be able to temporarily/momentarily provide large portions of the grids energy demands to provide what was needed. Something batteries are typically well suited for.

Mechanical inertia in generators also tends to do well in these situations.

PV panel supply was just not nearly large enough, and if you look at overall PV capacity as a percentage of their grid capacity, it’s pretty obvious it was never going to be enough to stabilize any serious issues.


Nobody knows the cause of the energy outage in Spain, Portugal and France... except the U.S. Energy Secretary Chris Wright, a chill for the oil and fracking industry.

Could you point to the outage conclusion report?


> We're at a point where even "free hot water" is not competitive with solar for power generation.

You're making the obvious mistake here of equating 1 GW solar with 1 GW of any other source with a 95-99% baseload capacity. To achieve the equivalent result, you'll need to have at least >2 GW actual solar power to equally compare the two.

Granted, in most developed places, solar still beats coal, but this is why in many developing economies with ample coal resources, it makes more sense economically to go with the coal plants.

Take any other resource, say hydel or geothermal - solar and wind quickly go down in economic efficiency terms compared to these, in most cases almost doubling or tripling in costs.


> To achieve the equivalent result, you'll need to have at least >2 GW actual solar power to equally compare the two.

Which is why I compared 1GW of coal power to 3GW of solar power.


I can’t really imagine how the person who responded to you managed to miss that, it was like the middle 1/5’th of your post. Oh well, I guess it is impossible to write a post well enough that somebody won’t jump in with a correction… right or wrong!


A 3GW solar power plant takes up a lot of land. Around 360km² of land according to my AI, FWIW.

We can live with huge land areas converted to power generation, but more space efficient alternatives will be a big improvement.


As a rule of thumb, 1 square meter receives about 1 kW of peak raw solar power when the sun is perpendicular. This should give you at least a rough magnitude of the problem instead of trusting the hallucinations of your AI.

Since you want to produce power all day, you would take about 20% of that to account for tilt variations and day night cycles, and another 20% to factor in cell efficiency.

So with adequate storage, one square meter of solar can generate an average of 40W of continuous electrical power, 24h per day. Let's round that down to 25W to take into consideration outages and maintenance, bad weather, spacing between panels for personnel access etc.

And there you have it 1GW/25W is about 40 square km with quite generous safety factors, an order of magnitude less than your AI figures. This is still a lot of land if you replace farmland with it, but still totally negligible compared with the millions of square km of hot desert the world has available for this use.

For example, scaling this 400x, to cover for the entire US electrical consumption, is still "only" 16000 sq.km , or 3% of the area of the Great Basins desert in the US, which is one of the smallish deserts of the world compared with Sahara, Ghobi, Kalahari, Australia, Arabia etc. Of course, there is little economic sense to build such a mega-solar farm and pay the cost of energy transport. In practice, we are seeing distributed production taking the cheapest available land nearby.


40% of US corn acreage is used for something like 10% of gasoline. This is an unfathomable amount of land. Solar yields 20x the amount of energy per acre. On top of that many are finding efficiencies of colocating solar with agricultural activities (agrivoltaics). And there's also replacing agricultural activities on marginal or water stressed land.

Conclusion, land isn't really a constraint in the US.


Yeah, I'm not saying solar power is impossible.

Just pointing out that there are real downsides to this energy source, like all the others.

Now is not the time to stop developing energy sources.


The space issue is obviously a bullshit red herring.

PV provides massively more value per acre than agriculture does. If PV were seriously constrained by land costs, agriculture would be impossible.

But society is perfectly fine with having land producing $500/acre/year of hay, instead of $25,000/acre/year of PV output.


Obviously there are downsides but space is something the US at least has basically unlimited amounts of.


Your AI is messing with you. 1MW requires ~6 acres, so a GW requires 6000. A square mile is 640 acres. Being generous, let's round up to 10 square miles. Times 3 and convert to square kilometers gives 78.


I don’t have any reason to doubt it, but it seems like a basically easy computation to verify or for the AI to show its work.

Anyway, the area issue seems not too bad. In the US as least, we have places like the Dakotas which we could turn like 70% of into a solar farm and nobody would really notice.


What if you include all the parking lots and warehouses and large commercial facilities in the world too?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: