I remain skeptical that fusion will ever be a commercially viable energy source. I'd love to be wrong.
The engineering challenges are so massive that even if they can be solved, which is far from certain, at what cost? With a dense high-energy plasma, you're dealing with a turbulent fluid where any imperfection in your magnetic confinement will likely dmaage the container.
People get caught up on cheap or free fuel and the fact that stars do this. The fuel cost is irrelevant if the capital cost of a plant is billions and billions of dollars. That has to be amortized over the life of the plant. Producing 1GW of power for $100 billion (made up numbers) is not commercially viable.
And stars solve the confinement problem with gravity and by being really, really large.
Neutron loss remains one of the biggest problems. Not only does this damage the container (ie "neutron embrittlement") but it's a significant energy loss for the system and so-called aneutronic fusion tends to rely on rare fuels like Helium-3.
And all of this to heat water to create steam and turn a turbine.
I see solar as the future. No moving parts. The only form of direct power generation. Cheap and getting cheaper and there are solutions to no power generation at night (eg batteries, long-distance power transmission).
We're at a point where even "free hot water" is not competitive with solar for power generation. It costs more to build a 1GW coal power plant than it does to build a 3GW solar power plant (the 3X is capacity factor compensation). And most of the cost of that coal power plant is the steam turbine and its infrastructure.
We're not at that point yet with natural gas because a combined cycle turbine is more efficient than a steam turbine.
People really don’t understand how huge that is. There is no way to make the math on nuclear or fusion work when the power extraction portion of the plant costs more than solar even if you zero out the generation costs
I see this is fallacy, there are a ton of industrial processes that use a ton of power just to produce heat. A great early use case for fusion will directly use the heat for these industrial processes. For example, aluminum requires ~14-17MWh to produce 1 ton... If you use the heat directly you reduce your processes inefficiency by removing the conversions: heat to steam to electric to heat.
Yeah, next 50 years you might not see coal/nat gas being replaced by fusion. But you will see fusion displacing chunks of what those powerplants will be powering
> I see this is fallacy, there are a ton of industrial processes that use a ton of power just to produce heat. A great early use case for fusion will directly use the heat for these industrial processes. For example, aluminum requires ~14-17MWh to produce 1 ton... If you use the heat directly you reduce your processes inefficiency by removing the conversions: heat to steam to electric to heat.
The other guy was correct while you are the one who posted the fallacy. If using heat from nuclear sources to drive aluminum production were feasible, people would already be doing it using heat from HTGR reactors rather than waiting for nuclear fusion reactors to be made. The reason it is not feasible is because the heat is an output, not an input. The actual input is electricity, which is what drives the reaction. The 940–980°C temperatures reached during the reaction are from the electricity being converted into heat from resistive losses.
It should be noted that production nuclear fusion reactors would be even more radioactive than nuclear fission reactors in terms of total nuclear waste production by weight. The only reason people think otherwise is that the hypothetical use of helium-3 fuel would avoid it, but getting enough helium-3 fuel to power even a test reactor is effectively an impossibility. There are many things that are hypothetically attainable if all people in the world decide to do it. The permanent elimination of war, crime and poverty are such things. Obtaining helium-3 in the quantity needed for a single reactor is not.
However, the goal of powering the Hall–Héroult process from a nuclear fusion reactor is doable. Just use solar panels. Then it will be powered by the giant fusion reactor we have in the sky. You would want to add batteries to handle energy needs when the sun is not shining or do a grid tie connection and let the grid operator handle the battery needs.
Finally, industrial processes that actually need heat at high temperatures (up to around 950°C if my searches are accurate) as input could be served by HTGR reactors. If they are not already using them, then future fusion reactors will be useless for them, since there is no future in sight where a man made fusion reactor is a cheaper energy source than a man made fission reactor. Honestly, I suspect using solar panels to harness energy from the giant fusion reactor in the sky is a more cost effective solution than the use of any man-made reactor.
> The other guy was correct while you are the one who posted the fallacy. If using heat from nuclear sources to drive aluminum production were feasible,
Aluminum reduction is electrochemical, not thermochemical. Yes, the pots are hot, but they are kept hot by resistive dissipation as the alumina is electrolysed.
(There is some chemical energy contributed from oxidation of the carbon electrode.)
Somewhere there was an excellent blog post that I lost the link to that explains that fusion and nuclear have basically the same requirements for energy extraction. You can therefore estimate the ideal cost of a perfect fusion reactor and zero out the cost of the generation side and get a rough estimate of the lowest possible cost for fusion with current technology. I think that put you at somewhere near the 20c mark. Solar is 4-5c and going down, it’s hard to beat that.
Also you have to be careful when comparing solar to fusion because there are significant lifecycle costs on fusion that are not present in solar. So you have to take that into account when calculating total cost
This is why I think the only fusion approach that possibly has a chance is Helion's, since it avoids the turbines of the heat -> electricity approach that fission and DT fusion use.
> A great early use case for fusion will directly use the heat for these industrial processes.
There is no chance that early fusion plants will be small enough to justify building them in the same building as a factory. They will start large.
> For example, aluminum requires ~14-17MWh to produce 1 ton
The Hall–Héroult process runs at 950 C, just below the melting point of copper. It is close to twice the temperature of steam entering the turbines. It is not something that can be piped around casually- as a gas it will always be at very high pressure because lowering the pressure cools it down. Molten salt or similar is required to transport that much heat as a liquid. Every pipe glows orange. Any industrial process will effectively be a part of the power plant because of how difficult it is to transport that heat away.
Also NB that the Hall–Héroult process is for creating aluminum from ore, and recycling aluminum is the primary way we make aluminum.
> Every pipe glows orange. Any industrial process will effectively be a part of the power plant because of how difficult it is to transport that heat away.
Industrial parks centered around power plants might become a thing in the future,
being looked at as essential infrastructure investment.
Heat transport could be seen as an entire sub-industry unto itself, adding efficiency and cost-savings for conglamorates that choose to partner with companies that invest in and build power plants.
To take advantage of this you would need to build an integrated power/manufacturing hub. The project would be extremely expensive and difficult to finance in places that don’t have strong central planning.
1GW worth of power during a period of high demand and low supply will cost you more than 3 GW worth of power during a period of high supply low and demand. In some situations, even 100 GW worth of power during some time periods will be cheaper to buy than 1 GW.
For a typical consumer in Sweden, the consumer buys most of their electricity when the prices is at its highest point, which is around 4 times of what the energy cost during the cheapest months. Electricity consumption is at the lowest point when prices are at the lowest, which is the same period when solar peak in production. Inversely, consumption is at the highest when solar production is at the lowest.
> It costs more to build a 1GW coal power plant than it does to build a 3GW solar power plant (the 3X is capacity factor compensation)
That “3X” figure assumes a high‐insolation region (CF ~25 %). In Central Europe, where solar CF is only ~12 %, you’d need about 5x the PV capacity to equal a 1 GW coal plant’s annual generation. How does scaling up to 5 GW of PV change the cost comparison vs a coal plant?
We're talking about electricity generation here, not heat generation. People have tried generating electricity using solar heat, but we've stopped doing that because it's too expensive.
> We're talking about electricity generation here, not heat generation
As a peer post noted (without back it up but seems reasonable):
> Only 20% of our energy needs are supplied by electricity.
It is a fair viewpoint to talk about energy instead of only electricity. For exemple the current EV are build using charcoal (steel and cement for the infrastructure) and parts/final product are moved around continent with oil (ships). Same for solar panels and their underlying steel structure. Same for the road were using those EV, etc… there’s technical solutions for those, but they didn’t prove to be economically competitive yet. So I’ll happily take that 80% efficiency when we need relatively low heat : domestic and commercial AC and water heating. Those are by far the most energy intensive usage in the residential sector when there isn’t an electric vehicle and are most needs in pick time (mornings, evening at winter). We better take that +60%.
Any low heat solution is going to have a very difficult time competing economically with heat pumps, which often have an efficiency > 300%.
The most economical solution for reducing our carbon emissions by 95% is doing these two steps in parallel:
1. Use electricity instead of fossil fuel
2. Generate electricity in carbon free manner
Yes, there are some use cases this doesn't work well at yet: steel & ocean transport are two you listed. But it does cover the 4 biggest sources of carbon emissions: ground transport, heating, electricity generation and agriculture. The big 4 are 95% of our carbon emissions.
The Rheem heat pump for domestic hot water that I have in my home claims a maximum energy savings of 75%. That implies that at 20% efficiency out of my solar panels, the efficiency of photovoltaic panels + the heat pump is equal to the 80% efficiency of solar hot water. However, this ignores losses from DC to AC and the lines.
The photovoltaic panels have the added bonus that the energy can be used for other purposes (e.g. transport, HVAC, computers, cooking, laundry, A/V equipment) should my hot water needs be low compared to what the system is designed to produce. However, from a pure efficiency standpoint, it is unclear to me which approach is better. They seem to be a rough tie, with losses for both approaches making the real world worse than ideal conditions. I am not sure if one is better than the other in the actual real world and if anyone who knows the answer is kind enough to share it, I would find the answer enlightening.
I mean, from a distribution standpoint, electricity is way easier to distribute than heat (pressurized steam? Hot water?) and has less loss over longer distances.
Doesn't matter that much if you have excess solar available, beyond that many who do solar also tend to go to a heat pump water heater which is 400% efficient bringing photovoltaics in line with solar hot water without running plumbing up to the roof and now that roof space can be used to power many things rather than just hot water.
The two being equal in efficiency is true in a best case scenario, but that ignores real world effects such as inverter losses. I wonder which would be superior in a real world test.
That said, in my home, I use net metered photovoltaic panels with a Rheem heat pump for domestic hot water. This was not done because I considered it to be a better solution, but because it was the only solution available to me from local installers.
Solar hot water has to account for pumping losses as well, its going to be in the same ballpark but the electric heat pump hot water system is much more flexible in how the power is used and decouples production from use along with electrical vs plumbing on the roof which is simpler and dare say less prone to issues.
Solar thermal heating used to make more sense but cost of photovoltaics has come down so much along with relatively cheap heat pump systems nobody is doing the former anymore it seems.
I just got a large solar system installed and next up is a heat pump water heater as thats the second largest user of power next to the HVAC, plus it will cool and dehumidify my garage some where the solar inverter and batteries are located, converting some of the waste heat from the inverter into hot water at the same time.
This is getting away from the topic, but the capital cost of most solar hot water heaters and their inflexibility with regard to clouds, solar angle, and outside temperatures has made using photovoltaics to resistively heat water a better deal even at the residential level for the past ten years.
You need a lot more than 3X solar capacity to deal with night time, which Coal has no issue with. You need some kind of storage (battery? Pumped hydro?) and that is expensive.
Storage is expensive for now. Just a few years ago, solar was mocked as being ridiculous and having no future due to terrible efficiency and high costs. Now that manufacturing is well established and people know how to set it all up, it's the cheapest source of energy.
Batteries beyond the scale of a handheld device only started getting massively manufactured and invested in fairly recently as well. Once it's obvious that it's possible to build megabatteries that can power towns, everyone will want in on the market and prices will go down.
Eventually. Right now there's the idea that batteries=lithium, and most manufacturers aren't even considering other materials. There are much cheaper materials out there that can make batteries, especially when looking at devices of massive scale.
Solar caused problems in Spain because it was misconfigured. AC inverters are a fabulous source of power stabilization; many grids choose to install batteries and inverters for grid stabilization.
The article mentions that largish batteries are needed for synthetic inertia, which I am guessing use A/C inverters. Spain appeared to lack sufficient batteries.
Obviously, this configuration of solar and battery banks will work more optimally when they are closer to the equator.
Will different types of power grids be required for areas further away, or is it practical to ship power long distances to far Northern/Southern areas?
The power source needs to be able to temporarily/momentarily provide large portions of the grids energy demands to provide what was needed. Something batteries are typically well suited for.
Mechanical inertia in generators also tends to do well in these situations.
PV panel supply was just not nearly large enough, and if you look at overall PV capacity as a percentage of their grid capacity, it’s pretty obvious it was never going to be enough to stabilize any serious issues.
Nobody knows the cause of the energy outage in Spain, Portugal and France... except the U.S. Energy Secretary Chris Wright, a chill for the oil and fracking industry.
> We're at a point where even "free hot water" is not competitive with solar for power generation.
You're making the obvious mistake here of equating 1 GW solar with 1 GW of any other source with a 95-99% baseload capacity. To achieve the equivalent result, you'll need to have at least >2 GW actual solar power to equally compare the two.
Granted, in most developed places, solar still beats coal, but this is why in many developing economies with ample coal resources, it makes more sense economically to go with the coal plants.
Take any other resource, say hydel or geothermal - solar and wind quickly go down in economic efficiency terms compared to these, in most cases almost doubling or tripling in costs.
I can’t really imagine how the person who responded to you managed to miss that, it was like the middle 1/5’th of your post. Oh well, I guess it is impossible to write a post well enough that somebody won’t jump in with a correction… right or wrong!
As a rule of thumb, 1 square meter receives about 1 kW of peak raw solar power when the sun is perpendicular. This should give you at least a rough magnitude of the problem instead of trusting the hallucinations of your AI.
Since you want to produce power all day, you would take about 20% of that to account for tilt variations and day night cycles, and another 20% to factor in cell efficiency.
So with adequate storage, one square meter of solar can generate an average of 40W of continuous electrical power, 24h per day. Let's round that down to 25W to take into consideration outages and maintenance, bad weather, spacing between panels for personnel access etc.
And there you have it 1GW/25W is about 40 square km with quite generous safety factors, an order of magnitude less than your AI figures. This is still a lot of land if you replace farmland with it, but still totally negligible compared with the millions of square km of hot desert the world has available for this use.
For example, scaling this 400x, to cover for the entire US electrical consumption, is still "only" 16000 sq.km , or 3% of the area of the Great Basins desert in the US, which is one of the smallish deserts of the world compared with Sahara, Ghobi, Kalahari, Australia, Arabia etc. Of course, there is little economic sense to build such a mega-solar farm and pay the cost of energy transport. In practice, we are seeing distributed production taking the cheapest available land nearby.
40% of US corn acreage is used for something like 10% of gasoline. This is an unfathomable amount of land. Solar yields 20x the amount of energy per acre. On top of that many are finding efficiencies of colocating solar with agricultural activities (agrivoltaics). And there's also replacing agricultural activities on marginal or water stressed land.
Conclusion, land isn't really a constraint in the US.
Your AI is messing with you. 1MW requires ~6 acres, so a GW requires 6000. A square mile is 640 acres. Being generous, let's round up to 10 square miles. Times 3 and convert to square kilometers gives 78.
I don’t have any reason to doubt it, but it seems like a basically easy computation to verify or for the AI to show its work.
Anyway, the area issue seems not too bad. In the US as least, we have places like the Dakotas which we could turn like 70% of into a solar farm and nobody would really notice.
No one wants to acknowledge that the economics will likely never work out for the reasons you mentioned. Too much maintenance -- and very expensive maintenance at that. It's far cheaper cost per watt to build a traditional fission reactor and run/maintain that.
Another reason is that ̶t̶r̶a̶n̶s̶m̶i̶s̶s̶i̶o̶n̶ distribution costs are half of your energy bill... so even if you could theoretically get fusion energy generation for "free" (which is impossible) you've still only cut your power bill in half.
Edit: I meant to say distribution costs not transmission. Looking at last months bill I paid $66.60 to deliver $51.76 of energy (about 56% of my total bill was delivery). The raw distribution charge was $49.32 or 42% of the bill. I'm not alone in these numbers, but your mileage may vary.
Transmission is a really interesting problem that creates all kinds of distortions.
Say a house uses 10,000kWh per year at $0.10/kWH so $1000/year electrcitiy bill. Now say you get a solar system that produces 5,000kWh per year, focused in the summer months (where your power bill tends to be higher anyway). You may even export some of that power back to the grid. Have you cut your power bill in half? No. It's probably down ~20-25%.
Why? Because regardless of how much power you use (within limits) you still need a connection to the power grid and that needs to be maintained. You'll often even see this on the electricity bill: fixed charges like "access charge" per month.
We benefit from being on a connected grid. Your own power generation might be insufficient or need maintenance. It's inefficient if everyone is storing their own power. So it's unclaer what the future of the power grid is. Should there be large grids, small grids or no grid?
There also resilience. Having small to medium local storage increases the stability of the grid.
Renewables and something like Iron-Salt battery containers, would be pretty efficient over all. Easy to roll-out, very safe.
We'll still need some sort of base load somewhere and backup to restart everything obviously. But the big giant power plants (with the huge capital costs, delays and NIMBY headaches) might become less necessary.
And the transmission costs argument is precisely why we'd likely be better off solving the problem of distributing power production across a more decentralized grid with a lot of wind and solar and battery all over the place
Problem: the capital & maintenance costs of the grid vary very little with its utilization %.
So if you build loads of wind & solar & battery all over - either (1) you've got to build so much battery capacity, all over, that you'll never need the grid, or (2) you've still got to build the grid to get you through occasional "calm & dark" periods.
Either way, you're looking at vastly higher capital expenses.
> Long-distance transmission (hundreds of kilometers) is cheap and efficient, with costs of US$0.005–0.02 per kWh, compared to annual averaged large producer costs of US$0.01–0.025 per kWh
Do you maybe mean that half electrical energy dissipate between production plant and consummer? But that figure seems quite large compared to what I can find online, and this would not be a problem with "free fusion".
I meant to say distribution costs not transmission. Looking at last months bill I paid $66.60 to deliver $51.76 of energy (about 56% of my total bill was delivery). The raw distribution charge alone was $49.32 or 42% of the bill. I'm not alone in these numbers, but your mileage may vary.
My point is that the infrastructure related to the delivery of energy to a physical location is a non trivial part of an energy bill, and that this part doesn't go away magically because "fusion".
Long-distance transmission, of huge quantities of electrical energy, IS very efficient.
Distributing tiny fractions of all that energy to each of millions of individual residences, then maintaining all the short/complex/low-capacity wiring needed to do that - that part ain't the least bit efficient.
Where I live I pay about $0.09 per kWh for generation and about that much for transmission as well. I think that's what they're referring to, the literal bill they get from their current provider.
First, actually getting fusion to positive energy ROI. That's step zero and we're not even close.
Second, scaling the production of fusion in an safe and economical way. Given the utter economic failure of fission nuclear power (there has never been a profitable one), my priors are that the fusion advocates are vastly underestimating, if not willfully ignoring, this part.
Finally, even if we do get to "too cheap to meter" energy, what then? Limitless electricity is not the same thing as limitless stored energy. Only 20% of our energy needs are supplied by electricity. To wit, the crucial industrial processes required to build the nuclear power plant in the first place can only be accomplished with combustible carbon. A power plant cannot generate the energy to build another power plant. Please let that sink in.
We're already seeing countries with photovoltaic and wind hitting $0/kW on sunny windy days - the grid is nearly saturated for daytime load. There isn't enough demand! This makes the economic feasibility of fusion even less attractive. No one is going to make money from it.
Where did you get the data that there has never been a profitable one? Not calling you out, but curious of where you are getting this data.
I would expect that there have been multiple nuclear power plants that provide a net positive return, specially on countries like France where 70% of their energy is nuclear.
France lost an incredible amount of money on nuclear through capacity factor issues. The numbers are so bad they don’t want to admit what they are.
However a reasonable argument can be made the public benefited from externalities like lower pollution and subsidized electricity prices even if it was a money pit and much of the benefit was exported to other countries via cheap off peak prices while France was forced to import at peak rates.
Regulatory burdens on fission account for negative externalities to an arguably overzealous degree, whereas fossil fuel energy has been until recently allowed to completely ignore them. Doesn't seem like a fair comparison.
Regulatory burdens on fission result from the inherent risks and negative externalities. You’re never going to see huge long term exclusion zones with coal, but nuclear has two of them right now (Ed: Overkill though the current size may be) which also have massive government funded cleanup efforts.
So while regulations may be overkill it’s not arbitrary only hydro is really comparable but hydro also stores water and reduces flood risks most years. Fusion sill had real risks, but there’s no concern around $500+ Billion cleanup efforts.
Depends on if anyone uses wells in the area. Probably not that much outside of some sci-fi movie extreme as piping water from in contaminated regions wouldn’t be particularly expensive, but the question is assuming something I’m not sure is possible.
How exactly would you get meaningful widespread tritium contamination of groundwater? IE not just the trace amounts you see from existing nuclear reactors.
Groundwater doesn’t flow quickly from a point source and tritium has a fairly short half-life. 60 years later you might be looking at a larger though still small area, but 97% of the stuff will have decayed and what remains is now diluted and doesn’t bioaccumulate.
It’s not going to concentrate around some site after entering the atmosphere the way heavier than air particulate pollution would.
The flow of tritium through a DT reactor is five orders of magnitude higher than tritium production in a fission reactor of the same thermal output. To put a number on it: the tritium produced and consumed in one year by a 1 GW(e) DT power plant would, if released as tritium oxide, contaminate two months of the average flow of the Mississippi River above the legal limit for drinking water.
Fusion power plants aren’t like nuclear reactors where you keep years worth DT in the reactor. DT only works if you’re actively recycling a breeder blanket. They are also going to be working with gasses as unlike fission reactors it’s not produced in fuel submerged in water thus contaminating the water.
Nearly pure tritium is extremely valuable so we aren’t going to be dealing with some long term leak. You hypothetically might have a large tank with say 1 month of T2 fuel but that would be really expensive directly and waste quite a bit of fuel through nuclear decay over time. Having that much fuel across multiple different systems is more plausible but then requires a wide range of different failures. But let’s assume such an improbable tank catastrophically fails, outside of containment, and then completely burns so the tritium will eventually fall back to earth.
It then has to rain over land, though even then storms don’t release all the moisture in the air, that water must be absorbed into the soil rather than running off or evaporating, where it’s further mixed with groundwater as it slowly seeps deep enough to be collected in some well. Thus even if conditions are perfect you’d have trouble reaching above the legal limit for drinking water.
I mean maybe if you intentionally selected the perfect moment with the perfect weather pattern and the perfect local geography and geology perhaps you’d be over the legal limits for a few wells for a little while until it rapidly decays.
My point is: even small leaks (in percentage terms) will result in much larger releases from LWRs. And tritium promises to be very difficult to contain, as it diffuses through a wide variety of materials. For example, it diffuses through polymer seals. Materials of a reactor will become saturated with it, providing a substantial source term in accidents.
Lurking over all this is the issue that loss of property value doesn't require anyone to actually prove tangible harm. The mere fact that property values were affected is enough for a tort.
The kind of releases you’re talking about is 8+ orders of magnitude smaller than in your prior example and again without burning it’s lighter than air and just going strait up. Right now T2 is ~$30,000 / gram hell even D is ~13,000x as much as hydrogen. This just isn’t the kind of thing you’d let escape in meaningful quantities in day to day operations.
When people talk about how safe fusion is they aren’t kidding, even breathing in a significant amount of T2 isn’t particularly dangerous radiologically as density is really low and you will quickly exhale it. Huge quantities would be a larger suffocation risks but then you’re talking multi million dollar accidents simply from lost fuel.
Not a single one of the ~700 nuclear power plants has been built without significant government subsidies [1][2].
Additionally, the industry as a whole is shielded from the liability that would otherwise have bankrupted it multiple times. Notably, the clean up from Fukushima will likely take over 100 years, requires tech not yet invented and will likely cost as much as a trillion dollars [3]. In the US, there is a self-insurance fund paid into by the industry, which would've been exhausted 10-20 times over from a Fukushima level disaster. Plus, Congress severely limits liability from nuclear accidents, both on a per-plant and total basis ie the Price-Anderson Act [4].
Next, it seems like it's the taxpayer who is paying to process and store spent nuclear waste, a problem that will persist for centuries.
Even with all this the levellized-cost-of-energy ("LCOE") of fission power is incredibly expensive and seemingly going up [5].
Some want to reduce costs by using more off-the-shelf tech and replicating it for scale, most notably with small modular reactors ("SMRs") but this actually makes no sense because larger fission reactors are simply more efficient.
Not really in the sense that the owning company has managed to survive without the state stepping in and give them money.
Most reactors are old and in need of repair, most of these earlier than planned afaik.
There is also the bigger issue that some reactors are shut down in the summer because cooling water would leave the reactor so hot that it would be a danger to the animals living in the river.
I won't dispute that fission power has enormous capital costs. But how much of its alleged "failure" has been the utter FUD that's been pushed for the past 50+ years about how we'd all be glowing if nuclear power was widespread?
I mean sure, waste disposal is a serious issue that deserves serious consideration. But fission waste contaminates a discrete area. Fossil fuels at scale cause climate change that contaminates the entire freaking planet. It's a travesty we haven't had a nuclearized grid for 20-30 years at this point.
That's not quite right: 20% of our energy supply comes from electricity. But over 50% of the energy we consume gets wasted as heat, and fossil fuel energy wastes a lot more of its energy as heat.
So the real number is closer to 40%. If we switch ground transport to EV's and heating to heat pumps we can get up to ~75%.
> I remain skeptical that fusion will ever be a commercially viable energy source. I'd love to be wrong.
I’m also skeptical, but I think the emphasis of my skepticism is on “commercially viable” as opposed to an available energy source. That is, I think fusion development will (and should) proceed anyway.
There’s a good argument that nuclear fission is not really commercially viable in its current form. Yet it provides quite a lot of commercially available electricity. And it also powers aircraft carriers and submarines. And similar technology produces plutonium for weapons. In other words, I don’t think fission’s continued availability as a power source is a strictly commercial decision.
I think there’s a quite a lot of technology that is not directly commercially viable, like high energy physics, or the space program. But they remain popular and funded. And they throw off a lot of commercial side benefits.
The growth of solar for domestic consumer power will certainly continue and that is a good thing. But I bet we’ll have fusion too in the long run. There’s no lack of ideas for interesting things to do with extreme amounts of heat and power. For example I’m hopeful that humanity eventually figures out space propulsion powered by fusion.
> With a dense high-energy plasma, you're dealing with a turbulent fluid where any imperfection in your magnetic confinement will likely dmaage the container.
This is true of Tokamak type designs based around continuous confinement, but perhaps less so with something like Helion's design which is based on magnetically firing plasma blobs at each other and achieving fusion through inertial confinement (cf NIF laser-based fusion), with repeated/pulsed operation rather rather than continuous confinement.
No doubt the containment vessel will still suffer damage, but I guess it's a matter of degree - is it still economically viable to operate or not, which I guess needs to be verified experimentally by scaling up and operating for a sufficiently long period of time. Presumably they at least believe the approach is viable or they'd not be pursuing it (and have an agreement in place with Microsoft to power one of their data centers with one of the early units).
There are serious theoretical objections to Helion approach so I am very sceptical to their approach. Stellarators on other hand do not have any known theoretical obstacles and avoid the problem of plasma instabilities.
What are the theoretical problems? Aren't they already achieving fusion with their test reactors, so what's the problem with scaling up and producing net energy?
OK, and hobby rocketists have nailed a SpaceX style landing too, but so what?
Have you seen the videos of Helion's reactor - hardly a basement project. Sam Altman (OpenAI) also has personally invested hundreds of millions of dollars into Helion, presumably after some due diligence!
And also an r/fusion post documenting prior claims:
> “The Helion Fusion Engine will enable profitable fusion energy in 2019,” - NBF 7/18/2014.
> “If our physics holds, we hope to reach that goal (net energy gain) in the next three years,” - D. Kirtley, CEO of Helion in the Wall Street Journal 2014.
> “Helion will demonstrate net energy gain within 24 months, and 50-MWe pilot plant by 2019,” - NBF 8/18/2015.
> “Helion will attain net energy output within a couple of years and commercial power in 6 years,” - Science News 1/27/2016.
> “Helion plans to reach breakeven energy generation in less than three years, nearly ten times faster than ITER,” - NBF 10/1/2018.
> Their newest claim on their website is: "We expect that Polaris will be able to demonstrate the production of a small amount of net electricity by 2024."
I'm sure all this came up in any due diligence as well. They are on Series E after all.
More than a decade of missed milestones is not the type of company that gets this many rounds of investment.
A lot of people really want fusion to happen, and happen sooner. I think that leads to people taking far higher risks with the capital. This sort of investment is always risky, but donating to a grander cause of technology advancement can be a reason for the investment, in addition to expected future value of the investment.
High-profile investors are not a signal that something will be successful, no matter how smart they may be in some other domain. Lots of people who should have known better invested in Theranos, too.
Helion's device is a toy. They have nothing that would let them scale past designs of the 70s and say a lot of very suspect things, like that they want to use worse fuel mixes and calling one of the oldest and simplest designs "new" and "unique".
This is simply wrong. They have a very clever combination of ideas that is truly new, so new that people are having a hard time understanding just how clever it is.
The IM video you posted, btw, is not to be taken seriously. It appears to be based solely on the Real Engineering video, not on Helion itself.
Nobody is building commercial plants any time soon; it's still in the experimental phase, with new discoveries happening almost every month.
I see it similarly to the difference between a car with a combustion engine and an electric one. Combustion engines are fully developed. We're reaching the maximum possible performance and utilisation. It's a dead end. However, with electric cars, for example, new battery development is far from over. E.g sodium batteries.
And just off the top of my head, in fusion, the discovery of better electromagnets, as happened a while back, can quadruple energy output.It's not a dead end, and writing it off would be short-sighted.
You realize this is what people said about solar energy and nuclear energy at one point, right
And before someone chimes in and says Nuclear doesn't make sense - it made sense at plenty of times and in different places.
It doesn't make sense in Western countries that are hell bent on making it as expensive as possible, strictly to ensure it doesn't get built, so we stick on fossil fuels as long as possible.
This is a meaningless argument people trot out all the time for things they just don't understand. Sometimes it applies but often it doesn't.
For example, people will dismiss arguments saying FTL is likely impossible because people once said that about going to the Moon. To be fair, there was some logic to the anti-Moon argument based on physics. The big change came with multi-stage rockets that solved the weight and thrust problems. And even then it's close [1].
There are good, physical reasons why FTL is highly likely impossible. You know, based on phnysics.
Likewise, the challenges to commercial fusion are also based on physics. Fusion reactions produce neutrons. Neutrons can't be magnetically contained. Neutrons destroy the container and, more importantly, lose energy from the system.
> I remain skeptical that fusion will ever be a commercially viable energy source. I'd love to be wrong.
It can be for deep space propulsion. The Orion project [1] demonstrated that you can power a spaceship so that it has both huge thrust and huge specific impulse with hydrogen bombs. The main issue with this project is the proliferation concerns. However, if you replace the bombs with pellets that are imploded by lasers, like the NIF experiment did [2], then you could get to the point where you can drive a rocket with non-weaponizable fusion explosions.
It feels like we forget quickly that we already have nuclear power around. And yet fission constantly suffers due to economics issues, stemming from needing to build massive plants full of steel and concrete, each facility being effectively bespoke, requiring fancy equipment and electronics, needing constant monitoring, having to plan to decommission an irradiated plant and dispose of radioactive waste, and... needing to mine and refine uranium.
Fusion would (maybe, unless we need helium-3) solve that last one, and only sorta solve the radioactive waste one. Everything else remains, perhaps even gets worse.
Still good to see people working on it, maybe it'd be useful for far-future spaceships or areas where solar/wind aren't feasible. But I don't see how it wins economically at large.
There are multiple potential fusion reactions, duterium and tritium like in our home star The Sun is the most researched. There is also research into ones with Lithium and other left side elements. Finally the one I think has the best future is aneutronic fusion with Boron11 plus hydrogen, it gives off three alpha particles which can be converted directly to electricity. the leading model is Field Reversed Fusion. https://spectrum.ieee.org/aneutronic-fusion
True, I didn't state that clearly. What I meant was our sun fuses Hydrogen, whereas I feel Aneutronic fusion using Boron-11 will be better here on earth.
The overall process of proton-proton fusion within the Sun can be broken down into several simple steps. A visual representation of this process is shown in Figure 1. The steps are:[4]
Two protons within the Sun fuse. Most of the time the pair breaks apart again, but sometimes one of the protons transforms into a neutron via the weak nuclear force. Along with the transformation into a neutron, a positron and neutrino are formed. This resulting proton-neutron pair that forms sometimes is known as deuterium.
A third proton collides with the formed deuterium. This collision results in the formation of a helium-3 nucleus and a gamma ray. These gamma rays work their way out from the core of the Sun and are released as sunlight.
Two helium-3 nuclei collide, creating a helium-4 nucleus plus two extra protons that escape as two hydrogen. Technically, a beryllium-6 nuclei forms first but is unstable and thus disintegrates into the helium-4 nucleus.
The final helium-4 atom has less mass than the original 4 protons that came together (see E=mc2). Because of this, their combination results in an excess of energy being released in the form of heat and light that exits the Sun, given by the mass-energy equivalence. To exit the Sun, this energy must travel through many layers to the photosphere before it can actually emerge into space as sunlight. Since this proton-proton chain happens frequently - 9.2 x 1037 times per second - there is a significant release of energy.[3] Of all of the mass that undergoes this fusion process, only about 0.7% of it is turned into energy. Although this seems like a small amount of mass, this is equal to 4.26 million metric tonnes of matter being converted to energy per second.[3] Using the mass-energy equivalence, we find that this 4.26 million metric tonnes of matter is equal to about 3.8 x 1026 joules of energy released per second!
- Lawson criterion is derived with the assumption of equilibrium plasma, which doesn't hold true in any real tokomak/stellarator
- at required temperatures most of the energy would be in photons of thermal radiation, that don't get confined by magnetic field, so when plasma relaxes from high energy beams to thermal equilibrium, it loses all the pumped energy through radiation
- with high energy beams tokomak is essentially a particle accelerator, where electrons get in the way of collision
I'm thinking perhaps the best place for a fusion reactor is 93 million miles away. It's already up and running, and we're making huge strides in energy collection and storage...
But so long as there is a boatload of prestige and funding to be harnessed via fusion research, it'll be a Really Big Thing.
Centuries ago, an ambitious and clever alchemist could harness a fair quantity of those things via transmutation research. Vs. these days, we have repeatedly demonstrated the ability to transmute lead into gold. But somehow, there's no big talk about, or prestige in, or funding for scaling that process up to commercial viability.
There are a couple of factors in play with any research, including fusion. If there's money to be had for funding then somebody will research it.
But another more nefarious factor is the nexus of fusion energy research and nuclear weapons research [1]. To build and maintain a stockpile of nuclear weapons (specificially thermonuclear weapons) you need appropriate trained nuclear energy physicists.
Nuclear fusion as an energy source has major unsolved problems. Off the top of my head:
* The super conducting metals required for confinement randomly stop superconducting.
* The fuels produce absurd amounts of radiation and the Helium-3 solution for that might as well be fairy dust, since even if we convert the energy global economy to helium-3 production, we will not have enough by orders of magnitude to power hypothetical fusion reactors that would handle our needs. Strip mining the moon for it is supposedly a way to get it, but defacing the surface of the moon for minuscule amounts of Helium-3 per acre is unlikely to ever be profitable.
* The amount of radioactive materials produced from the experiments are many times those produced in fission reactors.
This is just off the top of my head. Until recently, I would have included the inability to produce more energy than we put into it on this list, but LLNL’s breakthrough a few years ago seems to have solved that. I suspect that someone with time to look into the practical issues involved in building a fusion reactor would find other issues (such as the design not being practical to use in a production power plant and thus further research being needed to make one that is).
I wonder if the only reason countries fund nuclear fusion research is to keep nuclear scientists from finding employment in the production of nuclear weapons.
As for the amount of radioactive material, the experimental reactors are several times the size of fission reactors. It is obvious that they irradiate far more material.
There is a very well researched youtube video that goes over these things:
Quenching isn't random. The waste has a half life of 10 years. You can use an idiotic unit like total mass of material without taking into account level of radioactivity or half life, but that's not even worth discussing.
"Quenching isn't random" is a big change from "it doesn't happen". Quenching is random for all intents and purpose, which is why test fusion reactors have systems in place to try to detect and respond to it before very bad things happen. If they knew when it would happen, they would not need such detectors.
As for the waste, you are still going to have to pay per unit to clean it up, just like you would with waste from a nuclear fission reactor. You have far more of it since the volume being irradiated is far higher. Although it is by volume rather than by weight, a decomissioned MH-1A PWR power plant produced 89 m^3 of solid radioactive waste and 363 m^3 of liquid waste:
The youtube video stated that the cleanup effort for the Joint European Torus was projected to produce 3000m^3 of waste, and the ITER reactor would be 10 times its size. Neither of them produced or will produce useful energy, yet they produced / will produce orders of magnitude more waste when it is time to decommission them than a decommissioned fission reactor did.
It also does not matter if the half lives are lower. It is still not going to be safe to be around that stuff long after both of us are dead and buried. The costs are effectively the same, since they will both be disposed in the same way.
I do not recall where I heard about the helium-3 situation, but I recall hearing some figures and the prospect of having enough to run fusion reactors was not good. Doing a search suggests that I had been only slightly mislead about the scarcity of helium-3. It is still extremely rare, but the US reportedly has used up to 70,000 liters of it per year:
The density of Helium-3 at STP is presumably 0.1338 g/L, based on Helium-4's 0.1784 g/L. That suggests that the total annual US industrial demand is ~9 kg. This is definitely not as bad as I had thought, but it is still fairly dire.
As per wikipedia, a single 1GW nuclear fusion reactor that is 100% efficient at converting helium-3 into electricity would need 52.5 kilograms per year. 6.7 metric tons per year would be needed to power the entire US.
That assumes 100% efficiency, and if the conversion efficiency is anything like a nuclear fission plant, we would be lucky to see 3% efficiency, but we could be optimistic and assume something higher. Either way, it does not change the conclusion.
With an annual supply of ~9 kg, practical helium-3 fueled nuclear fusion reactors are not happening. Maybe the supply would be higher if you include other countries, and maybe we could get it somewhat higher if we try, but the reality is that helium-3 is an extremely rare isotope and the idea of using it as a practical fuel for a nuclear fusion reactor is a pipe dream unless the supply problem is solved and people figure out how to actually build a reactor that generates power using it, without encountering any of the other known problems that make this unlikely.
The difficulty in scaling supply is why are people discussing wild ideas like mining the moon, or even mining Jupiter. The supply situation is so constrained that the US government is reportedly buying 3 liters of it from a company that promises that it will strip-mine the surface of the moon for it with delivery by April 2029:
That said, the fairy dust remark was probably inaccurate, but the idea that our supply is short by orders of magnitude is correct according to mathematics.
The problem(s) of scale are not only those of scaling up, but also scaling down.
One of the best and most unsung benefits of solar is that it is profoundly easy and intuitive to build a very small (ie, vehicle- or house-sized) grid.
In an increasingly decentralized and stateless world, it makes sense to look for these qualities in an energy source.
Yeah. It will be basically impossible for fusion to come anywhere near the cost of fission. However, calling solar energy cheap ignores the necessary battery storage cost, which is substantial.
The steam reactor I guess you might be describing is tokamak, which i agree will be a dead end technology.
There are interesting small fusion reactors that skip the steam step. They compress plasma magnetically, and when the fusion happens, the expanding plasma in turn expands the magnetic field, and the energy is harvested directly from the field. No steam and turbines.
I have no idea why you are being downvoted. The chances of a power source that _doesn't even work yet_ will out-compete one that is currently on both an exponential price decline curve and exponential capacity growth curve are pretty close to 0.
Quantum tunneling does not work differently in the core of the Sun than it does on the surface of the Earth.
So what is the difference between those two places? Temperature and pressure. In the Sun those arise from gravity. On the Earth, we need to create them mechanically.
The engineering challenges are so massive that even if they can be solved, which is far from certain, at what cost? With a dense high-energy plasma, you're dealing with a turbulent fluid where any imperfection in your magnetic confinement will likely dmaage the container.
People get caught up on cheap or free fuel and the fact that stars do this. The fuel cost is irrelevant if the capital cost of a plant is billions and billions of dollars. That has to be amortized over the life of the plant. Producing 1GW of power for $100 billion (made up numbers) is not commercially viable.
And stars solve the confinement problem with gravity and by being really, really large.
Neutron loss remains one of the biggest problems. Not only does this damage the container (ie "neutron embrittlement") but it's a significant energy loss for the system and so-called aneutronic fusion tends to rely on rare fuels like Helium-3.
And all of this to heat water to create steam and turn a turbine.
I see solar as the future. No moving parts. The only form of direct power generation. Cheap and getting cheaper and there are solutions to no power generation at night (eg batteries, long-distance power transmission).