Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I do enjoy sharing this kind of news with all the fusion haters online. Fusion tech is legitimately cracking away on their "perpetually X-years away" stigma. That perpetual barrier can very reasonably be viewed as a normal technology barrier now.


CEA themselves are saying fusion is not going to be ready by 2050.

Don't mistake skepticism for hate. I will be the first one to applaud a commercial fusion reactor. But fusion proponents often use it's pending development as an argument against fission - a technology we already have and desperately need to adopt now.


As a big proponent of fusion: we should be spending more money and effort on it. We should be spending more money and effort on fission too. Sustainable energy sources shouldn't be fighting for scraps.


Yes, there are significant issues. Nothing we do not anticipate solving, but still. It will take time and solving these issues in a resource-effective way so that it can actually work as a power plant will be a challenge.

> But fusion often use it's pending development as an argument against fission - a technology we already have and desperately need to adopt now.

If it helps, CEA is also doing a ton of R&D on fission (and batteries, among others). But there, the real issues are mostly political.


Now that we've made it to 2025, 2050 doesn't feel nearly as far away to me.


20 years ago I would have agreed with you. However today we have proof that wind and solar work, are cheap, and are useful. The world doesn't need fusion or fission, other technology is plenty good.

Unless you can do a science fiction thing of turning off the sun, and harvesting the hydrogen in it to power local reactors in earth orbit to provide the energy (light) we need without letting the vast majority escape our solar system unused. Otherwise that big fusion reactor in the sky provides all the energy we need.


Wind and solar power are proving very cheap and good at the margin, but it doesn't solve for the massive needs of a modern grid. Unlike plants, we do not necessarily have the option of turning society off when it's not sunny or windy.

Energy storage is far from a solved problem. Tesla produces ~40 gigawatts of storage capacity an entire year. California alone consumes ~800 gigawatts of power in a day. Even if Tesla dedicated every bit of lithium it had to building storage capacity for just one state, and demand didn't increase, it would realistically still take over a decade to keeping the lights on purely with renewables for a 24 hour period. At which point the first battery packs would be nearing the end of their service life.


This is an incredibly misinformed and outdated opinion. Tesla produces 40 gigawatts of storage capacity only because demand for storage capacity is currently quite low, and because China is more cost effectively producing storage. As demand increases, production will increase to match demand.

Most states currently only care about installing solar and wind -- not storage -- because they are still majority fossil fuels, and at the current moment it makes no sense to install storage if you still have fossil fuel to dislodge. The only exception is really California, who are installing storage, but their bottleneck is not the market's ability to deliver enough supply.

There are also many storage options beyond lithium ion if you only spent a moment to look.


Grid scale storage holds potential but for now it isn't economically viable for industrial base load. Residential customers are probably manageable but factories and data centers have to run 24×7: they can't shut down just because the sun isn't shining and wind isn't blowing. It's clear that the USA has to rapidly reindustrialize if we want to keep having stuff. For political and demographic reasons we won't be able to count on China as a reliable supplier much longer. Domestic electricity demand is going to grow much faster than the storage supply can keep up. The only realistic current options for that base load are a mix of fission and natural gas.

Maybe fusion will be an alternative someday but for now it's just a fantasy. We need to act based on what's proven to work today.


"Base load" can be achieved with renewables, batteries and natural gas. There have been lots of simulation studies demonstrating this. Not only is it achievable, it's also significantly cheaper and faster than fission with natural gas, even after accounting for all costs related to renewables such as the need for more transmission lines. This is especially true in the United States, which is uniquely blessed with abundant solar resources and well diversified wind resources.

Fission as a solution is something that is popular on social media, for reasons that are utterly mystifying to me. The arguments are invariably a few words that reach sweeping conclusions with no actual data backing it up, and lots of data contradicting it that the individual appears oblivious to.


I suspect the main reason fission is having a resurgence of popularity is because it maintains the current power structure of a rare large facilities controlled by a handful of actors. This has obvious advantages if you are a rich person concerned about keeping wealth concentrated into a few sets of hands.

Renewables are by their nature much more distributed in space, which makes them much harder to enclose and control in the way required to reproduce the current structure, especially as they are mainly being built by challengers who aren't really interested of forming monopolies with the fossil industry.


Well it's theoretically possible to supply industrial base load with battery storage but with the rate that demand is growing and the constraints on battery manufacturing that just won't be realistic for many years to come. How many battery cells does it take to keep a steel mill running through the night, and how will that impact power prices for large customers? As for natural gas, we're going to increasingly need that as a chemical feed stock to sustain the reindustrialization. So that leaves fission as the only known long-term option for sustainably meeting a large increase in base load demand.


Your one and only argument is that supply of batteries cannot keep up with demand. This is not only false, it's actually the inverse of the truth, due to Wright’s Law.

Current supply of storage matches current demand. Supply is low only because demand is low. However, as demand increases, supply will continue to match demand, and moreover the price will actually decrease because of the fact that the learning curve is a function of production volume.

This has been a steady empirical phenomenon for 30+ years, and it's predicted by basic economics principles. It's not going to change now!

This is true for all battery types, but especially for sodium ion and iron air, which are constituted of abundant materials. Sodium ion in particular has very similar behavior and cost to lithium ion.

This confusion you're having is you seem to be conflating manufactured goods (like batteries) with scarce goods like land or services, whereby there's a fixed supply that can't be increased and where Wright's Law doesn't apply. This is not correct.

Storage is more like televisions or light bulbs, where you can basically make as much of it as you want, and the price will keep declining as more is made. And supply will always be there for demand, whatever the level of demand happens to be (in this case, a lot).


> How many battery cells does it take to keep a steel mill running through the night, and how will that impact power prices for large customers?

Steel mills run when power is cheap. They historically have run at night (and only minimum power during the day) because cheap power is available at night. Of course there are lots of different steel mills, older ones can't shut down - but modern ones don't run 24x7, they run when power is cheap. Even the old 24x7 ones did their yearly maintenance in December - when power demand is highest (Christmas lights).

Wind and solar are easially predicted a few days in advance with high accuracy, and thus the mills change their shifts/output to follow the cheap power. If it is cloudy/no wind they will send their employees home (with pay) or do maintenance for that week while waiting on more cheaper energy. It takes a tremendous amount of energy to melt iron and so they manage this carefully because it makes them money. They can't deal with months of no production, but they can manage a week here and there.


Battery manufacturing has to be at massive scale even in a nuclear powered world, just to supply battery electric vehicles.

Converting every passenger car and light truck in the US to a BEV would involve enough batteries to store something like two days of the average grid output, which is more than would be needed for a cost optimal wind/solar/battery/hydrogen system for a 100% renewable grid.


> Converting every passenger car and light truck in the US to a BEV would involve enough batteries to store something like two days of the average grid output, which is more than would be needed for a cost optimal wind/solar/battery/hydrogen system for a 100% renewable grid.

Assuming the power stored in these vehicles can be reclaimed by the grid anytime they want?


No, I was just pointing out the scale of the required battery manufacturing.

It's an argument I like to use. When someone claims "we can't use X because of reason Y, we have to do Z instead" I look to see if Z also is hit by objection Y.

Another example of this is "renewables require too much material that we can't recycle", at which point I observe that the quantity of materials produced by society as a whole greatly exceeds what renewables would involve, even if the society is powered by nuclear. The US produces 600 megatons of construction and demolition waste a year, for example. Renewable waste would just be a minor blip on this existing waste stream. So, either recycling this waste isn't actually needed, or a putative sustainable nuclear-powered society has discovered how to recycle it, so just toss the renewable waste (which is almost entirely things like steel, aluminum, and glass) into that same recycling infrastructure.


"we can get clean energy by continuing to burn fossil fuels".

If this isn't about ceasing carbon emissions then none of this is necessary. Fire up the coal plants!


Calculate the area-under-the-curve (AUC) of two time series over, say, the next 50 years:

(1) the emissions of a 98% renewable + 2% natural gas grid that comes online in 6 years, assuming fossil fuels for t between [t, t+6 years].

(2) the emissions of a 100% fission grid that comes online in 16 years, assuming fossil fuels for t between [t, t+16 years].

If you insist on ignoring the temporal nature of cumulative emissions, then sure, you can arrive at a convenient but false conclusion. But any honest analysis will consider the emissions in that [t+6 year, t+16 year] interval.

(... it would also consider things like social licensing risks leading to early plant closures like what's happening in Germany, or the fact that nuclear will likely be paired with natural gas too because demand itself is variable, and overbuilding nuclear is expensive.)


Ehm, you should be fair and not fudge the numbers in your favor. :-)

Start both with the same (current) % for renewables and (1) have some realistic ramp-up of renewables to reach 98%, and (2) keep the renewables more modestly rising in the fission version, while fading-out fossils in favor of fission

You should also account the carbon foodprint of grid-level energy storage (yes, it will be needed, even with the natural gas plans), vs the foodprint for fission plants (undoubtedly quite bad).


There are so many more cost-effective grid-scale options like pumped storage. I think it's daft to "waste" the energy density of lithium batteries on stationary applications.


Battery storage has become cheaper than pumped hydro, I believe, at least for diurnal storage. The price declines in Li-ion cells have been remarkable, particularly recent decline in LFP cell prices.


Battery production/year is following an exponential curve right now. Tons of new research on promising new directions is continually being produced and incorporated into batteries. Projecting only continued production at the current rate isn't "realistic", it's wildly pessimistic.


The total electricity grid requirements are also growing - it’s at 30TWh annually and before the AI explosion was ~2-3% (let’s conservatively estimate 2030 as 40TWh). Let’s say 20% of that is satisfied direct from renewables without storage leaving 32TWh.

Aggressive predictions have us generating ~6-10TWh of batteries by 2030 meaning we’re going to still need about another 3-6 years to actually satisfy demand (ignoring complexity of hooking up the batteries). On top of that, the batteries require rare earth metals that companies are gearing up to satisfy by strip mining the ocean floor for those polymetallic nodules, operations which have a very real risk of completely destroying deep ocean life. It seems to me like it’s slow and ecologically potentially more destructive than even global warming. Is it really wise to be betting on batteries at this scale vs tried and true nuclear fission which doesn’t carry any of these risks?


We can make an effectively unlimited amount of battery storage, especially sodium ion or iron air (which don't need ocean floor mining...). There are no practical limits on the timescales of ~10-20 years.

What people forget is batteries are a manufactured good, which follows Wright's Law. Manufactured goods (like energy storage, TVs, lightbulbs) obey different economic principles to scarce goods (like land, services, or goods with scarce inputs), and they have effectively unlimited supply. The supply is strictly set by demand.

Aggressive predictions of ~6-10TWh/year of batteries in 2030 are more predictions of demand, not so much predictions of supply. If market demand in 2030 is 30TWh/year, then that's what the market will produce. But don't blame manufacturers for the fact that demand in 2030 will only be 6-10TWh/year! And don't confuse this for a sector's inability to increase supply!

The response when seeing a "6-10TWh/year" prediction should be "how can we incentivize demand so that this number is 30TWh/year instead".


You didn't address the need for rare earth metals. Can you link sources talking about the 'unlimited amounts of battery storage'? I was also under the impression (albeit uninformed) that battery storage was not a solved problem, either technically or ecologically.


There is no need for rare earth metals for stationary storage. See sodium ion, which performs similar to lithium ion batteries and are only slightly more expensive (but that cost differential has nothing to do with the product itself, it's because of economies of scale and Wright's Law has been operating on lithium ion for longer).

Lithium ion is preferred for vehicles because it's lighter, but again we are talking about stationary storage, so the extra weight of sodium ion isn't a problem.

The technology is solved, and the materials needed to make it abundant. It's all about demand. If the demand is there, the industrial capacity will follow. But right now, the market is only demanding about 3TWh/year of storage, and so that's how much industry is producing.


> The technology is solved, and the materials needed to make it abundant. It's all about demand. If the demand is there, the industrial capacity will follow

It takes a lot of time for new battery technologies to scale and disrupt existing ones and entrenched players have an incentive to continue competing. Sodium ion, iron air etc might replace lithium ion on the 30 year time scale but lithium ion will continue to drive down costs and up its capacity to try to compete and it has significantly more revenue to fund this by being the only player in the market. So it’s not clear when alternative batteries will start to replace lithium ion, but at scale it’s unlikely to be a quick process. And please don’t pretend like it’s all a demand side problem. It takes time to build out new factories from manufacturing all the equipment needed to acquiring and training employees. There’s plenty of demand for cheap batteries and the ability to manufacture simply isn’t there either and it’s being brought online. Oh and that capacity being added? It’s all lithium ion and requires a long pay off for that investment. Lithium ion is going to be potential a significant ecological debt worse than fossil fuels if the ocean floor strip mining gets going.


Well yes that is what happens when the market is left to its own devices and external costs aren't accounted for. The solution isn't to abandon storage it's to embrace the many options out there that don't require rare earths.

And it is all a demand side problem. If the world wanted to buy 10 or 20 TWh a year at current market prices, that's how much would be produced. But the world doesn't want to do that and hence that much isn't produced. This is Econ 101 for goods with non scarce inputs. It doesn't take ten years to scale up production for commodity goods.


Since we're teaching each other first principles, bringing a new mass-scale battery manufacturing facility online to full production typically takes anywhere from 2 to 5 years. Planning takes ~1-2 years, construction & setup takes ~1-2 years and ramp to full output takes 6 months to 1 year. And since we're on Econ 101, investments require payoff so all of this is done carefully to not tank the price of batteries by overproducing supply. In control systems terms, supply side investments always aim for undershooting demand as overshooting hurts how much money you make and risks destabilizing your market.

As for scarcity, inputs to lithium ion ARE scarce which negates your entire model. Pretending they aren't is where you're making a mistake. Lithium, cobalt & nickle are relatively scarce and the mines for that have to scale up to meet demand as well. You've also got a workforce to train to do the work which takes time & is also input-constrained. That's why there's massive NEW lithium mines being opened in the US & elsewhere to extract existing reserves to meet the growth in lithium ion batteries. If the world thought that sodium ion or ion air was an immediate future, you wouldn't see these massive large-scale investments into lithium. Lilthium-ion batteries is going to be a large and growing market for decades which brings me back to the strip mining of the ocean floor that's coming to support that.

Whright's law by the way isn't also an inevitable effect that goes on forever. At some point your exponential plateau's and you no longer see such exponential decrease in pricing. That's why processors aren't getting cheaper and compute isn't scaling up quite in the same way as in the early days. There's only so efficient you can make something.


Hang on. So the reaction to companies intending to perform actions that will destroy the economy system of the oceans is to prevent more demand? Why this instead of just forbidding that mining?


Well without that mining batteries will run out of cost effective materials quite quick and temper the ability to hit the demand necessary to decarbonize the grid. There’s also complicated legal matters since a lot of this happens in international waters. Oh, and the body that nominally regulates the matter has been clearly regulatory captured and is handing out mining licenses left and right. So while it might be nice to hypothesize what a sensible regulatory framework might look like, what’s actually happening is “full steam ahead” mining. It’s really bleak.


Also if we're planning for the long term; wind and solar sound like bad options for going into major global catastrophes like large asteroid hits or a nuclear war. It'd be better as a matter of principle to be using systems that can cope with massive climate disruptions. I like to bring up https://en.wikipedia.org/wiki/Year_Without_a_Summer - an event like that will happen sooner or later and it'd be pretty rough if we've all gone too heavy with solar.

One of those hopefully-you-don't-need-it concerns but it is starting to become a more pressing with the uptick in wars and unrest that seems to be going on.


More[1] reading material. The volcanic winter of 536 made it one of the worst years for humanity.

[1]: https://en.wikipedia.org/wiki/Volcanic_winter


Sorry, how is having a few, very delicate, power sources more resilient than an abundance of mechanically simple and widely distributed power sources?


They work if light is massively cut down for 12 months. And can be fortified to the nth degree.


This has to be the most hilariously desperate anti-renewable argument yet.


In what way is it anti-renewable?


"Renewables (or, at least, PV) are bad because they won't work after a K/Pg-level asteroid impact."


Why would that make them bad? They're still good. It just isn't clever to rely only on one source of power. It is like chicken being a good thing to have in the fridge. That doesn't make a sack of potatoes in the cupboard bad.

On a 1:500 year time horizon we know there are threats that dim the sun (possibly quite a bit shorter now that nuclear weapons are on the table and we seem to be incapable of dealing with that threat productively - the number of actors with nukes is growing). Planning for that isn't anti-renewable, it is just cautious.

And nobody was talking about K-Pg events. You'll notice the years quoted were all after the Roman Empire was founded.


> The world doesn't need fusion or fission, other technology is plenty good.

It's not. If it was the world wouldn't be using 140k TWh of fossil-fuel-produced energy[1], and would be using a lot more than 9k TWh of renewable energy[2]

[1] https://ourworldindata.org/grapher/global-fossil-fuel-consum... [2] https://ourworldindata.org/grapher/modern-renewable-energy-c...


> wind and solar work, are cheap, and are useful. The world doesn't need fusion or fission, other technology is plenty good

Which is why we aren't building record-setting amounts of natural gas infrastructure, oh wait...


We (the US) aren't building record-setting amounts of natural gas capacity. That was 20 years ago.

https://headwaterseconomics.org/wp-content/uploads/HE_electr...


At the risk of coming off as a nay-sayer, let's say engineering hurtles related to fusion power generation is overcome. How is the presumably high upfront capital costs going to compare with the ROI?

That is, it would seem likely that fusion power would be costly to build. It would also seem apparent that if it were to fulfil its promise then the power it generates is sold at or less than the current amount. That would then seem to imply a lengthily time to make a return on the initial investment. Or am I missing something else with this equation?


> return on the initial investment.

It's not only initial investment. Half of the fusion fuel is tritium, which is one of the most expensive substances on Earth (a google search finds that the price of tritium is about $30k per gram [1]). For comparison, fission reactors need enriched uranium, and that costs only about $4000 per kilogram [2]. People have the idea that fusion produces many times more energy than fission, probably because fusion bombs have a higher yield than fission bombs. This is not true. The most typical fusion reaction involves one deuterium and one tritium and yields 17.5 MeV from a total or 5 nucleons. A fission reaction involves one neutron and one atom of U-235 and yields 190 MeV from 236 nucleons. So fusion yields about 4.3 times more energy per nucleon. That's respectable, but in the popular imagination fusion yields 100 or 1000 times more energy than fission, so the fuel cost can be neglected. Nothing could be further from the truth.

[1] https://www.google.com/search?q=tritium+price

[2] https://www.uxc.com/p/tools/FuelCalculator.aspx


The myth of unbounded / free energy from fusion comes from being able to use any old hydrogen atoms, rather than the much rarer deuterium and tritium.

Perhaps one day we'll get there, but I worry that the current advancements using the rarer isotopes will end up proving to be a dead end on that road, much like so many attempts at GAI. In the short term I suspect we'd have better odds with getting thorium reactors to be economical.


Deuterium is not rare at all. There's enough in your morning shower to provide all your energy needs for a year.

https://dothemath.ucsd.edu/2012/01/nuclear-fusion/

Tritium is rare but lithium isn't, and we can make tritium from lithium using the neutrons from fusion. (We also get tritium from fission plants, which is how we'd build the first fusion reactors.)


> we can make tritium from lithium using the neutrons from fusion

Each fusion reaction consumes one tritium atom and produces one neutron. If that neutron hits a lithium atom, it can split that and produce a tritium atom. If everything goes perfectly and there are no losses, then you get a 100% replacement of all the tritium that you consume. If you have a 90% replacement ratio (highly optimistic), you essentially lower the cost of your tritium fuel by a factor of 10, so from $30000 per gram to $3000 per gram, or $3 MM per kilogram.

> We also get tritium from fission plants

Yes we do. Mainly from Candu reactors. There are 49 Candu and Candu-like reactors in the world, and each produces less than 1kg of tritium per year. According to [1] a 1 GW fusion power plant would consume about 55 kg of tritium per year. So you'd need to run more than 50 fission power plants to operate one fusion power plant. Most people who dream of fusion think that fission will become irrelevant, not that you'll need 50 fission power plants for each fusion power plant.

[1] https://www.sciencedirect.com/science/article/abs/pii/S09203...


That's why fusion blankets for D-T reactors use lead or beryllium as neutron multipliers. CFS for example uses FLiBe molten salt. Doing it this way a tokamak can not only sustain its own tritium supply, but periodically provide startup fuel for additional reactors.

Initial tritium load for a small, high-field reactor like CFS is much smaller than for ITER. And I'll note that the paper you linked has this conclusion:

> The preliminary results suggest that initial operation in D–D with continual feedback into the plasma of the tritium produced enables a fusion reactor designed solely for D–T operation to start-up in an acceptably short time-scale without the need for any external tritium source.


> CFS for example uses FLiBe molten salt

Ok, let's talk about that. For those who are not familiar, CFS stands for Commonwealth Fusion Systems, as startup with links to MIT. CFS aims to build a fusion reactor similar to ITER, but many times smaller, the secret sauce being that they use superconductors to achieve high magnetic fields. Back in 2022 some of the MIT guys got an ARPA-E grant to investigate the use of FLiBe to achieve atritium breeding ratio higher than 1 [1]. The results are in [2], they were published in January 2025. Here are some quotes:

  > The long-term goal of LIBRA is to demonstrate a TBR ⩾ 1 in a large volume (1000 kg ∼ 500 l) of FLiBe molten salt using D–T neutron generators. Note that a full-scale LIB in an ARC-class FPP will require ∼250 000 l of FLiBe, hence the importance of understanding tritium behavior in large salt volumes.
ARC is the fusion reactor designed by CFS. This paper states that it will need 250000 liters of FLiBe. This is an insane amount. To understand how large this amount is, consider this: this ARPA-E project that took 3 years, used a quantity of 100 ml, so 0.1 liters.

Anyway, what breeding ratio was achieved? 3.57 x 10^(-4), or 0.0357%. It's a long way to go from here to 1.

I'm not saying it's impossible, but too many things related to fusion are just "engineering details".

[1] https://arpa-e.energy.gov/programs-and-initiatives/search-al...

[2] https://iopscience.iop.org/article/10.1088/1741-4326/ada2ab/...


250000 liters of FLiBe contains 44 tons of beryllium, and the current annual production is 220 tons, so it's possible but not cheap.


Wow. The ARC reactor is supposed to deliver 270 MWe (when/if it will be built) [1]. So 20% of the world annual production of beryllium for one power plant that would deliver about one quarter of the power of an AP-1000 fission reactor.

[1] https://en.wikipedia.org/wiki/ARC_fusion_reactor


And the original ARC design used 958 tons of FLiBe.

https://arxiv.org/pdf/1409.3540


Beryllium is very efficient as a neutron multiplier, but it is also extremely rare. It would not be acceptable as a consumable for energy production, as it is much more useful for other purposes.

In the Solar System, the abundance of beryllium is similar to that of gold and of the platinum-group metals. On Earth, the scarcity of beryllium is less obvious only because it is concentrated in the continental crust, where it is relatively easily accessible, even if its amount in the entire Earth is much smaller.

Lead neutron multipliers would be preferable, because they only inter-convert isotopes of lead, so it is not destroyed, like beryllium.

However lead used for this purpose becomes radioactive, with a very long lifetime, unless expensive isotope separation would be used for it.


I mean, it has to destroy the lead eventually, since lead is being used as a source of the extra neutrons. An individual lead nucleus will be converted to lighter lead isotopes by (n,2n) reactions, but eventually it will reach Pb-203 which decays to Tl-203. Presumably the thallium (and then mercury) will also be subject to (n,2n) reactions.


No, it comes from foolishly thinking that the cost of fuel will dominate cost of energy. That doesn't require fusion of protons; deuterium and lithium are cheap.


I don't know much about this but I assume that the tritium will be created somehow while fussion is done [1]

[1] https://en.m.wikipedia.org/wiki/Deuterium%E2%80%93tritium_fu...


Agreed. I think fusion power would be great, but the sales pitch of 'limitless free power' just isn't true. The thought experiment I use is this: Let's imagine coal is magically free in every way. How does my power bill change? The answer is "barely at all" because the cost of utility electric power is mostly in distribution. We pay around 30c/kWh while the wholesale energy price is more like 2c/kWh.

It'll still make a difference in large scale energy intensive stuff, like desalination, aluminium refining, etc. but the average punter is going to save a lot more by installing solar panels.


We'll never know until (or if it ever comes) but there's reason to believe Fusion could be >50% cheaper than Fission.

That would still be more expensive than Solar and Wind (by 100% or more) - but I am skeptical in the same time frame those sources will be able to take over baseload generation.

It's really comparing apples to oranges.

Plus, it's a very hypothetical future. Anything could happen between now and then.


What is your exact scenario for cheap fusion?

Because IMO the only approach that is even capable of delivering here is the Helion one (=> direct conversion). And that design is incredibly far from ready, the whole approach is completely unproven and their roadmap is mainly wishful self-delusion (from what we can tell by evaluating past milestones, like "first 50MW reactor finished by 2021"-- there is no 50MW reactor even now).

From my PoV, ITER-style tokamaks are the most conservative/certain design, and also the furthest along by far. That would imply:

=> Cryogenics for the magnets

=> big hightemperature vacuumchamber for plasma

=> all the thermal/turbogenerator infrastructure needed in conventional plants

=> super high neutron radiation flux (this is a problem)

I just don't see where you save anything. This is basically just a fission reactor, only a magnitude more complicated and demanding. I absolutely don't see how it could ever get significantly cheaper than conventional nuclear powerplants.


Fission reactor has to be big and has to deal with storage of a lot of nuclear waste and must implement a lot of expensive measures to stop runaway reaction in case of unexpected events.

Fusion has none of this. Assuming Q >> 1 will be demonstrated in a design that can be commercialized the next biggest problem is dealing with high-energy neurons on a scale never experienced before with potential much faster degradation of materials than anticipated leading to prohibiting operational costs.


That's a problem, but it's not necessarily even the biggest problem. Other huge problems include the shear size of the machines per MW of output (and hence cost per MW), coupled with their dreadful complexity and the difficulty of keeping them operating when they become too radioactive for hands-on maintenance. Designs typically just assume the reactors will be reliable enough, when there's no empirical evidence to support that (and the one study that tried to estimate uptime based on analogies with other technologies found the reactor would have an uptime percentage of just 4%!)


Helion's promised dates were conditioned on funding, which they didn't actually get for several years. Adjusting for when they did get funding, they're pretty much on track.


Even if fusion is an expensive power source, it may still be desirable in areas which aren’t well suited to wind or solar.


If we figure it out, it might end up being cheaper than fission eventually.


Compared to fission? It's still quite unclear that fusion will provide improvements over fission.


Without any of the meltdown concerns a fusion powerplant is a lot simpler to actually build than a fission plant. It has a small fraction of the security, reliability, regulatory, etc concerns (not none, just way way less). Unless it's so marginal that it's barely producing electricity I'd be pretty surprised to find out we had Q>1 fusion and yet it couldn't out compete fission anywhere fission is practical.


Modern fission designs mitigate meltdown concerns well enough that I'm not sure the safety & security around a fusion plant would actually be any better/cheaper, although public sentiment may be enough of an advantage. Tritium & neutron activated metals are dangerous enough to require keeping the traditional nuclear plant safeguards IMO. As far as proliferation concerns go, I don't see any reason you couldn't breed plutonium in the neutron flux of a fusion reactor, & the tritium is clearly viable for boosted warheads.


Modern fission designs plausibly mitigate meltdown concerns well enough...

To move that "plausibly" into "actually" you have to have very careful design review by regulators. Very careful review of construction to make sure what is constructed is what was designed. And so on and so forth. It's a lot of friction that skyrockets costs. Legitimately. People inevitably attempt to cut corners, and there's no way to make sure they aren't on the safety parts without checking. Actual currently regulatory costs seem to bear out the difference between these, with SMR people spending large amounts of money to convince regulators they didn't screw up, vs Helion fusion being "regulated like a hospital".

I'm not saying fusion has no proliferation concerns. But it's the difference between "low grade nuclear waste, or a very high tech very advanced program to weaponize a working reactor" and "even a broken reactor can be strapped to some explosives to make a dirty bomb". I can't say I'm very aware of how much proliferation concerns drive costs.

Public sentiment also helps.


A lot depends on the actual reactor design.

I was thinking more of large scale D-T fusion, e.g. the tokamak design, which requires breeding tritium & is expected to create a lot of neutron activated waste. The tritium is especially concerning, as it's roughly as deadly as polonium-210 & highly bioavailable in the form of super heavy water.

You're probably right for smaller aneutronic designs like Helion's. If they can actually be made to work, they'll be much safer.


That's astounding, I've never heard anybody claim that the reactors would be simpler before! Do you have any estimates of anybody working on the problem that thinks that?

Every schemer I have ever seen is quite a bit more complex than a fission reactor. Often, designs will depend on materials that do not yet exist.

That said there is a tremendous variety of techniques that fit under the umbrella term of "fusion," so I'm hoping to learn something more.


Not simpler in terms of technology, but simpler in terms of deployment, regulation, and security. Those are the majority of costs in fission power plants.


The majority of the cost in fission is in the massive construction build, change orders, logistics, massive concrete pours, welding, etc.

I've looked a lot into this in terms of how to get a project like Georgia's Vogtle to have cost less, or Olkioluoto in Finland, or Flamanville 3 in France. Big complex construction projects are expensive, and it's not clear at all to me that fusion would be simpler or smaller, or escape the rest of Baumol's cost disease that has been plaguing fission in highly developed economies.


The more plausible looking modern fusion companies tend to be designing very small reactors compared to those projects. Vogtle is 5000 MWs. Olkioluoto is 1600. Helion is promising reactors that are 50 and can be shipped via trains by 2028 (or 2030 depending on how you read some statements/interpret what I just said). They still need some neutron shielding to actually operate them safely (boronated concrete, probably not shipped by train), but nothing on the scale of what you need for a fission plant.

(and other than that I echo elcritch's comments)


Helion also promised 50MW prototypes by 2021. It's 2025 and they have no 50MW prototype still. Fusion power roadmaps are generally exercises in wishful thinking, while fusion power startup roadmaps are basically gaslighting-as-a-serivce.

I still think its worth researching and we'll get there at some point, but I'm not holding my breath-- the whole industry has overpromised in the past, continues to overpromise now and will be probably be irrelevant for de-carbonizing the grid by the time the technology is actually ready at an industrial scale.

Mass media reporting on the whole sector is admittedly even worse; especially for uninformed readers without an engineering background.


I believe the promise you are referring to was contingent upon Helion getting funding that they did not get. It's not gaslighting to publish an amibitious timeline for a startup and be wrong, but that's not even what happened here. They said "if X (funding), then Y". X didn't happen.

I believe the current timetable is no longer contingent upon funding (since they've got the funding they think they need). It's no doubt still an optimistic startup timeline, a target, that they might well fail to achieve (even without the startup failing, just being late).


Meh. They promised 50MW by 2021 in 2018. In 2021, they got half a billion $, but the 50MW plant is still not running. But the bigger problem is that those are all just pretty prototypes; according to a 2018 ARPA report, magnetic field compression in the 40T range is needed for commercial viability. The various prototyes have pushed this from 4T to 10T (and presumably soonish 15T) since 2014. Extrapolating that trend is much less promising than Helions roadmaps, and doing linear extrapolation there is probably doing them a favor...

It's a cool concept, but probably not gonna be viable anytime soon (if ever!).


With regards to scaling, I think there are two components:

Does the physics change as they scale up the field strength? No one is really going to know until they try (unless we get a lot better at simulating plasma real fast). If not, they lost a bet, but they lost it honestly and as far as I can tell (not a physicist) it was a reasonably good bet to make.

Can they physically build the bigger magnets they need fast enough to meet their timelines (and everything else. I understand they are currently bottlenecked on capacitors)? Apart from normal "startups are overly optimistic" issues I don't see any reason to think that they shouldn't be able to reliably predict how fast they can scale magnet size, or be limited to a linear rate. While they are big magnets, it's not exactly new physics.

I'm not sure I'd say they are "probably going to be viable" anytime soon either. I think they have a good chance, but "probably" as in ">50%" is probably pushing it. (Also depends on where you put the goalposts of course)

FWIW I believe that 2018 report was for a high gain low pulse rate plan that Helion rejected, and they are aiming for substantially lower strength magnets as a result. I can't find anything more than rumors to confirm that though.


Just to be clear: I'm not accusing Helion of being dishonest or even fraudulent.

It's just that from everything I know about the project, they still have a long way to go, and there are a lot of milestones to hit that are just pipe dreams for now (actually fusing He3, breeding it, net-gain energy extraction, ...).

I would expect progress to slow down significantly as the scale of prototypes and their complexity increases (like what happens for basically every engineering project ever)-- but progress is already slow/behind schedule to begin with...


That’d be interesting to learn more about. What I’ve seen always leans toward regulation driving costs.

Though I guess some of that infrastructure could be overbuilt due to excessive regulation.

Also much of the concrete and steel is needed for the containment domes. Fusion power likely wouldn’t require nearly as much protection. Perhaps just a fairly standard industrial building.


Regulation generally drives costs by making us build more. Generally safety systems and other redundancy.


I would guess the preventative maintenance over the lifetime of a fission reactor exceeds the initial build costs.


I think that it will depend on economies of scale.


People won't be afraid of fusion, fusion plants can't be used to make bombs, fusion plants could maybe explode, but they won't poison the nearby land (or the whole planet) for decades-eons.


> fusion plants can't be used to make bombs

Helion's reactor, if it works, could become a source of the cheapest neutrons on the planet. It would greatly enable nuclear proliferation by providing neutrons for breeding of fissionable material for bombs.

A 50 MW DD reactor would produce enough neutrons to make half a ton of plutonium per year. Remember, none of these neutrons have to be turned around to make tritium, as they would have to be in a DT reactor.


I wouldn’t bet on a sane response to it. People are afraid of 5G, vaccines, and even masks.


IMHO, dislike of masks is built into us as a social species that place significant value on facial expressions. Makes sense from an evolutionary game theory perspective for societies to discourage them.

Easy to find research showing the detrimental effects of masks on communication, etc: https://pmc.ncbi.nlm.nih.gov/articles/PMC10321351/


Man I was doing ok this afternoon, why did you have to go poke a stick in people's totally rational responses to respiratory PPE?


There is a certain amount of "who cares about the cost" when it comes to fusion power. Nations will want to build them to lower or eliminate reliance on foreign energy, to address climate change concerns, and as a backup for renewables, and for other non-economic reasons. Many things that governments will want to fund that have nothing to do with directly "how much does the electricity cost?" or "when can we expect a return on investment?"

And the first generation will be expensive. That's how all new technology is.


The non-national-state investors care about the cost and roi.


And they’ll be subsidized such that they have a positive ROI


There's definitely an existential question around if fusion will ever be able to beat renewables plus batteries, but who knows with our energy demands ever increasing at some point renewables may hit a breaking point in land cost.

I'm generally pro-publicly funded research. There is not any direct ROI on say the LHC, but it does fund advanced manufacturing and engineering work that might enable other more practical industrial applications. The ROI might be a century away.


> At the risk of coming off as a nay-sayer, let's say engineering hurtles related to fusion power generation is overcome. How is the presumably high upfront capital costs going to compare with the ROI?

Does money even matter once fusion is attainable?


I'm not sure if you're being serious, but I'm going to assume you are. Let's say energy costs 1/10th it does today. That's far cheaper than I see anybody predicting fusion will be, but I think renewables will get there. How much does cheap energy change in the economy? What is bottlenecked by expensive energy at the moment? It turns out that matter, people, people's wants, still have a huge impact.

Make all energy free. What does that change? It lowers operating costs for many things, but up front capital costs are still there. Land still matters. Food still matters.

Money will still matter. Allocation of time, of resources, all that still matters a lot. Energy is big for the economy, but if its free we shift our focus to other matters of logistics.


If energy was truly free, it would revolutionize the economy and would fundamentally change how money matters.


I definitely prefer spending the money on fusion over rushing a Mars mission. Fusion is probably cheaper than Mars and will actually benefit humanity. Which is not something I can say about going to Mars (or even the moon).


A Mars mission would benefit humanity, but less directly. The past lunar missions and space program benefited humanity in many ways.

For pure return on investment, I agree with your take.

Provided of course that any future threats to humanity as a single planet civilization don’t materialize. There’s a low and uncertain tail risk ignored in our calculation.


Are you saying that the benefit to humanity of a Mars mission is that if the Earth explodes, we have an uninhabitable planet (under any realistic expectations) to stay on?


No, he clearly said that a "second home for humanity" is of dubious (but potentially nonzero) value.

Rather, the main benefit would lie in the technological advances made in order to enable such a Mars mission in the first place (similar to advances during Apollo).


>Rather, the main benefit would lie in the technological advances made in order to enable such a Mars mission in the first place

I agree with this view, but the comment I was replying to only mentioned as a benefit that Mars could be a second home (which I find rather ridiculous).


> the comment I was replying to only mentioned as a benefit that Mars could be a second home

The first and second sentences of that comment literally say

> A Mars mission would benefit humanity, but less directly. The past lunar missions and space program benefited humanity in many ways.

And then it goes on to acknowledge the "second home" element, but only as a small consideration.


"that comment literally say" even thought it doesn't say that one of the benefit would be "technological advances" so in reality "that comment literally doesn't say it" and that's why I was asking.


No, for a succesful Mars mission certain scientific progress has to be made. Unlike in economy, in science such things trickle down to us mortals.


Economic advances don't trickle down to "us mortals"?

Dude, the relentless decrease in cost of manufactured items, this decrease that makes your current way of life possible, is driven by exactly that. Manufacturers are in life-or-death competition and we consumers reap the benefit as prices are driven ever downward.


No, I was making a joke about trickledown economics.


No, that’s not what I’m saying. That seems of questionable value. Unless some crazy tail event happens that makes it valuable.

The benefit to humanity is the technological advancement.


The planet Mars is a gift from God for humanity


Thats what they said about lead!

A 'gift of God'?: The public health controversy over leaded gasoline during the 1920s: https://pmc.ncbi.nlm.nih.gov/articles/PMC1646253/

https://hal.science/hal-03924698/document


> The planet Mars is a gift from God for humanity

I bet I can guess the name of the god too!


God really lowered his standards after he created Earth.


“Well, created one nice one, and one desolate, irradiated, lifeless shit-hole! My work here is done!” Haha


There's just no economic case for fusion. It's useful research, but current fission does the job better, and we already have decades of proven reserves, centuries likely if we kept looking for new reserves ... and then thousands of years from sea water extraction.

There's also many paths to improved fission. Fast neutron reactors, thorium, small fast neutron reactors for industrial heat, thorium reactors, accelerator-driven subcritical reactors ... Millions of years of fuel available and new ways to use the output beyond boiling water for electricity.

Note that I'm not mentioning slow neutron SMR, they're mostly pointless and just an excuse not to build current and perfectly fine PWR/BWR/heavy water reactors.


I like the idea of the passively-safe, waste-reducing LFTR but it's still a materials science issue at this point, and there's no real solution in sight.

Fission still has this huge stigma about "nuclear=dangerous and bad" which clearly isn't true with the growing number of passively-safe designs... but nobody wants to fund development of those into proper commercial reactors.

Meanwhile, fusion is still different and futuristic enough to have support from governments and the general public.


> I like the idea of the passively-safe, waste-reducing LFTR but it's still a materials science issue at this point, and there's no real solution in sight.

Seems ironic that in a thread about fusion with loads of difficult technical challenges that will still require decades of research after 60 years of investment and research have already been poured into it, a minor issue of slight corrosion in LFTR requiring maybe a few years of research is seen as an insurmountable obstacle with "no real solution in sight".


The solution may already be here, in the form of ceramics already used in aeronautics. A French startup is working on a small reactor for industrial heat with them.


Yeah but I still think it would be a great scientific achievement and should be pursued.

Fusion has better security properties than fission, so perhaps it will find some use case in the far future.


I think the funding has had a modest stimulus, and that was always the locus of causation for "perpetually x years away." Private fusion especially (but I do think their claims are somewhat overstated).


It's insane how many people like that are out there. "Fission is bad, fusion is bad, we should only do renewables." C'mon, fission brought us where we are and fusion might be the future. I believe they both deserve further research and improvements.


It’s a common fallacy: “$thing is good, new and exciting, therefore everything else is old and rubbish”. The pattern is very easy to see if we pay attention. It’s very common in tech circles, where people tend to be easily excited about new things.


This has always seemed wild to me. New tech always always sucks. In complex problem spaces it takes years to effectively identify use cases, edge cases, and bugs and get all that shit ironed out, and yet the enthusiasm you speak of is pervasive.


That's because tech fetishists are buying hope and optimism. If they actually cared about how to build the future they would be into engineering:

Paperwork, standards, logistics, non-destructive tests, monitoring, certification, other "boring" stuff.

Tech people LOVED bitching about how complicated the USB-C standard is, how it does too much, etc.

Guess what? As a consumer, I can plug pretty much anything into anything else, use literally any brick to charge nearly any device, deliver outstanding amounts of wattage over cables the size of headphone wires, for pretty cheap, and USB-C docks that you just plug into whatever and things just hook up and function.

It does that because of the millions spent on human beings spending time to work out bugs, work around edge cases, discover what people tolerate and care about with the standard, etc.

Consumers ignored all the complaints about it being complicated and just fucking used it and it's ubiquitous and works for pretty much everyone and the only people who have bad experiences are the ones buying exclusively fraudulent cables off amazon and only some of those people are hitting those problems!

I can just plug a cable into the power port and get HDMI out of my steam deck. Holy shit.

THAT'S the future.


I do enjoy how mindless some of the fusion advocacy is.

Why do you think a result like this would make anyone less skeptical of fusion? Ability to run a device for this long is not the obstacle to success for nuclear fusion. This is just another vastly overhyped "breakthrough", which we seem to have every week.

I've followed fusion for probably longer than you've been alive, and there are fundamental showstoppers for the common approaches, particularly tokamaks and stellarators. Fusion may have a chance with unconventional approaches, like Helion's, but the consensus approach looks like an exercise in groupthink that won't lead anywhere.


>Why do you think a result like this would make anyone less skeptical of fusion?

Just 9 days ago: https://news.ycombinator.com/item?id=43000301

>>Ability to run a device for this long is not the obstacle to success for nuclear fusion.

What an odd take. Do you also consider the list of flight endurance records to be immaterial to aircraft evolution?

https://en.wikipedia.org/wiki/Flight_endurance_record


Focus on relatively unimportant subtasks has a name. It's called "bikeshedding".

This achievement is relatively unimportant. It's not the major issue that would block a DT fusion reactor. As such, achieving it doesn't move the needle much on the plausibility of DT fusion in tokamaks.


This is not the definition of bikeshedding.


Bikeshedding is focusing on triviality at the expense of important issues. That is just what is happening here. The showstopper issues with tokamaks are size, cost, reliability, materials. This has nothing to do with any of those.


>The showstopper issues with tokamaks are size, cost, reliability, materials.

Plasma runtime in not a showstopper then?


It's an issue that wasn't high risk. I mean, sure, check off the box, but don't pretend this moves the needle much. This was minor compared to those issues that have no clear solutions, or reasons to think there won't be solutions.


Agreed with all of this. And, there’s an implicit criticism of science journalism here. Any article that suggests useful fusion reactors are X years away should address the massive unsolved engineering problems that have to be surmounted. But no news source is going to spend five or six extra paragraphs explaining neutron metal activation or hydrogen embrittlement.


I don't hate it, but am not fanboy either. Imagine you can have nuclear fission and uranium is already found in nature ready to go to the reactor. Even in that case, nuclear fission could not beat solar or eolic ROI.

Even if nuclear fussion had the advantage of free combustible, the costs of building and manteinance alone could make it not practical. As of today it's not enough to have positive net return, but to have a LCOE of maybe $60/MWh (and going down). Current estimates put fussion at $120/MWh.

If it can't keep up with solar and eolic rade of fallig prices, it might be only suitable to replace fission power (which is not falling), about 10% of the grid. And there have been literally billions spent in research.


> Even in that case, nuclear fission could not beat solar or eolic ROI.

Neither solar or wind are free. There are costs associated e.g. with building, shipping, maintaining, decommissioning these things (and hopefully at some point recycling, but that’s not solved). Looking at the whole picture, these costs are not that different. These technologies are complementary, they have very different characteristics.

> Current estimates put fussion at $120/MWh.

Current estimates are completely unreliable, because no industrial-scale demonstrator was built. They are a useful tool for planning and modeling, but not solid enough to build an industrial strategy on them. (And it’s “fusion”)


Did anybody say they are free? But the costs of running solar or eolic are way lower than the costs of running fission, or the costs that likely would be running a fusion central. In case you don't know what ROI means, it is return on investment (i.e. building, shipping, mantaining decomission...).

As of today, we are closer to mass batteries as renewable companion than fusion, at least in terms of ROI. If both end up competing for lithium, it would go to batteries unless fusion becomes dirty cheap.

Current estimations are useful because they mark the starting point for fusion: they are at around 120. They need to reach 80 to replace fission. They need to reach 60 to replace batteries. Assuming batteries don't get better ROI.

Same numbers were useful 30 years ago for solar: it was fully functional, but not yet economically sound. It was not much than a toy and a promise (as it is fusion today). Only when prices made sense it turned to a serious energy source.


About lithium: DT fusion needs mostly Li-6. If it were separated, batteries would work just fine with Li-7.

I recall a story of some lab that was trying to make a lithium-based neutron detector. It wouldn't work, and when they investigated they discovered the lithium they had bought was almost pure Li-7. It was surplus sold back into the chemicals market from the US hydrogen bomb program (which needed Li-6).


I don't think current costs for fusion are useful for modeling, or really anything, because there's nothing there yet. We don't even have prototypes.

But if there is not a clear and speedy path to get fusion to $30/MWh it's not going to make it. Batteries, solar wind, and geothermal are all busy deploying and getting cheaper every month, year, and decade. The grid system possible with 2035's solar and battery tech is going to be completely unimaginable to today's grid ops.


> As of today it's not enough to have positive net return, but to have a LCOE of maybe $60/MWh

If you don't count externalities (see cost of firming intermitency [1]).

> (and going down).

Not the last two years according to LCOE+ 2024. the main culprit is inflation, but the curve was nearing flat anyway.

[1]: https://www.lazard.com/media/gjyffoqd/lazards-lcoeplus-june-...


When I go to https://model.energy/ and solve for the cost of energy from renewables + storage in the US, using 2030 cost assumptions, the cost is less than $0.05/kWh. This is providing synthetic 24/7/365 baseload power, so all intermittency has been taken care of.


Problem solved then?

We should give the folks at model.energy the next peace prize for their effort.


You don't have to go that far, but you could at least listen honestly to what that model, and what more complex models, are telling you.


You're answering to a post I made saying that prices that were quoted from the Lazard report don't reflect all that's writen in the report.

Your answer gives a model unrelated to the figures I was discussing, with extremely agressive prices [1] set as hypotheses and zero network costs factored in. Sure, I can accept it as a minimum limit for the cost of a system, but that's not a very useful information, and you're not quoting this price as a lower limit either.

I don't see an honesty issue here, just someone believing that spherical cows are going to produce milk tomorrow.

[1]: e.g. the Lazard report quotes utility PV at $29-92/MWh, while your tool quotes it at 21.7€/MWh.


Solar is cheap, but it's only a supplementary power source. If you add in energy costs it becomes much, much more expensive than fission.

The elephant in the room is natural gas which is the true competitor to fission and is still dirt cheap in the US.


No, with proper system design solar + wind + storage is cheaper than new construction nuclear.

There's a reason China is installing two orders of magnitude more solar than nuclear these days (nameplate capacity basis).


China is also the top consumer in the world of coal and they continue to break their record every year.

On the margin I don't argue that renewables are cheaper, but you still need a way to generate base load power on demand.


You can generate baseload power from non-baseload sources. Renewables + storage can do the whole job. There is no need for anything called a "baseload source". Indeed, that label indicates a deficiency, not a capability: it's a source that has to run (almost) all the time to make economic sense.


China needs every power generation it can build.


I've seen cost estimates around there for tokamaks. If Helion actually works, their estimate is more like $20/MWh, and it looks pretty plausible given their reactor design. They would have relatively low neutron radiation, direct electricity extraction without a turbine, factory-built reactors transportable by rail, and no particularly expensive components like superconductors or fancy lasers.

Some of the other designs also look relatively cheap. Tokamaks are just the one we understand the best, so we have the highest confidence that they'll work.


We have highest confidence that tokamaks will "work" in the sense of reaching a physics goal. We have very little confidence tokamaks will "work" in the sense of reaching an engineering/economic goal. Too often the former is confused with the latter in these discussions.


No argument there, I just didn't spell it out since we were already throwing around specific levelized costs anyway.


Actually, this only reinforces "fusion is only 10 years away".

I blame journalists not being able to proprely report on this subject.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: