Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
OpenAI and Nvidia announce partnership to deploy 10GW of Nvidia systems (openai.com)
458 points by meetpateltech 1 day ago | hide | past | favorite | 602 comments




Framing it in gigawatts is very interesting given the controversy about skyrocketing electric prices for residential and small business users as a result of datacenters over the past three years, primarily driven by AI growth. If, as another commenter notes, this 10GW is how much Chicago and NYC use combined, then we need to have a serious discussion about where this power is going to come from given the dismal status of the USA's power grid and related infrastructure and the already exploding costs that have been shifted to residential users in order to guarantee electric supply to these biggest datacenters (so they can keep paying peanuts for electricity and avoid shouldering any of the infrastructural burden to maintain or improve the underlying grid/plants required to meet their massive power needs).

I'm not even anti-datacenter (wouldn't be here if I were), I just think there needs to be serious rebalancing of these costs because this increase in US residential electric prices in just five years (from 13¢ to 19¢, a ridiculous 46% increase) is neither fair nor sustainable.

So where is this 10GW electric supply going to come from and who is going to pay for it?

Source: https://fred.stlouisfed.org/series/APU000072610

EDIT:

To everyone arguing this is how DCs are normally sized: yes, but normally it's not the company providing the compute for the DC owner that is giving these numbers. nVidia doesn't sell empty datacenters with power distribution networks, cooling, and little else; nVidia sells the GPUs that will stock that DC. This isn't a typical PR netnewswire bulletin "OpenAI announces new 10GW datacenter", this is "nvidia is providing xx compute for OpenAI". Anyway, all this is a segue from the question of power supply, consumption, grid expansion/stability, and who is paying for all that.


I work in the datacenter space. The power consumption of a data center is the "canonical" way to describe their size.

Almost every component in a datacenter is upgradeable—in fact, the compute itself only has a lifespan of ~5 years—but the power requirements are basically locked-in. A 200MW data center will always be a 200MW data center, even though the flops it computes will increase.

The fact that we use this unit really nails the fact that AI is basically refining energy.


  A 200MW data center will always be a 200MW data center, even though the flops it computes will increase.
This here underscores how important TSMC's upcoming N2 node is. It only increases chip density by ~1.15x (very small relative to previous nodes advancements) but it uses 36% less energy at the same speed as N3 or 18% faster than N3 at the same energy. It's coming at the right time for AI chips used by consumers and energy starved data centers.

N2 is shaping up to be TSMC's most important node since N7.


> N2 is shaping up to be TSMC's most important node since N7

Is it?

N2, from an energy & perf improvement seems on par with any generation node update.

          N2:N3   N3:N5  N5:N7
  Power   ~30%    ~30%    ~30%
  Perf    ~15%    ~15%    ~15%
https://www.tomshardware.com/news/tsmc-reveals-2nm-fabricati...

Yes. It has more tape outs at this stage of development than both N5 or N3. It’s wildly popular for chip designers it seems.

I thought Apple gets exclusive access to the latest node for the first 1-2 years. Is that not the case?

No. That's not the case. Maybe for a few months only.

I love that term "refining energy". We need to plan for massive growth in electricity production to have the supply to refine.

Sounds smart but it’s abusing the semantics of “refine” and is therefore ultimately vacuous.

I think it is really just the difference between chemically refining something and electrically refining something.

Raw AC comes in, then gets stepped down, filtered, converted into DC rails, gated, timed, and pulsed. That’s already an industrial refinement process. The "crude" incoming power is shaped into the precise, stable forms that CPUs, GPUs, RAM, storage, and networking can actually use.

Then those stable voltages get flipped billions of times per second into ordered states, which become instructions, models, inferences, and other high-value "product."

It sure seems like series of processes for refining something.


It is the opposite of refining energy. Electrical energy is steak, what leaves the datacenter is heat, the lowest form of energy that we might still have a use for in that concentration (but most likely we are just dumping it in the atmosphere).

Refining is taking a lower quality energy source and turning it into a higher quality one.

What you could argue is that it adds value to bits. But the bits themselves, their state is what matters, not the energy that transports them.


I think you're pushing the metaphor a bit far, but the parallel was to something like ore.

A power plant "mines" electron, which the data center then refines into words. or whatever. The point is that energy is the raw material that flows into data centers.


Maybe more like converting energy to data, as a more specific type of refinement.

Using energy to decrease the entropy of data. Or to organize and structure data.

This is OpenAI, they are not decreasing the entropy. This is refining coal into waste heat and CO2.

I like that. Take random wild electrons and put them neatly into rows & columns where they can sit a spell.

All life is basically refining energy - standing up to entropy and temporarily winning the fight.

It's all about putting the entropy somewhere else and keeping your own little area organised.

People of the earth, remember: unnecessary arm and leg movements increase the entropy! Fear of the heat death of the universe! Lie down when possible!

Yes, in a very local context it appears so, but net entropy across the system from life's activities is increased

"the purpose of life is to hydrogenate carbon dioxide"

-- Michael Russel


> the power requirements are basically locked-in

Why is that? To do with the incoming power feed or something else?


Cooling too. A datacenter that takes 200MW in has to dissipate 200MW of heat to somewhere.

guessing massive capital outlays and maybe irreversible site selection/preparation concerns.

Basically, yes. When you stand up something that big, you need to work with the local utilities to ensure they have the capacity for what you're doing. While you can ask for more power later on, if the utilities can't supply it or the grid can't transport it, you're SOL.

You could in theory supplement it with rooftop solar and batteries, especially if you can get customers who can curtail their energy use easily. Datacentres have a lot of roof space, they could at least reduce their daytime energy costs a bit. I wonder why you don't see many doing solar, do the economics not work out yet?

I'd have to do the math, but I doubt that makes sense given the amount of power these things are drawing. I've heard of DCs having on-site power generation, but it's usually in the form of diesel generators used for supplemental or emergency power. In one weird case, I heard about a DC that used on-site diesel as primary power and used the grid as backup.

Compared to their volume they absolutely do not: you get about ~1kW / m^2 of solar. Some quick googling suggests a typical DC workload would be about 50 kW / m^2, rising too 100 for AI workloads.

Refining it into what? Stock prices?

That's pretty interesting. Is it just because the power channels are the most fundamental aspect of the building? I'm sorta surprised you can't rip out old cables and drop in new ones, or something to that effect, but I also know NOTHING about electricity.

Not an expert, but it’s probably related to cooling. Every joule of that electricity that goes in must also leave the datacenter as heat. And the whole design of a datacenter is centered around cooling requirements.

Exactly. To add to that, I'd like to point out that when this person says every joule, he is not exaggerating (only a teeny tiny bit). The actual computation itself barely uses any energy at all.

This.

A local to me ~40W datacenter used to be in really high demand, and despite having excess rack space, had no excess power. It was crazy.


40W - is that ant datacenter? :)

Yeah, it was the companies pilot site, and everything about it is tiny.

But it very quickly became the best place in town for carrier interconnection. So every carrier wanted in.

Even when bigger local DC's went in, a lot of what they were doing was just landing virtual cross connects to the tiny one, because thats where everyone was.


You lost a M or K next to your W.

I still have an Edison bulb that consumes more power.


Yep I see that haha.

Where do the cards go after 5 years? I don't see a large surplus of mid sized cloud providers coming to buy them (cause AI isn't profitable), Maybe other countries (possibly illegally)? Flood the consumer market with cards they can't use? TSMCs' more than doubled packaging and they are planning on doubling again

DC infra is always allocated in terms of watts. From this number, everything else is extrapolated (e.g. rough IT load, cooling needed, etc).

> is neither fair nor sustainable

That's half what I pay in Italy, I'm sure the richest country in the world will do fine.


Here in Belgium a stupid amount of that bill is hidden taxes. i kind of assume it's similar in Italy.

We import most of our energy, that's really it.

And the substantial increase in profits for all providers, which isn't comparable to that of our neighbours. Our disposable income in Belgium really exists to subsidise energy companies, supermarkets, and a pathetic housing market.

>I'm sure the richest country in the world will do fine.

You underestimate how addicted the US is to cheap energy and how wasteful it is at the same time.

Remember how your lifestyle always expands to fill the available resources no matter how good you have it? Well if tomorrow they'd have to pay EU prices, the country would have a war.

When you lived your entire life not caring about the energy bill or about saving energy, it's crippling to suddenly have scale back and be frugal even if that price would still be less than what other countries pay.


I had my highest power bill last month in 4 years, in a month that was unseasonably cool so no AC for most of the month. Why are we as citizens without equity in these businesses subsidizing the capital class?

An 8% increase y/o/y is quite substantial, however keep in mind globally we experienced the 2022 fuel shock. In Australia for example we saw energy prices double that year.

Although wholesale electricity prices show double-digit average year-on-year swings, their true long-run growth is closer to ~6% per year, slightly above wages at ~4% during the same period.

So power has become somewhat less affordable, but still remains a small share of household income. In other words, wage growth has absorbed much of the real impact, and power prices are still a fraction of household income.

You can make it sound shocking with statements like “In 1999, a household’s wholesale power cost was about $150 a year, in 2022, that same household would be charged more than $1,000, even as wages only grew 2.5x”, but the real impact (on average, obviously there are outliers and low income households are disproportionately impacted in areas where gov doesn’t subsidise) isn’t major.

https://www.aer.gov.au/industry/registers/charts/annual-volu...


I wouldn’t call a $100-270 electric bill a “fraction” when it’s about 5% post tax income. I use a single light on a timer and have a small apartment

Especially since these sorts of corporations can get tax breaks or har means of getting regulators to allow spreading the cost. Residential shouldn’t see any increase due to data centers, but they do, and will, supplement them while seeing minimal changes to infrastructure

When people are being told to minimize air conditioning but then these big datacenters are made and aren’t told “reduce your consumption” then it doesn’t matter how big or small the electric bill is, it’s supplementing a multi billion dollar corporation’s toy


6% YoY is much higher than the 2-3% inflation target

So a 6.6x increase in power bill, offset by a 2.5x wage increase has no major impact?

I'm sure none of the other outgoings for a household saw similar increases. /s


> So where is this 10GW electric supply going to come from and who is going to pay for it?

I would also like to know. It's a LOT of power to supply. Nvidia does have a ~3% stake in Applied Digital, a bitcoin miner that pivoted to AI (also a "Preferred NVIDIA Cloud Partner") with facilities in North Dakota. So they might be involved for a fraction of those 10GW, but it seems like it will be a small fraction even with all the planned expansions.

https://www.investopedia.com/applied-digital-stock-soars-on-...

https://ir.applieddigital.com/news-events/press-releases/det...


> Framing it in gigawatts is very interesting given the controversy

Exactly. When I saw the headline I assumed it would contain some sort of ambitious green energy build-out, or at least a commitment to acquire X% of the energy from renewable sources. That's the only reason I can think to brag about energy consumption


Or this brings power and prestige to the country that hosts it. And it gives clout precisely because it is seemingly wasteful. Finding the energy is a problem for the civilian government who either go "drill baby drill" or throw wind/solar/nuclear at the problem.

Watt is the hottest new currency in big tech. Want to launch something big? You don't have to ask for dollars or headcount or servers or whatever else used to be the bottleneck in the past. There's plenty of all this to go around (and if not it can be easily bought). Success or failure now depends on whether you can beg and plead your way to getting a large enough kilowatt/megawatt allocation over every other team that's fighting for it. Everything is measured this way.

Explains why Meta is entering power trading space

https://subscriber.politicopro.com/article/eenews/2025/09/22...


That gives me Enron vibes, even though these are vastly different situations. But the idea of a social media company trading in this space is nuts.

To me, the question is less about “how do we make more energy” and more about “how do we make LLMs 100x more energy efficient.” Not saying this is an easy problem to solve, but it all seems like a stinky code smell.

I'm pretty confident that if LLMs were made 100x more energy efficient, we would just build bigger LLMs or run more parallel inference. OpenAI's GPT-5 Pro could become the baseline, and their crazy expensive IMO model could become the Pro offering. Especially if that energy efficiency came with speedups as well (I would be surprised if it didn't). The demand for smarter models seems very strong.

0,19 per kwh. Damn man, here it is like 0,97 per kwh (Western Europe) … stop complaining

Regulated price in France:

- 0,1952 per kWh for uniform price.

- 0,1635 / 0,2081 for day/nigh pricing

- 0,1232 /... / 0,6468 for variable pricing

https://particulier.edf.fr/content/dam/2-Actifs/Documents/Of...

You have a very bad deal if you pay 0.97€ per kWh.


This is not true. The average in the EU is 0,287 €/kWh. I pay 0,34 €/kWh in Berlin.

And in Germany the price includes transmission and taxes, it's the consumer end price. You have to remember that some countries report electricity price without transmission or taxes, also in consumer context, so you need to be careful with comparisons.

DCs need to align their training cycles with the peak of renewable power generation

They are starting to include batteries so they dont have to adjust to external factors

Utilities always need to justify rate increases with the regulator.

The bulk of cost increases come from the transition to renewable energy. You can check your local utility and see.

It’s very easy to make a huge customer like a data center directly pay the cost needed to serve them from the grid.

Generation of electricity is more complicated, the data centers pulling cheap power from Colombia river hydro are starting to compete with residential users.

Generation is a tiny fraction of electricity charges though.


Datacenters need to provide their own power/storage, and connect to the grid just to trade excess energy or provide grid stability. Given the 5-7 year backlog of photovoltaic projects waiting for interconnect, the grid is kind of a dinosaur that needs to be routed around

“ skyrocketing electric prices for residential and small business users as a result of datacenters over the past three years”

This is probably naïve. Prices skyrocketed in Germany for similar reasons before AI data centers were a thing.


Prices of _everything_ went up over the past five years. Datacenter expansion was far from the main driver. Dollars and cents aren't worth what they used to be.

Elsewhere it was mentioned that DCs pay less for electricity per Wh than residential customers. If that is the case, then it's not just about inflation, but also unfair pricing putting more of the infrastructure costs on residential customers whereas the demand increase is coming from commercial ones.

Industrial electricity consumers pay lower unit rates per kWh, but they also pay for any reactive power that they consume and then return -- residential consumers do not. As in, what industrial consumers actually pay is a unit cost per kVAh, not kWh.

This means loads with pretty abysmal power factors (like induction motors) actually end up costing the business more money than if they ran them at home (assuming the home had a sufficient supply of power).

Further, they get these lower rates in exchange for being deprioritised -- in grid instability (e.g. an ongoing frequency decline because demand outstrips available supply), they will be the first consumers to be disconnected from the grid. Rolling blackouts affecting residential consumers are the last resort.

There are two sides to this coin.

Note that I am in no way siding with this whole AI electricity consumption disaster. I can't wait for this bubble to pop so we can get back to normality. 10GW is a third of the entire daily peak demand of my country (the United Kingdom). It's ridiculous.

Edit: Practical Engineering (YouTube channel) has a pretty decent video on the subject. https://www.youtube.com/watch?v=ZwkNTwWJP5k


This feels like a return to the moment just before Deepseek when the market was feeling all fat and confident that "more GPUs == MOAR AI". They don't understand the science, so they really want a simple figure to point to that means "this is the winner".

Framing it in GW is just giving them what they want, even if it makes no sense.


I mean gigawatts is a concise metric to get a grasp of the amount of gpu compute they install, but the honesty seems a bit strange to me imo.

Total gigawatts is the maximum amount of power that can be supplied from the power generating station and consumed at the DC through the infrastructure and hardware as it was built.

Whether they use all those gigawatts and what they use them for would be considered optional and variable from time to time.


> So where is this 10GW electric supply going to come from

If the US petro-regime wasn't fighting against cheap energy sources this would be a rounding error in the country's solar deployment.

China deployed 277GW of solar in 2024 and is accelerating, having deployed 212GW in the first half of 2025. 10 GW could be a pebble in the road, but instead it will be a boulder.

Voters should be livid that their power bills are going up instead of plummeting.


Fyi capacity announced is very far from the real capacity when dealing with renewables. It's like saying that you bought a Ferrari so now you can drive at 300km/h on the road all of the time.

In mid latitudes, 1 GW of solar power produces around 5.5 GWh/day. So the "real" equivalent is a 0.23 GW gas or nuclear plant (even lower when accounting for storage losses).

But "China installed 63 GW-equivalent" of solar power is a bit less interesting, so we go for the fake figures ;-)


You think they don't know that too? You can bet they're investing heavily in grid-level storage too.

I was commenting the initial number announcement. And storage at this scale right now doesn't exist. The most common way, water reservoirs, requires hard-to-find sites that are typically in the Himalaya, so far away from the production place. And the environmental cost isn't pretty either.

How would that solar power a DC at night or on a cloudy day? Energy storage isn’t cheap.

In 2025 it’s cheaper to demolish an operating coal plant and replace it with solar and battery, and prices are still dropping.

Why aren't all these businesses doing that then?

I'm living in one of the most expensive electricity markets in the US. It has a lot more to do with the state shutting down cheap petro energy (natural gas) and nuclear then replacing it with... tbd.

I actually don’t like this measurement, as it’s vague and dilutes the announcement. Each product has a different efficiency of watts.

Imagine Ford announced “a strategic partnership with FedEx to deploy 10 giga-gallons of ICE vehicles”


It’s a sticky metric though because Moores law per power consumption died years ago.

Theoretically couldn't you use all the waste heat from the data center to generate electricity again, making the "actual" consumption of the data center much lower?

Given that steam turbine efficiency depends on the temperature delta between steam input and condenser, unlikely unless you're somehow going to adapt Nvidia GPUs to run with cooling loop water at 250C+.

Thermodynamics says no. In fact you have to spend energy to remove that heat from the cores.

(Things might be different if you had some sort of SiC process that let you run a GPU at 500C core temperatures, then you could start thinking of meaningful uses for that, but you'd still need a river or sea for the cool side just as you do for nuclear plants)


In the Nordics the waste heat is used for district heating. This practical heat sink really favors northern countries for datacenter builds. In addition you usually get abundant water and lower population density (meaning easier to build renewables that have excess capacity).


> letter of intent for a landmark strategic partnership

> intends to invest up to xxx progressively

> preferred strategic compute and networking partner

> work together to co-optimize their roadmaps

> look forward to finalizing the details of this new phase of strategic partnership

I don't think I have seen so much meaningless corporate speak and so many outs in a public statement. "Yeah we'll maybe eventually do something cool".


NVDA's share price enjoyed a nice $6 bump today, so the announcement did what it was supposed to do.

In a sense, it's just an ask to public investors for added capital to do a thing, and evidently a number of investors found the pitch compelling enough.


Increase in share price doesn't provide extra cash to a company. They'd have to issue new shares for that.

It doesn't directly, but it helps because they can do deals where they buy things with stock, like people's labor or small companies, and now that "money" is more valuable.

It does help with employee stock compensation. If your stock doubled in the past year, then you just need to dole out 50% of shares as last year in equity refreshers to retain talent.

Nvidia probably has the opposite problem - employee stock has appreciated so much that you have to convince them not to retire.

Maybe but people's spending also dramatically goes up as they start making more money. You buy that $5m vacation home at Tahoe, you buy fully-loaded Rivian SUVs, you send your kids to expensive private schools, you fly only first-class on family vacations, and you are back to needing to work more to sustain this lifestyle.

This assumes your staff are not a bunch of boglehead freaks constantly on blind and crunching spreadsheets and grinding their leetcode for that perfectly timed leap.

RSU vesting is a bit like options. You have the option but not the obligation to stay in the job!


But company owns stock right? So they can sell those stocks no?

It can, but investors don't like that since it dilutes the value of their own shares. Which is why large companies usually do the opposite - share buybacks. Nvidia in fact bought $24 billion worth of its own shares in the first half of 2025, and plans to spend $60 billion more in buybacks in upcoming months.

Which investors also usually don't like. It says "we have all this cash, but we have no idea what to do with it so we are buying out own stock". While I'd expect a company to actually invest (into research, tech, growth etc.) with it's excess cash to make more money in the future.

Preferred by some to dividends.

If stock buybacks cause the price to go up like it should in theory, that's less of a tax hit than dividends! I'll take it

That has to be compared with how much stock the company is “selling”, via equity compensation to employees.

The "meaning" is clear, create FOMO among suckers.

For someone who doesn't know what a gigawat worth of Nvidia systems is, how many high-end H100 or whatever does this get you? My estimates along with some poor-grade GPT research leads me to think it could be nearly 10 million? That does seem insane.

It's a ridiculous amount claimed for sure. If its 2 kW per it's around 5 million, and 1 to 2 kW is definitely the right ballpark at a system level.

The NVL72 is 72 chips is 120 kW total for the rack. If you throw in ~25 kW for cooling its pretty much exactly 2 kW each.


So each 2KW component is like a top-shelf space heater which the smart money never did want to run unless it was quite cold outside.

It will be the world's most advanced resistor.

Safely in "millions of devices." The exact number depends on assumptions you make regarding all the supporting stuff, because typically the accelerators consume only a fraction of total power requirement. Even so, millions.

"GPUs per user" would be an interesting metric.

(Quick, inaccurate googling) says there will be "well over 1 million GPUs" by end of the year. With ~800 million users, that's 1 NVIDIA GPU per 800 people. If you estimate people are actively using ChatGPT 5% of the day (1.2 hours a day), you could say there's 1 GPU per 40 people in active use. Assuming consistent and even usage patterns.

That back of the envelope math isn't accurate, but interesting in the context of understanding just how much compute ChatGPT requires to operate.

Edit: I asked ChatGPT how many GPUs per user, and it spit out a bunch of calculations that estimates 1 GPU per ~3 concurrent users. Would love to see a more thorough/accurate break down.


I'm kinda scared of "1.2 hours a day of ai use"...

Sorry, those figures are skewed by Timelord Georg, who has been using AI for 100 million hours a day, is an outlier, and should have been removed.

Roger, but I still think with that much energy at its disposal, if AI performs as desired it will work it's way up to using each person more than 1.2 hours per day, without them even knowing about it :\

When GPUs share people concurrently, they collectively get much more than 24 hours of person per day.

You're right!

With that kind of singularity the man-month will no longer be mythical ;)


It will be epic!

A lot of GPUs are allocated for training and research, so dividing the total number by the number of users isn’t particularly useful. Doubly so if you’re trying to account for concurrency.

Before reading your comment I did some napkin math using 600W per GPU: 10,000,000,000 / 600 = 16,666,666.66...

With varying consumption/TDP, could be significantly more, could be significantly less, but at least it gives a starting figure. This doesn't account for overhead like energy losses, burst/nominal/sustained, system overhead, and heat removal.


It's easy to see what it could be by looking at Green500.

B200 is 1kW+ TDP ;)

And consists of 8 GPUs

Account for around 3MW for every 1000 GPUs. So, 10GW is around 333 * 10 * 3MW so 3.33 * 1k * 1k GPUs, so around 3.33 M GPUs

Vera Rubin will be about 2.5kw and Feynman will be about 4kw.

All-in, you’re looking at a higher footprint maybe 4-5kw per GPU blended.

So about 2 million GPUs.


How much cable (and what kind) to connect them all? That number would be 100x the number of gpus. I would think they just clip on metal racks no cables but then I saw the xai data center that can blue wire cables everywhere

It was announced last week that Nvidia acquired-hired a company that can connect more than 100,000 GPUs together as a cluster that can effectively serve as a single integrated system.

Do you have a link or info?


Thank you

I think it's called enfrabica or something similar

and How much is that in terms of percentage of bitcoin network capacity ?

Bitcoin mining consumes about 25 GW: https://ccaf.io/cbnsi/cbeci so this single deal amounts to about 40% of that.

To be clear, I am comparing power consumption only. In terms of mining power, all these GPUs could only mine a negligible fraction of what all specialized Bitcoin ASIC mine.

Edit: some math I did out of sheer curiosity: a modern top-of-the-line GPU would mine BTC at about 10 Ghash/s (I don't think anyone tried but I wrote GPU mining software back in the day, and that is my estimate). Nvidia is on track to sell 50 million GPUs in 2025. If they were all mining, their combined compute power would be 500 Phash/s, which is 0.05% of Bitcoin's global mining capacity.


I'm also wondering what kind of threat this could be to PoW blockchains.

Literally none at all because asic

Some chains are designed to be ASIC resistant.

What happens if AI doesn't pay off before the GPUs wear out or are in need of replacement?

So at that point a DC replaces them all with ASICs instead?

Or if they just feel like doing that any time.


At this scale, I would suggest that these numbers are for the entire data center rather than a sum of the processor demands. Also the "infrastructure partnership " language suggest more than just compute. So I would add cooling into the equation, which could be as much a half the power load, or more depending on where they intend to locate these datacenters.

You could think of it as "as much power as is used by NYC and Chicago combined". Which is fucking insanely wasteful.

I dunno.

Google is pretty useful.

It uses >15 TWh per year.

Theoretically, AI could be more useful than that.

Theoretically, in the future, it could be the same amount of useful (or much more) with substantially less power usage.

It could be a short-term crunch to pull-forward (slightly) AI advancements.

Additionally, I'm extremely skeptical they'll actually turn on this many chips using that much energy globally in a reasonable time-frame.

Saying that you're going to make that kind of investment is one thing. Actually getting the power for it is easier said than done.

VC "valuations" are already a joke. They're more like minimum valuations. If OpenAI is worth anywhere near it's current "valuations", Nvidia would be criminally negligent NOT to invest at a 90% discount (the marginal profit on their chips).


According to Google's latest environmental report[1] that number was 30 TWh per year in 2024, but as far as I can tell that's their total consumption of their datacenters, which would include everything from Google Search, to Gmail, Youtube, to every Google Cloud customer. Is it broken down by product somewhere?

30 TWh per year is equivalent to an average power consumption of 3.4 GW for everything Google does. This partnership is 3x more energy intensive.

Ultimately the difference in `real value/MWh` between these two must be many orders of magnitude.

[1] https://sustainability.google/reports/google-2025-environmen...


Data centers typically use 60% (or less) on average of their max rating.

You over-provision so that you (almost) always have enough compute to meet your customers needs (even at planet scale, your demand is bursty), you're always doing maintenance on some section, spinning up new hardware and turning down old hardware.

So, apples to apples, this would likely not even be 2x at 30TWh for Google.


For other readers: "15 Twh per year" is equivalent to 1.71 GW, 17.1% of the "10GW" number used to describe the deal.

This is ignoring the utilization factor though. Both Google and OpenAI have to overprovision servers for the worst case simultaneous users. So 1.71 GW average doesn't tell use the maximum instantaneous GW capacity of Google -- if we pull a 4x out of the hat (i.e. peak usage is 4x above average), it becomes ~7 GW of available compute.

More than a "Google" of new compute is of course still a lot, but it's not many Googles' worth.


Does Google not include AI?

I mean if 10GW of GPUs gets us AGI and we cure cancer than that's cool, but I do get the feeling we're just getting uncannier chatbots and fully automated tiktok influencers

Current llms are just like farms. Instead of tomatoes by the pound you buy tokens by the pound. So it depends on the customers.

This is also my take. I think a lot of people miss the trees for the forest (intentionally backward).

AI that could find a cure for cancer isn't the driving economic factor in LLM expansion, I don't think. I doubt cancer researchers are holding their breath on this.


And when it’s built, Sam Altman will say: We are so close, if we get 10TW, AGI will be here next year!

Do you think the existence of NYC and Chicago is insanely wasteful?

We are way past peak LLM and it shows. They are basically advertise spacing heating as if it's some sort of advancement, while the tech seems to have stagnated, and they re just making the horses faster. The market should have punished this

It's 100% plausible and believable that there's going to be a spectacular bubble popping, but saying we are way past peak LLM would be like saying we were way past peak internet in 1999-2001 -- in reality, we weren't even remotely close to peak internet (and possibly still aren't). In fact, we were so far from the peak in 2001 that entire technological revolutions occurred many years later (e.g., smartphones) that just accelerated the pace even further in ways that would've been hard to imagine at the time. It's also important to note that AI is more than text-based LLMs -- self-driving cars and other forms of physical "embodied" AI are progressing at exponential pace, while entirely new compute form factors are only just now starting to emerge yet are almost certainly guaranteed to become pervasive as soon as possible (e.g., real AR glasses). Meanwhile, even plain-old text-based LLMs have not actually stagnated.

Sam Altman has called it a bubble already: https://www.cnbc.com/2025/08/18/openai-sam-altman-warns-ai-m... (Even a liar sometimes speaks the truth? I don’t know.)

  “You should expect OpenAI to spend trillions of dollars on data center construction in the not very distant future,” he told the room, according to a Verge reporter.

  “We have better models, and we just can’t offer them, because we don’t have the capacity,” he said. GPUs remain in short supply, limiting the company’s ability to scale.
https://finance.yahoo.com/news/sam-altman-admits-openai-tota...

So why would Altman say AI is in a bubble but OpenAI wants to invest trillions? Here's my speculation:

1. OpenAI is a private company. They don't care about their own stock price.

2. OpenAI just raised $8.3b 3 weeks ago on $300b valuation ($500b valuation today). He doesn't care if the market drops until he needs to raise again.

3. OpenAI wants to buy some AI companies but they're too expensive so he's incentivized to knock the price of those companies down. For example, OpenAI's $3b deal for Windsurf fell apart when Google stepped in and hired away the co-founder.

4. He wants to retain OpenAI's talent because Meta is spending billions hiring away the top AI talent, including talent from OpenAI. By saying it's in a bubble and dropping public sentiment, the war for AI talent could cool down.

5. He wants other companies to get scared and not invest as much while OpenAI continues to invest a lot so it can stay ahead. For example, maybe investors looking to invest in Anthropic, xAI, and other private companies are more shaky after his comments and invest less. This benefits OpenAI since they just raised.

6. You should all know that Sam Altman is manipulative. This is how he operates. Just google "Sam Altman manipulative" and you'll see plenty of examples where former employees said he lies and manipulates.


Altman wants OTHERS to spend trillions are GPU. He needs the scaling hype to continue so he can keep getting investors to put money in hopes of an AGI breakthrough. If there is no funding, OpenAI is immediately bankrupt.

Though "Compute infrastructure will be the basis for the economy of the future" doesn't sound that off. LLMs may go but compute will live on. Bit like web portals and optical fiber.

>We are way past peak LLM and it shows

The dot com bubble saw crazy deals and valuations, followed by a crash.

some companies emerged from it and went on to be a giant company like Amazon. Let's hope this AI boom have some similar outcomes.


In hindsight, the dot com bubble was really the dot com dip.

There will be a great market correction soon. Long term though it’ll still have some value, much like after the dot com crash the internet still remained useful. I hope.

> We are way past peak LLM

in the sense that all of the positive narrative is getting priced in.


Water is a critical resource in dwindling supplies in many water-stressed regions. These data centers have been known to suck up water supplies during active droughts. Is there anyone left at the EPA that gets a say in how we manage water for projects like this?

We deplete our midwestern aquifers to make ethanol which we burn, and we grow almonds in California.

Both of those have significantly more water impact. Both of those are significantly less useful.

Why not focus on issues that matter.


Food and fuel are significantly more useful than chatbots.

Why either/or? This is largely a tech forum so almond crops don't need to be the big area of focus or where we as a community can offer our best knowledge/coordination.

Water is much less of an issue than the media makes it out to be. It's a problem in some specific areas, yes, but power is a much better concern.

And even where water scarcity is a problem, heat exchangers can be configured to use wastewater. The Palo Verde plant does this.

Correct. There are a variety of solutions. Each DC is somewhat unique, but in general water isn't a huge concern. Cities make a big deal about it b/c they want the hyper scalers to give concessions such as processing gray water for the local muni.

Where is this water meme coming from? Surely the water is just pumped around, not actually used up?

Evaporative cooling effectively "uses up" the water. It's possible to run chillers instead, but that consumes more electricity, and some power plants also use evaporative cooling.

Some water usage has highly questionable counting methodologies.

Like using if a datacenter is using hydroelectric power you count the evaporation from the dam reservoir as "used water".

I'm not an expert but imo correct accounting should really only consider direct consumption. It's very silly when we play games like having petro states have very high carbon footprints even if they don't actually burn the fuel.


Some 1400 cubic kilometers of water evaporate every day on our blue planet here. The water isn't deleted, really.

Water pollution bigger danger than water usage. Look up videos of people whose water changes color after a data center was built nearby.

They cause all kinds of problems. We could even include all of the new methane power plants that will likely need to be built.

the e p what?

The entire premise of The Simpsons Movie is an artifact of another time. Sigh.

Am I correct that your argument is something like, "AI endangers our water supply"? If so, what evidence would it take for you to change your mind? Maybe someone here can provide it.

That's a lot. I always had this idea in the back of my mind that British Columbia should get in on the AI game and try and get data centers located in BC because we generally have a lot of "excess" hydro generation capacity. There's a new mega dam recently opened that had lots of criticism about it being "unneeded".

That mega dam (Site C) produces 1.1GW of energy.


onlyrealcuzzo wrote:

> Google is pretty useful. It uses 15 TWh per year.

15TWh per year is about 1.7GW.

Assuming the above figures, that means OpenAI and Nvidia new plan will consume about 5.8 Googles worth of power, by itself.

At that scale, there's a huge opportunity for ultra-low-power AI compute chips (compared with current GPUs), and right now there are several very promising technology pathways to it.


" there's a huge opportunity for ultra-low-power AI compute chips (compared with current GPUs), and right now there are several very promising technology pathways to it"

Sharing an example would be nice. Of how much power reduction are we talking here?


This one datacenter should be able to perform a 51% attack on any of the big cryptocurrencies with that much compute.

An interesting hedge in case the AI bubble pops.


Someone did the math above and said all of it would only be about 0.05 percent for Bitcoin.

I'm not sure about the GPU pow coins though


Nope, anything besides ASICs are useless for crypto mining.

Kansas City shuffle?

You're downvoted but it's a real threat. Imagine hackers or state sponsored entities use one of these mega data centers to destroy a few cryptocurrencies.

They are nothing compared to BTC Asics

Comparing 100 duck sized horses to 1 horse sized duck. Or perhaps the amount of GPUs is in the ratio of 1,000:1.

No one ever talks about the electricity demands for powering these things. Electric bills here in NJ via PSEG have spiked over 50% and they are blaming increased demand from datacenters, yet they don't seem to charge datacenters more?

https://www.datacenterdynamics.com/en/news/new-jersey-utilit...


A classic political games move, and it says more about how much anti-consumer nonsense is tolerated in New Jersey than it does about power generation and distribution pricing realities.

The data centers will naturally consolidate in areas with competitive electricity pricing.


Define, "competitive electricity pricing". Because surely this will make the electricity prices less competitive for the you and me...

This is called marginal pricing. Everyone pays the price of the marginal producer.

In some cases they try to get the data centres to pay for their infrastructure costs but the argument is that customers don't pay this normally but do so through usage fees over time.


That is on NJ government for allowing the price increases. They can easily say no.

In accounting terms, this is a shady business practice known as "round tripping" where you invest in a company for the sole purpose of them buying your product. It allows you to count your revenue multiple times.

Nvidia has consistently done this with Coreweave, Nscale, really most of its balance sheet investments are like this. On the one hand there's a vaguely cogent rationale that they're a strategic investor and it sort of makes sense as an hardware-for-equity swap; on the other, it's obviously goosing revenue numbers. This is a bigger issue when it's $100B than with previous investments.

It's a good time to gently remind everyone that there are a whole pile of legal things one can do to change how a security looks "by the numbers" and this isn't even close to the shadiest. Heck some sell-side research makes what companies themselves do look benign.


A relevant joke, paraphrased from the internet:

Two economists are walking in a forest when they come across a pile of shit.

The first economist says to the other “I’ll pay you $100 to eat that pile of shit.” The second economist takes the $100 and eats the pile of shit.

They continue walking until they come across a second pile of shit. The second economist turns to the first and says “I’ll pay you $100 to eat that pile of shit.” The first economist takes the $100 and eats a pile of shit.

Walking a little more, the first economist looks at the second and says, "You know, I gave you $100 to eat shit, then you gave me back the same $100 to eat shit. I can't help but feel like we both just ate shit for nothing."

"That's not true", responded the second economist. "We increased total revenue by $200!"


The punchline is supposed to be GDP, but yeah, same concept.

This should go without saying but unfortunately it really doesn't these days:

This kind of corporate behavior is bad and will end up hurting somebody. If we're lucky the fallout will only hurt Nvidia. More likely it will end up hurting most taxpayers.


Its the same loop de loop NVIDIA is doing with Coreweave as i understand.'Investing' in coreweave which then 'buys' NVIDIA merch for cloud rental , resulting in Coreweave being the top 4 customers of NVIDIA chips.

Wait, why the quotes? NVDA sends cash, and the Coreweave spends it, no? I don’t think quotes are accurate, if they imply these transactions aren’t real, and material. At the end of the day, NVDA owns Coreweave stock, and actual, you know, physical hardware is put into data centers, and cash is wired.

I don't really understand how it is round tripping.

In the end, Nvidia will have OpenAI shares, which are valuable, and OpenAI will have GPUs, which are also valuable. It is not fake revenue, the GPUs will be made, sold at market price, and used, they are not intended to be bought back and sold to another customer. And hopefully, these GPUs will be put to good use by OpenAI so that they can make a profit, which will give Nvidia some return on investment.

It doesn't look so different from a car loan, where the dealer lends you the money so that you can buy their car.


A dollar is always a dollar, so it's hard to claim that $1 million in revenue is actually worth $10 million. OpenAI shares, on the other hand, aren't publicly traded, so it's much easier to claim they're worth $10 million when noone would actually be willing to buy for more than $1 million.

It's not necessarily manipulative but it's also not exactly an arms-length purchase of GPUs on the open market.


It looks like NVDIA looking to move up the value chain to have a stake in the even higher margin/addressable market instead of simply selling the tools.

If OpenAI doesn't pan out than Nvidia has worthless OpenAI stock and OpenAI has a pile of mostly useless GPUs.

That’s still not round tripping?

They for example did a similar deal with Nscale just last week.

https://www.cnbc.com/2025/09/17/ai-startup-nscale-from-uk-is...


This is being done out in the open (we’re reading the press announcement) and will be factored into valuations.

Also, investing in OpenAI means they get equity in return, which is not a worthless asset. There is actual mutually beneficial trade occurring.


Is it counting revenue multiple times? It's buying your own products really, but not sure how that counts as double counting revenue

Customer A pays you $100 for goods that cost you $10. You invest $100-$10=$90 in customer B so that they'll pay you $90 for goods that cost you $9. Your reported revenue is now $100+$90=$190, but the only money that entered the system is the original $100.

Yes, but you’ve also incurred a $90 expense in purchasing the stock of Company B and that stock is on the balance sheet.

In the actual shady version of this, Company B isn’t the hottest AI investment around, it’s a shell company created by your brother’s cousin that isn’t actually worth what you’re claiming on the balance sheet because it was only created for the round tripping shell game.


Except that this is isn't round-tripping at all. Round-tripping doesn't result in a company actually incurring expenses to create more product. Round-tripping is the term for schemes that enable you to double count assets/revenue without any economic effects taking place.

Every time HackerNews talks about anything in the legal or finance realm, people trip over themselves to make arguments for why something a big tech is doing is illegal. This is definitively neither illegal nor shady. If Nvidia believes, for example, that OpenAI can use their GPUs to turn a profit, then this is inherently positive sum economically for both sides: OpenAI gets capital in the form of GPUs, uses them to generate tokens which they sell above the cost of that capital and then the return some of the excess value to Nvidia. This is done via equity. It's a way for Nvidia to get access to some of the excess value of their product.


At some point one might simply argue that the nature and timing of these wildly fantastical press releases is tantamount to a "scheme to defraud".

“ Every time HackerNews talks about anything in the legal or finance realm, people trip over themselves to make arguments for why something a big tech is doing is illegal.”

And your evaluation also rises as a consequence of your increased revenue.

It's real revenue, but you are operating a fractional reserve revenue operation. If the person your investing in has trouble, or you have trouble - the whole thing falls over very fast.

The "investment" came from their revenue, and will be immediately counted in their revenue again.

In this case it seems that if we're being strict here the investment could then also show up as fixed assets on the same balance sheet

Oracle also announced a lot of future revenue from AI, while they're part of Stargate Partners that is investing in OpenAI. Similar deal...

> this is a shady business practice known as "round tripping" where you invest in a company for the sole purpose of them buying your product.

Microsoft and Google have been doing it for decades. Probably, MS started that practice.


"In accounting terms, this is a shady business practice known as "round tripping" where you invest in a company for the sole purpose of them buying your product. It allows you to count your revenue multiple times."

... and we've seen this before in previous bubbles ...


Isn’t our stock market basically propped up on this AI credits etc. house of cards right now?

This is some Enron shit. Lets see NVDA mark to market these profits. Keep the spice flowing.

Waiting patiently for the Ed Zitron article on this...

When executives can't measure success by output, they measure success by input, a perverse incentive that rewards inefficiency.

Execs ask their employees to return to office, because they don't know how to measure good employee output.

Now OpenAI and Nvidia measure success by gigawatt input into AI instead of successful business outcomes from AI.



He single-handedly cost people more than anyone with his bearish takes lol

Or he saved them more than anyone by limiting their losses when it does finally crash

except he called the top in 2023

Its not a good argument against him. I read his articles and he is absolutely correct about the state of things. Predicting the crash is a fools errand. I don't use that as a argument to discredit what he actually writes regarding the raw economics of the AI industry.

I say this as someone who has been holding NVDA stock since 2016 and can cash out for a large sum of money. To me its all theoretical money until I actually sell. I don't factor it into financial planning.

You don't see me being a cheerleader for NVDA. Even though I stand to gain a lot. I will still tell you that the current price is way too high and Jensen Huang has gotten high off his own supply and "celebrity status".

After all, we all can't buy NVDA stock and get rich off it. Is it truly possible for all 30,000+ NVDA employees to become multi-millionaires overnight? That's not how capitalism works.


I am all against bubbles and irrational valuations etc. but I think in this case the prospect of future growth was fully justified. There are never guarantees, but Nvidia's price went up 10x or more in three years and e.g. their PER stayed mostly flat. But their PER of 50 three years ago would be 5 today, which would be extremely undervalued. I would say the "market" got it correctly this time.

He’s been absolutely wrong on most things but spreading FUD is how he makes money, like Gary Marcus

I don't care for personalities. You want to mark him as a grifter but is that just an emotional response? I have not bought anything from Ed, I don't subscribe to his newsletter, I don't know much about him beyond visiting his website every few weeks and reading the free articles. He does not sell me vitality pills or coffee mugs. The only soliciting he does is his paid sub stack.

But it goes both ways? Because AI promoters are also spreading FUD. That's how they make money. Because their livelihoods are tied to this technology and all the valuations. So is spreading FUD for you just a condition on whether or not you agree with the person?


If there is any FUD, it is feom other side. No one is scared after they read Zitron article, most are bored because they are dense to read.

But people are literally scared ai will destroy all the jobs after reading articles about how it will. Companies scared not to use ai whether it makes sense or not just to not miss out is where FUD is.


To put this into perspective this datacenter would have the land area of Monaco (740 acres) given assumptions of a 80kW/rack per case.

Monaco is so tiny it fits into Berlin's Tempelhofer Feld (a circular park inside the city).

I mean, you're not wrong, but Tempelhofer is also a former airport, so had to be quite big. And since Brexit, Berlin is the biggest city in the EU.

Paris? I mean, if we are considering the wide area. Otherwise London wouldn’t have been considered the largest.

So, basically a single BYD factory

Monaco is 2 km^2 [1].

I'm confused because if I assume each rack takes up 1 square meter I get a much smaller footprint: around 12 hectares or 17 football fields.

And that assumes that the installation is one floor. I don't know much about data centers but I would have thought they'd stack them a bit.

Am I the only person who had to look up how big Monaco was?

[1]: https://en.wikipedia.org/wiki/Monaco

[2]: https://www.wolframalpha.com/input?i=10+GW+%2F+%2880kw+%2F+m...


My ChatGPT calculation is below. ChatGPT is factoring in the actually size of building and a campus and it have the range of 340 to 740 acres. https://chatgpt.com/share/68d195ff-3528-8004-8418-ec462b1433...

To put Monaco in perspective, the US could fit 4.8M Monacos

To put Rhode Island in perspective, it would hold 1,340 Monacos.

https://www.comparea.org/MCO+US_RI


> Strategic partnership enables OpenAI to build and deploy at least 10 gigawatts of AI datacenters with NVIDIA systems representing millions of GPUs

I know watts but I really can’t quantify this. How much of Nvidia is there in the amount of servers that consume 10GW? Do they all use the same chip? What if there is newer chip that consumes less, does the deal imply more servers? Did GPT write this post?


You don’t need AI to write vague waffly press releases. But to put this in perspective an H100 has a TDP of 700 watts, the newer B100s are 1000 watts I think?

Also, the idea of a newer Nvidia card using less power is très amusant.


A 72 GPUs NVL72 rack consumes up to 130kW, so it's a little more than 5 500 000 GPUs

$150-200B worth of hardware. About 2 million GPUs.

So this investment is somewhat structured like the Microsoft investment where equity was traded for Azure compute.


For scale: The 1960's era US Navy submarine I served on had a 78MW reactor, so 10GW is 128 nuclear submarines

For a better sense of scale: it's about 2% of the average US electricity consumption, and about the same as the average electricity consumption of the Netherlands (18 million people)

For another sense of scale: A 500MW AI-centric datacenter could cost $10 billion or more to build. So 10GW is $200 billion!

Wtf, and this is from 1 company

How many atrophic,xAi,google,Microsoft would be????

having around 5% entire country infrastructure on AI hardware seems excessive no???


No this is from just one partnership, my sense is that OpenAI alone wants more than that.

5% to 10% of US electricity going to AI in 10 years is consistent with the current valuations of AI companies.


Around 5% in the next 5 years for AI alone sounds pretty in-line with projections I've seen.

Isn't this pretty bad for the climate? I don't dare to ask ChatGPT now /S

Some more context, Nuclear power stations can be up to 2GW, offshore windfarms are seemingly hitting a plateau at ~1.5GW, individual turbines in operations now are 15MW. Grids are already strained, 525kV DC systems can transmit ~2GW of power per cable bundle…

Adding 10GW of offtake to any grid is going to cause significant problems and likely require CAPEX intensive upgrades (try buy 525kV dc cable from an established player and you are waiting until 2030+), as well as new generation for the power!


But that's assuming they actually have to transport power over long distances right? If they colocate these massive AI datacenters right next to the power generation plants, it should be cheap to transport the power. You don't need to upgrade massive sections of the grid and build long-distance power lines.

The xAI Colossus 2 1GW data centers seem to be located about ~20 miles from the power generation utility (https://semianalysis.com/2025/09/16/xais-colossus-2-first-gi...)


20 miles is a long way to move power, on land you have huge issues over getting permits for construction as it’s so disruptive, offshore specialist vessels that serve a global existing supply chain.

Yeah the path forward here is going to be Apple-like vertical supply chain integration. There is absolutely no spare capacity in the infra side of electrical right now, at least in the US.

And there is great cost saving potential in vertical integration. Distribution and transmission are huge costs. If you can build a data center right next to a power plant and just take all their power you get much better prices. Not trivial to do with the kinds of bursty loads that seem typical of AI data centers, but if you can engineer your way to a steady load (or at least steady enough that traditional grid smoothing techniques work) you can get a substantial advantage

> bursty loads that seem typical of AI data centers

Don’t datacenters want to run at their rated capacity 24/7?


I don’t think that’s possible with large scale power infrastructure, and specifically grid infrastructure is so tightly regulated. Closest that I’m aware of was TSMC buying the output of an entire offshore windfarm for 25yrs (largest power purchase contract ever - TSMC / Ørsted)… maybe Microsoft re starting nuclear power plants, or Google reporting offshore wind sites come out of contract (but nothing at the 10GW scale).

In the long run, perhaps this will give us a better power grid, just like the dotcom bubble gave rise to broadband?

This still blows my mind.

If each human brain consumes ~20W then 10 GW is like 500 M people, that sounds like a lot of thinking. Maybe LLMs are moving in the complete opposite direction and at some point something else will appear that vaporizes this inefficiency making all of this worthless.

I don’t know, just looking at insects like flies and all the information they manage to process with what I assume is a ridiculous amount of energy suggests to me there must be a more efficient way to ‘think’, lol.


We know for a fact that current LLMs are massively inefficient, this is not a new thing. But every optimization you make will allow you to run more inference with this hardware, there's not a reason for it to make it meaningless any more than more efficient cars didn't obsolete roads.

> But every optimization you make will allow you to run more inference with this hardware

Unless the optimization relies in part on a different hardware architecture, and is no more efficient than current techniques on existing hardware.

> there's not a reason for it to make it meaningless any more than more efficient cars didn't obsolete roads

Rail cars are pretty darned efficient, but they don’t really work on roads made for the other kind.


Or just ten very safe РБМК reactors rated 1GW each (they can't explode).

You almost got me. RBMKs had this problem with large positive void coefficients that was buried by the Soviet Union, which lead to Chernobyl.

The control rods with the graphite on the tip was the cherry on top...

A big power station of any type is ~1GW. Nuclear is slow to build, so I'd have to guess natural gas.

The US is adding significantly more solar, and slightly more wind, than natural gas every year. This doesn't have to be placed where people already are, but can be placed where energy is the cheapest, which favours solar and wind substantially more than gas (or nuclear).

The reasonable (cost effective, can be done quickly) thing to do is put this wherever you can generate solar + wind the most reliably, build out a giant battery bank, and use the grid as a backup generator. Over time build a better and better connection to the grid to sell excess energy.


Trump is personally and vindictively against green energy.

He wants coal and gas.


When the price of gas is so much higher than solar, that hardly matters. No reason the data centre have to be in the US.

It should be illegal to build that much fossil fuel powerplants to just train LLMs.

The platant disregard of global warming by AI investors is truly repulsive.


Keep in mind that the industrial processes that consume fossil fuel also contribute to quality of life in various ways. Improvements in emergency response and early detection infrastructure alone have resulted in deaths from extreme weather events reaching record low levels. Poverty as a whole has seen record-breaking decreases over the last 30 years.

So there are other factors to weigh besides how much contributes to CO2 emissions.


A typical reactor core is 1 GW, so it's also one rather big nuclear power plant.

More like two (and a half )

Strange unit of measurement. Who would find that more useful than expected compute or even just the number of chips.

I wouldn't be surprised if power consumption is a starting point due to things like permitting and initial load planning.

I imagine this as a subtractive process starting with the maximum energy window.


It's a very useful reference point actually because once you hit 1.21 GW the AI model begins to learn at a geometric rate and we finally get to real AGI. Last I've heard this was rumored as a prediction for AI 2027, so we're almost there already.

Is this a crafty reference to Back to the Future? If so I applaud you.

1.21GW is an absurd level of precision for this kind of prediction.

It's from the movie "Back to the Future"

If a card costs x money, and operating it every year/whatever costs y money in electricity, and y >> x, it makes sense to mostly talk about the amount of electricity you are burning.

Because if some card with more FLOPS comes available, and the market will buy all your FLOPS regardless, you just swap it in at constant y / for no appreciable change in how much you're spending to operate.

(I have no idea if y is actually much larger than x)


A point of reference is that the recently announced OpenAI-Oracle deal mentioned 4.5 GW. So this deal is more than twice as big.

At large scales a lot of it is measured on power instead of compute, as power is the limitation

For a while, it's become increasingly clear that the current AI boom's growth curve rapidly hits the limits of the existing electricity supply.

Therefore, they are listing in terms of the critical limit: power.

Personally, I expect this to blow up first in the faces of normal people who find they can no longer keep their phones charged or their apartments lit at night, and only then will the current AI investment bubble pop.


Probably because you can't reliably predict how much compute this will lead to. Power generation is probably the limiting factor in intelligence explosion.

That, and compute always goes up.

These $ figures based on compute credits or the investor's own hardware seem pretty sketchy.

10 gigawatts sounds ridiculously high, how can you estimate the actual usage? I guess they are not running at capacity 24/7 right? Because that would be more than the consumption of several European countries, like Finland and Belgium:

https://en.m.wikipedia.org/wiki/List_of_countries_by_electri...


What does this mean? "To support the partnership, NVIDIA intends to invest up to $100 billion in OpenAI progressively as each gigawatt is deployed."

Nvidia is buying their own chips and counting it as a sale. In exchange they’re maybe getting OpenAI stock that will be worth more in the future. Normally this would count as illegally cooking the books I think but if the OpenAI investment pays off no one will care.

What if it doesn't?

Still unlikely they’d get prosecuted because they’re not trying to hide how they’re doing this and there’s no reasonable expectation that OpenAI is likely to fold. I doubt they’d improperly record this in their accounting ledger either.

It's a good question since it's probably the 99% case.

Perhaps it means OpenAI will pay for the graphics card in stock? Nvidia would become an investor in OpenAI thereby moving up the AI value chain as well as ensuring demand for GPUs, while OpenAI would get millions of GPUs to scale their infrastructure.

They're investing in kind. They're paying with chips instead of money

They will transfer the money to buy their own chips right before each chip is purchased

That they will invest 10$ in OpenAI for each W of NVIDIA chips that is deployed? EDIT: In steps of 1GW it seems.

I am confused as to what the question is.

so nvidia's value supported by the value of AI companies, which nvidia then supports?

It means this is a bubble, and Nvidia is hoping that their friends in a white house will keep them from being prosecuted, of at least from substantial penalties.

> What does this mean?

> to invest up to

i.e. 0 to something something


So OpenAI is breaking up with Microsoft and Azure?

They've been sleeping with Oracle too recently, so I don't think they're breaking up, just dipping a toe in the poly pool

It's more resembling a Habsburg family tree at this point

https://bsky.app/profile/anthonycr.bsky.social/post/3lz7qtjy...

(pencil in another loop between Nvidia and OpenAI now)


In true Bay Area fashion?

I would say Microsoft cheated on OpenAI first ;)

https://www.reuters.com/business/microsoft-use-some-ai-anthr...


Are Anthropic and Google breaking up with Nvidia?

It was more like Microsoft refused to build the capacity OpenAI was asking for, so they gave them blessing to buy additional compute from others.

It does seem like Satya believes models will get commoditized, so no need to hitch themselves with OpenAI that strongly.


Most salient point of all of this: gigawatts is pronounced "jigga watts", like Back to the Future's Doc does correctly.

Did I miss the part where they mention the 10 large nuclear plants needed to power this new operation? Where's all the power coming from for this?

Also water. You will be rationed, OpenAI will not.

https://www.newstarget.com/2025-08-02-texas-ai-data-centers-...


I assumed this headline was not aimed at the public, but at some utility they want to convince to expand capacity. Otherwise, bragging about future power consumption seems a bit perplexing.

Or to assuage investors participating in the OpenAI secondary on the issue of cash burn.

Also, the fact they they announce not how much computing power they are going to deploy but rather how much electricity it's going to use (as if power usage is a useful measurement of processing power) is kind of gross.

"Good news everybody, your power bills are going up and your creaking, chronically underfunded infrastructure is even closer to collapse!"


Build this thing in the middle of the desert and you would need around 100 sq mile of solar panels + a fuck load of batteries for it to be energy independent. The solar farm would be around $10 billion which is probably far less than the gpus cost

Dissipating 10GW of heat is also a challenge in a sunny, hot, dry environment.

$10 billion is small change compare to the estimated all-inclusive cost of $10 billion for EACH 500MW data center ... $200 billion for 10GW.

Won't get you the necessary 4 9's uptime and energy sadly. Im still 100% for this -- but need another model for energy delivery.

100 sq km should suffice

[flagged]


100 square miles is small in the American southwest. And a solar farm would disrupt the ecosystem much less than many other land uses. Adding shade and cover will benefit many species.

A few nuclear power plants have a much smaller environmental impact.

Burning 10GW of fossil fuels for 20 years while waiting for the nuclear plants to finish building will do far more damage to the environment than 100 square miles of shade in the desert.

This is a man's world

On sand?

Environmentalists are just against progress. A few desert species going extinct is not a big deal. It's an arid wasteland. When we eventually terraform it (with desalinated water from solar / fusion) those species are going to die out anyway.

> Environmentalists are just against progress

Just pure anti-life on earth talk


Consumer electric grids.

Exactly this. This is essentially a new consumer tax in your electrical bill. The buildout of the electrical grid is being put on consumers essentially as a monthly tax with the increase in electrical costs. Everyone in the country is paying for the grid infrastructure to power these data centers owned by trillion dollar companies who aren't paying for their needs.

Yep. Consumers are screwed and $500/month electric bills are coming for the average consumer within a year or two. We do not have the electricity available for this.

I'm pretty average, living in a small home, and my electric bill is already >$500/mo in the summer, and that's with the A/C set at 76F during the day.

Where do you live? How old is your house?

500 is insane.


I don't expect him to tell you where he lives but my bill EXPLODED recently due to what I now know is data center demand.

How many kWh is that? At those amounts solar panels seem like a no-brainer business case?

How will this actually be powered? Just seems like we’re powering an ecological disaster.

So Nvidia is giving OpenAI money so OpenAI can buy more Nvidia GPUs?

Telecom vendors were doing exactly this before the dotcom crash of 2000

The infrastructure and energy required to power these systems at scale are critical. I hope we carefully consider the environmental impact of building and operating data centers. I’m optimistic that we will develop efficient and sustainable solutions to power the data centers of today and the future

I'm old enough to remember when vendor financing was both de rigueur and also frowned upon... (1990s: telecom sector, with all big players like Lucent, Nortel, Cisco, indulging in it, ending with the bust of 2001/2002, of course)

This absolutely feels like the Telco Bubble 2.0, and I've mentioned this on HN as well a couple times [0]

[0] - https://news.ycombinator.com/item?id=44069086


For sure a great infrastructure build out -- lets hope the leftover are better energy infrastructure so that whatever comes next in 7 years after the flame out has some great stuff to build on (similar to telco bubble 1.0) and less damaging to planet earth in the long arc.

Yep. The Telco Bust 1.0 along with the Dotcom Bust is what enabled the cloud computing boom, the SaaS boom, and the e-commerce boom by the early-mid 2010s.

I think the eventual AI bust will lead to the same thing, as the costs for developing a domain-specific model have cratered over the past couple years.

AI/ML (and the infra around it) is overvalued at their current multiples, but the value created it real, and as the market grows to understand the limitations but also the opportunities, a more realistic and permanent boo' will occur.


Yeah - no doubt on the eventual productivity gains due to AI/ML (which are real, of course, just like the real gains due to telecom infra buildup), but must an economy go through a bubble first to realize these productivity gains??

It appears that the answer is "more likely yes than not".

Counting some examples:

- self driving / autonomous vehicles (seeing real deployments now with Waymo, but 99% deployment still ahead; meanwhile, $$$ billions of value destroyed in the last 10-15 years with so many startups running out of money, getting acquihired, etc)

- Humanoid robots... (potential bubble?? I don't know of a single commercial deployment today that is worth any solid revenues, but companies keep getting funded left / right)


Happened with the electrical grid too.

I think you make a very interesting observation about these bubbles potentially being an inherent part of new technology expansion.

It makes sense too from a human behavior perspective. Whenever there are massive wins to be had, speculation will run rampant. Everyone wants to be the winner, but only a small fraction will actually win.


There are venture capital firms.

Nvidia is transforming into a venture GPU company. X GPUs for Y percent.


I’m really curious how this affects the consumer GPU market over the next few years. Sure, there has been a GPU shortage for a few years now but if this continues, there should be an absolute surplus of obsolete-gen enterprise GPUs flooding the market, right? Any ideas what limitations and benefits these cards might have for an enthusiast?

These systems aren’t easily converted into desktop style GPU’s, so it may not trickle down the way we hope

I assume surplus DGX A100 are already out there but they consume kilowatts so enthusiasts can't even plug them in.

I feel like we're lucky Nvidia even sells consumer GPUs any more. At this point it's just a distraction to them and takes away resources they could be devoting to higher value hardware.

And the data center-class hardware doesn't do well in a home environment. It's not good for gaming. It runs hot and uses a ton of energy. Not to mention, silicon that is running hot 24/7 for years probably isn't the best thing to own second hand.


probably explains why we have "just rent H100 in the cloud, duh" influencers under every hardware building post on hn now

Folks old enough to have been around in 2000 have seen this movie before.

If this was such a great business, money would be coming from outside and Nvidia would be using its profits to scale production. But they know it's not and once the bubble pops, they profit margin evaporates in months. So they keep the ball rolling - this is pretty much equivalent to buying the cards from ... themselves.


Yeah, who cares about the enviroment... who needs water and energy, if you AI agent can give you better pep talk

What's the purpose to have access to smart assistants if it doesn't result in improving your basic needs, not improving your quality of life? Who is spending now? Only high income households, while majority is struggling with high utility bills and grocery prices - very basic needs.

Don't forget about better filters for influencers talking about the climate crisis!

When fucking up the human mind isn't enough. This is really villainous.

And before you think that's nonsense, let's not forget these people are accelerationists. Destroying the fabric of society is their goal.


Wouldn't Nvidia be better served investing the $100B in expanding GPU manufacturing capacity?

They’re already spending as much money as they possibly can on growth, and have no further use for cash currently - they’ve been doing share buybacks this year.

By investing in TSMC? By buying TSMC? I don't think $100B would buy them enough current generation capacity to make a difference from scratch.

The don't have to pick just one.

This is throwing more cards on the house of cards. Nvidia is “investing” in OpenAI so OpenAI can buy GPUs from NVidia. Textbook “round tripping.”

I generally like what’s been happening with AI but man this is gonna crash hard when reality sets in. We’re reaching the scary stage of a bubble where folks are forced to throw more and more cash on the fire to keep it going with no clear path to ever get that cash back. If anyone slows down, even just a bit, the whole thing goes critical and implodes.


It seems similar to how GE under Jack Welch would use their rock solid financials to take on low cost debt that they could lend out to suppliers who needed finance to purchase their products.

The biggest difference here though is that most of these moves seem to to involve direct investment and the movement of equity, not debt. I think this is an important distinction, because if things take a downturn debt is highly explosive (see GE during the GFC) whereas equity is not.

Not to say anyone wants to take a huge markdown on their equity, and there are real costs associated with designing, building, and powering GPUs which needs to be paid for, but Nvidia is also generating real revenue which likely covers that, I don't think they're funding much through debt? Tech tends to be very high margin so there's a lot of room to play if you're willing to just reduce your revenue (as opposed to taking on debt) in the short term.

Of course this means asset prices in the industry are going to get really tightly coupled, so if one starts to deflate it's likely that the market is going to wipe out a lot of value quickly and while there isn't an obvious debt bomb that will explode, I'm sure there's a landmine lying around somewhere...


GE also created their "rock-solid financials" by moving money around as necessary to make earnings projections.

Except I'm guessing they are not selling their equity, they are making debt backed by their equity?

Yea this is speculation, you might be right. I'm not sure how exactly they're doing this but my thinking would be.

1. Selling equity (probably good).

2. Financed with actual profits over time showing up as lower margins on the income statement (probably good).

3. Issuing debt backed by their equity (possibly a dumpster fire).


> Financed with actual profits over time showing up as lower margins on the income statement (probably good)

would these equity investments only impact the balance-sheet as financial investments - why would they show up as lower margins on income statement ?


> debt is highly explosive (see GE during the GFC) whereas equity is not.

Not as explosive as debt but I'd venture to say that nowadays equity is a lot more "inflamable" compared to 2008-2010, as in a lot more debt-like (which I think partly explains the current equity bubble in the US).

As in, there are lots and lots of investment funds/pension funds/other such like financial entities which are very heavily tied to the "performance" of equity, and I'm talking about trillions (at this point) of dollars, and if that equity were to get a, let's say, 20 or 30% hair-cut in a matter of two-three months (at most), then we'll for sure be back in October 2008 mode.


Totally agree, it might not blow up in Nvidia's face, but there's a margin call sitting around somewhere in the pile.

> As in, there are lots and lots of investment funds/pension funds/other such like financial entities which are very heavily tied to the "performance" of equity, and I'm talking about trillions (at this point) of dollars, and if that equity were to get a, let's say, 20 or 30% hair-cut in a matter of two-three months (at most), then we'll for sure be back in October 2008 mode.

Just curious, can you detail how it would fail exactly?


Anytime there's a massive draw down equities an asset-liability mismatch shows up (margin calls) because someone was borrowing money to spend in the short term against the value of assets that have now disappeared.

It might not be the catastrophic cascading failure of the GFC, but someone somewhere in the pile will get exposed.


Ah yes I see. It's the idea that somewhere, somehow, there is debt that's funding all of this, even if it's very indirect.

But real GPUs are being built, installed and used. It's not paper money, it's just buying goods and services partly with stock. Which is a very solid and time honored tradition which happens to align incentives very well.

What revenues do these GPUs generate for OpenAI? OpenAI is not currently profitable, and it is unclear if its business model will ever becomes profitable -- let alone profitable enough to justify this investment. Currently, this only works because the markets are willing to lend and let NVIDIA issue stock to cover the costs to manufacture the GPUs.

That's where the belief that we are in a bubble comes from.


OpenAI is profitable if they stop training their next generation models. Their unit economics are extremely favorable.

I do buy that they are extremely over-valued if they have to slow down on model training.

For cloud providers, the analysis is a bit more complex; presumably if training demand craters then the existing inference demand would be met at a lower price, and maybe you’d see some consolidation as margins got compressed.


> OpenAI is profitable if they stop training their next generation models. Their unit economics are extremely favorable.

But OpenAI can't stop training their next generation models. OpenAI already spends over 50% of their revenue on inference cost [1] with some vendors spending over 100% of their revenue on inference.

The real cash cow for them is in the business segment. The problem here is models are rapidly cloned, and the companies adjacent to model providers actively seek to provide consumers the ability to rapidly and seamlessly switch between model providers [2][3].

Model providers are in the situation you imagine cloud providers to be in; a non-differentiated, commodity product with high fixed costs, and poor margins.

[1] https://www.wheresyoured.at/why-everybody-is-losing-money-on...

[2] https://www.jetbrains.com/help/ai-assistant/use-custom-model...

[3] https://code.visualstudio.com/docs/copilot/customization/lan...


I agree the market dynamics are weird now, I disagree that says much about the existence of other equilibria.

For example, inference on older GPUs is actually more profitable than bleeding-edge right now; the shops that are selling hosted inference have options to broaden their portfolio the advancement of the frontier slows.

Cloud providers are currently “un-differentiated”, but there are three huge ones making profits and some small ones too. Hosting is an economy-of-scale business and so is inference.

And all of these startups you quote like Cursor that are not free-cash-flow positive are simply playing the VC land grab game. Costs will rise for consumers if VCs stop funding, sure. That says nothing about how much TAM there is at the new higher price point.

The idea that OAI is un-differentiated is just weird. They have a massively popular consumer offering, a huge bankroll, and can continue to innovate on features. Their consumer offering has remained sticky even though Claude and Gemini have both had periods of being the best model to those in the know.

And generally speaking there are huge opportunities to do enterprise integrations and build out the retooling of $10T of economic activities, just with the models we have now; a Salesforce play would be a natural pivot for them.


That's why we are seeing these insane numbers. The competition is "do or die" right now.

Zuckerberg said in an interview last week he doesn't mind spending $100B on AI, because not investing carries more risk.


This only applies if you think one of two things; First, that it is guaranteed that this specific line of inquiry will lead to development of a form of superintelligence or otherwise broadly applicable development; or second, the form of machine learning technologies that unlocks or otherwise enables a market which would otherwise be inaccsesible that justifies this investment.

To date, no evidence of either even exists. See Zuckerbergs recent live demo of Facebooks Ray Bans technology, for example.


OpenAI generates plenty of revenues from their services. Don't conflate revenues with profits

I don't believe I am. Investors (value investors, not pump and dump investors) provide capital to companies on the expectation of profit, not revenue.

Sure, and as long the expected profit keeps increasing investors are happy. They don't need to make an actual profit yet.

The counter point to this is that while not profitable, the cashflow is real, and inference is marginally ROI positive. If you can scale inference with more GPUs then eventually that marginal ROI grows large enough to cover the R&D and other expenses and you become profitable.

"Marginally ROI positive" works in a ZIRP environment. These are huge capital investments; they need to at least clear treasury return hurdles and importantly provide attractive returns.

I am fundamentally skeptical of "scaling inference". Margins are not defensible in the market segment OpenAI is in.


For some of these tech companies their valuations let them go to the market with their equity in way that is basically a ZIRP environment. In a way you could say this is a competitive advantage someone like Nvidia has at the moment and so they are trying to push that.

I'm also pretty skeptical, and could imagine this whole thing blowing up, but it's not like this a big grift that's going to end up like the GFC either.


I think it's possible we are in datacenter GPU overcapacity already, and NVIDIA is burning its stock to avoid the music stopping.

It's already happening in China that datacenters are at GPU overcapacity. I wouldn't be surprised if it occurs here.


Wow, diluting stock during a bull run is incredibly short-sighted. NVIDIA is betting there will never be a downturn. If there is, the dilution causes late investors to either be left holding the bag or be forced to sell (potentially at a loss), meaning the stock has the potential to drop like a stone at the first sign of trouble.

I guess that’s why they would be gaming their numbers: to convince the next greater fools.


They're doing about a billion per month in revenue by running proprietary models on GPUs like these. Unless they're selling inference with zero/negative margin, it seems like a business model that could be made profitable very easily.

Revenue != profit, and you don't need to become net negative margin to be net unprofitable. Expensive researchers, expensive engineers, expensive capex, etc.

Inference has extremely different unit economics from a typical SaaS like Salesforce or adtech like google or facebook.


All of those expenses could be trimmed in a scenario where OpenAI or other big labs pivot to focus primarily on profitability via selling inference.

Currently, selling LLM inference is a red queen race: the moment you release a model, others begin distilling and attempting to sell your model cheaper, avoiding the expensive capitalized costs associated with R&D. This can occur because the LLM market is fundamentally -- at best -- minimally differentiated; consumers are willing to switch between vendors ("big labs", as you call them, but they aren't really research labs) to whomever offers the best model at the lowest price. This is emphasized by the distributors of many LLMs, developer tools, offering ways to switch the LLM at runtime (see https://www.jetbrains.com/help/ai-assistant/use-custom-model... or https://code.visualstudio.com/docs/copilot/customization/lan... for an example of this). The distributors of LLMs actively working against LLM providers margin provides an exceptionally strong headwind.

This market dynamic begets a low margin race to the bottom, where no party appears able to secure the highly attractive (think the >70% service margin we see in typical tech) unit economics typical of tech.

Inference is a very tough business. It is my opinion (and likely the opinion of many others) that the margins will not sustain a typical "tech" business without continual investment to attempt to develop increasingly complex and expensive models, which itself is unprofitable.


I don't disagree but you're moving the goalposts. I never said that they could achieve the profits of a typical tech business, just that they could be profitable. Also, the whole distilling problem doesn't happen if the model is proprietary.

> I don't disagree but you're moving the goalposts. I never said that they could achieve the profits of a typical tech business, just that they could be profitable. Also, the whole distilling problem doesn't happen if the model is proprietary.

In the absence of typical software margins, they will be eroded by providers of "good enough" margins (AWS, Azure, GCP, etc.) who gain more profit from the bundled services than OpenAI does from the primary services. This has happened multiple times in history, either resulting in smaller businesses below IPO price (such as Elastic, Hashicorp, etc.) or outright bankruptcy.

Second, the distilling happens on the outputs of the model. Model distillation refers to the usage of a models outputs to train a secondary smaller model. Do not mistake distillation for training (or retraining) to sparse models. You can absolutely distill proprietary models. In fact, that is how DeekSeek-R1-Distill-Qwen and the DeepSeek-R1-Distill-Llama are trained. This also happens with Chinese startups distilling OpenAI models to resell [2].

The worst part is OpenAI is already having to provide APIs to do this [1]. This is not ideal, as OpenAI wants to lock people into (as much as possible) a single platform.

I really don't like OpenAIs market position here. I don't think it's long term profitable.

[1] https://openai.com/index/api-model-distillation/

[2] https://www.theguardian.com/technology/2025/jan/29/openai-ch...


> Revenue != profit

Indeed. And even if that revenue is net profitable right now (and analysts differ sharply on whether it really is), is there a sustainable moat that'll keep fast-followers from replicating most of OpenAI's product value at lower cost? History is littered with first-movers who planted the crop only to see new competitors feast on the fruit.


And even if they are selling inference at negative margin, they'll make it up in scale!

These kinds of phrases are...eerily similar to the phrases heard right before...the .com bust. If you were old enough at the time, that's exactly what the mindset was back then.

The classic story of the shoeshine boy giving out stock tips...and all that.

We all know how that turned out.


> OpenAI is not currently profitable, and it is unclear if its business model will ever becomes profitable -- let alone profitable enough to justify this investment.

Well, yes. Which again is how venture capitalism has worked for ... is it decades or centuries? There is always an element of risk. With pretty solidly established ways to handle: expected value, risk mitigation etc.

I haven't lived through the dot com bubble (too young) but i've read about it. The absolutely insane ways they were throwing money at startups were... just insane. The potential of the technology is the same now and then: AI vs Internet. It wasn't the tech that failed the last time, it was the way the money was allocated.

The math is actually quite mathing this time around. Most AI companies have solid revenues and business models. They aren't turning a profit because (like any tech startup) they chose to invest all their revenue plus investments into growth, which in this case is research and training new models. They aren't pivoting every 6 months, aren't burning through cash reserves just to pay salaries, and they've already gone through train/deploy cycles several times each, successfully.

Are they overvalued? shrug that's between them and their investors, and we'll find that out eventually. But this is not a bubble that can burst as easily as last time, because we're all actually using and paying for their products.


Amazon lost money every year for the first 9 years of it's existence and people said it was a bubble the entire time.

Amazon was gross margin profitable -- and significantly so -- the entire time.

It just turns out they were a server farm subsidizing a gift shop.


Yeah this.

Ultimately the marketplace was just an investment that had embedded within it a real option for AWS. Magical really.


But GPUs are a depreciating asset - if there's a bubble burst and your 5 million GPUs are idle for the next few years before demand picks up again, they'll be pretty outdated and of limited use.

Infrastructure tends to have much longer lifetimes. A lot of the telco infrastructure "overbuilt" during that boom is still used today - you can always blow new fibre, replace endpoints and all that without digging everything up again, which was the largest cost in the first place. Sure, in the above example you'll still the datacentre itself (and things like electricity connections and cooling) that can be reused, but that's a relatively small fraction of the total cost comparitively.


No one's implying it's fake money or resources, only that will no clear path to profit eventually the money will stop flowing and valuations will implode.

It’s a global AI race, there is more at stakes than profit.

There was also a global AI race in the 80's...

> But real GPUs are being built, installed and used.

At this moment they could as well be called bitcoin or tulips....No different from Chinese ghost towns. Real houses being planned and built... And let's not talk to accountants about the depreciation rates on GPU Hardware that is out in 8 to 12 months...



Same thing with the 1.3 billion EUR investment of ASML into Mistral. ASML -> Mistral -> NVIDIA -> TSMC -> ASML -> ...

It would be amusing if it also wasn't so accurate.

I didn’t see the step where Larry has to sell any stock, and hence puts downward price pressure on Oracle share prices.

What is the source of the cash in steps 3, 4, and 7?


He doesn't have to sell. He can finance the deal with debt backed by his newly risen stock as collateral. Then the debt is used to further inflate the price of the stock.

The flywheel metaphor is pretty apt.


It is us, index fund owners :clown:

Disclaimer: I also have a small amount of money in vanguard IRA


According to the image of the steps, Oracle’s share price is going up, presumably more than it would have without engaging in these steps. How can that cost index fund owners? They would be benefiting from the share price increase.

Ultimately, debt will fuel this. Oracle can't pay with cashflow.

Credit.

Going to leave this link here: https://www.hussmanfunds.com/comment/mc250814/

By many different measures, we are at record valuations (though must be said, not P/E however). Tends not to end well. And housing prices are based on when mortgages were at 3% and have not reset accordingly. We are in everything bubble territory and have been.


> Tends not to end well.

I'm no financial guru but this time around the boom/bust cycle, there's a new, additional factor that's concerning. Even though I sold my individual tech company shares a few years ago and diversified all my equity holdings in broad market ETFs like VTI, the so-called "Magnificent 7" tech companies have inflated so much, they now occupy a disproportionate percentage of even broad market ETFs which hold ~5,000 stocks based on their market caps. The obvious issue being their share prices all having a significant component elevated by the same thing - unrealistic AI growth expectations.


Two words. Passive flows.

Where do you think your 401K money is going...right into the S&P 500...and who gets the lion's share of allocation out of that? The Mag7 et al.

If you chart the last 25 years, Gold (yes, that one...the useless metal) has outperformed the S&P (and it's making new highs even today). What does that say about hard assets vs these companies?


Housing prices have not reset because of supply and demand. People are sitting on those 3 percent mortgages and not selling.

People are quite bearish and the stock market is making all time highs. This is actually a very good sign, because we are far from any euphoria.

Always keep in mind the old saying: pesimists get to be right and optimists get to be rich.


Only if they're optimistic at the right time and not the wrong one.

Timing the market is a fallacy. Time in the market is what builds wealth.

You're treating a statistical tendency as immutable law. It's true that attempting to time the market is not generally a good investment strategy, but every investment is made at some time, and some of those are very bad times to put money in the market. That it'll probably recover eventually doesn't much help if you've lost everything in the interim.

Which is why you invest a percentage of your paycheque not wait to time the market.

That quote definitely has some insane survivor bias in it.

Optimists go bankrupt or something and you blame them on their work ethic or something and you discard any of those optimists who didn't really succeed and cherry pick those optimists which went right...

Its a classic survivorship bias.

I am pessimistic in US stocks because they are so concentrated on AI for returns and its definitely a bubble or approaches its territory, there is somewhat no denying about it from what I observe.

Your comment really is just off putting to me because I feel like its just a copium which is going to be inhaled by the new generation and then if we fail which lets be honest failure is a natural part of life, we are gonna blame ourselves and that's just really depressing.

I'd better be right than rich. Maybe my rich definition is something that I can get out of hard work while maybe being pessimist (just enough money to have freedom lol)

I don't want to make billions or hundreds of millions, i don't want to build a vc funded disaster for humanity in the name of shareholders whether its an Ad dystopia or an AI nightmare fuel, I'd rather make a imprint on humanity other than my bank account number but maybe that's me being "optimistic"

Sorry but your comment truly ragebaited me... I have very strong opinions in this regards.


> I am pessimistic in US stocks because they are so concentrated on AI

The russel 2000 index just made an all time high. The bull market is diverse and global. Indexes of many countries are also at all time highs.


I am pessimistic in US S&P 500 for the most part actually given how concentrated it is in AI (refer to that hank green video)

I also didn't know that the other world's stocks are doing fine actually. but maybe there is a difference in economy and stocks at this point...

I believe that we can all surely agree on the legendary john bogle's philosophy and in the current day and age realize that us s&p stocks are too centralized on ai and world stocks can be better...

Regarding russel 2000 index. I feel like a lot of money trickles down from the AI hype but its honestly great that russel is doing great.

The point I am trying to make is that atleast for US right now, its political system is so shaky that I can't trust its economical system and there is no denying that if the AI bubble bursts, then it would bleed the whole economy at this point including russel.

There was a great hank green video which I recommend about this concept https://www.youtube.com/watch?v=VZMFp-mEWoM

Also, A lot of countries are definitely in turmoil right now so I am actually surprised by your statements that world economy is doing quite high, maybe stock markets are just another asset class which have gotten so inflated that they are out of touch from the ground reality... (Something I heard in an atrioc video)

I am definitely a bit surprised to hear that the world stocks are doing fine from all the bloodbath of tarrifs and some political issues the world is facing right now...


Politics is a distraction and largely irrelevant to investing.

The stock market has so much money going into it that it is in a bull market. Because people have nowhere else to put their money into (real estate is dead atm).

You are letting your political biases poison your financial decisions.


It isn't even a political bias but rather we can't deny that the economy feels like kissing the ring whether its us buying intel stocks or sort of forcing nvidia to buy some intel stocks and etc.

And I feel like its in a bull market because of AI Hype which was the main comment of the original parent to which you responded I think...

If this AI hype fails to deliver. Literally the magnificient 7 will have a huge loss of money which would then make the stockholders feel less wealthy which will spend less and it would have a drastic impact in the WHOLE economy.

Yes its in a bull market but I feel like I don't want to find out if I am in the peak of a bull market for an AI craze y'know?

And I am not advocating against stocks omg, I am just saying that world stocks are better in current landscape and I doubt if its poisoning my financial decisions.

NO I Don't want all of my saved money to go into an index which is going to be heavily dictated by the future of AI which I and many presume to be a bubble. I would much rather invest in index funds that target the world, heck maybe even index funds that target every country ex usa

My point is that the bubble will burst and then atleast S&P / nasdaq will definitely bleed.

Either we can talk about if you think its a bubble or not, since I am not comfortable investing in a bubbly situation no matter how lucrative it becomes y'know?

What are your thoughts on it?


You can find excuses not to invest at any time. Easiest thing in the world has always been finding an excuse not to invest.

Mag7 are some of the most profitable and well run companies in history investing their insane profits.

No other country has public markets as developed, regulated and liquid as the US. Likely you are just investing into the unknown with a ton of risk factors you are not aware of. In places outside of the US politics actually is a significant factor in investing.


At least the deal is denominated in watts rather than currency which may hyperinflate soon.

I do not think the leveraging is going to end there. I suspect this will be used to justify/secure power generation investments, possibly even nuclear. Likely via one or more of the OpenAI/Altman adjacent power startups.

On the bright side, if lots of power capacity is added and most of the GPUs end up idle, then there might be cheap power available for other uses.

Power generation is not a monolithic enterprise. If more supply is built than needed, certain suppliers will go bankrupt.

They may, but that doesn’t mean that the capacity disappears. It may require some assumptions about USG willingness to backstop an acquisition but it’s not a significant leap to think that the generation capacity remains in (more capable?) hands.

Speaking of capacity, what happened to all the "dark fiber" that was supposedly built for Internet 2 or whatever? The fiber doesn't go away just because a bubble burst, right?

Railways are similar, many were built by investors who lost all their investment but the railway is still there.

What are the chances suppliers will go bankrupt but the plants get sold and still produce power?

Not if Ellison trickles it out for maximum profit.

And computing in general gets cheaper.

heating our homes next winter with clusters of h100s

Altman is all in on converting the solar system into a Dyson sphere to power OpenAI.

And it is hilarious [0]

[0] dyson spheres are a joke / Angela Collier https://youtu.be/fLzEX1TPBFM


Isn't that already happening via Oklo? Up 500%+ YTD.

This is more like reinvesting into the business as it's growing. It's a positive sum loop.

Nvidia makes money by selling to OpenAI. OpenAI makes money by selling a service to users that uses Nvidia. So Nvidia invests in the build out and expansion of the infrastructure that will use Nvidia.

This is a classic positive sum loop.

It's not that different than a company reinvesting revenue in growing the company.


But is OpenAI recouping this? I remember seeing reports a year ago that it was in the realm of $700M/mo in inference costs for them - are they earning that now?

Of course the strategy of taking a loss and reinvesting - but I don't see how OpenAI is making enough money to pay for all this, now or in the future.


It hasn't monetized any of its services. There are currently no ads. And it's not selling user data, well not yet.

That's literally hundreds of billions worth of revenue.

Just look at the options OpenAI has to generate revenue beyond subscriptions.


How would you "see" that, given that OpenAI isn't public?

Is this more of an accounting thing?

Is there some (tax?) efficiency where OpenAI could take money from another source, then pay it to Nvidia, and receive GPUs. But instead taking investment from Nvidia acts as a discount in some way.

(In addition to Nvidia being realistically the efficient/sole supplier of an input OpenAI currently needs. So this gives

  1. Nvidia an incentive to prioritize OpenAI and induces a win/win pricing component on Nvidia's GPU profit margin so OpenAI can bet on more GPUs now

  2. OpenAI some hedge on GPU pricing's effect on their valuations as the cost/margin fluctuates with new entrants
)?

It sounds like Nvidia has so much cash already that they would prefer to own x% of OpenAI instead.

It's not round tripping. Economically Nvidia is investing property is OpenAI. It's not investing nothing, far from it.

It's interesting how deals like this are politically relevant. Nvidia refused to do deals like this (investing in companies buying large amounts of NVIDIA GPUs) after they got the hammer from Biden's SEC for self dealing due to their investment in Coreweave.

But now that there is a new SEC, they are doing a bunch of these deals. There is this one, which is huge. They also invested in Lambda, who is deploying Gigawatt scale datacenters of NVIDIA GPUs. And they are doing smaller deals too.


I'm not saying you're wrong, but with Nvidia pulling back from DGX Cloud, it makes sense that they'd continue to invest in their strategic partners (whether it's software companies like OpenAI or infrastructure vendors like Coreweave).

It forces us to confront a question.

How much investment and prioritization in scaling laws is justified?


Regardless of the scaling hypothesis, they need the compute to serve the models at scale.

Really curious how xAI is working out financially. Grok blows me away for coding.

It's interesting how profitable Tesla is despite the huge investments in their AI training infrastructure. They seem to be one of the best positioned companies that can maintain enough profitability to be able to afford their AI infrastructure without issue.

Google?

Gemini isn’t the best though, I’ve found it not nearly as good as grok

My thought is think of all the really cheap compute that will be available to researchers. Sure, it will crash but at the end of the day there will be a huge glut of gpus that datacenters will be trying to rent out near cost.

I (as a uninformed rando) think that there are a lot of research ideas that have not been fully explored because doing a small training run takes 100k. If that drops to 1000, then there is a lot more opportunities to try new techniques.


With this level of power usage it might be the opposite. Once they can't subsidize the cost it might increase.

"...man this is gonna crash hard when reality sets in."

Given the amount of money invested and the expectations, the crash will be of cataclysmic proportions


I do think about this where they are creating a printing / cash burning cycle where both OpenAI keeps on doing raises and Nvidia can get more sales...

The're just buying and investing from each other?

I also recall reading that OpenAI is developing its own chips. What happened to that?

I don't think the NVIDIA deal is an exclusive one... They can still use TPUs and GPUs and other cloud providers if they like. They may still be planning to.

Well I hope it crashes so we can get back to normalized GPU prices.

Almost every model trained by the majors has paid for itself with inference fees.

I’m not saying there isn’t a bubble, but I am saying if the researchers and strategists absolutely closest to the “metal” of realtime frontier models are correct that AGI is in reach, then this isn’t a bubble, it’s a highly rational race. One that large players seem to be winning right now.


> Almost every model trained by the majors has paid for itself with inference fees.

Even if we assume this is true, the downstream customers paying for that inference also need it to pay for itself on average in order for the upstream model training to be sustainable, otherwise the demand for inference will dry up when the music stops. There won't always be a parade of over-funded AI startups burning $10 worth of tokens to bring in $1 of revenue.


My employer spends $100k/month or more on OpenAI fees. Money well spent, in both product features and developer process. This is just one fairly small random startup. Thousands of companies are spending this money and making more money because of it.

Curious what makes you think the money is well spent.

I can maybe digest the fact that it helped prototype and ship a bit more code in a shorter time frame... but does that warrant in enough new customers or a higher value product that would justify $100k a month?!


Probably 80% of that money goes towards product features that are crucial to retention and acquisition of customers, and the business is profitable. Could those features exist without AI integrations? Some yes, but the data would be limited/inferior, other features would not be possible at all.

The 20% spent on dev tooling seems well-spent. About 10 devs on the team, and all at least 2x (hard to measure exactly, but 2x seems conservative) more productive with these tools.


Some of that $100k/month might be powering the features, rather than supporting development.

yeah it's probably 80% going to product features (processing/classifying data, and agentic workflow features), and 20% to dev tools

Isn't most of OpenAI revenue from end users and not revenue from token sales? For Anthropic, it is the opposite where almost all of their revenue comes from API usage. So even if AGI/ASI don't pan out, OpenAI will have a great consumer-focused inference business where they build useful applications (and new devices) using existing state-of-the-art LLMs and stop investing heavily in the next gen model training? I think potentially just replacing Google Search and smartphones with a new AI device would be massive consumer businesses that OpenAI could potentially go after without any major advancements in AI research.

Tokens that can be purchased for $10 may or may provide the purchaser with almost any dollar denominated result, from negative-billions* to postive-billions**.

Right now, I assume more the former than the latter. But if you're an optimistic investor, I can see why one might think a few hundred billion dollars more might get us an AI that's close enough to the latter to be worth it.

Me, I'm mostly hoping that the bubble pops soon in a way I can catch up with what the existing models can already provide real help with (which is well short of an entire project, but still cool and significant).

* e.g. the tokens are bad financial advice that might as well be a repeat of SBF

** how many tokens would get you the next Minecraft?


The thing is that AI researchers that are not focused on only LLM do not seem to think it is in reach.

Demis Hassabis seems to think this and not only does he not focus only on LLMs, he got a nobel prize for a non-LLM system ;)

As far as I know, that Nobel prize was for being the project manager...

Ever since NLP and CSR, the two unassailable fortresses of every AI winter, fell to LLMs? I had no doubt that AGI is within reach.

It's less "will it happen" now, and more "whether it hits in a few decades or in a few years".


Which of these model-making companies have posted a profit? I'm not familiar with any.

They account internally for each model separately; Dario said they even think of each model as a separate company on Dwarkesh some time ago.

Inference services are wildly profitable. Currently companies believe it’s economically sensible to plow that money into R&D / Investment in new models through training.

For reference, oAI’s monthly revs are reportedly between $1b and $2b right now. Monthly. I think if you do a little napkin math you’ll see that they could be cashflow positive any time they wanted to.


Again with the "this is very profitable if you don't account for the cost of creating it?"

Then my selling 2 dollars for 1 dollar is a wildly profitable business as well! Can't sell them fast enough!

Why does it seem like so many people have ceased to think critically?


OpenAI claims that each GPT generation has sold enough inference at high enough margin to recoup the cost of training it.

The company overall is still not profitable because these proceeds are being used to fund training the next GPT generation.


The idea that it’s a bubble on the frontier model side is insane. AI assisted coding alone makes it the most valuable thing we’ve ever created.

Get your head out of the proverbial, a bullshitting machine that lets some developers do things faster if they modify how they develop isn't even close to the most valuable thing we've ever created.

It easily is, nothing else is even remotely close. Software is the most valuable industry on earth and we are well on our way to fully commoditizing it.

Does this mean they pay for it through consumer GPU sales?

Last quarter, NVIDIA reported data-center revenue of $30.8 billion and gaming revenue of only $3.3 billion.

https://nvidianews.nvidia.com/news/nvidia-announces-financia...

This also explains why NVIDIA will not sell high VRAM consumer GPUs: it would cannibalize on their exorbitant data-center profits.


If they won't, somebody else will. And frankly that alone can pop their bubble - it minimizes/locks margins they can ever charge, a margin that is already negative. Apparently for every $1 made they currently pay $2.25?

No, those are a drop in the bucket.

I think you have just described the global economy.

Perhaps I should short openAI… would you try it?

I also ask this as a rationalist technique, the moment you ask “what outcome would you actually put money on” people suddenly get far more realistic about how confident they actually feel. You get a whole lot less “oh they’re DEFINITELY gonna fail/succeed!” type hyperbole when money is on the line.

> throwing more cards on a house of cards.

Nice metaphor! Huge bubbles usually get a historical name like "Tulip Craze" or "Dot Com Crash" and when this bubble bursts "House of Cards" is a good candidate.


Oh, I see now: house of cards (usual meaning) + throwing more cards on (like throwing money on the fire, and also how you destabilize house of card) + GPU cards in this case (even though they're not necessarily cards). I like it.

I just hope it works out just like the dot com crash in the long run - which is that the internet kept going and bringing real value it just needed a big market reset when it popped.

What is this a bubble on? What does said bubble collapsing look like?

Nvidia is giving OpenAi money (through investment) to buy Nvidia chips. The bubble is that Nvidia got that money from its crazy high stock price, the extra investment raises OpenAi’s evaluation and the increased sells raises Nvidia’s evaluation. If the valuations see a correction then spending like this will decrease, further decreasing valuations.

Bubble collapsing looks like enshittification of OpenAI tools as they try to raise revenues. It’ll ripple all throughout tech as everyone is tied into LLMs, and capital will be harder to come by.


> The bubble is that Nvidia got that money from its crazy high stock price,

This is totally False, NVDA has not done any stock offerings. The money is coming from the ungodly amount of GPUs they are selling. In fact they are doing the opposite, they are buying back their stock because they have more money that they know what to do with.


A company buys back its stock if it thinks the stock is underpriced. Otherwise when “you have more money than you know what to with” you give it to your shareholders via a dividend. A concept mostly forgotten by tech companies.

Ermm this is nothing but a wealth transfer from the shareholders who sell at too low a price, to those who dont.

A company buys back stock because distributing dividends incurs a 30% withholding tax.

Sorry guys but this is why I dont want to see many finance related posts here, because very few know what they are talking about.

Buybacks are the preferred method of RETURNING CASH to shareholders, because dividends historically have been sticky. Buybacks are flexible.

Buybacks are also done to optimise the debt ratio, to minimise the firms cost of capital and thereby maximizing firm value.


NVDA outstanding shares are down ~1.2% year over year; the company has been buying back its own shares with —>> profits <<— to the tune of tens of billions.

Meanwhile NVDA stock is mildly up on this news, so the current owners of NVDA seem to like this investment. Or at least not hate it.

Agreed that we’ll see ad-enabled ChatGPT in about five minutes. What’s not clear is how easily we’ll be able to identify the ads.


Valuations won’t see a correction for the core players, I have no idea why people think that. Both of these companies are already money factories.

Then consider we are about to lower interest rates and kick off the growth cycle again. The only way these valuations are going is way up for the foreseeable future


> Bubble collapsing looks like enshittification of OpenAI tools as they try to raise revenues

Why does monetizing OpenAI tools lead to bubble collapse? People are clearly willing to pay for LLMs


You read this backwards. If the bubble collapses we will see OpenAI raise capital by increasing revenue instead of investment.

AI and tech companies

Collapse might look a little like the dot com bubble (stock crashes, bankruptcies, layoffs, etc)


And it's worth reiterating that a bubble does not mean the technology is worthless. The dot com bubble collapsed despite the internet being a revolutionary technology that has shaped every decade since. Similarly LLMs are a great and revolutionary technology, but expectations, perception and valuations have grown much faster than what the technology can justify

These hype cycles aren't even bad per se. There is lots of capital to test out lots of useful ideas. But only a fraction of those will turn out to be both useful and currently viable, and the readjustment will be painful


Plus unused dark fiber = unused AI data centers and power generation capacity.

I think ultimately the conclusion that we're in a bubble is bad analysis. It jumps over a chasm and assumes that analogy to past historical situations allows us to draw conclusions.

This isn't a bubble. This is the collapse of 300 years of modern capitalism into corporate techno feudalism.

This won't crash and lead to a recession or depression. We are at the end game. Look around you. Capital is going scorched earth on labor. They are winning. Cost of living in metropolitan areas is exploding, and most of us will end up begging for scraps in peripheral areas.

This is the result of everything the elites have been working towards for the past few decades. Climate catastrophe is the cherry on the cake: they will shock therapy us into the last few bits. There will be corporate citizenship that enables one to live as a demi-god at the behest of the owners, and survival in the wastelands for the rest of us.


High end server gpus and AI roi expectations.

I think everyone is underestimating the advancements in wafer tech and server compute over the last decade. Easy to miss when it’s out of sight out of mind but this isn’t going anywhere but up.

The current SOTA is going to pale in comparison to what we have 10 years from now.


> I think everyone is underestimating the advancements in wafer tech and server compute over the last decade.

What advancements?

We have done a fabulous job at lowering power consumption while exponentially increasing density of cores and to a lesser extent transistors.

Delivering power to data centers was becoming a problem 20 ish years ago. Today Power density and heat generation are off the charts. Most data center owners are lowering per rack system density to deal with the "problem".

There are literal projects pushing not only water cooling but refrigerant in the rack systems, in an attempt to get cooling to keep up with everything else.

The dot com boom and then Web 2.0 were fueled by Mores law, by Clock doubling and then the initial wave of core density. We have run out of all of those tricks. The new steps that were putting out have increased core densities but not lowered costs (because yields have been abysmal). Look at Nvidia's latests cores, They simply are not that much better in terms of real performance when compared to previous generations. If the 60 series shows the same slack gains then hardware isnt going to come along to bail out AI --- that continues to demand MORE compute cycles (tokens on thinking anyone) rather than less with each generation.


Same as they are doing with CoreWeave. In a sane world the SEC would do something but we are past that. What about Boeing opening an airline company and selling airplanes to itself?

Sure, but its going to be a great plot for the movie that comes out in five years.

I agree that it is a bubble.

But the "round tripping" kind of makes sense. OpenAI is not listed, but if it was, some of the AI investment money would flow to it. So now, if you are an AI believer, NVidia is allocating some of that money for you.


The real question is not whether this is a bubble since as you mentioned even if AI settles into a somewhat useful semi-mainstream tech, there is no way any of the likely outcomes can justify this level of investment.

The real question is what are we gonna do with all this cheap GPU compute when the bubble pops! Will high def game streaming finally have its time to shine? Will VFX outsource all of its render to the cloud? Will it meet the VR/AR hardware improvements in time to finally push the tech mainstream? Will it all just get re-routed back to crypto? Will someone come up with a more useful application of GPU compute?


Ai is already in semi-useful mainstream tech. There's a massive misunderstanding on this site (and other neo luddite sites) that somehow there is no "long tail" of business applications being transformed into ai applications.

any examples?

Current systems are already tremendously useful in the medical field. And I'm not talking about your doctor asking ChatGPT random shit, I'm saying radiology results processing, patient monitoring, monitoring of medication studies... The list goes on. Not to mention many of the research advances done using automated systems already, for example for weather forecasting.

I'm getting real "put everything on the blockchain" vibes from answers like this. I remember when folks were telling me that hospitals were going to put patient records on the blockchain. As for radiology, it doesn't seem this use of AI is as much of a "slam dunk" as it first appeared[1][2]. We'll see, I guess.

Right now I kind of land on the side of "Where is all the shovelware?". If AI is such a huge productivity boost for developers, where is all the software those developers are supposedly writing[3]? But this is just a microcosm of a bigger question. Almost all the economic growth since the AI boom started has been in AI companies. If AI is revolutionizing multiple fields, why aren't relevant companies those fields also growing at above-expected rates? Where's all this productivity that AI is supposedly unlocking?

[1] https://hms.harvard.edu/news/does-ai-help-or-hurt-human-radi...

[2] https://www.ajronline.org/doi/10.2214/AJR.24.31493

[3] https://mikelovesrobots.substack.com/p/wheres-the-shovelware...


Ok, but i am asking for uses for LLMs specifically.

Of course i agree ML has already helped in many other areas and has a bright future. But the thing everyone is talking about here are LLM's


"The bubble will pop any minute now, any second, just you wait" is cope.

Even if AI somehow bucks the trend and stops advancing in leaps? It's still on track to be the most impactful technology since smartphones, if not since the Internet itself. And the likes of Nvidia? They're the Cisco of AI infrastructure.


The dot com bubble popped. It doesn't mean that the internet wasn't successful, just that people got way too excited about extrapolating growth rates.

AI is here to stay, but the question is whether the players can accurately forecast the growth rate, or get too far ahead of it and get financially burnt.


The importance of the Internet didn't prevent the .com bubble from bursting.

Does anyone in the finance business know how legal this all it? I am hearing terms like "round tripping" being thrown around. A practice where a company sells and buys back its own product to artificially inflate revenue.

I'm asking because its not just OpenAI that they are apparently doing this with, instead its with multiple other major GPU providers, like Coreweave.

And its just being done all out in the open? How?


IANAL but you can do pretty much anything as long as it's disclosed. The only problem with round-tripping is doing it secretly.

As an investor you may decide that round-tripping is dumb but in that case your recourse is to sell the stock.


First of all, it's not 'textbook round tripping' at all. The parent commenter is dead wrong but HNers upvote when they see "AI is a bubble."

Textbook round tripping is like: OpenAI buys GPUs from Nvidia. And the only reason it buys these GPUs is to resell it back to Nvidia, or just do nothing. It doesn't make it round tripping just because OpenAI is taking investment and buying stuff from Nvidia at the same time.

Unless you really believe OpenAI has no intention to use these GPUs for other purposes (like training GPT-6. I know, a crazy idea: OpenAI will train and release a model), it's not round tripping.


Its not just about OpenAI, though. Even though they have the biggest/flashiest deal. The other more obvious example is coreweave.

> OpenAI buys GPUs from Nvidia. And the only reason it buys these GPUs is to resell it back to Nvidia

Funny you should say this. Nvidia having those GPUs be rented back to them is also something thats happening.

https://www.kerrisdalecap.com/wp-content/uploads/2025/09/Ker...

"As detailed by The Information, in early 2023 Nvidia invested $100 million in equity and signed a $1.3 billion rental agreement through 2027, under which it rents back GPUs from CoreWeave to support internal R&D and its DGX cloud offering."

"CoreWeave is not the only neocloud to benefit from Nvidia’s strategic support. Nvidia has actively supported an ecosystem of emerging AI infrastructure providers – including Lambda, Nebius, and Applied Digital –"

They are quite literally buying GPUs only to rent them right back to Nvidia.

And these are just the public deals. Is Nvidia systematically selling GPUs and having them be rented back to, by every major GPU cloud providers?

https://www.investing.com/analysis/coreweave-nvidia-partners...

"This deep alliance culminates in the new $6.3 billion agreement. The deal’s most critical component is a strategic commitment from NVIDIA to purchase any of CoreWeave’s unsold cloud computing capacity through April 2032"


How I see it - the people with the money make the rules, why would they make rules against themselves?

Yes, but my point is that this almost feels like an Enron case. Things were fine, until they weren't. And then in retrospect the fraud is obvious.

I'm just surprised that nobody is yelling to the rooftops about practices that are just so out in the open right now.


You know you can sell inferencing at near 100% margins, right? More, even.

Amells like yahoo driven 2000 bubble. Definitely short every ancillsry business involved

I feel like data center deployments is the new metric that companies choose to show growth vs the old headcount growth.

We are definitely closer to the top in this market. Do people even realize what they’re predicting in terms of energy use? It’s going to be a wasteland territory sooner than people think.

its going to go 10x from where it is now

Next Year: OpenAI announces it is seeking funding for a Dyson Sphere

Where do I sign? Finally back to space race instead of era of social network degradation.

I don't think it's a good idea to have the same sociopaths who brought us the current status quo be propping up a new space race...

The production rate of ~~paperclips~~ tokens isn't growing quickly enough!

Life was so much cheaper in the '80, when you could travel in time with just 1.21GW.

Whats in it for Nvidia? At the recent 300B valuation, 25% equity?

Stating compute scale in terms of power consumption is such a backwards metric to me, assuming that you're trying to portray is as something positive.

It's like selling steel by the average fractional number of mining deaths that went into producing it. Sure, at a given moment there will be some ratio between average deaths and steel, but that's a number that you want to be as low as possible.


Stating compute scale in terms of power consumption is exactly how one looks at data centers or capacity planning right now though. It's the major constraint.

It's just a different abstraction level.


Anyone else find it fascinating that gigawatt/unit of power is the metric used for this deal?

I only have an 600W computer

Mine is using 9W according to Coconut Battery and working fine for most purposes.

Does this affect OpenAI’s renegotiation of their deal with Microsoft?

i am wondering do they still use 48v in those computers ? That's a lot of amp.

I look forward to subsidizing this effort with my skyrocketing home power bill.

The Earth become warmer, and we spend a lot of money to make it 10GW faster warming.

I do think that us humans are gonna cook ourselves on this planet.

However at least AI is doing _something_ with the energy. Cryptocurrency is such a fucking useless waste of energy I'd take anything over it.


And the bubble keeps bubble-ing

Perpetual Money Machine

The good thing about this, is that when the AI bubble bursts, we will have a lot of energy infrastructure that wouldn't be built otherwise.

Nvidia if you're listening give me 10K and i'll bu...*invest 10K+ 10 euro worth of cash in your product.

Very foolish of them not to leverage SoftwareFPU. And with minimal effort Performas are rackable.

so like 3 racks of h100?

there's a worrying lack of structural integrity building up in this hype bubble and this adds more fuel to the fire.

You essentially have Nvidia propping up its own valuation here by being its own customer. If they sold a bunch of H100's to themselves and then put it as revenue on their books they'd be accused of fraud. Doing it this way is only slightly better.


They should spend gigawatts on something more useful instead.

Where does this fit in with the $300 billion partnership between OpenAI and Oracle? You know, the one that also hasn't happened yet and catapulted Oracle's stock price through the stratosphere last week? Is that also getting built or is OpenAI partnering with Nvidia to get access to the GPUs that neither they nor Oracle currently own?

Yaay, one step closer to torment nexus.

low effort comment, whose content is a stale reference to other low effort memes

Ben Thompson and Doug O'Laughlin ( https://stratechery.com/2025/the-oracle-inflection-point-app... (paywall), https://www.fabricatedknowledge.com/p/capital-cycles-and-ai ) are calling it a bubble, largely because we've entered the cycle where cash flows aren't paying for it, but debt is (See Oracle. they won't be able to pay for their investment with cash flow).

I think even Byrne Hobart would agree (from his interview with Ben): -- Bubbles are this weird financial phenomenon where asset prices move in a way that does not seem justified by economic fundamentals. A lot of money pours into some industry, a lot of stuff gets built, and usually too much of it gets built and a bunch of people lose their shirts and a lot of very smart, sophisticated people are involved with the beginning, a lot of those people are selling at the peak, and a lot of people who are buying at the peak are less smart, less sophisticated, but they’ve been kind of taken in by the vibe and they’re buying at the wrong time and they lose their shirts, and that’s really bad. --

This is a classic bubble. It starts, builds, and ends the same way. The technology is valuable, but it gets overbought/overproduced. Still no telling when it may pop, but remember asset values across many categories are rich right now and this could hurt.


This is a dot-com level bubble.

Dotcom++

To the people who are calling this evidence of a bubble: There is no credible indication that AI in general is a bubble, even if not all investments will make sense in retrospect. Quite the opposite, the progress in the field over the last few years is staggering. AI systems are becoming superhuman at more and more tasks. It's only a question of time till AI will outperform us at everything.

> There is no credible indication that AI in general is a bubble, even if not all investments will make sense in retrospect.

If you add up all of the contracts that OpenAI is signing, it's buying something like $1 trillion/year worth of compute. To merely break even, it would have to make more money than literally every other company on the planet, fairly close to twice the current highest revenue company (Walmart, a retailer, which, yeah, there's a reason that has high revenue).


They are aiming at being the first to develop an AGI and eventually superintelligence. Something that can replace human workers. Walmart is small fish in comparison. OpenAI is currently in the lead, so their chances are decent.

This is an argument that OpenAI needs to achieve a supernatural outcome in order to be a financial success.

So you think superhuman intelligence is supernatural.

If you want to substitute "science fiction" that's fine too. We generally don't bank real investment expectations on science fiction outcomes. The positive expectation scenario you've provided is "OpenAI obsoletes workers, to the extent that Walmart is small fish". That's a sci-fi outcome, not a rational expectation.

You would have called ChatGPT or Dall-E science fiction just shortly before it actually came out.

There is no indication of being in a bubble when you're actually in one. Its only after the bubble pops do people recognize it in hindsight. Otherwise there would be no bubbles and we wouldn't see large institutions fall for this crap.

What is a credible indication? Who is credible? Its all subjective. Its possible to fool yourself endlessly when financial incentives are involved. The banks did it with mortgages.


This is awful. You should know the private companies building these datacenters often get back door deals with PUCs. They do NOT pay their fair share in their consumption and the extra cost is shouldered onto the general rate payers.

More degenerate "privatizing of the profits, socializing the profits" behavior. American public continues to get bent by billionaires and continue to elect folks that will gladly lube them up in preparation of that event.



"Hey, there's a bubble"

If I had shovels to sell, I'd definitely announce a strategic partnership to have a huge quarry dug by hand.

Seriously, is there anyone in the media keeping unbiased tabs on how much we're spending on summarizing emails and making creatives starve a little more?


Ed Zitron is an AI skeptic from the market perspective, highly recommend his stuff. It’s definitely not comforting to read, but he’s doing the math behind these headlines and it’s not adding up at all[0]

0: https://www.wheresyoured.at/


Yeah, I pretty much agree with what he's been doing. But he's not what the average person would call 'media,' so his reach is severely limited.

fancy stock buyback lol

Can we get some laws to force these companies to start subsidizing the consumer grids they're pummeling?

The electric bills are getting out of hand.


They would build their own power lines / grid if they could.

No thanks, I'll just take subsidies to my bill.

Well, I suggest you go into politics and do something about it rather than be pointlessly smug on the internet.

If you're actually interested, the reason it's important to build out the grid even more instead of "subsidizing" is because the current grid can't handle renewables well which we need to improve if we want to use sustainable energy.



... why? the current (heh) situation is that they do these big announcements and then local/state governments around the US get in a bidding war to try to shift costs from the datacenter operator on to their own citizens, in addition to offloading all of the capex.

What will happen if/when the AI bubble pops and there is far more grid capacity than demand? Power plant bailouts?

Load growth for the last 15 years has been very small but load growth going forward is expected to rise due to electrification of all things to decarbonize the economy. This means home heating, electrical cars, heavy industries, obviously data centers and the list goes on. So even if we have more grid capacity than demand (this seems unlikely), it will be used before too long.

Will just make capacity available for electrification of other infrastructure like heat pumps, electric cars, and so on. Lots of other folks would happily buy that power. The whole AI bubble is just driving up electricity pricing for everyone else at the moment.


Where is Apple? Even from an investment perspective.

My MacBook Pro runs local models better than anything else in the house and I have not yet needed to install a small nuclear reactor to run it, so, I feel like they're doing fine.

Being rationale.

Rational.

Ha, that too.

Maybe we're not sure if they're being rational or rationalizing.

Apple doing fine and often spend the same 100B in a year buying back Apple stocks.

Losing the race

Right, but is the race to the pot of gold, or the stoplight (in which case by "losing" they save on gas)?

This is not something that can be won. The LLM architecture has been reaching it's limitations slowly but surely. New foundational models are now being tweaked for user engagement rather than productive output.

$100 billion, what a number. It makes me a bit cynical. The amount of useful developments you could finance in either clean energy, education, nature preservation, medicine, anything.

But no, let's build us a slightly better code generator.

Strange times we live in...


If Solar can't compete with natural gas economically, and subsidizing solar ends up de-incentivizing natural gas production by artificially lowering energy prices, what's the solution here?

Your question is weird.

Solar does compete economically with methane already, and it's only going to improve even more.


If true, why aren't we mass scaling it all over the American West? We have railways running from West -> East, why not include power lines that can take power from energy farms in the West -> East? No major project in AZ, TX, or CA to give a city free power? etc

> We have railways running from West -> East, why not include power lines that can take power from energy farms in the West -> East?

Firstly, there is no such thing as an infinitely scaling system.

Secondly, because power transmission isn't moving freight. The infrastructure to move electricity long distances is extremely complicated. Even moving past basic challenges like transmission line resistance and voltage drop, power grids have to be synchronized in both phase and frequency. Phase instability is a real problem for transmission within hundreds of miles, let alone thousands upon thousands.

Also that infrastructure is quite a bit more expensive to build than rail or even roads, and it's very maintenance hungry. An express built piece of power transmission that goes direct from a desert solar farm to one of the coasts is just fragile centralization. You have a long chain of high-maintenance infrastructure, a single point of failure makes the whole thing useless. So instead you go through the national grid, and end up with nothing, because all of that power is getting sucked up by everyone between you and the solar farm. It probably doesn't even make it out of the state it's being generated in.

BTW the vast majority of the cost of electricity is in the infrastructure, not its generation. Even a nuclear reactor is cheap compared to a large grid. New York city's collection of transmission lines, transformers, etc. (not even any energy generation infrastructure, just transmission) ballparks a couple hundred billion dollars. Maintenance is complex and extremely dangerous, which means the labor is $$$$. That's what you're paying for. That's why as we continue to move towards renewables price/watt will continue to go up, even though we're not paying for the expensive fuel anymore. The actual ~$60 million worth of fuel an average natural gas plant burns in a year pales in comparison to the billions a city spends making sure the electrons are happy.


60% tariffs on solar components from China, an executive that is actively hostile to renewable energy, and you still are massively scaling it to some extent.

67% of new grid capacity in the US was solar in 2024 (a further 18% was batteries, 9% wind, and 6% for everything else). In the first half of 2025 that dropped to 56% solar, 26% batteries, 10% wind, and 8% everything else (gas). Source for numbers: https://seia.org/research-resources/solar-market-insight-rep...


Getting approval across multiple states for lines takes a very long time. The federal government and just about any state, municipality, or private land owner along the proposed route can block or delay it. The TransWest Express transmission line project started planning in 2007 but couldn't start construction until 2023, and it only needed to cross 4 states.

If the coast-to-coast railways hadn't been built in the past, I don't think the US could build them today. There are too many parties who can now block big projects altogether or force the project to spend another 18 months proving that it should be allowed to move forward.


It is massively scaling everywhere, and notably in Texas btw.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: