Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Good time to buy then, I don’t understand how stupid some traders can be.

A more efficient model is better for NVIDIA not worse. More compute is still better given the same model. And as more efficient models proliferate it means more edge computing which means more customers with lower negotiating power than Meta and Google…

This is like thinking that if people need to dig only 1 day instead of an entire month to get to their nugget of gold in the midst of a gold rush the trading post will somehow sell fewer shovels…



While everything you say may be true, this shows a fundamental misunderstanding about how the modern stock market functions. How much value a company creates is at best tangential and often completely orthogonal to how much the stock is worth. The stock market has always been a Keynesian beauty contest, but in the past few decades, strongly shaped and morphed by attention economies. A good example of this is DJT, a company which functionally doesn't do much anything, but in the last year has traded at wildly differing prices. P/E, EBITDA, etc, are all useless metrics in trying to explain this phenomenon.

In other words, NVIDIA is in the red not because the company is suddenly doing worse, but because traders think other traders think it will trade down. That is a self fulfilling prophecy, but only so long as there is sufficient attention to drive that. The same works the other way around as well, so long as there is sufficient attention to drive the AI hype train upwards, related stocks will do well as well.


Everything above the street level and physical economy is becoming gambling.

There has always been a component of gambling to all investing, but that component now seems to utterly eclipse everything else. Merit doesn’t even register. Fundamentals don’t register.


I understand that part, hence why this is very much buy the dip for me.


It's not just the usual herding behavior though. There's a convex response to news like this because people look at higher order effects like the growth of growth for stocks. Basically the DeepSeek story is about needing 40x fewer compute resources to run inference if their benchmarks are true. The dip doesn't mean that NVidia is now doomed, it simply means that if DeepSeek is legit, you need much less NV hardware to run the same amount of inference as before. Will the demand rise to still use up all the built hardware? Probably, but we went from a very stratospheric supply constraint to a slightly less stratospheric one, and this is reflected in the prices. Generally these moves are exaggerated initially, and it takes a bit of time for people to digest the information and the price to settle. It's an oscillating system with many feedback loops.


As someone who bought NVDA in early 2023 and sold in late 2024 I can say this is wrong.

There was never a question of if NVDA hardware would have high demand in 2025 and 2026. Everyone still expects them to sell everything they make. The reason the stock is crashing is because Wall St believed that companies who bought 50B+ of NVDA hardware would have a moat. That was obviously always incorrect, TPUs and other hardware was eventually going to be good enough for real world use cases. But Wall St is run by people who don't understand technology.


Loving the absolute 100% confidence there and the clear view into all the traders' minds that are trading it this morning.

If they'll sell everything they make and it's all about the moat of their clients, why is NVDA still down 15% premarket? You could quote correlation effects and momentum spillover, but that is still just the higher order effects I mentioned about people's expectations being compounded and thus reactions to adverse news being convex.


> why is NVDA still down 15% premarket?

Presumably because backorders will go down, production volume and revenue won't grow as fast, Nvidia will be forced to decrease their margins due to lower demand etc. etc.

Selling everything you make is an extremely low bar relative to Nvidia's current valuation because it assumes that Nvidia will be able to grow at a very fast pace AND maintain obscene margins for the next e.g. ~5 years AND will face very limited competition.


That's literally what I wrote in my post, which the parent disagreed with. You could disagree with the part that it is because inference is now cheaper - but again I'd argue that's just a different way of saying there's no moat.


People owned NVDA because they believed that huge NVDA hardware purchases was the ONLY way to get a AI replacement for a Mid Level software engineer or similar functionality.


That's basically what I wrote: "it simply means that if DeepSeek is legit, you need much less NV hardware to run the same amount of inference as before."

So I still don't understand what it is that you are so strongly disagreeing with, and I also don't understand how having owned NVidia stock somehow lends credence to your argument.

We are in agreement that this won't threaten NVidia's immediate bottom line, they'll still sell everything they build, because demand will likely rise to the supply cap even with lower compute requirements. There are probably a multitude of reasons why the very large number of people who own NVidia stock have decided to de-lever on the news, and a lot of it is simple uneducated herding.

But we are fundamentally dealing with a power law here - the forward value expectations for NVidia have exponential growth baked in to the hilt, combined with some good old fashioned tulip mania, and when that exponential growth becomes just slightly less exponential, that results in fairly significant price oscillations today - even though the basic value proposition is still there. This was the gist of my comment - you disagree with this?


Up until recently there was a belief by some investors that OpenAI was going to "cure cancer" or something as big as that. They assumed that the money flowing into OpenAI would 10x, under the assumption that no one else could catch up with them after that event and a lot of that would flow to NVDA.

Now is looks like that 10x of flow of money into OpenAI will no longer exist. There will be competition and compodiditzation, which causes the value of the tokens to drop way more than 40x.


You are assuming that the stock rose to this level "on merits" and now it falls due to "stupidness".

Maybe part of the growth was also "stupidness", and in that case buying the dip is a mistake because the "merit" price (value) is still way below.


I think the message everyone now accepts is: "there is no moat". It is plain stupid to think big models can be magically copy-protected - they are simply arrays of numbers and all components one need to create such arrays are free and well established. This is unlike the whole infrastructure, processes, social connections, hardware and storage, one need say to recreate a service like YouTube or Facebook. Large models are different - you don't need all of that - the future of LLMs is Open Source like Linux.


You can buy the dip if you want, as long as you're aware that you're not betting that "stupid" traders are undervaluing NVIDIA's fundamentals. Rather, you're betting that "stupid" traders will again rally NVIDIA's share price significantly above this dip, and you will be a smart trader who will know when to sell what you bought. Good luck.


And not just that, but even if AI's future is indeed as bright as the hype says (i.e. that NVIDIA's fundamentals are solid & that the market will eventually acknowledgment that after the fluctuations) they may still be wrong about the timeline.

In the .com bust you could have "bought the dip" in the early 00s right after the crash started and still taken 5 years before you weren't in the red even on "good" (in hindsight) stocks like amazon, ebay, microsoft, etc. The big hype there was eCommerce - it turned out to be true! We use eCommerce all the time now, but it took longer than predicted during the .com boom (same for broadband internet enabling "rich web experience" - it came true, but not fast enough for some hyped companies in '00).

And if you bought some of the darling stocks back then like Yahoo or Netscape that ended up not so great in hindsight you may have never recouped your losses.


Don't try to catch a falling knife...bla bla...


Markets can stay irrational longer than you can stay solvent.


Put more simply: The stock market is not a reflection of the economy. Wall Street is not Main Street.


It's a reflection of expectations about the future economy. Obviously, such expectations are not always accurate because humans are quite fallible when trying to predict the future. This is even more true when there is a lot of hype about a certain product.

Yesterdays price of (say) NVidia was based on the expectation that companies would need to buy N billion of USD of GPUs per year. Now Deepseek comes out and makes a point that N/10 would be enough. From there it can go two ways:

- NVidia's expected future sales drop by 90%.

- The reduced price for LLMs should allow companies to push AI into markets that were previously not cost effective. Maybe this can 10x the total available market, but since the estimated total available market was already ~everything (due to hype) that seems unlikely.

- NVidia finds another usecase for GPUs to offset the reduced demand from AI companies.

In practice, it will probably be some combination of all three. The real problems are not caused for the "shovel sellers" but for companies like OpenAI and Anthropic, who now suddenly have to compete against a competitor that can produce the same product at (apparently) a fraction of the price.


> OpenAI and Anthropic, who now suddenly have to compete against a competitor that can produce the same product at (apparently) a fraction of the price.

OpenAI and Anthropic can react by adopting DeepSeek's compute enhancements and using them to build even better models. AI training is still very clearly compute-limited from their POV (they have more data than they know what to do with already, and training "reasoning"/chains-of-thought requires a lot of reinforcement learning which is especially hard) so any improvement in compute efficiency is great news no matter where it comes from.


I think it is more expectation about expectation. You buy/sell based on whether you expect other people to expect earn or lose. It is self-referential, hence irrational. If a new play enters and peoples expectations shift, that affects your expectation of value even though the companies involved are not immediately or directly affects.


As already mentioned elsewhere, Jevon's Paradox will increase demand subsequent to improved efficiency. Yes, will not can.

So if the stock market was reflective of the economy (future or the present) then stocks should go up, instead they're going down. Why? Because the stock market is not reflective of the economy.

The stock market is essentially a reflection of societal perception. DJT which was brought up earlier is a great example, because the price of DJT has next to nothing to do with Trump's businesses and almost everything to do with how he is perceived (and remember there is no such thing as bad publicity).

Personally I think the fall will be momentary and followed shortly by a climb to recovery and beyond, but who really knows.

If you don't want to lose your money: Don't let the sensationalist financial journalists and pundits get to you, don't let big red numbers in your portfolio scare you, ignore traders (they all lose their money), don't sell your stocks unless you actually need that money for something right now, re-read your investment manifesto if you have one, and maybe buy the dip for shits and giggles if you have some spare cash laying around.


I agree that it will improve demand for AI services. There's no hard rule that the demand increase will be larger than the efficiency increase though, and so total sales of GPUs may still decrease as a result.


Wall St is not Main St even if they price it perfectly.


>In other words, NVIDIA is in the red not because the company is suddenly doing worse, but because traders think other traders think it will trade down.

Well put. People need to unterstand that some stocks are basically one giant casino poker table. There was a comment with a link here that a lot of Nvidia buyers don't even know what products Nvidia is making and they don't care, they just want to buy low and sell high. Insert old famous comment abut shoe shine boy giving investment advice to Wall Street stock traders.


Nvidia is way too overvalued regardless of deepseek or the success of AI. This is just some correction (not even too big even considering the current bubble), these traders are not stupid.


I agree with Aswath Damodaran here. NVDA is priced for perfection in AI, but also whatever is next.

In addition, IMO NVDA’s margins are a gift and a curse. They look great to investors, but also mean all their customers are aggressively looking to produce their own GPUs.


They are also priced on the idea that nothing will challenge them. If AMD, Intel, or anyone else comes out with a challenger for their top GPUs at competitive prices, that’s a problem.

I’m surprised they haven’t yet.


The biggest challengers are likely the hyperscalers and companies like Meta. It sort of flew under the radar when Meta released an update on their GPU plans last year and said their cluster would be as powerful as X NVDA GPUs, and not that it would have X NVDA GPUs [1].

Also, I should add that Deepseek just showed the top GPUs are not necessary to deliver big value.

[1] https://engineering.fb.com/2024/03/12/data-center-engineerin...

This announcement is one step in our ambitious infrastructure roadmap. By the end of 2024, we’re aiming to continue to grow our infrastructure build-out that will include 350,000 NVIDIA H100 GPUs as part of a portfolio that will feature compute power equivalent to nearly 600,000 H100s.


Exactly. gpu's have become too profitable and of strategic importance, to not see several deep pocketed existing technology companies invest more and try and acquire market share. there is a mini moat here with cuda and existing work, but some the start of commodification must be on the <10 year horizon


agreed but ASML has been chronically undervalued forever.


Have you priced in the extremely limited freedom to operate they have? There is an extreme systemic risk to being a monopoly in a strategic position. It's an extreme beneficial position to be in, until it isn't.


Would you mind expanding on what you mean by this? I'm struggling to follow. Thanks.


ASML for now has a monopoly on cutting edge EUV. Since this is considered a strategic technology, the US dictates what they can sell to whom. This places ASML in a pincer. The US will develop a competitor as soon as they can if they can't get enough control over ASML, and at that point ASML would still be forbidden to sell to BRICS while losing the 'western' market as well.

So they're in a plushy seat, until the US decides they aren't.


do you have a model for "short-leash monopolies" would definitely put that in The Book.


I went for some Asian index funds to balance things out for this reason.


The people who really make the magic generally do not capture the hype.


NVIDIA has a P/E ratio of 56 that’s double that of the S&P 500 but half that of AMD and the same one as Meta.

And whether it’s overvalued or not isn’t relevant that selling a stock because the product the company produces is now even more effective is mind bogglingly stupid.


> selling a stock because the product the company produces is now even more effective is mind bogglingly stupid.

No it isn't. Investors are most likely expecting there will be less demand for Nvidia's product long-term due to these alleged increased training efficiencies.


There is afaict no inherent limit to expand on the bottom end of the market. My gut feeling is lower training costs will expand the market for hardware horizontaly far faster than any vertical scaling by a select 2 digit of mega corps could.


Well the price has a built in presumption that the earnings will keep growing. That's why PER is not that relevant for them, it's been over 50-70 since forever, but the stock went up 10x, which means earnings went up as well. DeepSeek might be good for their business overall, but it might mean earnings will not continue growing exponentially like they have been for the past two years. So it's time to bail.

You shouldn't underestimate the fact that a large amount of these trades are on margin. Sometimes you can't wait it out because you'll get margin called and if you can't pony up additional cash you're basically getting caught with your pants down.

Disclaimer: I am not a trader, so could be way off


Why? The compute requirements would still continue to grow the more efficient and more capable the models become.

If it’s cheaper to inference you end up using the model for more task, it it’s cheaper to train you train more than models. And if you now need only 1000’s of GPUs instead of 10’s or 100’s of thousands you’ve just unlocked a massive client base of those who can afford to invest high six to low seven figures instead of 100’s of millions or billions into to try their luck.


It could be, but maybe the feeling is the investments now are already massive and everyone has jumped on the AI train. If you are suddenly 10x efficient, and everyone gets 10x more efficient, there's less room to grow than before. What you're saying makes a lot of sense, but it's one thing to write it on a message board and another to use it to back up your decision that affects billions of dollars you have in your fund.

The proof is in the pudding, you're welcome to prove "everyone" wrong.


Doesn't this situation also imply to some degree that China is focused on beating the US on AI and probably they will develop a competitor to NVIDIA that will cause margins to drop significantly?

They have a lot of very smart people and the will to do it, seems like a matter of time before they succeed.


It might take 5 yrs to find the use cases. That's what happened with the dark fiber from the .com boom. Go look at Cisco 2001 for parallels


It's arguable how good a strategy it is to check against other P/E. During the tech bubble people would say X is cheap, because Y is trading at 100P/E instead of 200


Again that may reasonable but it’s a completely different argument. Whether there is a bubble or not and whether NVDA is overvalued is irrelevant to the subject at hand.

If it’s cheaper to train models it means far more customers that will try their luck.

If you reduce training requirement from a 100,000 GPUs to a 1000 you’ve now opened the market to 1000’s and 1000’s of potential players instead of like the 10 that can afford dumping so much money into a compute cluster.


the holy grail is to not have a separate train and inference steps. when the model can be updated while it is inferencing is where we're headed. deepseek only accelerates the need for more compute, not less


THIS is the only correct statement in all of this.

The goal for AGI and ASI MUST BE to train, inference, train, inference and so on and that all on the fly in fractions of a second from every token produced.

Now good luck calculating the compute and hard work in algorithms to get there.

Not possible? Then AGI won't ever work because how can AGI beat a human if it can't learn on the fly? Not to mention ASI lol.


P/E alone is useless anyway. A growth company is likely not making a profit as they are reinvesting. But not a profit doesn't implies good either of course.


AMD does not have a PE double of NVIDIA. PE is high because of amortisation of an acquisition. People on hackernews talk a lot but have no idea what they talk about. You might know how to write javascript or some other language but clearly you have not read the earnings reports or financials of AMD and probably alot of the other companies you talk about. So please stop spreading nonsense.

Just for those that clearly have no idea https://old.reddit.com/r/AMD_Stock/comments/1d2okn1/when_wil...



This is hackernews, not some boilerroom pump n dump forum. Please use more professional language and take your confidence down a notch. Try to learn and add to the discussion.

You seem to believe that the more inference or training value per piece of tech the more demand there will be for that piece of tech full stop when there are multiple forces at play. As a simple example, you can think of this as a supply spike; while you can make the bet that the demand will follow there could be a lag on that demand spike due to the time it takes to find use cases with product/market fit. That could collapse prices over the near term which could in turn decrease revenue. As a reminder the stock value isn't a bet on whether "the gold trader" will sell more gold or not, it's a bet on whether the net future returns of the gold trader will occur in line with expectations, expectations that are sky high and have zero competition built in.


Indeed:

https://en.wikipedia.org/wiki/Jevons_paradox

In economics, the Jevons paradox occurs when technological progress increases the efficiency with which a resource is used, but the falling cost of use induces increases in demand enough that resource use is increased, rather than reduced.


Yes but it will also mean that people wouldn't need cutting edge NVIDIA chips - it will be able to run on older node chips. Or from different manufacturers. So NVIDIA wouldn't be able to command the margins they do now.

It may be great news for VRAM manufacturers tough.


They only need cutting edge to be competitive, and that will be true independently of efficiency. That's the point.


I beat your Jevons paradox and raise you the Theory of Constraints..


> I don’t understand how stupid some traders can be.

90% of traders lose money, so that's a data point...

You're trying to apply rational thinking but that's not how markets work. In the end valuations are more about narratives in the collective mind than technological merit.


I think it's because the media coverage is all focused on how this means the big AI players have lost their competitive advantage, rather than the other side of the equation.

But that's also dumb, because "huge leap forward in training efficiency" is not exactly bad news for the major players in even the medium term. Short term, it means their models are less competitive, but I don't see any reason that they can't leverage e.g. these new mixed precision training techniques on their giant GPU farms and train something even bigger and smarter.

There seems to be this weird baked in assumption that AI is at a permanent (or at least semi-permanent) plateau, and that open source models catching up is the end of the game. But this is an arms race, and we're nowhere near the finish line.


NVDA is a totally manipulated stock. The company beat earnings in the last three quarters and the stock dropped 15% to 20% immediately after the results release.

Every single time...


I don't think it's manipulated, just hot. As the old adage goes: "Buy the rumor, sell the news".


What do you think is going on here? Why is it dropping?


DeepSeek is only partial to this. DeepSeek might be the trigger, but the core reasons are the overvaluations.


> Good time to buy then, I don’t understand how stupid some traders can be.

Likely a "how solid is the technical moat" evaluation - this could be a one-off or could be that there are an avalanche of advancements to continue along the efficiency side of the process.

Given the style and hype of logic in the AI space, I fully believe resources are not well allocated in compute and _actual_ thinking as to how they are spent.

Deepseek's apparent 10x more efficient per inference token... implies a lot of other hardware meets the general use-case. We also know that reasoning should be about 10W for human speed-of-thought... maybe another 1-2 orders of power efficiency.

"Pre-Training: Towards Ultimate Training Efficiency

We design an FP8 mixed precision training framework and, for the first time, validate the feasibility and effectiveness of FP8 training on an extremely large-scale model. Through co-design of algorithms, frameworks, and hardware, we overcome the communication bottleneck in cross-node MoE training, nearly achieving full computation-communication overlap. This significantly enhances our training efficiency and reduces the training costs, enabling us to further scale up the model size without additional overhead. At an economical cost of only 2.664M H800 GPU hours, we complete the pre-training of DeepSeek-V3 on 14.8T tokens, producing the currently strongest open-source base model. The subsequent training stages after pre-training require only 0.1M GPU hours." [1]

[1] https://huggingface.co/deepseek-ai/DeepSeek-V3


>the trading post will somehow sell fewer shovels

There is a point where there are enough shovels circulating that the demand for new shovels falters, even with zero drawback in the rush. And if so much gold was being mined that it overwhelmed the market and reduced the commodity price, the value of better shovels is reduced.

DeepSeek and friends basically reduce the commodity value of AI (and to be fair, Facebook, Microsoft et al are trying to do the same thing with their open source models, trying to chop the legs out of the upstart AI cos). If AI is worth less, there are going to be fewer mega capitalized AI ventures buying a trillion dollars worth of rapidly-depreciating GPUs in hopes at eeking out some minor advantage.

I wouldn't short nvidia stock, but at the same time there is a point where the spend of GPUs just isn't rational anymore.

>And as more efficient models proliferate it means more edge computing which means more customers with lower negotiating power than Meta and Google

Edge compute has infinitely more competition than the data center.


Good time to buy is % of income via tax efficient methods into the in SP500 for most.

But I agree in the sense that Deepseek just creates more demand. Because people desire to get AI to do more work. This makes bang for buck greater opening new opportunities.

This sell off is like selling Intel in 2010 because of a new C compiler.


maybe Nvidia is fine but I don't understand this logic. suppose it turns out GPUs are unnecessary at all, but they still provide a performance boost, but you can do everything you can do right now with CPUs w.r.t performance. would that be good or bad for Nvidia?

unless it can be said we need more performance than is currently possible, e.g. new demand, it would be catastrophic. it is unclear that throwing more compute actually expands what is possible. if that is not the case, efficiency is bad for nvidia because it simply results in less demand.


I could see arguments be made in both ways here. If GPUs end up being more efficient/powerful (like today) it could induce even more demand, but also if CPU gets within ~20% of how fast you can do something with a GPU, people might start opting for something like Macs with unified memory instead of GPUs.

Today a CPU setup is still nowhere near as fast as a GPU setup (for ML/AI), but who knows how it looks like in the future.

> it is unclear that throwing more compute actually expands what is possible

Wasn't that demonstrated to be true already in the GPT1/2 days? AFAIK, LLMs became a thing very much because OpenAI "discovered" that "throwing more compute (and training data) at the problem/solution expands what is possible"


Not if the current architecture plateaus around the level of O1/3.5/R1. Then any further training is useless and inference is relatively cheap.


"dig only 1 day instead of an entire month"

You answered your own question. People do not dig in the Sacramento right anymore for gold, because, it is gone. If you can train models for 1/100 the cost, and you sell model training chips, you probably are not going to sell as many chips.


That's why the shovel maker from back then are selling mining machines today.

Everyone here thinks Nvidia is dommed because of training efficiency.

But what has Nvidia been doing for the past decade? Correct increasing training and inferencing efficiency by magnitudes.

Try to train GPT4 on 10k of Volta, Ampere, Hopper and then Blackwell.

What has happened since then? Nvidia has increased their sales in magnitudes.

Why? Because thanks to improvement in data, in algorithms, compute efficiency ChatGPT was possible in the first place.

Imagine Nvidia wouldn't exist. When do you think the ChatGPT moment would happen on CPUs? LOL

Going back to my first sentence. Nvidia started also with small shovels which were GeForce cards with CUDA. Today Nvidia is selling huge GPU clusters (mining machines, yes pun intended ^^).


In said gold rush scenario, the price of gold itself would have collapsed.


And since the gold supply is limited, fewer shovels would indeed be required.


This doesn't have anything to do with DS, despite what the media claims, it comes to a heightened sense of fear people have with playing with fire.


>>>I don’t understand how stupid some traders can be.

I believe the saying is "The market can stay wrong for longer than you can stay solvent."


You are right, it is a Jevon's paradox thing. We were already going to spend all the gigawatts on AI, it is just more AI.


> Good time to buy then

No. The stock is still x10 after this dip from 2 years ago and x40 from a few years ago.


It is a demonstration that progress is no longer due to llm scaling.


But a more efficient model doesn't need the top notch hardware


nVidia is also about HPC in general, not just AI. It's remarkably silly that the stock would plunge 13% just because someone made a more compute-efficient LLM.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: