Hacker News new | past | comments | ask | show | jobs | submit login

Surely that would be amazing for NVDA? If the only 'hard' part of making AI is making/buying/smuggling the hardware then nvidia should expect to capture most of the value.



No. Before Deepseek R1, Nvidia was charging $100 for a $20 shovel in the gold rush. Now, every Fortune 100 can build an O1-level model with currently existing (and soon to be online) infra. Healthy demand for H100 and Blackwell will remain, but paying $100 for a $20 shovel is unlikely.

Nvidia will definitely stay profitable for now though, as long as Deepseek’s breakthroughs are not further improved upon. But if others find additional compression gains, Nvidia won’t recapture its old premium. Its stock hinged on 80% margins and 75% annual growth, Deepseek broke that premise.


There still isn't a serious alternative for chips for AI training. Until competition catches up or models become so efficient they can be trained on gaming cards Nvidia will still be able to command the same margins.

Growth might take a short-term dip, but may well be picked up by induced demand. Being able to train your own models "cheaply" will cause a lot more companies and departments want to train their own models on their own data, and cause them to retrain more frequently.

The time of being able to sell H100 clusters for inference might be coming to an end though.


> that would be amazing for NVDA?

It’s good for Nvidia. It’s not as good as it was before. (Assuming DeepSeek’s claims are replicable.)


DeepSeek revealed it's not as hard as previously thought; a much smaller number of less sophisticated chips was sufficient.


NVDA is too invested in training and underinvested in edge inference.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: