Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I expect they'll lose value in the step-function shape that's typical of technology, where each new generation decreases the value of the previous one. But that's fine, as long as each chip generates more revenue over its lifetime than it cost. And the lifetime can extend through multiple generations, as long as each successive generation is only a marginal improvement over its predecessor.

Personally I think the bigger risk is software innovations making CPU training (and/or inference) sufficiently viable that it's cheaper to train models on a commodity CPU cluster than on some proportionally expensive GPU cluster. I don't know enough about the space to say whether that's likely, but it seems like a low risk, since pretty much any parallel algorithm will always be faster on GPU than CPU - it's just a question of the marginal benefits and cost (e.g. maybe it takes more CPU to train same model in same time, but cost of CPU is so much lower that it's worth buying more of them).



if it follows the trajectory of cryptocurrency, then the opposite will happen - even more expensive custom chips will replace GPUs




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: