Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Nvidia and the highest amount of vram you can get.

Currently the 4090, the rumor is the 4090ti will have 48gb of vram, idk if its worth waiting or not.

The more VRAM the higher paremeter count you can run all in memory (fastest by far).

AMD is almost a joke in ML. The lack of CUDA support (which is nvidia proprietary) is straight lethal, and also even though ROCM does have much better support these days, from what I've seen it's still a fraction of the performance of what it should be. I'm also not sure if you need projects to support it or not, I know pytorch has backend support for it but I'm not sure how easy it is to drop in.



That's good to know. A lot of people might look at the 7900XTX with its 24GB of ram for a grand and think "ah, a good deal for fitting an LLM in".


I think the most recent rumors were amended to it having 24, unfortunately.


Darn.

I mean in all honestly there's no reason a gaming card would need 48gb at the moment when so few games even use 24gb.

48GB really only makes sense for workstation cards.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: