Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Actually there's a lot of demand in the AI data center space for such a card, such as for running large mixture of experts (MoE) models -- e.g. DeepSeek v3, which is one of the best LLMs in the world today.

Although AMD would need to greatly improve their entire software stack to make running AI models on AMD an attractive proposition.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: