Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

AMD GPUs are a fantastic deal until you hit a problem. Some models/frameworks it works great. Others, not so much.


For sure but I think people on the fine tuning/training/stable diffusion side are more concerned with that. They make a big fuss about this and basically talk people out of a perfectly good and well priced 16gb vram card that literally works out of the box with ollama, lmstudio for text inferencing.

Kind of one of the reasons AMD is a sleeper stock for me. If people only knew.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: