Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can say that $2,000 for an M4 Pro is supposedly “not that expensive”, but for about $1,000 I can buy a gaming laptop with a 3090 and use the system RAM to help run larger models. The RAM may be slower, but the higher TFLOPS of the 3090 help greatly. It’s even better if we compare the Max, which as you said is more suited for LLM work, as one can then upgrade to a 4090 and still be in the clear.

The local LLM scene is definitely pricey and confusing, but making it more confusing by recommending the wrong hardware for the task doesn’t help much.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: