Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have a MacBook M2 and a PC with a 4090 ("just" one of them) - the VRAM barrier is usually what gets me with the 4090 when I try to run local LLMs (not train them). For a lot of things, my MacBook is fast enough, and with more RAM, I can run bigger models easily. And, it's portable and sips battery.

The marketing hype is overblown, but for many (most? almost all?) people, the MacBook is a much more useful choice.



Expanding on this, I have an M2Pro (mini) & a tower w/GPU... but for daily driving the M2Pro idles at 15-35W whereas the tower idles at 160W.

Under full throttle/load, even though the M2Pro is rated as less-performant, it is only using 105W — the tower/GPU are >450W!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: