I have a MacBook M2 and a PC with a 4090 ("just" one of them) - the VRAM barrier is usually what gets me with the 4090 when I try to run local LLMs (not train them). For a lot of things, my MacBook is fast enough, and with more RAM, I can run bigger models easily. And, it's portable and sips battery.
The marketing hype is overblown, but for many (most? almost all?) people, the MacBook is a much more useful choice.
The marketing hype is overblown, but for many (most? almost all?) people, the MacBook is a much more useful choice.