Hacker News new | past | comments | ask | show | jobs | submit login

I have the skills to write efficient CUDA kernels, but $2/hr is 10% of my salary, so no way I'm renting any H100s. The electricity price for my computer is already painful enough as is. I am sure there are many eastern European developers who are more skilled and get paid even less. This is a huge waste of resources all due to NVIDIA's artificial market segmentation. Or maybe I am just cranky because I want more VRAM for cheap.



This has 128GB of unified memory. A similarly configured Mac Studio costs almost twice as much, and I'm not sure the GPU is on the same league (software support wise, it isn't, but that's fixable).

A real shame it's not running mainline Linux - I don't like their distro based on Ubuntu LTS.


$4,799 for an M2 Ultra with 128GB of RAM, so not quite twice as much. I'm not sure what the benchmark comparison would be. $5,799 if you want an extra 16 GPU cores (60 vs 76).


We'll need to look into benchmarks when the numbers come out. Software support is also important, and a Mac will not help you that much if you are targeting CUDA.

I have to agree the desktop experience of the Mac is great, on par with the best Linuxes out there.


A lot of models are optimized for metal already, especially lamma, deepseek, and qwen. You are still taking a hit but there wasn't an alternative solution for getting that much vram in a less than $5k before this NVIDIA project came out. Will definitely look at it closely if it isn't just vaporware.


They cant walk back now without some major backlash.

The one thing I wonder is noise. That box is awfully small for the amount of compute it packs, and high-end Mac Studios are 50% heatsink. There isn’t much space in this box for a silent fan.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: