Personally the fact they didn’t produce Linux drivers for these things is baffling. It would just make more people buy them. So what if they don’t run macOS? I would buy a M1 air tomorrow if they published Linux drivers for the full stack.
There are always constraints. Usually it's the time and attention of the people who are working on the problem. Throwing more people at the problem can help, but also adds a lot of coordination overhead to the problem. It doesn't matter how many resources you have, hard problems are still hard.
I don't believe it, honestly. I would gamble that it's because helping Linux grow doesn't financially benefit them, now or in the future. And that, to me at least, feels quite a bit more likely from Apple.
> Personally the fact they didn’t produce Linux drivers for these things is baffling.
Followed by
> helping Linux grow doesn't financially benefit them, now or in the future
Is a bit perplexing. Why are you baffled?
Opportunity cost is a very real thing. Apple would probably be better served by developing Windows drivers, if they were going to take developers away from macOS drivers.
Apple, like every company, is limited by the speed at which their supply chain can produce specific parts. And, while they're often the largest customer for a given manufacturer, they're rarely the only.
There's literally people working on it right now, for free, that they could pay and give access to all of the docs for the hardware that they designed in house.
it's a perennial topic of discussion in the AMD vs Intel debate, so a lot of people do seem to care about it.
And why wouldn't you, when it all becomes heat in your room? I'm not saying every single watt matters but, in cases where you're putting the M1 ultra at 40W against a 280W threadripper or Epyc... yeah that hugely matters.
That's debate. That's just what you and me are doing right now. Useless internet fluff. When push comes to shove, and you really wanna shove something, you're not going to pick the most efficient, you're going to pick the best pusher available. And right now that's gonna be something by AMD or Intel with an (probably several) nvidia GPUs.
Or you offload it to GCP/AWS but then the wattage of the M1 Ultra is still irrelevant because you could use a Chromebook to do that.
Many people don't care about power consumption precisely because it becomes heat in the room. They live in places where the need for heating is bigger than the need for air conditioning.
In my experience, a ~500 W gaming PC does not generate any noticeable heat unless you use it in a small room behind a closed door. A 1450 W vacuum cleaner does, so I guess heat becomes significant somewhere around 1 kW of sustained power.
Do you actually care about power consumption in a desktop computer under very heavy workload?
By the way, the M1 Ultra has a 370W power supply, so really the question is do you really care about the difference between 300 and 400 watts in this use case?
> Do you actually care about power consumption in a desktop computer under very heavy workload?
yes, many people do, especially since that becomes heat in your room.
> By the way, the M1 Ultra has a 370W power supply, so really the question is do you really care about the difference between 300 and 400 watts in this use case?
Complete non-sequitur, you could put a 1.5kw power supply on an R7 5700G or an i3, that doesn't mean it pulls 1.5kw.
you also should be able to express this without using the flamebait style, that is not appropriate or welcomed on this site. Instead of "by the way...." you can simply say "the M1 Ultra pulls 370W so ...." (or you could, if that were true).
There is no easy way to know how much the Mac Studio can pull without using a wall meter.
We know that reported power consumption is inaccurate on M1 devices, from Anadtech's testing of the M1 Max, and I doubt it's very different on the Mac Studio.
In any case, the difference is not going to be much.
As far as heating a room, really, I can assure you that you will barely notice 100W. There are screens that will use more power than that (such as, amongst others, the XDR screen). I say this as someone who lived with a 900W power hog of a computer in places where temperature hits 42 in shade.
I don't know about the Mac Studio, but the Macbook Pro can exceed its stated power consumption limit. USB-C is a very plausible way, but I doubt it has over 100W of USB-C PD budget, and PSU included is 370W continuous, so it still has headroom for devices.
The Studio Ultra has six Thunderbolt 4 ports. Each port is required to deliver at least 15W by the TB4 standard, so that’s already 90W. Add some budget for the USB-A ports and you are near 100W for power delivery.
Er, machines in the M1 Ultra class do not have 400 watt power supplies. The 2019 Mac pro that it's compared to (with 28 cores) has a 1.4Kw power supply.
A 12900K has a max TDP (for the chip, not including ram) of 210 watts, and loses on geekbench5 to the M1 ultra by 1/3rd or so. To get closer you'll need a i9 or threadripper and systems with either of those typically need quite large power supplies and still have a small fraction of the memory bandwidth.
Or even the 440GB/sec of memory bandwidth available to the CPUs.
Sure if you are cache friendly enough and you get enough Zen3 or Intel cores you can win, but you end up spending a fair chunk of change, getting less memory bandwidth, and for a clear win you often need to spend more, like say getting a Lenovo Threadripper (and they have a exclusive rights to the chip for 6 months or something).