That’s interesting because I would’ve thought having strong local compute was the old way of thinking. I run huge jobs that consume very large amounts of compute. But the machines doing the work aren’t even in the same state I’m in. Then again maybe I’m even older as I’m basically on the terminal server / mainframe compute model. :)
If your goal is to sell more MBPs (and this is marketing presentation) then, judging by the number of comments that have the phrase "my M1" and the top comment, it seems like M1 vs M4 is the right comparison to make. Too many people are sticking with their M1 machines. Including me.
It's actually interesting to think about. Is there a speed multiplier that would get me off this machine? I'm not sure there is. For my use case the machine performance is not my productivity bottleneck. HN on the otherhand... That one needs to be attenuated. :)
My guess is the only reason to open and upgrade a computer is if one needs (or wants) to be on the bleeding edge of what local compute is capable of on a day to day basis. With the advent of cloud compute the number of use cases that meet that criteria shrinks every day. With the iMac there is a price premium but what the users is paying for is a computer that just gets out of their way. For them the computer is simply a means, not an end.
Most of my buddies w/ PCs for gaming generally only open up their machine to upgrade their video card, once their motherboard no longer supports the latest and greatest they just dumpster the whole damn thing (maybe sell the card on ebay), or turn it into a plex server or something and start over.
I agree the overall math is easier in the frequency domain, especially because you don’t know which frequencies are problematic so best to look at all of them, but I think the concept is best explained at first, in the time domain.
Here’s my attempt in a couple of sentences.
It takes time for the signal to propagate from input to output in any real circuit. If that time is a substantial fraction of the period under consideration then the input of the amplifier, which includes the feedback signal, cannot effect the output before it has moved. And if the delay through the amplifier is just wrong relative to the signal period one can end up in a dog chasing its own tail situation and the output oscillates.
The rest is just math. :)
P.S. this explanation also explains why we use phase and not seconds to measure the delay of the circuit. Because everything is relative to the input signal period and if we use phase we get that for free. No extra divide.
As someone who makes HW for a living, please do make more Rube Goldberg machines of black box LLMs. At least for a few more years until my kids are out of college. :)
There were so many other little ones, like bit-shifting signed integers, which is compiler dependent. I lol’d when they told me I didn’t know C well enough. Like bro, I learned on Borland C, then Microsoft (Visual) C, now mostly gcc/clang C. I don’t know C at all.
Depends. If the competing universities degrade into glorified coding boot camps they’ll probably get thier lunch eaten in turn. And graduates need to be getting reasonable job offers as well.
I suppose, but the 3D printer requires consumable inputs. So without active shipping that printer is going to have a very limited lifetime. There’s always a corner case, like having to 3D print on Mars or something, but thats a niche of a niche.