Not gonna argue with that. I do use MCUs myself where appropriate. I like that they’re more hardware than software: flash the thing once and it will work every time from there on out. Very few moving parts, everything is very simple, power consumption is measured in milliwatts.
But it still blows my mind that the equivalent (a vast superset, if you consider the GPU and specialized coprocessors) of my workstation from the late 90s now costs less than $5.
But it still blows my mind that the equivalent (a vast superset, if you consider the GPU and specialized coprocessors) of my workstation from the late 90s now costs less than $5.