It's incredibly useful for companies and organizations, especially when lending computers to their employees, but why the hell would this tech be put inside consumer devices? It just sits there as an exposed attack surface without the user even having the tools to maybe make something out of it.
Back when I worked on the sysadmin side of things we used vPro for out of band management of servers in our datacenters, but we never used it for our 10k+ laptops and desktops.
Yeah exactly. We used Dell iDRAC remote management cards and HP ILO for that mostly. We still use the latter on the few servers we have left (which is very very few). But on laptops/desktops never.
That still doesn't really give it any reason to have it in workstation chips, in Xeons perhaps...
It gives you OOB management on every endpoint. These days I think it is less useful (I like autopilot/intune) but for some field devices it is nice to solve boot loop scenarios or similar bare metal problems over the internet instead of making a dude drive for 6 hours to BFE to find out why your doodad has ghosted.
If you're using Intel architecture, it needs at least some SMM: it is used on startup (initial hardware configuration) and often during power management events (CPU clock scaling, hibernation, etc). The article mentions that they disable most but not all of SMM, for those reasons.
> why the hell would this tech be put inside consumer devices
Because it is cheaper to make one single CPU chip variant, that is then sold to both the corporate and consumer channels, than it is to make two, one with ME for the corporate channel, and another without ME for the consumer channel.
Plus, once the ME was required to actually boot the CPU (note, why it became a requirement is a different argument), it then became much more expensive to omit for consumer grade CPU's because a "non-ME consumer grade" CPU would need to be a completely different chip with some alternate way to "initially boot up".
It's incredibly useful for companies and organizations, especially when lending computers to their employees, but why the hell would this tech be put inside consumer devices? It just sits there as an exposed attack surface without the user even having the tools to maybe make something out of it.