All sorts of reasons, but this isn't a left-pad situation. Axios's functionality is something provided by a library in a lot of languages (C/C++ with libcurl and friends, Python with requests, Rust with reqwest, and so on).
That's not to say it's inherently necessary for it to be a third-party package (Go, Ruby, and Java are counterexamples). But this isn't a proliferation/anemic stdlib issue.
Pinning, escrowing, and trailing all help, but I'm not sure "this step will be eliminated" is inevitable.
Package manager ecosystems are highly centralized. npm.org could require MFA (or rate limit, or email verification, or whatever) and most packagers would gripe but go along with this. A minority would look for npm competitors that didn't have this requirement, and another minority would hack/automate MFA and remove the added security, but the majority of folks would benefit from a centralized requirement of this sort.
Requiring a human-in-the-loop for final, non-prerelease publication doesn't seem like that onerous of a burden. Even if you're publishing multiple releases a day on the regular (in which case ... I have questions, but anyway) there are all sorts of automations that stay secure while reducing the burden of having to manually download an artifact from CI, enter MFA, and upload it by hand.
That's a good thing (disruptive "firebreak" to shut down any potential sources of breach while info's still being gathered). The solve for this is artifacts/container images/whatnot, as other commenters pointed out.
That said, I'm sorry this is being downvoted: it's unhappily observing facts, not arguing for a different security response. I know that's toeing the rules line, but I think it's important to observe.
This is the right answer. Unfortunately, this is very rarely practiced.
More strangely (to me), this is often addressed by adding loads of fallible/partial caching (in e.g. CICD or deployment infrastructure) for package managers rather than building and publishing temporary/per-user/per-feature ephemeral packages for dev/testing to an internal registry. Since the latter's usually less complex and more reliable, it's odd that it's so rarely practiced.
I like your dream. I think financial incentives make it unlikely, though. The writing's been on the wall for user-friendly general computing OSes for awhile, I think. So Microsoft's incentive is to treat Windows like a loss leader (even if it's not) and use it as a funnel for services/subscription revenue from their other products.
I hate that/wish it weren't so, but I think the last ~15y of M$ decisionmaking makes a lot of sense in that context.
Another aspect to this is that I really doubt consumers would go to linux if there was any pay-wall or 'donate for more features' type aspect to it. Something that really isn't emphasized much is how lots of OSS/linux work is done by the various big corporations often for goals that are not aimed at the small scale users, and it's a happy byproduct that many aspects of their system may run better just by swapping OS, all free to them. Similarly Valve's efforts seem tightly focused on what matters to their products/services and being available to everyone is a byproduct.
The windows cost gets hidden/de-emphasized when buying a PC, or other users just ignore it which is seems to be below MS's pain tolerance for lost revenue on those users. If there was a price of admittance to linux for any other company to devote resources to work on it where it couldn't be treated as a loss-leader for something else, it'd be an even tougher struggle to migrate users over. (and it's likely right now most people moving to linux are somewhat enthusiasts)
> I suspect it's going to hurt iPad sales though, as a real Mac running MacOS is vastly more capable than any iPad.
Maybe, but I somewhat doubt it, for a few reasons:
- Kids like iPads for gaming/video watching, and the overhead of computer interfaces for them might discourage laptopping (understandable for littler kids; regrettable loss of tech familiarity for older ones, but true regardless).
- Parents/rough users like iPads 'cuz there aren't moving parts or gaps to get hammered and damaged, though the screen is a risk.
- Cellular iPads/huge phone-alikes are pretty popular, and the vast majority of users are unfamiliar with the idea of hooking a computer-shaped device up to cellular internet.
- iPads are easier to MDM-manage/lock down. You can do that on MacOS too, of course, but a lot of folks find it easier to regulate kid/employee/etc. use of an iPad because the management system is familiar and simpler.
- iPads feel like a big phone. That's a pretty intuitive switch for a lot of folks who either don't have keyboarded computers at all, or associate them with non-fun (work/school) computing. Silly distinction to draw, to be sure, but very significant in the minds of many users. The single-brick/touch aspect of iPads is desirable enough that a fold-out laptop isn't going to overlap with a lot of those users.
That only grants market control so long as Microsoft keeps releasing new APIs, otherwise the people reimplementing them like valve/wine will catch up.
I think Valve’s play isn’t to steal tons of Microsoft’s gaming market share; their play is to just get enough of a market that game developers are incentivized to code to the APIs that work well in Proton, not whatever the latest and greatest in Windows is. If we cross that inflection point, Microsoft’s PC gaming chokehold will be on life support.
Does pipenv download and install prebuilt interpreters when managing Python versions? Last I used it it relied on pyenv to do a local build, which is incredibly finicky on heterogenous fleets of computers.
That's not to say it's inherently necessary for it to be a third-party package (Go, Ruby, and Java are counterexamples). But this isn't a proliferation/anemic stdlib issue.
reply