I consider myself quite promiscuous when trusting software but sometimes just can't. Seeing how signal desktop does 100MB updates every week, or the big ball of coalesced mud that is typescript compiler, made me avoid these. Why there isn't more pushback against that complexity?
When something complex exists it’s usually because alternatives are worse. Would you have less issues with 10MB updates? 1MB? One megabyte is a lot of text, a good novel for a week of evening reading can be less than that.
I think the concern OP has is why a lot of the updates are so large. I use signal desktop and the UI hasn't changed in years. It begs the question what those 100mb are and whether it's actually necessary.
That’s only formally a choice. In reality, you’ll depend on some delivery system (among many other systems) that is a part of some build/dev system that is a part of a job market conjuncture.
And all of that is completely out of your control and competence budget, unless you’re fine with shipping your first 50kb updates ten (metaphorical) years later.
As long as the end result works and doesn't pile up install data ad infinitum on the system I wouldn't bat an eye at something that takes 2 seconds to download over an average internet connection.
What really grinds my gears is updates that intentionally break things. Sometimes on purpose, sometimes out of incompetence, but most often out of not giving a single fuck about backwards compatibility or the surrounding ecosystem.
Every few years I lull myself into the false sense of security over running apt upgrade, until it finally destroys one of my installs yet again. Naturally only one previous package is ever stored, so a revert is impossible if you ever spent more than two releases not doing an upgrade. Asshole-ass design. Don't get me started on Windows updates (actual malware) or new python versions...
It pulls so many dependencies and the npm situation is frequent topic of conversation here. And the hype makes it attractive target. No idea how MS and Gihub relate to that.
A few years ago some guy demonstrated how vulnerable the NPM ecosystem is but NPM chose to shoot the messenger instead of fixing the problem. Makes me think that the three letter agencies want the software to be vulnerable to make their job easier.
Can you point out some examples of NPM shooting messengers? I recall mostly silence and new security controls appearing (albeit opt in) in response to the crisis.
What exactly are you referring to? Specifically typescript has zero dependencies.
Generally speaking, I agree, the npm-ecosystem still has this pervasive problem that pulling one package can result in many transitive dependencies, but a growing amount of well-known packages try to keep it as limited as possible. Looking at the transitive dependency graph is definitely good (necessary) hygiene when picking dependencies, and when done rigorously enough, there shouldn't be too many bad surprises, at least in my personal experience.
interesting perspective. i suppose complex minifiers would also be an attack vector, as they don't as readily afford even eyeballing obvious deviances due to the obfuscation