Hacker News new | past | comments | ask | show | jobs | submit login

This is my take-away as well. Many projects let warnings fester until they hit a volume where critical warnings are missed amidst all the noise. That isn't ideal, but seems to be the norm in many spaces (for instance the nodejs world where it's just pages and pages of warnings and deprecations and critical vulnerabilities and...).

But pushing breaking changes just to suppress some new warning should not be the alternative. Working to minimize warnings in a pragmatic way seems more tenable.




Ironically, as a NodeJS dev, I was going to say the opposite: I'm very used to the idea that you have a strict set of warnings that block the build completely if they fail, and I find it very strange in the C world that this isn't the norm. But I think that's more to do with being able to pin dependencies more easily: by default, everyone on projects I work with uses the same set of dependencies always, including build departments and NodeJS versions. And any changes to that set of dependencies will be recorded as part of the repository history, so if be warnings/failures show up, it's very easy to see what caused it.

Whereas in a lot of the C (and C++, and even older Python) codebases I've seen, these sorts of dependencies aren't locked to the same extent, so it's harder to track upgrades, and therefore warnings are more likely to appear, well without warning.

But I think it's also probably the case that a C expert will produce codebases that have no warnings, and a C novice will produce codebases filled with warnings, and the same for JS. So I can imagine if you're just "visiting" the other language's ecosystem, you'll see worse projects and results than if you've spent a while there.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: