Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It makes me pretty mad and sad that people think static languages solve this problem pretty much at all. If you do, I have a version of liblzma for you to install. If you do, do you release versions of your libraries without version numbers because the compiler will catch any mistakes?


Theoretically static languages don't solve this problem, but in practice, programmers writing packages in a static language don't gratuitously break their API every release or so. Which seems far too common in Python-land.


I’m not entirely sure that’s true, and I’m not sure it makes sense to extrapolate all dynamic languages from Python.

Huge amounts of effort are expended on Linux distros ensuring that all the packages work together. Much of and maybe most of those packages are written in static languages.

Many Python packages don’t have issues with things constantly breaking. I find NumPy, SciPy, the Scikits, and more to be rather stable. I can only think of making trivial fixes in the last few years. I have lots of exotic code using things like Numba that’s been long lived. I’m guessing Flask and Django are pretty stable at this point, but I don’t work on that side of things.

Packages undergoing a lot of construction still are less nice. I think that might be the nature of all new things, though. The example at the beginning of this article, TensorFlow, is still a relatively new sort of package and is seeing tons of new development still.

Packaging in Python in 2024 still sucks, which is a uniquely Python issue. Python’s slowness necessitating wrapping lots of platform specific binaries doesn’t help. Seemingly even major Python projects like TensorFlow have really only just started making an attempt to version their dependencies. In one of the issues in the article, the issue was TF pinning things way too specifically in the main project. One of the satellite projects had the opposite issue, not even setting min bounds. The Wild West of unpinned deps make it hard for upstream authors to even know they are breaking things.

Many people know Python packaging sucks, but I don’t think they know how bad it really is. The slowness is also special to Python. Other languages like Julia and Clojure seem to be much better with these difficulties, and I think in large part this is due to early investments preventing the problems from festering.

Rust vs C++ is a good comparison I think. Cargo is better than anything C++ has by far. In C++, it’s common to completely avoid dependencies altogether because the best you’ve had historically is the OS-specific package manager. The issue isn’t static vs dynamic. The issue is early investment in packaging and community uptake.


> TensorFlow, is still a relatively new sort of package and is seeing tons of new development still.

But I thought Tensorflow is already "dead" and everyone is moving to Torch...?

Even if it's not dead, tf has been around for almost a decade by now.

The landscape of ML is changing rapidly I'll grant you that, so I guess that might necessitate more visible changes esp. on API and dependencies...


It solves the issue of finding out a function signature changed at compile time instead of runtime, which is infinitely better. The real answer is that serious software developers don't just leave their packages on auto-update and rarely, if ever, update dependencies unless there's a good reason.


It doesn't solve the problem of the function body changing though.


It finds the most trivial mistake. It isn’t infinitely better because static typing is far from free.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: