Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

IMO the main reason Python standard library is so wildly inconsistent. They don't really have the tools to migrate stuff painlessly and the 'batteries included' approach with weak versioning means you can't change stuff without breaking everyone who upgrades a python version.


It's not about tooling. For example, one issue you can't solve with better tooling: how would you update all the textbooks that have working code examples? Do they all need to monitor for changes and release minor versions as well? That'd be terrible for the community.

The impact of backwards-incompatibility is often worse than you expect. If anyone should know that by now, it's the Python core. Or perhaps the folks that faithfully waited for Perl 6.


Even if they changed the standard library from version to version... the result would be that people would stop using the standard library, migration tools or no tools. Nobody really wants to deal with being pinned to a specific minor version and no older/newer - especially libraries.


The correct solution is to version the standard library separately from the language and allow for versioned dependencies, then a new language/VM update doesn't imply a new library that breaks everything, and vice versa.

But python grew up in an enviroment where this sort of thing was not practical, and the batteries included is actually a good approach for what python tries to do - scripting. It just doesn't scale well in to maintainability.


Once you have the standard library split apart, you might as well split it up, though, and then you don't really have a standard library any more. You could go the Haskell Platform route... which isn't a wonderful idea.


> Haskell Platform route... which isn't a wonderful idea

It's a bad idea, on short timeframes. But it's a hell of an adaptability bonus that ensures the language will keep improving.

Just like Haskell's extension system, or its loose dependency on Prelude, or its multiparadigm emulation.

I keep hopping something better than Haskell appears and people move on - but it probably won't happen any time soon, as soon as something better gets traction, Haskell will simply devour it and keep growing.


I think if Python was developed from scratch now you would have a very small classic standard library and then stuff like HTTP server/client and JSON parsers would be separate libraries handled by package manager, but since python is a scripting language it would make sense to ship with some packages by default (so not a standard library, but say core packages) - this would let you version the core packages like any other package, but it would still let you run scripts without internet access/need to pull random dependencies for a script run.


That is pretty much the Haskell Platform. Haskell people have issues with it because it contains a whole bunch of really useful packages, but essentially pins them at old versions globally. Upgrading them, then, is a global thing, and if someone depends on an older version... you're stuck.

The solution to this, of course, is sandboxing - never install libraries globally, only on a project-specific basis, and then their dependencies can override global ones. But it's fiddly to get the UX right - in Python, managing that involves two separate tools. And you'd need to create a project directory to get a repl with some library in it, unless you had extensions to the repl to install things temporarily - and then you'd likely have two separate UXes for installing things, whether you're doing it for a REPL or a project.


Then people will complain that you need to specify dependencies just to use leftPad function.


Even with a consistent standard library, you can't really be confident that you've properly refactored all usages of a class or function unless you have 100% test coverage -- and even if you did, it would be difficult to automate the refactoring the way you could with a static language.


I don't think your theory is right (I simply don't think it has been given enough polish), but even so, the other side of the coin is that you can relatively easily hack in temporary migration paths. For instance, a function can examine the parameters it is given and convert them to the latest API, spewing out a warning.

Backwards compatibility is mostly an attitude problem.


Yeah if all you have is straight calls to functions, as soon as you do stuff like assign function to a variable, method calls you need to do sophisticated code analysis and that's not even touching the untraceable stuff like string/dynamic access, monkeypatching, etc. etc.

Refactoring in python is bad even if you restrict yourself to "sane" code (no metaprogramming and abusing dynamic stuff so the tools can follow what you're doing) if you need something that will work for everything out there it's just impossible to do.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: