I think Python and Go are two examples where the standard library has very much proven invaluable.
Without it, it's difficult to be confident in the portability of components, and it quite frankly makes the language less attractive for use.
Like the other poster mentioned, I'm happy for the RUST folks to take a "wait-and-see" approach, but at some point, I believe "blessed" components are going to be expected and strongly desired.
Python is a prime example of a rotten standard library.
Take urllib/urllib2 (use requests instead), unittest (use py.test or nose), os (too low-level, hence arcane usage) or time as examples (use pytz for anything serious), and of course Tkinter.
Take all of the modules that solve minor/niche tasks that could easily have been put in a seperate library (e.g. wave), that are usually a bad idea to use (e.g. pickle), that contain some copy/pasteable functions the authors deemed useful as comments (itertools), have documented bugs with copy/pasteable workarounds (csv).
Oh, and the way they do exceptions is a mess.
Oh, and don't try to read the Python standard libraries source code. It's ugly.
No, standard libraries should constrain themselves to providing a good foundation for library designers.
(I still like Python, and use it a lot.)
(Since you mentioned Go. One thing that I love about Go is that everything interaction with the underlying OS goes through the syscall package. Also, their designer went for more minimalism. And they didn't have to worry about design mistakes they made 25 years ago. Python is old. I'm not a fan of Gos compatibility promise: it means Go 1 will rot away too. But they're in a much better starting position.)
It all depends what you're using it for. If I'm writing code that's going to be around awhile and can tolerate some dependencies, I'll totally use, say, requests. But I use urllib2 from the repl all the time, especially if I'm on a machine that isn't mine. The fact that any Mac already has the tools on it to grab some JSON from an API, parse it, and do something useful with it from the command line, without having to download or install anything, is immensely useful, even if the API is a bit suboptimal. The same applies to quick and dirty things I'm shoving into a Gist to share with colleagues. Not all things have to be elegant to be useful.
(That said, those kinds of usecases aren't really in Rust's wheelhouse, so I still think in Rust's case having batteries not included is probably the right call.)
It's possible to achieve the best of both worlds. We could introduce the concept of a standard distribution instead of a standard library. A release of Python should ship with specific versions of requests, nose, and other packages that have a broad community consensus. Those packages could be individually upgraded later, but every install of Python 2.7.9, for instance, would have requests 2.5.1 or greater installed. That would avoid the stagnation packages see when they enter the standard library, and would maintain the benefits of universal availability.
I agree about urllib and unittest. However over the last decade I still find Python to be a one of the best "batteries included" language/platform.
That is not small thing either. It allows getting started easier, which in turns gets more people to use it.
Here is a list of modules I used and was happy there were in stdlib:
socket, shelve, cPickle, tarfile, urllib[2], Tkinter, time, ctypes, subprocess, asyncore, json, SimpleXMLRPCServer, wave (sorry, I did use it many time ;-) ), timeit, syslog and many others.
While I'll readily agree that some parts of the standard library are rotten, that's not sufficient justification to say that there shouldn't be one.
I should also clarify my expectations about a standard library; to me, a standard library should have all of the basics covered (interaction with the underlying system, I/O, networking, etc.) and anything that benefits from better integration with the runtime (think data types such as those found in python's collections module).
If anything, I'd argue that the main problem with Python's standard library is not the library itself, but the lack of more focused curation.
Note that I never said that I expect all functionality to be available in a language's standard library; for me personally, Go's standard library has roughly the right balance.
The other thing that really needs to be in the standard library is protocols & interfaces that will be implemented by a number of userspace libraries. Go benefits immensely from having standard io.Reader and io.Writer types and most people implementing them instead of defining their own. Similarly, most of its web frameworks use http.ResponseWriter and http.Request instead of defining their own. Python's unittest module may be a mess as a test framework, but all the major Python unittest frameworks take a unittest.TestCase, which keeps tests portable among the different systems.
The worst case is exemplified by pre-STL C++, which didn't even have a string type in the stdlib. As a result, every project and library wrote their own, which meant that you basically had to choose a C++ ecosystem and develop for it rather than write libraries that are portable across multiple C++ projects.
... and a sufficiently large C++ application would use 10 different string classes in various parts of it, likely with different text encodings too, with lots of fun converting between them. Even a crappy UTF-16 based String like Java's is better than such a mess.
What's wrong with the exceptions? I never noticed that problem but agree completely on all your other points. (Especially itertools – why aren't the "recipes" defined in the module?!)
In my experience the batteries included is one of the best features of python.
The standard library is fine for simple and small scripts, especially in restricted environments without pip or sudo rights. They are a lot of awesome python libs (like requests) but it is awesome to have the stdlib available everywhere and be able to depend on it.
The opposite is also true; a good package manager can avoid the stdlib to bitrot ala Python, because you can easily swap out an old module by simply repackaging as a third party dependency.
On the contrary, the benefits of having a standard set of batteries, especially those upon which other might be developed, is fundamental to a sane ecosystem development. For instance, a standard framework for async I/O programming is important, because then people can write thousands of protocols that can be fully interoperable; if 2-3 different framework arises, each one can then develop its own ecosystem of protocols and libraries (not interoperable), and it's hard to think that each one would be as rich as the one in the former scenario. The same can be said for a HTTP library, a threading library, a XML/JSON marshaling library, a threading/concurrency library, and so on.
I think that crates.io still has a ways to go before the situation is ideal. Discoverability of libraries is the first pain point to look at. After that, I'd like to see crates.io automatically parse an uploaded package's docs (thank you, rustdoc!) and host them on the site itself.
I strongly disagree with that assertion; while I readily agree that some parts of the standard library are less maintained than others, as a consumer of many third-party components, I can say without a doubt that items in the core library are generally better maintained (from a security perspective) and easier to deal with.
The ease of installation has nothing to do with the desire for core components. Core components generally bring certain expectations/guarantees about security, reliability, and support.
That's just not true. Standard library components are not better maintained, not more secure, and not easier to deal with. There are too many examples to even count of each of those -- you can take a look at PEP 476 for just one recent example.
You're arguing with a general observation. The fact that it's not universally true is unsurprising. If you take the average quality of third-party libraries and the average quality of libraries in the standard library, I think you will find that the standard library is generally pretty good.
Whether it's better than the average is irrelevant, that isn't the benchmark. It needs to be better than or equal to all of the third party libraries for it to be worthwhile, otherwise, why does it exist? When you pick a library you don't pick all of them, you pick just the best one that meets the criteria you need, and the standard library can't beat the flexibility that other people will have to better meet those criteria without needing to partake in the process upstream.
(It's also not true that the standard library is generally pretty good IMHO, but that will start to veer us into subjectives [which I think the comment above my original post has already done anyhow]).
> When you pick a library you don't pick all of them, you pick just the best one that meets the criteria you need, and the standard library can't beat the flexibility that other people will have to better meet those criteria without needing to partake in the process upstream.
So, you're using three libraries that all depend on something like urllib - and they each use the library that fits their usecase best. Now you need to debug/review/depend on updates for 6 (7 if you also use the stdlib for something) foreign codebases rather than 3 + the standard library.
It's a trade-off between old/new good/best simple/complex (or complicated/complected). A standard lib that needs to maintain stability for 10+ years can never be "best" for all that time. But by being "good enough" it can often still be the best choice overall when the life cycle of a project is considered.
Sorry; I'll have to disagree. As part of a group that maintains Python packages for an operating system distribution, my experience has generally been more positive for the standard library compared to third-party components. Particularly when it comes to security issues.
> I think Python and Go are two examples where the standard library has very much proven invaluable.
C#. The amount of time I've wasted in Java programming teams while arguing over things like which of three quirky XML parsing implmentations[1] was the One To Use while the .net team powered off and built useful functionality...
C# seems to be in a similar position when it comes to JSON. There's both System.Runtime.Serialization.Json and System.Web.Script.Serialization built-in plus a whole bunch of third-party libraries (e.g. Json.NET seems to be popular). I'm still not sure which one to use.
Having a standard library for a specific task doesn't forbid alternatives to be implemented and used. But it discourages them enough, which is a very good thing for consistency (especially in reading code), lowers the barrier to entry (eg: I don't have to learn how to parse json with the specific library the project I'm contributing to is using), reduces binary size (I don't link three different http libraries, mine and those chosen by my dependencies) and generically reduce the time wasted by humanity in useless duplications of efforts. When the standard sucks, well, good alternatives will appear.
Also, the sad state of Python packaging is the reason why stdlib bitrotted; technically, your package manager could take care of backward compatibility: if eg at some point in the future you want to switch from a json library to a (far) better one, more in line with how idiomatic language has evolved, you just need to repackage the old one as a third party package, and make sure the packager manager brings it down for the user automatically.
Without it, it's difficult to be confident in the portability of components, and it quite frankly makes the language less attractive for use.
Like the other poster mentioned, I'm happy for the RUST folks to take a "wait-and-see" approach, but at some point, I believe "blessed" components are going to be expected and strongly desired.