Codechef appears to use Python 3.1.2 for "Python3", which is pretty old (March 21st, 2010) and predates some significant performance improvements in Python3[0].
Fedora 22 is ... 3 releases out? So we're talking about 12-18 months depending on how soon F20 is due to be released. Still, I'm surprised, and I've been a Fedora contributor since FC5.
Adblock plugins slurp down new lists without updating the plugin code itself. Newer techniques, sure, but I don't think ABP's release cycle is substantially faster than Firefox's.
I've often seen normal people come up with deeply nested directory structures with long descriptive names and they get bitten hard by the 260 character limit. For example names like this are very common in my experience:
C:\Documents and Settings\Joe Random User\My Documents\Some Cloud Software\Company\Projects\2013\156 Big Customer Inc\Locations\Someville Foostreet\Problem reports\2013-10-07 product not working\images\video of product not doing what it's supposed to do.avi
Not necessarily want, but I've come across it just by installing a node project into my home folder on XP, something along the lines of:
C:\Documents and Settings\____._______.________\project\node_modules\package\src\tests\somelib\helpers\test123\file.html
What made this troublesome for me is that my networked profile refused to sync while the file was there (and couldn't be removed without renaming each layer to single character folders. Additionally the profile sync failure copied back files I had deleted confusing the heck out of me :)
I seem to have some slightly longer file paths on my home computer. Do I need that? Probably not. I could move those files somewhere else. But should I really have to worry about such things?
reached 248 here when run against my home dir. Interestingly the winner is the path to a transitive npm dependency five(!) levels down the dependency tree. The command I used:
Right now, sure. But suppose some piece of software goes rogue and starts creating dirs and files. Further suppose that there's enough of them to fill the disk. OK, now your disk is full and nothing on the machine works, so you need to delete those deeply-nested files. Except, whoops!, the paths are too long, so you can't delete them with the command line. OK, now you're writing a script or burning some godawful boot CD to try to get to a safe mode or Linux kernel. All in order to delete some silly unnecessary files.
Problems like this aren't problems when everything is working normally. But that's beside the point. When the shit hits the fan and the system is only getting in your way and adding to the problems instead of helping solve them, it's really really really not fun.
Certain SBT plugins and build settings can generate really deep nested paths under the "target" folder of a project. I think the point of this is to enforce file immutability during the build process - whenever a file is modified by a build stage, a new file is created instead.
I could be completely misunderstanding this though since SBT is fairly complex.
I dont think people do, that doesnt mean it doesnt happen by accident. Until we can get file system as a database, and all of this can be abstracted, it can be a real issue. That said, the target here is a developer - a bit of careful planning can avoid said complications.