Hacker Newsnew | past | comments | ask | show | jobs | submit | a_cool_username's commentslogin

Those are definitely reasonable reasons to lose confidence in elections and feel disillusioned, but voter ID laws won't help you there (which was GP's point).


The pypa team are just not capable stewards of core aspects of the python ecosystem. As a maintainer and developer of Python based tools and libraries it is very frustrating having these folks push some change that they want and simply oopsie a significant chunk of the Python ecosystem, and then go dark for hours.

They've done it this time by making poor architectural decisions ("Isolated builds should install the newest setuptools") and then add in poor library maintenance decisions ("We'll remove this feature used by thousands of packages that are still in use as active dependencies today"). Possibly each of these decisions were fine in a vacuum, but when you maintain a system that people depend upon like this, you can't simply push this stuff out without thinking about it. And if you do decide to do those things, you can't just merge the code and call it a day without keeping an eye on things and figuring out if you need to yank the package immediately! This isn't rocket science, everyone else developing important libraries in Python world has mostly figured this stuff out. In classic pypa form, it sounds like there was a deprecation warning but it only showed up if you ran the deprecated command explicitly, while the simple presence of this command causes package installs to fail. You have to at least warn on the things that will trigger the error!

These days I try to rely on the absolute minimum number of packages possible, in order to minimize my pypi exposure. That's probably good dev practice anyway, but is really disappointing that I can't rely on third party libraries basically at all in Python, when the vast pypi package repository is supposed to be a big selling point of the language. But as a responsible developer I must minimize my pip / setuptools surface area, as it's the most dangerous and unreliable part of the language. Even wrapper tools are not safe, as you see in the thread.


You might want to try getting them from apt-get. They're usually more stable there and get patched if they fail to install or fail to work with a newer version of something else.


The article discusses that this is actually in the tens of thousands of dollars:

>Twitter’s API service limits tweets for non-paying users to 1,500 a month — not enough for many emergency accounts. And while a small fee for the platform's 'Blue Check' service will increase that ceiling for individual users, the cost of an enterprise account has reportedly climbed into the tens of thousands of dollars.


If you're an even remotely competent developer in a language used by more than one company, have a reasonably firm grasp of the English language, can work with others reasonably well, and can be bothered to apply for jobs, then there are definitely plenty of opportunities for your skills. Plenty of companies don't even care about the language, they figure you'll pick it up. The only way this isn't true is if you are already hugely overpaid or if you come off terrible in interviews. A year ago I would have said "or are a convicted felon" but I think plenty of companies don't even care about that anymore, that's how huge demand is. You don't even have to live somewhere with jobs anymore so long as you have a reliable Internet connection!

If you just sit around letting your current company take advantage of you, and complaining about not getting a big enough raise you definitely won't find those opportunities though. You have to interview a lot.


They don't allow me to become an expert by having me be full stack in multiple stacks with frequent context switching. I tend to be slow in the code screens.


Maybe there's something I don't understand, but per this table it's slightly worse to be married filing separately than it is to be single at the top brackets: https://www.bankrate.com/taxes/tax-brackets/

edit: changed "much" to "slightly"


Once you hit the sweet spot of developing for cross-platform (even just Linux, MacOS, and Windows) and supporting normal average-people users and have (even optional!) C dependencies, Python's packaging situation quickly deteriorates into "nightmare" territory.


This is exactly my problem. I have to support Windows (a locked down corporate version) and Linux.


>A key question arises: why are so few repositories type-correct?

The authors don't seem to ever discuss the fact that mypy version changes frequently make previously-passing code fail type checks.

I don't think I have ever upgraded mypy and not found new errors. Usually they're correct, sometimes incorrect, but it's a fact of being a mypy user. Between mypy itself and typeshed changes, most mypy upgrades are going to involve corresponding code changes. The larger your code base and the more complicated your types are, the worse it'll be, but it's basically an ever-present issue for any program interesting enough to really benefit from a type checker.

How many of those repositories were "type-correct" but only on particular versions of mypy? I bet it's a lot!


I don't have this experience. Can you give an example of a previously-valid codebase that failed typechecking unexpectedly on a recent MyPy update, that wasn't a result of a false negative bug in MyPy?


I can't give you any links because it's not open source code, but there was a bug fix in 0.790 named "Don't simplify away Any when joining union types" that caused me some problems with bad annotations in the existing code: The annotations implied that Any was possible, but it wasn't, but it got dropped from the final Union by the bug so we never had to handle Any. Dataclasses have had some backwards-incompatible improvements as well.

But the big culprit is typeshed. Something will get new/fixed annotations and suddenly you aren't handling all possible return types in your callers, or whatever.


I've had this problem a few times., For example it happened with the 0.800+ versions. Mypy got stricter, and was more aggressive in finding code (eg small scripts in not in the proper python hierarchy).

I can't show any of my professional work (no publicly available src) but this side project of mine is locked to 0.790 until I can find time to sort the issues: https://github.com/calpaterson/quarchive/tree/master/src/ser...

It's hard to classify anything as a "false negative" with mypy since it is very liberal (often unexpectedly so, which I think is one of the sharp edges of gradual typing).


I obviously can't share it here, but my primary codebase at my job needs a few dozen changes every time we change mypy versions. There's a few places of false positives where I've had to comment `# shut up, mypy` and a `type: ignore`. Usually when I'm being clever with the subprocess module.


I wonder if that's a mypy issue or more that the typeshed types are bugged, since type shed versions also get shipped (used to?) with new type checker versions.

https://github.com/python/typeshed


in a codebase with tens of thousands of lines of typed code, I see maybe a few new type errors with a new mypy release. They've always been fixable within a few minutes.

I think mypy has some problems, but this isn't one of the bigger ones for me.


I don't think it's a "problem" with mypy, I just think it's likely the cause of a lot of the programs that the authors think don't type check.

Though I will say, while most have been fixable in a few minutes, some have been a real chore to fix. Sometimes an innocuous looking error balloons into several hours of reconciling obscure type system behavior errors once you start fixing it. Regardless, it's a small price to pay for proper type checking in Python. I've more than made up the lost time in detecting bugs before they ship.


Even for fully statically typed languages like C++, it is very common that some old code can’t compile with latest compiler. Shrugs.


No it isn't. C++ has extremely good backwards compatibility.


I disagree. C++ often removes language features, standard library features (std::random_shuffle), etc. Also object files compiled with different standard versions often are ABI incompatible with each other, which means you can't just pick new C++ for new code, but its rather all or nothing.

You can argue that when it removes features it provides a replacement (sometimes), but that does not change the fact that if you have any reasonably large project (>1 million LOC), every standard upgrade will break your app.

One of the main reasons we write all new code in Rust and are migrating the C++ code base step by step to Rust is because Rust offers infinitely better backward compatiblity guarantees than C++.

Rust never ever breaks your code, and you can opt-in to newer Rust editions for new code only, and these are ABI compatible with Rust code written using older editions.


Even aside from deliberate backwards-compatibility breaks in the standard, compilers sometimes break compatibility. Both MSVC and GCC 11 have changed their header file transitive includes within the past few years, causing projects (like doctest and Qt5) to stop compiling because they forgot to include headers, which built fine in the past but not anymore. IDK if it's "very common", but it's definitely happening in the wild.

MSVC: https://github.com/onqtam/doctest/issues/183

GCC:

- https://invent.kde.org/qt/qt/qtbase/-/commit/8252ef5fc6d0430...

- https://invent.kde.org/qt/qt/qtbase/-/commit/cb2da673f53815a...


In the c and c++ worlds, this is the same line of thinking that kept new warnings out of "-Wall" for so long.


I’ve found the opposite with pyright. Code that seems right but is failing checks is fixed with the next release :)


Sometimes! A very simple example is code that uses "async" as a variable name. It became a keyword in 3.5, which was an enormous pain in the ass.


Let me start by saying: I love python, and I love developing in it. It's the "a pleasure to have in class" of languages: phenomenal library support, not too painful to develop in, nice and lightweight so it's easy to throw together test scripts in the shell (contrast that with Java!), easy to specify simple dependencies + install them. (contrast that with C!).

That said... if you work on software that is distributed to less-technical users and have any number of dependencies, python package management is a nightmare. Specifying dependencies is just a minefield of bad results.

- If you specify a version that's too unbounded, users will often find themselves unable to install previous versions of your software with a simple `pip install foo==version`, because some dependency has revved in some incompatible way, or even worse specified a different dependency version that conflicts with another dependency. pip does a breadth-first search on dependencies and will happily resolve totally incompatible dependencies when a valid satisfying dependency exists.[1]

- If you specify a version with strict version bounds to avoid that problem, users will whine about not getting the newest version/conflicting packages that they also want to install. Obviously you just ignore them or explain it, but it's much more of a time sink than anyone wants.

- In theory you can use virtualenvs to solve that problem, but explaining how those work to a frustrated Windows user who just spent hours struggling to get Python installed and into their `PATH` is no fun for anyone. Python's made great strides here with their Windows installers, but it's frankly still amateur hour over there.

- Binary packages are hell. Wheels were supposed to make Conda obsolete but as a packager, it's no fun at all to have to build binary wheels for every Python version/OS/bitness combination. `manylinux` and the decline of 32-bit OSes has helped here, but it's still super painful. Having a hard time tracking down a Windows machine in your CI env that supports Python 3.9? Too bad, no wheels for them. When a user installs with the wrong version, Python spits out a big ugly error message about compilers because it found the sdist instead of a wheel. It's super easy as a maintainer to just make a mistake and not get a wheel uploaded and cut out some part of your user base from getting a valid update, and screw over everyone downstream.

- Heaven help you if you have to link with any C libraries you don't have control over and have shitty stability policies (looking at you, OpenSSL[2]). Users will experience your package breaking because of simple OS updates. Catalina made this about a million times worse on macos.

- Python has two setup libraries (`distutils` and `setuptools`) and on a project of any real complexity you'll find yourself importing both of them in your setup.py file. I guess I should be grateful it's just the two of them.

- Optional dependencies are very poorly implemented. It still isn't possible to say "users can opt-in to just a specific dependency, but by default get all options". This is such an obvious feature, instead you're supposed to write a post-install hook or something into distutils.

- Sometimes it feels like nobody in the python packaging ecosystem has ever written a project using PEP420 namespaces. It's been, what, 8 years now? and we're just starting to get real support. Ridiculous.

I could go on about this for days. Nothing makes me feel more like finding a new job in a language with a functioning dependency manager than finding out that someone updated a dependency's dependency's dependency and therefore I have to spend half my day tracking down obscure OS-specific build issues to add version bounds instead of adding actual features or fixing real bugs. I have to put tons of dependencies' dependencies into my package's setup.py, not because I care about the version, but because otherwise pip will just fuck it up every time for some percentage of my users.

[1] I am told that this is "in progress", and if you look at pip's codebase the current code is indeed in a folder marked "legacy".

[2] I 100% understand the OpenSSL team's opinion on this and as an open source maintainer I even support it to some degree, but man oh man is it a frustrating situation to be in from a user perspective. Similarly, as someone who cares about security, I understand Apple's perspective on the versioned dylib matter, but that doesn't make it suck any less to develop against.


> struggling to get Python installed and into their `PATH` ... it's frankly still amateur hour over there

But that has been solved on Windows for quite a while hasn't it?

Python installs the "py" launcher on the path, which allows you to run whichever version you want of those you have installed. Just type "py" instead of "python". Or "py -3.5-32" to specifically run 32-bit Python 3.5, or "py -0" to list the available versions.


It's gotten a lot better, but we still hit tons of issues with users who don't know what Python version they installed their application in. Oh and of course our "binaries" in Scripts/bin don't seem to show up in the PATH by default. So I get to tell people "py -3.8-64 -m foo" on windows, "foo" everywhere else.

This gets much much worse when a new version of Python comes out and we don't support it yet (because of the build system issues I mentioned). I spent several weeks teaching people how to uninstall 3.8 and install 3.7 before we finally got a functioning package out for 3.8.


I like Mozilla build system on Windows, you click "start-shell.bat" and it runs console. Python, mercurial, rust - just works, never checked PATH.

https://firefox-source-docs.mozilla.org/setup/windows_build....


Sure, but telling people to run "py -3.7" seems a lot easier than walking them through uninstalling and reinstalling Python, as you would have had to in the bad old days. It's reliable and consistent and doesn't depend of what's installed where or how it's configured. If you run "py -3.7 -m venv my_env", it just works, always, with no special context required.

Although I don't handle user support for Python packages, if I did, that would be my go-to approach.


If only there was some graphical tool that allows the user to see conflicts, relax version dependencies, and of course rollback changes if things didn't work out.

Or an error message like:

    There's a version conflict. In order to resolve, try one of the following:

    pip relax-dep package1 >= 1.0
    pip relax-dep package2 >= 2.0
    pip remove package3
And then you would want to have

    pip undo
(Just brainstorming here.)


Looks like any other package manager:

* developers install with language packager

* in between install with OS package manager

* users install bundle

Those who have troubles with pip, gems, cabal, etc should check over options first.

Wait, bundlers Gemfile.lock lists installed versions at least ten years, what is "too unbounded" in pip?


It depends! Sometimes I have to lock a dependency at minor releases because every.single.release from the author breaks something new, and I've already worked around the locked version's failings. Sometimes I have to lock a dependency at a major version and everything is fine after that. Usually when the latter happens, eventually the developer releases something that fits within the version bounds and breaks. Sometimes they fix it in the next release, but then I have to deal with a week of bug reports from users that "I couldn't pip install the latest release!". A big complaint I'll with flask/werkzeug app is that something or other broke because they installed something else with strict version requirements alongside it (because the authors of that program have experienced the same bullshit, I assume).

Maybe I'm spoiled from working with cargo and npm (I have almost no ruby experience so I can't comment there), but both of them have way fewer such version conflicts in my experience. Obviously there are tradeoffs and I don't want the node_modules experience for my users, but often it seems that would be a much better experience than pip for everyone. With either of those, I just "npm install" or "cargo install" and all my dependencies end up there working.

You can generate a requirements.txt file using "pip freeze" on a functioning system, but then you have to figure out a way to point users at it instead of using "pip install myapp". Also you might have to do it for each OS since windows vs mac vs linux can have different package dependencies specified, and even if you don't do that, a dependency doing it means you have to account for it.

You can copy+paste the "pip freeze" output into your setup.py and add quotes+commas, but then you're back to breaking side-by-side packages.

So what am I, a developer trying to distribute my command-line application to less-technical users, supposed to do? Distribute two entirely different packages, "myapp-locked" and "myapp"? Tell people to install from a copy+pasted "requirements.txt" file? I've started distributing docker containers that have the application installed via the requirements.txt method, which is fucking stupid but at least the users of that complain less about versioning issues... until the day someone yanks a package I guess.


I've recently reported bug on Xmonad github, they have

### Checklist

  - [ ] I've read [CONTRIBUTING.md](https://github.com/xmonad/xmonad/blob/master/CONTRIBUTING.md)

  - [ ] I tested my configuration with [xmonad-testing](https://github.com/xmonad/xmonad-testing)
I think it is brilliant idea, immediately checked latest git versions, I assume you may add

  - [ ] I tested my application with [latest stable requirements.txt](...)
And something about triangulation and reporting to another repo too.

Sorry to hear about breaks on major version. Ruby gems (libraries) freeze dependencies on major, sometimes minor, example [0]. But applications shipped with Gemfile and Gemfile.lock [1], [2]. So `bundle install` is reproducible [3]:

> The presence of a `Gemfile.lock` in a gem's repository ensures that a fresh checkout of the repository uses the exact same set of dependencies every time. We believe this makes repositories more friendly towards new and existing contributors. Ideally, anyone should be able to clone the repo, run `bundle install`, and have passing tests. If you don't check in your `Gemfile.lock`, new contributors can get different versions of your dependencies, and run into failing tests that they don't know how to fix.

Yes, docker, msi, Flatpack, AppImage - whatever works for you and your users. It is sad we can't easily statically compile in one file on scripting languages.

[0] https://github.com/teamcapybara/capybara/blob/master/capybar...

[1] https://github.com/Shopify/example-ruby-app/blob/master/Gemf...

[2] https://github.com/Shopify/example-ruby-app/blob/master/Gemf...

[3] https://bundler.io/guides/faq.html


I partially maintain a GitHub project with ~1500 stars, many of my users are not very technically skilled. I rarely encounter entitled/mean/crazy people - we probably average one obnoxious contributor a year, two or three if you count the ones who go away when you politely tell them it's not going to happen. I get a lot of PRs and reasonably well-written issues from users whose profiles indicate it's their first contribution.

I think a lot of this has to do with the effort you put into community management. We put tons of energy into making it easy to make good contributions: PR/issue templates, we're on slack, we respond to emails, we make it as easy as possible to get your code in as long as it passes tests. It's a lot of effort and I can't imagine doing it for something that was just a personal project, but I think that's how the bigger projects deal with it.


Totally agree. I have 100+ repositories open sourced (Just launched this website yesterday! https://statux.dev/), a handful of them with 1k+ stars, and for the most part it's just me maintaining those.

This is besides my fulltime job, friends and hobbies. So I don't really have much time for growing a community, I've done some effort in the past when I had more time and it worked fairly well, but I found it to be very project-oriented in general so if you do many smaller projects (as opposed to few large ones) the community approach doesn't scale well.

I don't encounter many mean/crazy people! I can count them with one hand. I find few entitled people, but most people I've found are nice.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: