Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tools you can use to make sure the Python program you wrote keeps working: requirements.txt, pip, pipenv, pyenv, virtualenv, pyenv-virtualenv, virtualenvwrapper, pyenv-virtualenvwrapper, venv, pyvenv, conda, miniconda, poetry, docker, nix.

Which ones did I miss? Which of them actually ensure your program always works the same as when you first wrote it, without asterisks?



You've missed: pdm, uv, pip-tools, pipx, rye, and probably some others.

Only pdm and poetry generate cross-platform lock files by default as far as I know, but there are a lot of people trying to solve this problem right now.

It's not an easy problem to solve. Python's package management predates package managers from most other programming languages and Python itself predates Linux. There is a lot of baggage so change is very slow.


Shameless plug: don't forget the wonderful pip-chill for simplifying gigantic requirements files (and for stripping out version numbers to make canaries easier to do).


    python -m venv myenv
    . myenv/bin/activate
    <install reqs>
    <your program>
Ensures you have the same python environment. The other part is OS state.


> Ensures you have the same python environment.

But not the same Python runtime, which does not fulfil the request of guaranteeing it works the same later without asterisks.


I'm not sure what you might mean by runtime, but a python venv does make sure you're using the same runtime. There are symlinks in the environments bin directory to the specific runtime. If you're plonking a python3.12 binary over the top of the python3.11 one, then yeah, you'll be using a different one, but that'd be an issue nomatter what you're using.

Just don't uninstall your python binaries and you're fine.

If you want to package everything up into am image or a zip, you can do that too if you want, but in my 10+ year career I've not had much of an issue just using boring venvs.


> but a python venv does make sure you're using the same runtime.

It does not.

> There are symlinks in the environments bin directory to the specific runtime.

Precisely. They symlink to a path. Which means that if you have a certain version of Python at one location (let’s say /usr/bin/python3) and later update that (let’s say by upgrading macOS and the Xcode developer tools), the same virtual environment will point to a different version of Python.

> Just don't uninstall your python binaries and you're fine.

That’s not a practical solution. Sometimes you don’t have a choice, as demonstrated above. By that logic one could say “don’t change anything about your system and you don’t even need virtual environments”. Which is somewhat true, but also profoundly unhelpful.

The point of the question was to reproduce the same thing without asterisks and you’ve introduced a major one and called it a day.


The binary installs aren't that generic in the machines I've used (mac and debian based, so covers a fair bit).

It's python3.11, not python3, and the python3 "executable" is, itself a symlink to the particular binary (in this example python3.11). Upgrading just changes the symlink, which wouldn't affect the venv, which isn't using python3, it's using python3.11.

I didn't introduce anything, I explained how the links are to the versioned binaries, which isn't what you're stating is happening.

Edit to add: as someone else also points out, you don't even have to use the symlinked versions, you can use --copies


> It's python3.11, not python3, and the python3 "executable" is, itself a symlink to the particular binary (in this example python3.11).

Not for the example I gave. If you’re seeing Python 3.11, you’re definitely not using the /usr/bin/python3 on macOS with is made available by the Xcode CLI tools. That’s at 3.9.6 even on Sonoma.

> as someone else also points out, you don't even have to use the symlinked versions, you can use --copies

Which, again, doesn’t work for the stated case.

https://news.ycombinator.com/item?id=39818732


Sorry, but yes I am using macOS for work. I use homebrew as xcode is the issue here, not python or macOS. There are solutions to your problem, you just seem resistant to using them.


> There are solutions to your problem, you just seem resistant to using them.

No, it is you who are failing to understand. I’m describing to you a real scenario, but it’s not one that bothers me. I don’t need you to come up with a solution and didn’t ask you for it. You need to understand not everyone has the same requirements and tradeoffs you do.


You can use --copies if you don't want symlinks [0].

[0]: https://docs.python.org/3/library/venv.html


Not with the macOS Python from the Xcode developer tools:

> Error: This build of python cannot create venvs without using symlinks


Don't use that one


— Doctor, it hurts when I raise may arm.

— Then don’t raise your arm.

Like I said in the original comment you replied to and are now ignoring:

> That’s not a practical solution. Sometimes you don’t have a choice, as demonstrated above. By that logic one could say “don’t change anything about your system and you don’t even need virtual environments”. Which is somewhat true, but also profoundly unhelpful.


No what you're saying is "Doc I want to only live on bread", to which a reasonable response is "it sounds like you're in jail and have bigger problems."

I'm not ignoring you; I just don't think your use case of "I want to capture the Python from Xcode in my dev env so it's resilient to changes and upgrades" is something anyone wants (or should want) to do. Do you really want to ship on that version? Are you asking your users to install Xcode?

> That’s not a practical solution. Sometimes you don’t have a choice, as demonstrated above. By that logic one could say “don’t change anything about your system and you don’t even need virtual environments”. Which is somewhat true, but also profoundly unhelpful.

In other words: when are you forced to use Xcode's Python for development, and why would that be a good idea? I'm earnestly asking; there may be reasons; I'm just not aware of any.


> I just don't think your use case of "I want to capture the Python from Xcode in my dev env so it's resilient to changes and upgrades" is something anyone wants (or should want) to do.

Then you are wrong. Simple as that. I’m describing a very real scenario.

> Do you really want to ship on that version?

Holy moly, is it really that hard to understand the difference between wanting and having to? For the use case, and older version in a consistent place which is easy to install is the best solution.

> Are you asking your users to install Xcode?

Triggering an Xcode CLI tools installation is simple and done graphically. And it’s one step removed from installing Homebrew or pyenv, which both need them (even for the scripted installation, pyenv requires git).

> In other words: when are you forced to use Xcode's Python for development, and why would that be a good idea?

See, for a moment there you understood it’s about users, not just your own dev environment, but then went back. Unfortunately, after this conversation with you two I no longer have the energy to go through it in detail in an uphill explanation. Another time, maybe.


>> I just don't think your use case of "I want to capture the Python from Xcode in my dev env so it's resilient to changes and upgrades" is something anyone wants (or should want) to do.

> Then you are wrong. Simple as that. I’m describing a very real scenario.

>> Do you really want to ship on that version?

> Holy moly, is it really that hard to understand the difference between wanting and having to? For the use case, and older version in a consistent place which is easy to install is the best solution.

Tone is hard on the internet, but I'm honestly trying to understand your use case. The way I understand it now is "I want to be able to develop Python programs locally using only Xcode's Python." I'm still not sure why you want to (generally when people ship Python programs they bundle a runtime) but let's set that to the side. For this use case, I don't understand why symlinks don't work for you. Xcode installs to versioned folders so you can have multiple Xcode installs side by side, that way new versions won't overwrite things. I'm obviously not an expert here though; am I missing something?


> For this use case, I don't understand why symlinks don't work for you.

I never said that. My assertion was that venv does not ensure the same Python runtime.¹ That’s it. I don’t have a problem with that. But I do know of one situation where it could make a difference and “do it another way” is not a reasonable answer.

If we ever meet in person I’ll gladly explain it in detail.

¹ https://news.ycombinator.com/item?id=39814963


This is more like:

- Doctor, it hurts when I hit my head on the wall

- Then don't hit your head on the wall

You have solutions available, use them


Apparently, “sometimes you don’t have a choice” is a foreign language.

— Hey, so you know how we have this requirement, which came about after years of dealing with and understanding a problem and the available solutions?

— Yes, what about it?

— Well, a random commenter on Hacker News who has zero context of the problem has suggested we use a method we already found inadequate for our specific use case.

— Oh wow, in that case let’s replace the whole system right now.


I think people are really looking for some examples of scenarios where this might be a requirement and your time might be better spent providing one vs. fighting with them. Maybe something like an airgapped gov network with very specific approved and audited software?


This is not the way you're supposed to be using pyenv

You're supposed to install a specific version of python in a specific place, with a specific name. Say, /usr/local/python-3.10.6

Use pyenv to use that python. Control that by creating a `.python-version` file that says 3.10.6

You now have a project that uses 3.10.6. Unless, of course, somebody installs a different version in that path - at which point you've got bigger issues

Using pyenv to use `/usr/bin/python3` and hoping for the best misses the point


> This is not the way you're supposed to be using pyenv

Because that’s not what the conversation is about. It’s about virtual environments.


If you attempt to reproduce a project in a repo that only includes source and requirements.txt you will not have metadata about the version of Python used.

It seems that there is a way to do this in pip but I don’t think it is widely used: https://stackoverflow.com/questions/19559247/requirements-tx...


>The other part is OS state

And that's the whole problem. "The same Python environment" doesn't mean much because it underspecifies the full runtime dependency chain.


It's inherit problem for all languages based on some runtime. Java, PHP, Ruby, whatever have the same issue. And TBH it's not very major issue to start with.

No other solution besides fixed OS (containers, nix) could solve this problem.


>No other solution besides fixed OS (containers, nix) could solve this problem.

You don't have to use NixOS to use Nix. You can install it on any Linux distro, or MacOS, and use it as your build system.


The link posted solved this problem without containers


Github issues for pyenv show a slightly opposite picture.


This approach is full of footguns.

Forget to install virtualenv? Have fun getting rid of packages in your global system.

Sub-dependencies? You'll have to think of a strategy for it. Most people don't and the reproducibility of their project suffers.

Developer vs production dependencies ? You'll have to think of a strategy for it.

And I think I'm forgetting one, but that's already enough.


In this case you don’t count the standard library as part of your Python environment. Since Python version is not specified neither is the standard library.


I am always curious how many people, outside of those building code for third party clients, actually hit this problem? In the 10+ years of using Python I have never had a problem using the core tools. The ecosystem is far from perfect but it has never cause me a problem.

Edit: Wow y'all are some sour people for voting down this question. I truly wonder how often people run into this problem compared to just complaining about it.


Just a requirements.txt that you can install with pip in a venv works 95% of the time. Unless you need a specific python version, which you only figure out halfway through by reading the documentation. Or you are using pytorch, because on windows and linux the version on pypi lacks GPU support, but there's no good way to encode the download url in your dependencies. Or when people don't properly maintain the requirements.txt, because you are supposed to somehow manually keep it in sync.

I run into some kind of packaging or dependency problem for basically every non-trivial project. Hence the many "solutions", but somehow most end up worse than the problems they attempt to solve


Working in CI in a company that has couple dozens of Python packages: I'd say about once a week. There are some weeks with no incidents, and there are weeks when everything is broken for many days straight.

NB. The latest incident was Friday when I've discovered that some CI pipeline ran `setup.py install` that down the lane invoked easy_install, which doesn't have a policy of ignoring bizarre versions s.a. X.Y.Zrc1 or X.Y.Zb2 etc. It ran aground when it was trying to install scikit-learn which wanted NumPy >=X.Y.Z, but it already installed X.Y.Zb1, and it didn't realize that this version should be OK (also, it shouldn't have installed non-release versions anyways).


I just hit it a few days ago. 3rd party script that I did not write. The dev has lost interest in it 3 years ago. It is written for 2.7.x (and it is a nontrivial amount of code). So my choices are some sort of pyenv thing (or one of them), or fix up the script myself so it runs on 3.12. I lucked out and someone else had already done the second thing. One project I worked on was 2.4.x. If they ever have to update that (which I hope they have) to something more recent I could see them doing it until they port it over. That is not even a lib or anything that is just the main prog.


download python 2.7 and run it? no pyenv needed


2.7 has been EOL for five years. It doesn’t get security updates nor bug fixes.


Python Foundation support ended that long ago, but vendors maintained it for longer. Red Hat support for it through RHEL 7 ends 30 June 2024, however, and my guess is they’re the last ones.


For this use case it is probably fine to have 2.7.x installed to use for this. Just annoyed. The downside to having it that way is this weird stray python installed. Would rather use something to switch it around to make it easier to keep track. But in the end it didnt mater as someone else had already updated it and I did not have to bother. Maybe next time :)


If I were writing applications only for myself it'd be fine, but I've definitely run into issues when writing code with others - teammates, open source repos, etc.


Just getting me and one coworker in sync on our dev machines often causes problems. Even more when we try to deploy to a cloud service which relies on pip.


I am not sure about what 'core' means in this context.

In my edperience I had many many problems with OS packages vs pip installed ones. There were really strange dependency issues.

In some cases I encountered dependency hell.

Even if somebody said to me that the core means also using virtualized en I would disagree as places over Internet not always explicitly guide you in that route.


> Which of them actually ensure your program always works the same as when you first wrote it, without asterisks?

Nix does, if you want to actually invest the effort into the "without asterisks" part.


You didn’t miss Poetry, but I have to say up until I started using Poetry python in large projects was a pain in the tooling department to setup and maintain over longer periods of time.

It’s no panacea, but feels more stable and usable (especially from onboarding new team members PoV) than other tooling I’ve tried


What does poetry deliver beyond what pyenv gets you? I haven't used either tool super extensively.


pyenv only manages python versions, while poetry manages dependencies and virtual environments. They are complimentary, but do not overlap.


To add to this, with poetry, you basically do `poetry env use python3.12` to create a python virtualenv on python3.12 (you can use whatever python version pyenv supports, doesn't even have to be CPython).

The generated virtual env name is a little wonky. I'm not sure exactly what the scheme is, but it's basically `$(package)-$(some sort of hash?)-$(pyversion)`. I can't speak for all tools, but at least VS Code detects the poetry env and suggests it (and indicates it as a Poetry env) when you go to configure the project interpreter.


You can use the poetry config to enable poetry putting the virtualenv into the project folder itself, which allows most IDEs to discover it.


Only Nix does. The builds are deterministic, transitively all the way back to gcc and libc.


What package managers are people using in other languages to make sure that software "always works the same as when you first wrote it, without asterisks"? I'd like to understand how they solve the "package no longer exists in a central registry" problem.


This is not as much about a package manager as it is about conventions and necessary dependencies.

Standards ensure that going forward language semantics and syntax don't change. Having minimal dependencies ensures program longevity. Package manager cannot solve these problems, no matter how good it is at its job.

Python doesn't have a standard, it's heavily reliant on dependencies which are very plentiful and similarly unregulated. A program written in C that uses only functionality described in some POSIX standard will endure decades unmodified. Even Python helloworld program went stale sometime ago, even though it's just one line.


> I'd like to understand how they solve the "package no longer exists in a central registry" problem.

This is, of course, an infrastructure/maintenance issue as much as a package manager design issue. But in Nix's case, the public 'binary cache' (for Nixpkgs/NixOS) of build outputs includes not only final build outputs but also the source tarballs that go into them. As Nix disallows network access at build time, all dependencies are represented this way, including jar files or source tarballs, or whatever— Nix itself must be the one to fetch your dependencies. Consequently, everything you fetch from the Internet for your build is a kind of intermediary Nix build that can be cached using the usual Nix tools. The Nix community's public cache has a policy of retaining copies of upstream sources forever (there is recently talk of limiting storage of the final built packages to a retention period of only 2 years, but sources will continue to be retained indefinitely. So far the cache reaches back to its inception around a decade ago.)

Taken together, these things mean that when a project disappears entirely from GitHub or Maven Central or whatever, people building against old versions of it with Nix/Nixpkgs don't even notice. Nix just fetches those upstream sources from the public cache without even reaching out to that central repository from which those sources have been removed.

For private use cases where your project and its dependencies won't be mirrored to the public cache of Nixpkgs builds, you can achieve the same effect by running your own cache or paying a hosted service to do that.

For builds outside the Nix universe, you can make special arrangements for each type of package your various builds fetch, and mirroring those repos. Then configure your builds to pull from your mirrors instead of the main/public ones.


Dotnet uses Nuget[1]. Packages in the system are immutable, & never changing. They can be unlisted, but never deleted (except in limited & extreme cases, like malware), which means even if a package maintainer stops publishing new versions to the repository, existing packages will continue to be publicly available for as long as Microsoft continues to exist.

[1] https://www.nuget.org/


Often a package says it works with OS version X, and not X+1. It may be true or false. But what you described does not solve either version of that problem.

Or the "says it will work with > X," but doesn't.


I think you're referring to vendoring dependencies? In python/pip for example, you can download the source for a package and point to the folder directly as a dependency instead of the version or a git URL. Most package managers/languages support some version of that. I suppose if you wanted to vendor all dependencies by default, keep them updated etc it would take a little more scripting or extra tools.


> Which of them actually ensure your program always works the same as when you first wrote it, without asterisks?

There aren't such tools. Python not being a standard and heavily reliant on the OS that runs it and on third-party components that are also not standard leaves you with no choice by to "be at the wheel" all the time. Virtually anything written in Python will go stale in a mater of few years. In other words, you need to constantly update and test as the environment changes just to stand still.


> Which ones did I miss?

Using python and python libraries only from your package manager (like APT) for a specific OS version.

Micromamba as well, lightweight version of conda/miniconda.


I have been looking at Rye, I like it so far.

The other one is Docker, which some people hate. I don't mind it, in some cases I prefer it.


None. The software container image is the best bet but you need to keep the image and not only the building scripts.


This is not true. Nix actually solves the problem. If you package with Nix, you'll get the exact same version of Python with the exact same version of dependencies, including the exact same version of system libraries. Your build will work in a year just as it does today. You don't even need to keep any binary artifacts.

Of course this doesn't come free: packaging with Nix may involve nontrivial effort for some projects.


How does nix solves the problem of Python dependencies not specifying their dependencies well or exhaustively? Do you have to find them out and fix them manually or can you generate a lock file and hope for the best?


Ultimately the true bellwether of compatibility is the package's own test suite, which we try to get working (and integrated into the build process) for as many packages as possible. For packages which have poor/non-existent test suites, often a downstream package's test suite will expose compatibility problems (we've found bugs in zlib security patches using curl's test suite for instance).

nixpkgs maintainers are frequently the first to notify a project author of incompatibility with new versions of another package.


It depends. If the dependency is in nixpkgs (Nix' own package repository), someone will have done the work of figuring out the dependencies properly and you can just use that. If it's not, you can either do that yourself and pin particular versions, or you can integrate with a tool like Poetry: Poetry could generate a lockfile for you that you then reference from Nix to get the versions of Python packages. You'd still need to specify any native dependencies manually, though.


> How does nix solves the problem of Python dependencies not specifying their dependencies well or exhaustively?

In Nix lingo, a package successfully building or running against an implicit or unmanaged dependency is called 'impurity'. To help keep builds 'pure', Nix builds everything in a sandbox and does various other tricks to ensure that a package being built can't/doesn't find any dependencies at build time that you don't explicitly tell Nix to include in the build environment.

(This is also increasingly how Linux distros build their packages, to solve the same problem.)

For the most part, if building (and running the test suite during the build) a Python package with Nix succeeds, you can be confident that you've got all the dependencies sorted out— even the ones upstream forgot to tell you about.

> Do you have to find them out and fix them manually or can you generate a lock file and hope for the best?

At install time, you can be confident that Nix will bring along all of the system-level dependencies your Python package needs. You don't have to find and fill any gaps at install time 10 years from now or whatever.

When you're writing your Nix package for the first time, you'll be doing a mix of generating the Nix code that defines your package from some upstream, Python-specific lockfile and making manual corrections when the build fails. Nix doesn't have any magic for figuring out dependencies that are left out of poetry.lock or whatever, or for disambiguating guaranteed-compatible exact versions from requirements.txt.


Yeah, containers don't fully solve this problem.

We still need a generated lock file with every top level dependency and sub-dependencies locked down to their most precise version commit to version control so that when you build your image today or in 6 months you end up with the same result.

Using pip to freeze your dependencies and writing a tiny shell script to generate a lock file at build time is better than nothing to solve this problem with nothing more than pip. It's what I do in https://github.com/nickjj/docker-flask-example and https://github.com/nickjj/docker-django-example. It's not perfect but it solves 80% with minimal complexity.


I remember when I was a childish Perl fan trading insults with the childish Python fans of the time, one of the things "they" always chided "us" about was that we had too many different ways to accomplish the same thing...


Pyinstaller, provided you build it in a reasonably old glibc to avoid glibc incompatibilities. I know that is a caveat but it is a one time build time operation.


Doesn't Pyinstaller also require CPython to have been built with '--enable-shared'? I _think_ this is the default for most system-installed Pythons that I've seen, but it's not the default if you build Python from source or install via pyenv.


asdf can be used as an alternative to pyenv. (In fact, it is not only meant for the Python ecosystem, so it can also replace nvm and others.)

For me, the combination of asdf and Poetry has worked quite well recently: I use asdf to pin the Python & Poetry version and then use Poetry to pin everything else.


Point of clarification: asdf uses python-build which is from pyenv.

https://github.com/asdf-community/asdf-python?tab=readme-ov-...


True. I mentioned asdf mainly because it can be used with many languages & tools and is not restricted to the Python ecosystem.


You only need pip and virtualenvwrapper

requirements.txt is a text file, not a separate tool...

if you want your program to work exactly as intended in the future, there are tools like py2exe and py2app for that.


Don't need any wrappers either. venv has been built into python for ages.


yes but `workon` is too good not to have


Looks like what could be a shell alias to cd and activate? I always keep a few tabs open to my repos folder, so not sure it would help me much. Doesn't seem to support fish, though activate does.


I modified my bash prompt to detect and auto-activate Python environments, and show if one is currently active. Haven’t thought about activate in years. It’s great.


Yes, I did that on my work laptop with fish, since I only ever used it with the work project.

On my personal machine I don't bother with venvs, so also haven't thought about it in years. But was trying to figure out what the "workon" command did for GP.


`workon` lists all virtualenvs you have in your virtualenvs folder and `workon foo` activates environment foo

this way I have all my envs in ~/virtual and all my projects in ~/projects and just `workon xyz` when I want to be in a certain venv for a given project (which doesn't always map one-to-one)


While gross ugly Maven lets you specify the bytecode version and every JDK can produce backwards-compatible jars and has done for many years


You missed hatch, uv, rye, pdm.

I think rye hits most of the points pretty well, it ensures both python and package versions.


Dependencies are for the weak. You only need Python. Just support a range of versions if you intend to distribute.


pyenv is all you need. pip is part of python. requirements.txt is just a convention used to store the list of stuff to hand to pip.


You missed Pipenv :-)

Edit: Oh I see you didn't. I can't read.


Go


You're missing Tox[0] & Nox[1]. Tox is actually quite nice when dealing with testing code across different versions of Python.

[0] https://tox.wiki [1] https://nox.thea.codes/en/stable/




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: