- uv is aware of your dependencies, you can add/remove development dependencies, create group of development dependencies (test, lint, dev, etc) and add or remove those and only those at will. You can add dependencies and optional dependencies for a project as well, think my_app[cli,standard]. You don't need to have different requirements.txt for each case nor do you need to remove things by hand as you'd do in pip, since it doesn't remove deps when you remove a package for example. As a result, you can remove {conda,poetry,...} from your workflows.
- uv can install python and a virtualenv for you. Any command you run with `uv run` from the root of a repo will be aware of its environment, you don't even need to activate a virtualenv anymore. This replaces {pyenv, pyenv-virtualenv, virtualenvwrapper,...}.
- uv follows the PEPs for project config (dependencies, optional dependencies, tool configs) in the pyproject.toml so in case uv dies, it's possible to migrate away for the features are defined in the PEPs. Which is not the case for say, poetry.
- uv has a lock file and it's possible to make deps platform specific (Windows, Linux, MacOS, etc). This is in compliance with a PEP but not supported by all tools.
- uv supports custom indexes for packages so you can prefer a certain index, for example your company package index or pytorch's own index (for ML work).
- very fast, makes local dev very seamless and is really helpful in CI/CD where you might just setup and tear down python envs a lot.
Also, the team is responsive on Github so it's easy to get help.
Does this also replace, or work well with tox? We currently use it to run basic CI/local workflows (`tox -e lint` for all linters, `tox -e py310`, `tox -e py312` to run tests suites on chosen interpreters' environments), and to set up a local environment with package installed in-place (so that we can run `myprogram -arg1 -arg2` as if it was installed via `pip`, but still have it be editable by directly editing the repo).
With how much the ecosystem is moving, I don't know whether the way we're doing it is unusual (Django and some other big projects still have a tox.ini), obsolete (I can't find how ux obsoletes this), or perfectly fine and I just can't find how to replace pip with ux for this use case.
I'm not personally releasing a ton of internal packages where I work but I know of https://github.com/tox-dev/tox-uv. Haven't tried it yet though but it seems to do what you want. I also saw that nox (tox but in python instead of a tox.ini file https://nox.thea.codes/en/stable/config.html), is supporting uv from what I understand.
tox-uv has been a great selling point for my personal use of uv. I'm typically testing across 4-5 different versions of Python and the build speedup has been significant.
Uv works fine with tox, but have you tried nox? I only dipped my toes in tox, but I found nox around the same time and gravitated to it. I replaced PDM's "scripts" concept with nox sessions. I have a project where most of the functionality is nox sessions I call in CI pipelines. Writing sessions in pure python opens so many doors.
Not only it's faster, it also provides a lock file, `uvx tool_name` just like `npx`, and a comprehensive set of tools to manage your Python version, your venv and your project.
You don't need `pyenv`, `poetry` and `pipx` anymore, `uv` does all of that for you.
It's a much more complete tool than pip. If you've used poetry, or (in other languages) cargo, bundler, maven, then it's like that (and faster than poetry).
If you haven't, in addition to installing dependencies it will manage and lock their versions (no requirements.txt, and much more robust), look after the environment (no venv step), hold your hand creating projects, and probably other things.
Edit to add: the one thing it won't do is replace conda et al, nor is it intended to.
The problems start as soon as your scripts should run on more than your own computer.
If you pip install something, you install it on the system python (the python binary located at sys.executable). This can break systems if the wrong combination of dependencies comes together. This is why you should never install things via pip for other people, unless you asked them first.
Now how else would you install them? There is a thing called virtual environments, which basically allows you to install pip dependencies in such way, they are only there within the context of the virtual environment. This is what you should do when you distribute python programs.
Now the problem is how do you ensure that this install to the virtual environment uses specific versions? What happens when one library depends on package A with version 1.0 and another library depends on a package with version 2.0? Now what happens if you deploy that to an old debian with an older python version.. Before uv I had to spend literal days to resolve such conflicts.
uv solves most of these problems in one unified place, is extremely performant, just works and when it does not, it tells you precisely why.
It brings way more to the table than just being fast, like people are commenting. E.g. it manages Python for your projects, so if you say you want Python 3.12 in your project, and then you do 'uv run python my script.py', it will fetch and run the version of Python you specified, which pip can't do. It also creates lock files, so you know the exact set of Python package dependencies that worked, while you specify them more loosely. Plus a bunch of other stuff..
The only advantage over pip is it's faster. But the downside is it's not written in Python.
The real point of uv is to be more than pip, though. It can manage projects, so basically CLI commands to edit your `pyproject.toml`, update a lockfile, and your venv all in one go. Unlike earlier tools it implements a pretty natural workflow on top of existing standards where possible, but for some things there are no standards, the most obvious being lockfiles. Earlier tools used "requirements.txt" for this which was quite lacking. uv's lockfile is cross-platform, although, admittedly does produce noisier diffs than requirements.txt, which is a shame.
As a straight pip replacement, yeah it's mostly just faster. Although it does have a few breaking changes that make it more secure (it has a more predictable way of resolving packages that reduce the risk of package squatting).
That’s just the same “my pip is too slow” problem which some people don’t have.
I work in a place with 200 developers, and 99% of pip usage is in automated jobs that last an hour. Shaving a couple seconds off that will not provide any tangible benefit. However moving 200 people from a tool they know to one they don’t comes at a rather significant cost.
> Shaving a couple seconds off that will not provide any tangible benefit.
It could be more than that.
I switched from pip to uv today in a Dockerized project with 45 total dependencies (top level + sub-dependencies included).
pip takes 38 seconds and uv takes 3 seconds, both uncached. A 10x+ difference is massive and if uv happens to be better suited to run on multiple cores it could be even more because my machine is a quad core i5 3.20ghz from 10 years ago.
> I work in a place with 200 developers
In your case, if you have 200 developers spending an hour on builds that could in theory be reduced down to 5 minutes per build. That's 11,000 minutes or 183 hours of dev time saved per 1 build. I know you say it's automated but someone is almost always waiting for something right?
For what it's worth uv is fully compatible with pip. just replace 'pip --foo bar' with 'uv pip --foo bar'. One project I'm working on is 100% 'classic' pip based with no plans of moving, but I still use uv when working on it as it is completely transparent. Uv manages my venvs and python versions and makes things like switching between different versions of python and libraries much smoother, and I can still use the same pip commands as everybody else, it's just that all my pip commands run faster.
* uv makes some design choices that aren't always strictly compatible with the spec
* uv correctly implements the spec and it turns out pip, or the underlying library, didn't (I have worked on fixing a couple of these on the pip side)
* uv doesn't support legacy features still in pip
* Tool specific features or exact output diverge
This is not a criticism, but I've seen some users get irate with uv because they were under the impression that it was making much stronger compatibility guarantees.
I've been working with pip for so long now that I barely notice it unless something goes very wrong.