Top
Best
New

Posted by deepakjois 7 days ago

Fun with uv and PEP 723(www.cottongeeks.com)
638 points | 225 commentspage 4
_visgean 7 days ago|
I honestly don't like that this is expressed as a comment but I guess it makes the implementation easy and backwards compatible...
timkofu 1 day ago||
Nice.
babuloseo 7 days ago||
been doing this with Pipenv before, but uv is like Pipenv on steroids.
korijn 7 days ago||
There's no lockfile or anything with this approach right? So in a year or two all of these scripts will be broken because people didn't pin their dependencies?

I like it though. It's very convenient.

js2 7 days ago||
> There's no lockfile or anything with this approach right?

There are options to both lock the dependencies and limit by date:

https://docs.astral.sh/uv/guides/scripts/#locking-dependenci...

https://docs.astral.sh/uv/guides/scripts/#improving-reproduc...

zahlman 6 days ago|||
> So in a year or two all of these scripts will be broken because people didn't pin their dependencies?

People act like this happens all the time but in practice I haven't seen evidence that it's a serious problem. The Python ecosystem is not the JavaScript ecosystem.

nomel 6 days ago||
I think it's because you don't maintain much python code, or use many third party libraries.

An easy way to prove that this is the norm is to take some existing code you have now, and update to the latest versions your dependencies are using, and watch everything break. You don't see a problem because those dependencies are using pinned/very restricted versions, to hide the frequency of the problem from you. You'll also see that, in their issue trackers, they've closed all sorts of version related bugs.

zahlman 6 days ago||
> An easy way to prove that this is the norm is to take some existing code you have now, and update to the latest versions your dependencies are using

I have done this many times and watched everything fail to break.

nomel 6 days ago||
Are you sure you’re reading what I wrote fully? Getting pip, or any of them, to ignore all version requirements, including those listed by the dependencies themselves, required modifying source, last I tried.

I’ve had to modify code this week due to changes in some popular libraries. Some recent examples are Numpy 2.0 broke most code that used numpy. They changed the c side (full interpreter crashes with trimesh) and removed/moved common functions, like array.ptp(). Scipy moved a bunch of stuff lately, and fully removed some image related things.

If you think python libraries are somehow stable in time, you just don’t use many.

zahlman 6 days ago||
... So if the installer isn't going to ignore the version requirements, and thereby install an unsupported package that causes a breakage, then there isn't a problem with "scripts being broken because people didn't pin their dependencies". The packages listed in the PEP 723 metadata get installed by an installer, which resolves the listed (unpinned) dependencies to concrete ones (including transitive dependencies), following rules specified by the packages.

I thought we were talking about situations in which following those rules still leads to a runtime fault. Which is certainly possible, but in my experience a highly overstated risk. Packages that say they will work with `foolib >= 3` will very often continue to work with foolib 4.0, and the risk that they don't is commonly-in-the-Python-world considered worth it to avoid other problems caused by specifying `foolib >=3, <4` (as described in e.g. https://iscinumpy.dev/post/bound-version-constraints/ ).

The real problem is that there isn't a good way (from the perspective of the intermediate dependency's maintainer) to update the metadata after you find out that a new version of a (further-on) dependency is incompatible. You can really only upload a new patch version (or one with a post-release segment in the version number) and hope that people haven't pinned their dependencies so strictly as to exclude the fix. (Although they shouldn't be doing that unless they also pin transitive dependencies!)

That said, the end user can add constraints to Pip's dependency resolution by just creating a constraints file and specifying it on the command line. (This was suggested as a workaround when Setuptools caused a bunch of legacy dependencies to explode - not really the same situation, though, because that's a build-time dependency for some packages that were only made available as sdists, even pure-Python ones. Ideally everyone would follow modern practice as described at https://pradyunsg.me/blog/2022/12/31/wheels-are-faster-pure-... , but sometimes the maintainers are entirely MIA.)

> Numpy 2.0 is a very recent example that broke most code that used numpy.

This is fair to note, although I haven't seen anything like a source that would objectively establish the "most" part. The ABI changes in particular are only relevant for packages that were building their own C or Fortran code against Numpy.

nomel 6 days ago||
> `foolib >= 3` will very often continue to work with foolib 4.0,

Absolute nonsense. It's industry standard that major version are widely accepted as/reserved for breaking changes. This is why you never see >= in any sane requirements list, you see `foolib == 3.*`. For anything you want to work for a reasonable amount of time, you see == 3.4.*, because deprecations often still happen within major versions, breaking all code that used those functions.

zahlman 6 days ago||
Breaking changes don't break everyone. For many projects, only a small fraction of users are broken any given time. Firefox is on version 139 (similarly Chrome and other web browsers); how many times have you had to reinstall your plugins and extensions?

For that matter, have you seen any Python unit tests written before the Pytest 8 release that were broken by it? I think even ones that I wrote in the 6.x era would still run.

For that matter, the Python 3.x bytecode changes with every minor revision and things get removed from the standard library following a deprecation schedule, etc., and there's a tendency in the ecosystem to drop support for EOL Python versions, just to not have to think about it - but tons of (non-async) new code would likely work as far back as 3.6. It's not hard to avoid the := operator or the match statement (f-strings are definitely more endemic than that).

On the flip side, you can never really be sure what will break someone. Semver is an ideal, not reality (https://hynek.me/articles/semver-will-not-save-you).

And lots of projects are on calver anyway.

nomel 1 hour ago||
Agreed, this is a big problem, and exactly why people pin their dependencies, rather than leaving them wide open: pinning a dependency guarantees continued functionality.

If you don't pin your dependencies, you will get breakage because your dependencies can have breaking changes from version bumps. If your dependencies don't fully pin, then you they will get breaking changes from what they rely on. That's why exact version numbers are almost always pinned for something distributed, because it's a frequent problem that you don't want the end user having to deal with.

Again, you don't see this problem often because you're lucky: you've installed at a time when the dependencies have already resolved all the breakage or, the more common case, the dependencies were pinned tight enough that those breaking changes were never an issue. In other words, everyone pinning their dependencies strict enough is already the solution to the problem. The tighter the restriction, the more guarantee of continued functionality.

I would suggest reading this comment chain again.

rahimnathwani 7 days ago||
PEP 723 allows you to specify version numbers for direct dependencies, but of course indirect dependencies aren't guaranteed to be the same.
zidoo 6 days ago||
My only question is: who asked for faster pip?
TypingOutBugs 6 days ago||
uv has a lot more perks! It makes distributing python tooling easier too
0x008 6 days ago||
Comparing apples and orange here. Uv scope is so much more than pip
lysace 7 days ago||
Why do I feel like I’m in an infomercial?
bjourne 6 days ago||
> For the longest time, I have been frustrated with Python because I couldn’t use it for one-off scripts.

Bruh, one-off scripts is the whole point of Python. The cheat code is to add "break-system-packages = true" to ~/.config/pip/pip.conf. Just blow up ~/.local/lib/pythonX.Y/site-packages/ if you run into a package conflict (exceedingly rare) and reinstall. All these venv, uv, metadata peps, and whatnot are pointless complications you just don't need.

tpoacher 6 days ago||
> If you are not a Pythonista (or one possibly living under a rock)

That's bait! / Ads are getting smarter!

I would also have accepted "unless you're geh", "unless you're a traitor to the republic", "unless you're not leet enough" etc.

SpaceNugget 6 days ago|
I'm not a python dev, but if you read HN even semi-regularly you have surely come across it several times in at least the past few months if not a year by now. It is all the rage these days in python world it seems.

And so, if you are the kind of person who has not heard of it, you probably don't read blogs about python, therefor you probably aren't reading _this_ blog. No harm no foul.

AstroJetson 6 days ago|
> uv is an extremely fast Python package and project manager, written in Rust.

Is there a version of uv written in Python? It's weird (to me) to have an entire ecosystem for a language and a highly recommended tool to make your system work is written in another language.

sgeisenh 6 days ago||
Similar to ruff, uv mostly gathers ideas from other tools (with strong opinions and a handful of thoughtful additions and adjustments) and implements them in Rust for speed improvements.

Interestingly, the speed is the main differentiator from existing package and project management tools. Even if you are using it as a drop-in replacement for pip, it is just so much faster.

zahlman 6 days ago|||
They are not making a Python version.

There are many competing tools in the space, depending on how you define the project requirements.

Contrary to the implication of other replies, the lion's share of uv's speed advantage over Pip does not come from being written in Rust, from any of the evidence available to me. It comes from:

* bootstrapping Pip into the new environment, if you make a new environment and don't know that you don't actually have to bootstrap Pip into that environment (see https://zahlman.github.io/posts/2025/01/07/python-packaging-... for some hints; my upcoming post will be more direct about it - unfortunately I've been putting it off...)

* being designed up front to install cross-environment (if you want to do this with Pip, you'll eventually and with much frustration get a subtly broken installation using the old techniques; since 22.3 you can just use the `--python` flag, but this limits you to environments where the current Pip can run, and re-launches a new Pip process taking perhaps an additional 200ms - but this is still much better than bootstrapping another copy of Pip!)

* using heuristics when solving for dependencies (Pip's backtracking resolver is exhaustive, and proceeds quite stubbornly in order)

* having a smarter caching strategy (it stores uncompressed wheels in its cache and does most of the "installation" by hard-linking these into the new environment; Pip goes through a proxy that uses some opaque cache files to simulate re-doing the download, then unpacks the wheel again)

* not speculatively pre-loading a bunch of its own code that's unlikely to execute (Pip has large complex dependencies, like https://pypi.org/project/rich/, which it vendors without tree-shaking and ultimately imports almost all of, despite using only a tiny portion)

* having faster default behaviours; e.g. uv defaults to not pre-compiling installed packages to .pyc files (since Python will do this on the first import anyway) while Pip defaults to doing so

* not (necessarily) being weighed down by support for legacy behaviours (packaging worked radically differently when Pip first became publicly available)

* just generally being better architected

None of these changes require a change in programming language. (For example, if you use Python to make a hard link, you just use the standard library, which will then use code written in C to make a system call that was most likely also written in C.) Which is why I'm making https://github.com/zahlman/paper .

jaapz 6 days ago||
But also, because it's written in rust. There are tools written in python that do these smart caching and resolving tricks as well, and they are still orders of magnitude slower
zahlman 6 days ago||
Such as?

Poetry doesn't do this caching trick. It creates its own cache with the same sort of structure as Pip's, and as far as I can tell it uses its own reimplementation of Pip's core installation logic from there (including `installer`, which is a factored-out package for the part of Pip that actually unpacks the wheel and copies files).

ebb_earl_co 6 days ago|||
Well, I use Debian and Bash: pretty much everything to make my system work, including and especially Python development, is written in C, another language!
pjc50 6 days ago|||
uv wins precisely because it isn't written in python. As various people have pointed out, it can complete its run before competing python implementations have finished handling their imports.

Besides, the most important tool for making python work, the python executable itself, is written in C. People occasionally forget it's not a self-hosting language.

dralley 6 days ago||
pip?

A tool written in Python is never going to be as fast as one written in Rust. There are plenty of Python alternatives and you're free to use them.