Posted by deepakjois 7 days ago
I like it though. It's very convenient.
There are options to both lock the dependencies and limit by date:
https://docs.astral.sh/uv/guides/scripts/#locking-dependenci...
https://docs.astral.sh/uv/guides/scripts/#improving-reproduc...
People act like this happens all the time but in practice I haven't seen evidence that it's a serious problem. The Python ecosystem is not the JavaScript ecosystem.
An easy way to prove that this is the norm is to take some existing code you have now, and update to the latest versions your dependencies are using, and watch everything break. You don't see a problem because those dependencies are using pinned/very restricted versions, to hide the frequency of the problem from you. You'll also see that, in their issue trackers, they've closed all sorts of version related bugs.
I have done this many times and watched everything fail to break.
I’ve had to modify code this week due to changes in some popular libraries. Some recent examples are Numpy 2.0 broke most code that used numpy. They changed the c side (full interpreter crashes with trimesh) and removed/moved common functions, like array.ptp(). Scipy moved a bunch of stuff lately, and fully removed some image related things.
If you think python libraries are somehow stable in time, you just don’t use many.
I thought we were talking about situations in which following those rules still leads to a runtime fault. Which is certainly possible, but in my experience a highly overstated risk. Packages that say they will work with `foolib >= 3` will very often continue to work with foolib 4.0, and the risk that they don't is commonly-in-the-Python-world considered worth it to avoid other problems caused by specifying `foolib >=3, <4` (as described in e.g. https://iscinumpy.dev/post/bound-version-constraints/ ).
The real problem is that there isn't a good way (from the perspective of the intermediate dependency's maintainer) to update the metadata after you find out that a new version of a (further-on) dependency is incompatible. You can really only upload a new patch version (or one with a post-release segment in the version number) and hope that people haven't pinned their dependencies so strictly as to exclude the fix. (Although they shouldn't be doing that unless they also pin transitive dependencies!)
That said, the end user can add constraints to Pip's dependency resolution by just creating a constraints file and specifying it on the command line. (This was suggested as a workaround when Setuptools caused a bunch of legacy dependencies to explode - not really the same situation, though, because that's a build-time dependency for some packages that were only made available as sdists, even pure-Python ones. Ideally everyone would follow modern practice as described at https://pradyunsg.me/blog/2022/12/31/wheels-are-faster-pure-... , but sometimes the maintainers are entirely MIA.)
> Numpy 2.0 is a very recent example that broke most code that used numpy.
This is fair to note, although I haven't seen anything like a source that would objectively establish the "most" part. The ABI changes in particular are only relevant for packages that were building their own C or Fortran code against Numpy.
Absolute nonsense. It's industry standard that major version are widely accepted as/reserved for breaking changes. This is why you never see >= in any sane requirements list, you see `foolib == 3.*`. For anything you want to work for a reasonable amount of time, you see == 3.4.*, because deprecations often still happen within major versions, breaking all code that used those functions.
For that matter, have you seen any Python unit tests written before the Pytest 8 release that were broken by it? I think even ones that I wrote in the 6.x era would still run.
For that matter, the Python 3.x bytecode changes with every minor revision and things get removed from the standard library following a deprecation schedule, etc., and there's a tendency in the ecosystem to drop support for EOL Python versions, just to not have to think about it - but tons of (non-async) new code would likely work as far back as 3.6. It's not hard to avoid the := operator or the match statement (f-strings are definitely more endemic than that).
On the flip side, you can never really be sure what will break someone. Semver is an ideal, not reality (https://hynek.me/articles/semver-will-not-save-you).
And lots of projects are on calver anyway.
If you don't pin your dependencies, you will get breakage because your dependencies can have breaking changes from version bumps. If your dependencies don't fully pin, then you they will get breaking changes from what they rely on. That's why exact version numbers are almost always pinned for something distributed, because it's a frequent problem that you don't want the end user having to deal with.
Again, you don't see this problem often because you're lucky: you've installed at a time when the dependencies have already resolved all the breakage or, the more common case, the dependencies were pinned tight enough that those breaking changes were never an issue. In other words, everyone pinning their dependencies strict enough is already the solution to the problem. The tighter the restriction, the more guarantee of continued functionality.
I would suggest reading this comment chain again.
Bruh, one-off scripts is the whole point of Python. The cheat code is to add "break-system-packages = true" to ~/.config/pip/pip.conf. Just blow up ~/.local/lib/pythonX.Y/site-packages/ if you run into a package conflict (exceedingly rare) and reinstall. All these venv, uv, metadata peps, and whatnot are pointless complications you just don't need.
That's bait! / Ads are getting smarter!
I would also have accepted "unless you're geh", "unless you're a traitor to the republic", "unless you're not leet enough" etc.
And so, if you are the kind of person who has not heard of it, you probably don't read blogs about python, therefor you probably aren't reading _this_ blog. No harm no foul.
Is there a version of uv written in Python? It's weird (to me) to have an entire ecosystem for a language and a highly recommended tool to make your system work is written in another language.
Interestingly, the speed is the main differentiator from existing package and project management tools. Even if you are using it as a drop-in replacement for pip, it is just so much faster.
There are many competing tools in the space, depending on how you define the project requirements.
Contrary to the implication of other replies, the lion's share of uv's speed advantage over Pip does not come from being written in Rust, from any of the evidence available to me. It comes from:
* bootstrapping Pip into the new environment, if you make a new environment and don't know that you don't actually have to bootstrap Pip into that environment (see https://zahlman.github.io/posts/2025/01/07/python-packaging-... for some hints; my upcoming post will be more direct about it - unfortunately I've been putting it off...)
* being designed up front to install cross-environment (if you want to do this with Pip, you'll eventually and with much frustration get a subtly broken installation using the old techniques; since 22.3 you can just use the `--python` flag, but this limits you to environments where the current Pip can run, and re-launches a new Pip process taking perhaps an additional 200ms - but this is still much better than bootstrapping another copy of Pip!)
* using heuristics when solving for dependencies (Pip's backtracking resolver is exhaustive, and proceeds quite stubbornly in order)
* having a smarter caching strategy (it stores uncompressed wheels in its cache and does most of the "installation" by hard-linking these into the new environment; Pip goes through a proxy that uses some opaque cache files to simulate re-doing the download, then unpacks the wheel again)
* not speculatively pre-loading a bunch of its own code that's unlikely to execute (Pip has large complex dependencies, like https://pypi.org/project/rich/, which it vendors without tree-shaking and ultimately imports almost all of, despite using only a tiny portion)
* having faster default behaviours; e.g. uv defaults to not pre-compiling installed packages to .pyc files (since Python will do this on the first import anyway) while Pip defaults to doing so
* not (necessarily) being weighed down by support for legacy behaviours (packaging worked radically differently when Pip first became publicly available)
* just generally being better architected
None of these changes require a change in programming language. (For example, if you use Python to make a hard link, you just use the standard library, which will then use code written in C to make a system call that was most likely also written in C.) Which is why I'm making https://github.com/zahlman/paper .
Poetry doesn't do this caching trick. It creates its own cache with the same sort of structure as Pip's, and as far as I can tell it uses its own reimplementation of Pip's core installation logic from there (including `installer`, which is a factored-out package for the part of Pip that actually unpacks the wheel and copies files).
Besides, the most important tool for making python work, the python executable itself, is written in C. People occasionally forget it's not a self-hosting language.
A tool written in Python is never going to be as fast as one written in Rust. There are plenty of Python alternatives and you're free to use them.