Top
Best
New

Posted by zdw 1 day ago

How uv got so fast(nesbitt.io)
1073 points | 361 commentspage 3
didip 18 hours ago|
If UV team has a spare time, they should rewrite Python in Rust without any of the legacy baggage.
simonw 19 hours ago||
This post is excellent. I really like reading deep dives like this that take a complex system like uv and highlight the unique design decisions that make it work so well.

I also appreciate how much credit this gives the many previous years of Python standards processes that enabled it.

Update: I blogged more about it here, including Python recreations of the HTTP range header trick it uses and the version comparison via u64 integers: https://simonwillison.net/2025/Dec/26/how-uv-got-so-fast/

ggm 19 hours ago||
Some of these speed ups looked viable to backport into pip including parallel download, delayed .pyc, ignore egg, version checks.

Not that I'd bother since uv does venv so well. But, "it's not all rust runtime speed" implies pip could be faster too.

eviks 20 hours ago||
> Every code path you don’t have is a code path you don’t wait for.

No, every code path you don't execute is that. Like

> No .egg support.

How does that explain anything if the egg format is obsolete and not used?

Similar with spec strictness fallback logic - it's only slow if the packages you're installing are malformed, otherwise the logic will not run and not slow you down.

And in general, instead of a list of irrelevant and potentially relevant things would be great to understand some actual time savings per item (at least those that deliver the most speedup)!

But otherwise great and seemingly comprehensive list!

zahlman 19 hours ago|
> No, every code path you don't execute is that.

Even in compiled languages, binaries have to get loaded into memory. For Python it's much worse. On my machine:

  $ time python -c 'pass'

  real 0m0.019s
  user 0m0.013s
  sys 0m0.006s

  $ time pip --version > /dev/null

  real 0m0.202s
  user 0m0.182s
  sys 0m0.021s
Almost all of that extra time is either the module import process or garbage collection at the end. Even with cached bytecode, the former requires finding and reading from literally hundreds of files, deserializing via `marshal.loads` and then running top-level code, which includes creating objects to represent the functions and classes.

It used to be even worse than this; in recent versions, imports related to Requests are deferred to the first time that an HTTPS request is needed.

eviks 13 hours ago||
> binaries have to get loaded into memory.

Unless memory mapped by the OS with no impact on runtime for unused parts?

> imports related to Requests are deferred

Exactly, so again have no impact?

zahlman 13 hours ago|||
> Unless memory mapped by the OS with no impact on runtime for unused parts?

Yeah, this is presumably why a no-op `uv` invocation on my system takes ~50 ms the first time and ~10 ms each other time.

> Exactly, so again have no impact?

Only if your invocation of pip manages to avoid an Internet request. Note: pip will make an Internet request if you try to install a package by symbolic name even if it already has the version it wants in cache, because its cache is an HTTP cache rather than a proper download cache.

But even then, there will be hundreds of imports mainly related to Rich and its dependencies.

eviks 13 hours ago||
> Only if your invocation of pip manages to avoid an Internet request.

Yes it does, by definition, the topic of discussion is the impact of unused code paths? How is http cache relevant here? That's a used path!

zahlman 10 hours ago||
I got confused by the direction of the discussion.

My original point was that Requests imports in pip used to not be deferred like that, so you would pay for them up front, even if they turned out to be irrelevant. (But also they are relevant more often than they should be, i.e. the deferral system doesn't work as well as it should.)

Part of the reason you pay for them is to run top-level code (to create function and class objects) that are irrelevant to what the program is actually doing. But another big part is the cost of actually locating the files, reading them, and deserializing bytecode from them. This happens at import time even if you don't invoke any of the functionality.

cmrx64 5 hours ago|||
rtld does a lot of work even in “static” binaries to rewrite relocations even in “unused parts” of any PIE (which should be all of them today) and most binaries need full dyld anyway.
rao-v 13 hours ago||
I have to say it's just lovely seeing such a nicely crafted and written technical essay. It's so obvious that this is crafted by hand, and reading it just reemphasises how much we've lost because technical bloggers are too ready to hand the keys over to LLMs.
yakshaving_jgt 5 hours ago|
This post was very clearly written with an LLM.
dangoodmanUT 17 hours ago||
> Zero-copy deserialization

Just a nit on this section: zero-copy deserialization is not Rust specific (see flatbuffers). rkyv as a crate for doing so in Rust is though

VerifiedReports 21 hours ago||
So... will uv make Python a viable cross-platform utility solution?

I was going to learn Python for just that (file-conversion utilities and the like), but everybody was so down on the messy ecosystem that I never bothered.

pseudosavant 20 hours ago||
I write all of my scripts in Python with PEP 723 metadata and run them with `uv run`. Works great on Windows and Linux for me.
zahlman 20 hours ago|||
It has been viable for a long time, and the kinds of projects you describe are likely well served by the standard library.
IshKebab 19 hours ago||
Yes, uv basically solves the terrible Python tooling situation.

In my view that was by far the biggest issue with Python - a complete deal-breaker really. But uv solves it pretty well.

The remaining big issues are a) performance, and b) the import system. uv doesn't do anything about those.

Performance may not be an issue in some cases, and the import system is ... tolerable if you're writing "a python project". If you're writing some other project and considering using Python for its scripting system, e.g. to wrangle multiple build systems or whatever than the import mess is a bigger issue and I would thing long and hard before picking it over Deno.

VerifiedReports 13 hours ago||
Thanks! I don't really think about importing stuff (which maybe I should), because I assume I'll have to write any specialized logic myself. So... your outlook is encouraging.
zahlman 20 hours ago||
I've talked about this many times on HN this year but got beaten to the punch on blogging it seems. Curses.

... Okay, after a brief look, there's still lots of room for me to comment. In particular:

> pip’s slowness isn’t a failure of implementation. For years, Python packaging required executing code to find out what a package needed.

This is largely refuted by the fact that pip is still slow, even when installing from wheels (and getting PEP 600 metadata for them). Pip is actually still slow even when doing nothing. (And when you create a venv and allow pip to be bootstrapped in it, that bootstrap process takes in the high 90s percent of the total time used.)

ec109685 21 hours ago||
The article info is great, but why do people put up with LLM ticks and slop in their writing? These sentences add no value and treats the reader as stupid.

> This is concurrency, not language magic.

> This is filesystem ops, not language-dependent.

Duh, you literally told me that the previous sentence and 50 million other times.

aurumque 21 hours ago|
This kind of writing goes deeper than LLM's, and reflects a decline in both reading ability, patience, and attention. Without passing judgement, there are just more people now who benefit from repetition and summarization embedded directly in the article. The reader isn't 'stupid', just burdened.
twoodfin 20 hours ago||
Indeed, I am coming around in the past few weeks to realization and acceptance that the LLM editorial voice is a benefit to an order of magnitude more hn readers than those (like us) for whom it is ice pick in the nostril stuff.

Oh well, all I can do is flag.

hk1337 18 hours ago|
It’s fast because it sucks the life force from bad developers to make them into something good.

Jokes aside…

I really like uv but also really like mise and I cannot seem to get them to work well together.

Onavo 17 hours ago|
Why? They are pretty compatible. Just set the venv in the project's mise.toml are you are good to go. Mise will activate it automatically when you change into the project directory.
hk1337 2 hours ago||
I believe I was trying it the other way around. I installed uv and python with mise but uv still created a .python_version file and using the one installed in the system instead of what was in mise
More comments...