Top
Best
New

Posted by deepakjois 6 days ago

Fun with uv and PEP 723(www.cottongeeks.com)
636 points | 225 comments
ACAVJW4H 6 days ago|
finally feels like Python scripts can Just Work™ without a virtualenv scavenger hunt.

Now if only someone could do the same for shell scripts. Packaging, dependency management, and reproducibility in shell land are still stuck in the Stone Ages. Right now it’s still curl | bash and hope for the best, or a README with 12 manual steps and three missing dependencies.

Sure, there’s Nix... if you’ve already transcended time, space, and the Nix manual. Docker? Great, if downloading a Linux distro to run sed sounds reasonable.

There’s got to be a middle ground simple, declarative, and built for humans.

nothrabannosir 6 days ago||
Nix is overkill for any of the things it can do. Writing a simple portable script is no exception.

But: it’s the same skill set for every one of those things. This is why it’s an investment worth making IMO. If you’re only going to ever use it for one single thing, it’s not worth it. But once you’ve learned it you’ll be able to leverage it everywhere.

Python scripts with or without dependencies, uv or no uv (through the excellent uv2nix which I can’t plug enough, no affiliation), bash scripts with any dependencies you want, etc. suddenly it’s your choice and you can actually choose the right tool for the job.

Not trying to derail the thread but it feels germane in this context. All these little packaging problems go away with Nix, and are replaced by one single giant problem XD

exe34 5 days ago||
> Nix is overkill for any of the things it can do. Writing a simple portable script is no exception.

ChatGPT writes pretty good nix now. You can simply paste any errors in and it will fix it.

traverseda 6 days ago|||
I don't think nix is that hard for this particular use case. Installing nix on other distros is pretty easy, and once it's installed you just do something like this

    #! /usr/bin/env nix-shell
    #! nix-shell -i bash -p imagemagick cowsay

    # scale image by 50%
    convert "$1" -scale 50% "$1.s50.jpg" &&
    cowsay "done $1.q50.jpg"
Sure all of nixos and packaging for nix is a challenge, but just using it for a shell script is not too bad
caspar 5 days ago|||
Last time I checked,[0] this works great - as long as you don't particularly care which specific versions of imagemagick or cowsay you want.

If you do care, then welcome to learning about niv, flakes, etc.

[0]: admittedly 3 years ago or so.

loremm 5 days ago||
This is a hack but I still found it helpful. If you do want to force a certain version, without worrying about flakes [1] this can be your bash shebang, with similar for nix configuration.nix or nix-shell interactive. It just tells nix to use a specific git hash for it's base instead of whatever your normal channel is.

For my use case, most things I don't mind tracking mainline, but some things I want to fix (chromium is very large, python changes a lot, or some version broke things)

``` #! nix-shell -i bash -p "cowsay" '(import (fetchTarball { url="https://github.com/NixOS/nixpkgs/archive/eb090f7b923b1226e8b... sha256 = "15iglsr7h3s435a04313xddah8vds815i9lajcc923s4yl54aj4j";}) {}).python3' ```

[1] flakes really aren't bad either, especially if you think about it as just doing above, but automatically

Imustaskforhelp 5 days ago|||
I will say this with a whole heart. My arch linux broke and I wanted to try out nix.

The most shocking part about nix is the nix-shell (I know I can use it in other distros but hear me out once), its literally so cool to install projects for one off.

Want to record a desktop? Its one of those tasks that for me I do just quite infrequently and I don't like how in arch, I had to update my system with obs as a dependency always or I had to uninstall it. Ephemerality was a concept that I was looking for before nix since I always like to try out new software/keep my home system kind of minimalist-ish Cool. nix-shell -p obs-studio & obs and you got this.

honestly, I like a lot of things about nix tbh. I still haven't gone too much into the flake sides of things and just use it imperatively sadly but I found out that nix builds are sandboxed so I found a unique idea of using it as a sandbox to run code on reddit and I think I am going to do something cool with it. (building something like codapi , codapi's creator is kinda cool if you are reading this mate, I'd love talking to ya)

And I personally almost feel as if some software could truly be made plug n play (like imagine hetzner having nix os machines (currently I have heard that its support is finnicky) but then somehow a way to get hetzner nix os machines and then I almost feel as if we can get something really really close to digital ocean droplets/ plug n play without any isolation that docker provides because I guess docker has its own usecases but I almost feel as if managing docker stuff is kinda harder than nix stuff but feel free to correct me as I am just saying what I am feelin using nix.

I also wish if something like functional lua (does fxn lua exist??) -> nix transpiler because I'd like to write lua instead of nix to manage my system but I guess nix is fine too!

Hetzner_OL 5 days ago|||
Hi there, Since you mentioned Hetzner, I thought I would respond here. While we do not have NixOS as one of our standard images for our cloud products, it is part of our ISO library. Customers can install it manually. To do this, create a cloud server, click on it, and then on the "ISO" in the menu, and then look for it listed alphabetically. --Katie
Imustaskforhelp 5 days ago|||
Hey hetzner. I am just a 16 year old boy (technically I am turning 17 on 2nd july haha but I want nothing from ya haha) who has heard great things about your service while being affordable but never have tried them because I guess I just don't have a credit card/I guess I am a really frugal person at this moment haha. I was just reading one of your own documents if I feel correct and it said that the support isn't the best(but I guess I was wrong)

I guess I will try out nix on hetzner for sure one day. This is really cool!!! Thanks! I didn't expect you to respond. This is really really cool. You made my day to whoever responded with this. THANKS A LOT KATIE. LOTS OF LOVE TO HETZNER. MAY YOU BE THE WAY YOU ARE, SINCE Y'ALL ARE PERFECT.

Hetzner_OL 5 days ago||
Hi again, I'm happy that I made your day! You seem pretty easy to please if that is all it takes. Keep in mind that customers must be 18 years old. I believe that is a legal requirement here in Germany, where we are based. Until then, if you're a fan, maybe you'd enjoy seeing what we're up to. We're on YouTube, reddit, Mastodon, Instagram, Facebook, and X. --Katie
loremm 5 days ago|||
and I've been using nixos on hetzner, nothing crazy but it's always worked great :-). A nice combination with terraform
Purplish9893 5 days ago|||
If you think nix-shell is cool, try out comma. https://github.com/nix-community/comma

When there's some random little utility I need I don't always bother to install it. It's just `, weirdlittleutil`.

bigstrat2003 6 days ago|||
> Packaging, dependency management, and reproducibility in shell land are still stuck in the Stone Ages.

IMO it should stay that way, because any script that needs those things is way past the point where shell is a reasonable choice. Shell scripts should be small, 20 lines or so. The language just plain sucks too much to make it worth using for anything bigger.

xavdid 5 days ago|||
My rule of thumb is that as soon as I write a conditional, it's time to upgrade bash to Python/Node/etc. I shouldn't have to search for the nuances of `if` statements every time I need to write them.
pxc 5 days ago|||
What nuances are there to if statements, exactly?

An if statement in, for instance bash, just runs any command and then runs one of two blocks of code based on the exit status of that command. If the exit status is truthy, it runs what follows the `then`. If it's falsey, it rhns what follows the `else`. (`elsif` is admittedly gratuitous syntax— it would be better if it were just implemented as an if inside an else statement.) This seems quite similar to other programming languages and like not very much to remember.

I'll admit that one thing I do in my shell scripts is avoid "fake syntax"— I never use `[` or `[[` because these obscure the real structure of the statements for the sake of cuteness. I just write `test`, which makes clear that it's just an ordinary command, ans also signals to someone who isn't sure what it's doing that they can find out just by running `man test`, `help test`, `info test`, etc., from the same shell.

I also agree that if statements and if expressions should be kept few and simple. But in some ways it's actually easier to do this in shell languages than in many others! Chaining && and/or || can often get you through a substantial script without any if statements at all, let alone nested ones.

passwd 3 days ago|||
The difference being, as far as I know, that `[[` is the real syntax. This from what I remember helps in avoiding certain class of issues, gives better error messages and is more certain to be a bash built-in.

What I would worry about more is that it breaks `sh` compatibility.

pxc 3 days ago||
`test` and `[` are Bash builtins just like `[[` is built into bash. But `[[`'s implementation does some things that actual commands can't do because it gets parsed differently than a normal command.

When Bash sees it, it treats it as something that needs to be matched, like a string, and won't stop taking user input in an interactive session until it sees the `]]` or something else that causes a syntax error. If I write `[` and just hit enter, I get an error from the `[` command, same as if I ran an external one. But if I use a `[[`, I get an error message back from Bash itself about a malformed conditional (and/or it will wait for input before trying to execute the command):

  2 bash  which -a [
  /Users/pxc/.nix-profile/bin/[
  /bin/[

  󰧈 2 ~ 
  bash  [
  bash: [: missing `]'

  󰧈 2 ~ 
  2 bash  /bin/[
  [: missing ]

  󰧈 2 ~ 
  2 bash  [[
  ∙ no prompt
  bash: conditional binary operator expected
  bash: syntax error near `prompt'

  󰧈 2 ~ 
  2 bash  [[
  ∙ a =
  bash: unexpected argument `newline' to conditional binary operator
  bash: syntax error near `='
The other thing `[[` does is it has you write `&&` and `||` instead of `-a` and `-o`. A normal command can't do this, because it can't influence the parser of the shell running it-- `&&` will get interpreted by the shell rather than the command unless it is escaped.

This same kind of special handling by the parser probably allows for other differences in error messages, but I don't write Bash that produces such errors, so I couldn't tell you. ;)

> more certain to be a bash built-in

If you want to be sure that you're using a built-in rather than a command, you can use the `builtin` command. But because `[[` is actually special syntax, it's technically not a builtin, so you can't use it this way! Check it out:

  ~ took 34m8s 
  2 󰈺   bash

  󰧈 2 ~ 
  bash  builtin [[
  bash: builtin: [[: not a shell builtin

  󰧈 2 ~ 
  1 bash  builtin [ 
  bash: [: missing `]'
The thing that lets `[[` yield more sophisticated error messages in some ways is actually the very reason I prefer to stay away from it: it's special syntax when an ordinary command works just fine. I think the any-command-goes-here-and-all-commands-are-equal structure of if statements in Unix shells is elegant, and it's already expressive enough for everything we want to do. Stuff like `[[` complicates things and obscures that elegance without really buying us additional power, or even additional concision.

Imo, that's the real reason to avoid it. I'm all for embracing Bashisms when it makes code more legible. For instance, I think it's great to lean on associative arrays, `shopt -s lastpipe`, and `mapfile`. They're innovations (or deviations, depending on how you look at it ;), like `[[`, but I feel like they make the language clearer and more elegant while `[[` actually obfuscates the beauty of `if` statements in shell languages, including Bash.

xavdid 5 days ago|||
I mean, there are 3 equally valid ways to write an if statement: `test`, `[`, and `[[`. In the case of the latter two, there are a mess of single-letter flags to test things about a file or condition[0]. I'm not sure what makes them "fake syntax", but I also don't know that much about bash.

It's all reasonable enough if you go and look it up, but the script immediately becomes harder to reason about. Conditionals shouldn't be this hard.

[0]: https://tldp.org/LDP/Bash-Beginners-Guide/html/sect_07_01.ht...

pxc 5 days ago||
You don't need any of those to write an if statement. I frequently write if statements like this one

    if ! grep -qF something /etc/some/config/file 2>/dev/null; then
      do_something
    fi
The `test` command is there if you want to use it, but it's just another command.

In the case of Bash, `test` is a built-in command rather than an external program, and it also has two other names, `[` and `[[`. I don't like the latter two because they look, to a naive reader, like special syntax built into the shell— like something the parser sees as unique and different and bear a special relationship to if-statements— but they aren't and they don't. And in fact you can use them in other shells that don't have them as built-ins, if you implement them as external commands. (You can probably find a binary called `[` on your system right now.)

(Actually, it looks like `[[` is even worse than "fake syntax"... it's real special syntax. It changes how Bash interprets `&&` and `||`. Yikes.)

But if you don't like `test`, you don't have to use it; you can use any command you like!

For instance, you might use `expr`:

  if expr "1 > 0"; then
    echo this will always run
  else
    echo this will never run
  fi
Fish has some built-ins that fall into a similar niche that are handy for simple comparisons like this, namely `math` and `string`, but there are probably others.

If you really don't like `test`, don't even need to use it for checking the existence or type (dir, symlink, socket, etc.) of files! You can use GNU `find` for that, or even sharkdp's `fd` if you ache for something new and shiny.

Fish actually has something really nice here in the `path` built-in, which includes long options like you and I both wish `test` had. You can write:

  if path -q --type=dir a/b/c
    touch a/b/c/some-file
  end
You don't need `test` for asking about or asserting equality of variables, either;

  grep -qxF "$A" <<< "$B"
is equivalent to

  test "A" = "$B"
or with the Fish `string` built-in

  string match --entire $A $B
The key is that in a shell, all commands are truthy in terms of their exit status. `&&` and `||` let you combine those exit statuses in exactly the way you'd expect, as do the (imo much more elegant) `and` and `or` combiner commands in Fish.

Finally, there's no need to use the likes of `test` for combining conditions. I certainly never do. You can just write

  test "$A" = "$B" && test "$C" = "$D"
instead of something like

  [ "$A" = "$B" -a "$C" = "$D" ]


If-statements in shell languages are so simple that there's practically nothing to them. They just take a single command (any!) and branch based on its exit status! That's it.

As for readability: any program in any language is difficult to understand if you don't know the interfaces or behaviors of the functions it invokes. `[`/`test` is no different from any such function, although it appears that `[[` is something weirder and, imo, worse.

colonial 4 days ago||||
This is a decent heuristic, although (IMO) you can usually get away with ~100 lines of shell without too much headache.

Last year I wrote (really, grew like a tumor) a 2000 line Fish script to do some Podman magic. The first few hundred lines were great, since it was "just" piping data around - shell is great at that!

It then proceeded to go completely off the rails when I went full sunk cost fallacy and started abusing /dev/shm to emulate hash tables.

E: just looked at the source code. My "build system" was another Fish script that concatenated several script files together. Jeez. Never again.

w0m 5 days ago|||
Historically; my rule of thumb is as soon as I can't see the ~entire script without scrolling - time to rewrite in Python/ansible. I Think about the rewrite, but it usually takes awhile to do it (if ever)
pxc 6 days ago||||
When you solve the dependency management issue for shell scripts, you can also use newer language features because you can ship a newer interpreter the same way you ship whatever external dependencies you have. You don't have to limit yourself to what is POSIX, etc. Depending on how you solve it, you may even be able to switch to a newer shell with a nicer language. (And doing so may solve it for you; since PowerShell, newer shells often come with a dependency management layer.)

> any script that needs those things

It's not really a matter of needing those things, necessarily. Once you have them, you're welcome to write scripts in a cleaner, more convenient way. For instance, all of my shell scripts used by colleagues at work just use GNU coreutils regardless of what platform they're on. Instead of worrying about differences in how sed behaves with certain flags, on different platforms, I simply write everything for GNU sed and it Just Works™. Do those scripts need such a thing? Not necessarily. Is it nicer to write free of constraints like that? Yes!

Same thing for just choosing commands with nicer interfaces, or more unified syntax... Use p7zip for handling all your archives so there's only one interface to think about. Make heavy use of `jq` (a great language) for dealing with structured data. Don't worry about reading input from a file and then writing back to it in the same pipeline; just throw in `sponge` from moreutils.

> The language just plain sucks too much

There really isn't anything better for invoking external programs. Everything else is way clunkier. Maybe that's okay, but when I've rewritten large-ish shell scripts in other languages, I often found myself annoyed with the new language. What used to be a 20-line shell script can easily end up being 400 lines in a "real" language.

I kind of agree with you, of course. POSIX-ish shells have too much syntax and at the same time not enough power. But what I really want is a better shell language, not to use some interpreted non-shell language in their place.

m2f2 6 days ago|||
Nice, if only you could count on having it installed on your fleet, and your fleet is 100pct Linux, no AIX, no HPUX, no SOLARIS, no SUSE on IBM Power....

Been there, tried to, got a huge slap in the face.

kstrauser 5 days ago||
Been there, done that. I am so glad I don’t have to deal with all that insanity anymore. In the build farm I was responsible for, I was always happy to work on the Linux and BSD boxes. AIX and HPUX made me want to throw things. At least the Itanium junk acted like a normal server, just a painfully slow one.

I will never voluntarily run a bunch of non-Linux/BSD servers again.

lenerdenator 5 days ago||
I honestly don't get why there are still a bunch of non-Linux/BSD servers, at least if the goal is to do UNIX-y stuff.

I haven't touched AIX or HPUX in probably a decade and I thought they were a weird idea back then: proprietary UNIX? Is it still 1993?

kstrauser 5 days ago||
At the time (10 years ago) I worked for a company with enormous customers who had all kinds of different deployment targets. I bet that list is a lot shorter today.

I hope so, for their sake. shudder

MatmaRex 6 days ago|||
Broke: Dependency management used for shell scripts

Woke: Dependency management used for installing an interpreter for a better programming language to write your script in it

Bespoke: Dependency management used for installing your script

maccard 5 days ago||||
Unfortunately there’s basically no guarantee that even the simplest scripts work.

    #!/bin/bash
    make $1
Has multiple possible problems with it.
johnisgood 5 days ago|||
I have a couple of projects consisting of around >1k lines of Bash. :) Not to bloat, but it is pretty easy to read and maintain. It is complete as well. I tested all of its functionalities and it just works(tm). Were it another language, it may have been more than just around 1k LOC, however, or more difficult to maintain. I call some external programs a lot, so I stick'd to a shell script.
wpm 6 days ago|||
I simply do not write shell scripts that use or reference binaries/libraries that are no pre-installed on the target OS (which is the correct target, writing shell scripts for portability is silly).

There is no package manager that is going to make a shell script I write for macOS work on Linux if that script uses commands that only exist on macOS.

fragmede 5 days ago||
fwiw (home)brew exists on both platforms
wazzaps 6 days ago|||
Check out mise: https://mise.jdx.dev/

We use it at $work to manage dev envs and its much easier than Docker and Nix.

It also installs things in parallel, which is a huge bonus over plain Dockerfiles

KingMob 5 days ago||
I declared nix bankruptcy earlier this year and moved to mise. It does 90% of what I need for only 1% of the effort of nix.
ndr 6 days ago|||
Why bother writing new shell scripts?

If you're allowed to install any deps go with uv, it'll do the rest.

I'm also kinda in love with https://babashka.org/ check it out if you like Clojure.

andenacitelli 6 days ago|||
+1 for Mise, it has just totally solved the 1..N problem for us and made it hilariously easy to be more consistent across local dev and workflows
yard2010 5 days ago|||
That's a shame as I got to a monk-level python jujitsu. I can fix any problem, you name it, https nightmare, brew version vs pyenv, virtualenv shenanigans. Now all this knowledge is a bad investment of time.
arcanemachiner 5 days ago||
Never say never.

Knowing the Python packaging ecosystem, uv could very well be replaced by something else. It feels different this time, but we won't know for a while yet.

w0m 5 days ago||
Agreed. I migrated ~all my personal things to Uv; but I'm sure once I start adopting widely at work I'll find edge cases you need to know the weeds to figureout/work around.
password4321 6 days ago|||
I'm unable to resist responding that clearly the solution is to run Nix in Docker as your shell since packaging, dependency management, and reproducibility will be at theoretical maximum.
bjackman 6 days ago|||
For the specific case of solving shell script dependencies, Nix is actually very straightforward. Packaging a script is a writeShellApplication call and calling it is a `nix run`.

I guess the issue is just that nobody has documented how to do that one specific thing so you can only learn this technique by trying to learn Nix as a whole.

So perhaps the thing you're envisaging could just be a wrapper for this Nix logic.

pxc 6 days ago|||
I use Nix for this with resholve and I like it a lot.
fouronnes3 6 days ago|||
Consider porting your shell scripts to Python? The language is vastly superior and subprocess.check_call is not so bad.
Narushia 5 days ago|||
> Great, if downloading a Linux distro to run sed sounds reasonable.

There's a reason why distroless images exist. :)

db48x 5 days ago|||
Guix is easier to grok than Nix, if anyone is looking to save themselves some effort.
est 6 days ago|||
> finally feels like Python scripts can Just Work™ without a virtualenv scavenger hunt.

Hmm, last time I checked, uv installs into ~/.local/share/uv/python/cpython-3.xx and can not be installed globally e.g. inside a minimal docker without any other python.

So basically it still runs in a venv.

ndr 5 days ago||
https://docs.astral.sh/uv/reference/settings/#pip_system
est 5 days ago||
I mean how to install `uv python install` into system-wide.

No matter what I tried it's always a symlink into ~/.local

bjornasm 4 days ago||
>When Python is installed by uv, it will not be available globally (i.e. via the python command). Support for this feature is in preview. See Installing Python executables for details.

>You can still use uv run or create and activate a virtual environment to use python directly.

est 4 days ago||
yes that's exactly what I meant on OP's "virtualenv scavenger hunt" statement.

You still need some kind of venv, even with the power of uv.

SmellTheGlove 6 days ago||
Would homebrew do the job?
w0m 5 days ago||
Homebrew does a great job @ initial setup; it does a poor job of keeping a system clean and updated over time.
epistasis 6 days ago||
This is really great, and it seems that it's becoming more popular. I saw it first on simonw's blog:

https://simonwillison.net/2024/Dec/19/one-shot-python-tools/

And there was a March discussion of a different blog post:

https://news.ycombinator.com/item?id=43500124

I hope this stays on the front page for a while to help publicize it.

soundblaster 3 days ago|
same! nice trick.

at the end of article it shows an mcp to fetch youtube subs. I've made a similar one using simonw's llm as a fragment, if you find it useful.

  llm -f youtube:<id>
  llm -f yt:<lang>:<id>
https://github.com/redraw/llm-fragments-youtube
puika 6 days ago||
Like the author, I find myself going more for cross-platform Python one-offs and personal scripts for both work and home and ditching Go. I just wish Python typechecking weren't the shitshow it is. Looking forward to ty, pyrefly, etc. to improve the situation a bit
SavioMak 6 days ago||
Speed is one thing, the type system itself is another thing, you are basically guaranteed to hit like 5-10 issues with python's weird type system before you start grasping some of the oddities
davidatbu 5 days ago|||
I wouldn't describe Python type checking as a shit-show. pyright is pretty much perfect. One nit against it perhaps is that it doesn't support non-standard typing constructs like mypy does (for Django etc). That's an intentional decision on the maintainer's part. And I'm glad he made that decision because that spurned efforts to make the standard typing constructs more expressive.

I'm also looking forward to the maturity of Rust-based type checkers, but solely because one can almost always benefit from an order of magnitude improvement in speed of type checking, not because Python type-checking is a "shit show".

I do grant you that for outsiders, the fact that the type checker from the Python organization itself is actually a second rate type checker (except for when one uses Django, etc, and then it becomes first-rate) is confusing.

ViscountPenguin 5 days ago|||
I've never particularly liked go for cross platform code anyway. I've always found it pretty tightly wedded to Unix. Python has its fair share of issues on windows aswell though, I've been stuck debugging weird .DLL issues with libraries for far too long in my life.

Strangely, I've found myself building personal cross platform apps in game engines because of that.

silverwind 5 days ago||
I do hope the community will converge on one type checker like ty. The fact that multiple type checkers exist is really hindering to the language as a whole.
jkingsman 6 days ago||
uv has been fantastic to use for little side projects. Combining uv run with `uv tool run` AKA `uvx` means one can fetch, install within a VM, and execute Python scripts from Github super easily. No git clone, no venv creation + entry + pip install.

And uv is fast — I mean REALLY fast. Fast to the point of suspecting something went wrong and silently errored, when it fact it did just what I wanted but 10x faster than pip.

It (and especially its docs) are a little rough around the edges, but it's bold enough and good enough I'm willing to use it nonetheless.

lxgr 6 days ago||
Truly. uv somehow resolves and installs dependencies more quickly than pyenv manages to print its own --help output.
mikepurvis 6 days ago|||
I know there are real reasons for slow Python startup time, with every new import having to examine swaths of filesystem paths to resolve itself, but it really is a noticeable breath of fresh air working with tools implemented in Go or Rust that have sub-ms startup.
mr_mitm 5 days ago|||
You don't have to import everything just to print the help. I try to avoid top-level imports until after the CLI arguments have been parsed, so the only import until then is `argparse` or `click`. This way, startup appears to be instant even in Python.

Example:

    if __name__ == "__main__":
        from myapp.cli import parse_args

        args = parse_args()

        # The program will exit here if `-h` is given

        # Now do the heavy imports

        from myapp.lib import run_app

        run_app(args)
mikepurvis 5 days ago||
Another pattern, though, is that a top level tool uses pkg_resources and entry_points to move its core functionality out to verb plugins— in that case the help is actually the worst case scenario because not only do we have to scan the filesystem looking for what plugins are available, they all have to be imported in order to ask each for its help strings.

An extreme version of this is the colcon build tool for ROS 2 workspaces:

https://github.com/colcon/colcon-core/blob/master/setup.cfg#...

Unsurprisingly, startup time for this is not great.

Spivak 6 days ago||||
Not to derail the Python speed hate train but pyenv is written in bash.

It's a tool for installing different versions of Python, it would be weird for it to assume it already had one available.

lxgr 6 days ago||
Oh, that might actually explain the slow line printing speed. Thank you, solves a long standing low stakes mystery for me :)
lxgr 6 days ago||||
The Python startup latency thing makes sense, but I really don't understand why it would take `pyenv` a long time to print each line of its "usage" output (the one that appears when invoking it with `--help`) once it's already clearly in the code branch that does only that.

It feels like like it's doing heavy work between each line printed! I don't know any other cli tool doing that either.

heavyset_go 6 days ago||
There's a launcher wrapper shell script + Python startup time that contributes to pyenv's slow launch times.
theshrike79 6 days ago|||
The "slowness" and the utter insanity of trying to make a "works on my computer" Python program work on another computer pushed me to just rewrite all my Python stuff in Go.

About 95% of my Python utilities are now Go binaries cross-compiled to whatever env they're running in. The few remaining ones use (API) libraries that aren't available for Go or aren't mature enough for me to trust them yet.

heavyset_go 6 days ago|||
Last time I looked, pyenv contributors were considering implementing a compiled launcher for that reason.

But that ship has sailed for me and I'm a uv convert.

mmcnl 5 days ago|||
I agree uv is amazing, but it's not a virtual machine, it's a virtual environment. It runs the scripts on top of your OS without any hardware virtualization. The virtual environment only isolates the Python dependencies.
bjornasm 4 days ago|||
>It (and especially its docs) are a little rough around the edges, but it's bold enough and good enough I'm willing to use it nonetheles

Thought I was the only one thinking this. Got to open an issue, I think it would be nice to have some more examples showcasing different use cases.

TZVdosOWs3kZHus 5 days ago||
No more dependency problems with mkdocs I ran into before every other month:

  uvx --with mkdocs-material --with mkdocs-material-extensions --with mkdocs-nav-weight mkdocs serve -a localhost:1337
Funnily enough it also feels like it is starting faster.
winterqt 5 days ago||
Is there a reason you didn’t explicitly pull in mkdocs as a dependency in that invocation? I guess uv will expose it/let you run it anyways due to the fact that it’s required by everything else you did specify.
tfitz237 5 days ago||
its a `uvx` call, so the tool being invoked is `mkdocs`, and all the other dependencies are additions on top of that
satvikpendem 6 days ago||
Very nice, I believe Rust is doing something similar too which is where I initially learned of this idea of single-file shell-type scripts in other languages (with dependency management included, which is how it differs from existing ways of writing single-file scripts in e.g. scripting languages) [0].

Hopefully more languages follow suit on this pattern as it can be extremely useful for many cases, such as passing gists around, writing small programs which might otherwise be written in shell scripts, etc.

[0] https://rust-lang.github.io/rfcs/3424-cargo-script.html

deepakjois 6 days ago|
C# too: https://devblogs.microsoft.com/dotnet/announcing-dotnet-run-...
rednafi 5 days ago||
> Before this I used to prefer Go for one-off scripts because it was easy to create a self-contained binary executable.

I still do because:

- Go gives me a single binary

- Dependencies are statically linked

- I don’t need any third-party libs in most scenarios

- Many of my scripts make network calls, and Go has a better stdlib for HTTP/RPC/Socket work

- Better tooling (built-in formatter, no need for pytest, go vet is handy)

- Easy concurrency. Most of my scripts don’t need it, but when they do, it’s easier since I don’t have to fiddle with colored functions, external libs, or, worse, threads.

That said, uv is a great improvement over the previous status quo. But I don’t write Python scripts for reasons that go beyond just tooling. And since it’s not a standard tool, I worry that more things like this will come along and try to “improve” everything. Already scarred and tired in that area thanks to the JS ecosystem. So I tend to prefer stable, reliable, and boring tools over everything else. Right now, Go does that well enough for my scripting needs.

istjohn 5 days ago||
I needed to process a 2 GB xml file the other day. While my Python script was chugging away, I had Claude translate it to Go. The vibe-coded Go program then processed the file before my original Python script terminated. That was the first time I ever touched Go, but it certainly won't be the last.
rednafi 5 days ago||
Go is pretty awesome. I’m sure that spending some time with the script would have made it at least 50 times faster than Python.
deepakjois 5 days ago|||
(author of post here)

I still use both Go and Python. But Python gives me access to a lot more libraries that do useful stuff. For example the YouTube transcript example I wrote about in the article was only possible in Python because afaik Go doesn't have a decent library for transcript extraction.

rednafi 5 days ago||
Yeah that's a fair point. I still do a ton of Python for work. The language is fine; it's mostly tooling that still feels 30 years old.
7bit 5 days ago||
Good for you. I dont See how this is relevant to this topic.
rednafi 5 days ago||
> Before this I used to prefer Go for one-off scripts because it was easy to create a self-contained binary executable.

Here's how it's relevant :)

4dregress 6 days ago||
I’ve been a python dev for nearly a decade and never once thought dep management was a problem.

If I’ve ever had to run a “script” in any type of deployed ENV it’s always been done in that ENVs python shell .

So I still don’t see what the fuss is about?

I work on a massive python code base and the only benefit I’ve seen from moving to UV is it has sped up dep installation which has had positive impact on local and CI setup times.

bboygravity 5 days ago||
How did you tell other people/noobs to run your python code (or how did you run it yourself after 5+ years of not touching older projects)?
x187463 5 days ago||
run script

"missing x..."

pip install x

run script

"missing y..."

pip install y

> y not found

google y to find package name

pip install ypackage

> conflict with other package

realize I forgot a venv and have contaminated my system python

check pip help output to remember how to uninstall a package

clean up system python

create venv at cwd

start over

...

</end of time>

psunavy03 5 days ago|||
>realize I forgot a venv and have contaminated my system python

>check pip help output to remember how to uninstall a package

>clean up system python

>create venv at cwd

>start over

This hits disturbingly close to home.

whattheheckheck 5 days ago||
This is like seeing someone complain they have to turn their computer on to do work
SAI_Peregrinus 5 days ago|||
Thankfully some newer systems will error by default if you try to mess with them via pip instead of your system's package manager. Easy to override if you want to, and saves a lot of effort fixing accidental screw ups.
jshen 5 days ago|||
Python's dependency management has been terrible until very recently compared to nearly every other mainstream language.
rednafi 5 days ago|||
I guess this is why people need to get out of this “Python dev” or “JS dev” mindset and try other languages to see why those coming to Python complain so much about dependency management.

People complain because the experience is less confusing in many other languages. Think Go, Rust, or even JS. All the tooling chaos and virtual environment jujitsu are real deterrents for newcomers. And it’s not just beginners complaining about Python tooling. Industry veterans like Armin Ronacher do that all the time.

uv is a great step in the right direction, but the issue is that as long as the basic tooling isn’t built into the language binary, like Go’s tools or Rust’s Cargo, more tools will pop up and fragment the space even further.

mmcnl 5 days ago|||
Confusing is underselling it. That implies that Python dependency management is working fine, it's just complex. But it's not working fine: there's no such thing as lock files, which makes reproducible installs a gamble and not a given. For small scripts this is probably "okay", but if you're working in a team or want to deploy something on a server, then it's absolutely not fine because you want deterministic builds and that's simply impossible without a decent package manager.

Tools like uv solve the "it works on my machine" problem. And it's also incredibly fast.

rednafi 5 days ago||
There is a lock file now.

https://packaging.python.org/en/latest/specifications/pylock...

Issue is since there are no standardized build tool (pip, uv both are third party), there are a zillion ways of generating this lockfile unlike go.mod or cargo.toml. So it doesn't work in many scenarios and it's confusing as hell.

4dregress 5 days ago|||
My view is I’m an engineer first and foremost and I use the tools which are best for the task at hand. That also means what’s best for the business in terms of others working on the project, this has meant python with some sort of framework.

People have suggested using other languages that might be faster but the business always choices what’s best for everyone to work with.

rednafi 5 days ago||
Sure, it depends on the type and maturity of the business, as well as the availability of local talent. I've worked at three companies that started out with Python and Django, then transitioned to other technologies as the business scaled. In those environments, there were two kinds of developers: those who quickly adapted and picked up new languages, and those who wanted to remain "Python devs." The latter group didn’t have a great time moving forward.

What I don't like about the "Python + Framework + Postgres" argument is that it often lacks context. This is a formidable combination for starting a business and finding PMF. But unless you've seen Python and Postgres completely break under 100k RPS and petabyte-scale data, it's hard to understand the limitations just from words. Python is fantastic, but it has its limits and there are cases where it absolutely doesn't work. This “single language everywhere” mindset is how we ended up with JavaScript on the backend and desktop.

Anyone can write Python, and with LLMs, there's not much of a moat around knowing a single language. There's also no reason not to familiarize yourself with others, since it broadens your perspective. Of course, some businesses scale quite well with Python or JavaScript. But my point isn't to abandon Python. It's to gain experience in other languages so that when people criticize Python’s build tools, you can genuinely empathize with those concerns. Otherwise, comments like “Python tooling is fine” from people who have mostly worked with only Python are hard to take seriously.

petersellers 5 days ago|||
> it’s always been done in that ENVs python shell .

What if you don't have an environment set up? I'm admittedly not a python expert by any means but that's always been a pain point for me. uvx makes that so much easier.

kinow 5 days ago|||
I wrote PHP/JS/Java before Python. Been doing Python for nearly a decade too, and like 4dregress haven't had the need to worry much about dep management. JS and PHP had all sorts of issues, Maven & Gradle are still the ones that gave me less trouble. With Python I found that most issues could be fixed by finding the PEP that implemented what I needed, and by trying to come up with a simple workflow & packaging strategy.

Nowadays I normally use `python venv/bin/<some-executable>`, or `conda run -n <some-env> <some-executable>`, or packaged it in a Singularity container. And even though I hear a lot of good things about uv, given that my job uses public money for research, we try to use open source and standards as much as possible. My understanding is that uv is still backed by a company, and at least when I checked it some time ago (in peps discussions & GH issues) they were no implementing the PEPs that I needed -- even if they did, we would probably still stay with simple pip/setuptools to avoid having to use research budget to update our build if the company ever changed its business model (e.g. what anaconda did some months/year? ago).

Digressing: the Singularity container is useful for research & HPC too, as it creates a single archive, which is faster to load on distributed filesystems like the two I work on (GPFS & LustreFS) instead of loading many small files over network.

arcanemachiner 5 days ago|||
Create a virtual environment:

python3 -m venv venv

Activate the virtual environment:

source venv/bin/activate

Deactivate the virtual environment:

deactivate

petersellers 5 days ago||
Or: `uvx ruff`

Which one is easier to run, especially for someone who doesn't use python everyday?

arcanemachiner 5 days ago||
The one they definitely won't have to re-learn in a few years.
fastasucan 4 days ago|||
Oh, I am fine re-learning an equivalent of `uvx ruff` every few years, thats not too bad.
petersellers 5 days ago|||
It's still easier if you use virtual environments so infrequently that you have to look up how to do it every time.
linsomniac 5 days ago|||
I've been a python dev for nearly 3 decades and feel that uv is removing a lot of the rough edges around dependency management. So maybe "problem" is the wrong word; I've been able to solve dependency management issues usually without too much trouble, I have also spent a significant amount of time dealing with them. For close to a decade I was managing other peoples Python environments on production systems, and that was a big mess, especially with trying to ensure that they stayed updated and secure.

If you don't see what the fuss is about, I'm happy for you. Sounds like you're living in a fairly isolated environment. But I can assure you that uv is worth a lot of fussing about, it's making a lot of our lives a lot easier.

mmcnl 5 days ago|||
Virtual environments alone are not enough. They don't guarantee deterministic builds. What do you do to ensure that your production environment runs the same code as your local dev environment? How do you solve that problem without dependency managers like uv or poetry?
bjornasm 4 days ago||
So if you have a big project that is 4 years old and you are going to run it in a new .venv, what do you do?
js2 6 days ago||
So far I've only run into one minor ergonomic issue when using `uv run --script` with embedded metadata which is that sometimes I want to test changes to the script via the Python REPL, but that's a bit harder to do since you have to run something like:

  $ uv run --python=3.13 --with-requirements <(uv export --script script.py) -- python
  >>> from script import X
I'd love if there were something more ergonomic like:

  $ uv run --with-script script.py python
Edit: this is better:

  $ "$(uv python find --script script.py)"
  >>> from script import X
That fires up the correct python and venv for the script. You probably have to run the script once to create it.
dkdcio 6 days ago||
I think you're looking for something like this (the important part being embeddeding a REPL call toward the end after whateve rsetup): https://gist.github.com/lostmygithubaccount/77d12d03894953bc...

You can make `--interactive` or whatever you want a CLI flag from the script. I often make these small Typer CLIs with something like that (or in this case, in another dev script like this, I have `--sql` for entering a DuckDB SQL repl)

mayli 6 days ago||
you are welcome

    cat ~/.local/bin/uve
    #!/bin/bash
    temp=$(mktemp)
    uv export --script $1 --no-hashes > $temp
    uv run --with-requirements $temp vim $1
    unlink $temp
jcotton42 5 days ago|||
If I may ask, why `unlink` instead of `rm`?
nomel 6 days ago|||
This is rather silly.
sambaumann 6 days ago||
Between yesterday's thread and this thread I decided to finally give uv a shot today - I'm impressed, both by the speed and how easy it is to manage dependencies for a project.

I think their docs could use a little bit of work, especially there should be a defined path to switch from a requirements.txt based workflow to uv. Also I felt like it's a little confusing how to define a python version for a specific project (it's defined in both .python-version and pyproject.toml)

tdhopper 6 days ago||
I write an ebook on Python Developer tooling. I've attempted to address some of the weaknesses in the official documentation.

How to migrate from requirements.txt: https://pydevtools.com/handbook/how-to/migrate-requirements.... How to change the Python version of a uv project: https://pydevtools.com/handbook/how-to/how-to-change-the-pyt...

Let me know if there are other topics I can hit that would be helpful!

wrboyce 6 days ago|||
This would’ve been really handy for me a few weeks ago when I ended up working this out for myself (not a huge job, but more effort than reading your documentation would’ve been). While I can’t think of anything missing off the top of my head, I do think a PR to uv to update the official docs would help a lot of folk!

Actually, I’ve thought of something! Migrating from poetry! It’s something I’ve been meaning to look at automating for a while now (I really don’t like poetry).

tdhopper 6 days ago||
https://pydevtools.com/handbook/how-to/how-to-migrate-from-p...
kstrauser 5 days ago||
You don't have to pip install it before calling uvx, do you?
7thpower 6 days ago||||
This is wonderful. When I was learning I found the documentation inadequate and gpt4 ran in circles as I did not know what to ask (I did not realize “how do I use uv instead of conda/pip?” was a fundamentally flawed question).
bormaj 6 days ago|||
This is a great resource, thank you for putting this together
oconnor663 6 days ago|||
> it's defined in both .python-version and pyproject.toml

The `requires-version` field in `pyproject.toml` defines a range of compatible versions, while `.python-version` defines the specific version you want to use for development. If you create a new project with uv init, they'll look similar (>=3.13 and 3.13 today), but over time `requires-version` usually lags behind `.python-version` and defines the minimum supported Python version for the project. `requires-version` also winds up in your package metadata and can affect your callers' dependency resolution, for example if your published v1 supports Python 3.[old] but your v2 does not.

zahlman 6 days ago|||
> how to define a python version for a specific project (it's defined in both .python-version and pyproject.toml)

pyproject.toml is about allowing other developers, and end users, to use your code. When you share your code by packaging it for PyPI, a build backend (uv is not one, but they seem to be working on providing one - see https://github.com/astral-sh/uv/issues/3957 ) creates a distributable package, and pyproject.toml specifies what environment the user needs to have set up (dependencies and python version). It has nothing to do with uv in itself, and is an interoperable Python ecosystem standard. A range of versions is specified here, because other people should be able to use your code on multiple Python versions.

The .python-version file is used to tell uv specifically (i.e. nobody else) specifically (i.e., exact version) what to do when setting up your development environment.

(It's perfectly possible, of course, to just use an already-set-up environment.)

gschizas 6 days ago|||
> there should be a defined path to switch from a requirements.txt based workflow to uv

Try `uvx migrate-to-uv` (see https://pypi.org/project/migrate-to-uv/)

0cf8612b2e1e 6 days ago|||
I have never researched this, but I thought the .python-version file only exists to benefit other tools which may not have a full TOML parser.
zahlman 6 days ago||
Read-only TOML support is in the standard library since Python 3.11, though. And it's based on an easily obtained third-party package (https://pypi.org/project/tomli/).

(If you want to write TOML, or do other advanced things such as preserving comments and exact structure from the original file, you'll want tomlkit instead. Note that it's much less performant.)

furyofantares 6 days ago|||
Same, although I think it doesn't support my idiosyncratic workflow. I have the same files sync'd (via dropbox at the moment) on all my computers, macos and windows and wsl alike, and I just treat every computer likes it's the same computer. I thought this might be a recipe for disaster when I started doing it years ago but I have never had problems.

Some stuff like npm or dotnet do need an npm update / dotnet restore when I switch platforms. At first attempt uv seems like it just doesn't really like this and takes a fair bit of work to clean it up when switching platforms, while using venvs was fine.

avianlyric 5 days ago||
You should probably look to have the uv managed venvs completely excluded from being synced, and forcing every machine to build its own venv. Given how fast and consistent uv is, there’s no real reason to share the actual venvs between machines anymore.
furyofantares 5 days ago||
Thank you! I wrap all my tools in very simple shell+batch scripts anyway so just specifying a different venv for each does the trick.
mmcnl 5 days ago||
I agree the docs are not there yet. There is a lot of documentation but it's a description of all the possible options that are available (which is a lot). But it doesn't really tell me how to actually _use_ it for a certain type of workflow, or does a mediocre job at best.
tpoacher 6 days ago|
What's going on? This whole thread reads like paid amazon reviews
indosauros 6 days ago||
What's going on is "we have 14 standards so we need to create a 15th" actually worked this time
mmcnl 5 days ago|||
To be fair, I've used Poetry for years and it works/worked amazingly well. It's just not as fast as uv.
kibwen 6 days ago|||
It works far more of the time than people give it credit for. There are a lot of good XKCDs, but that one is by far the worst one ever made, as far as being a damaging meme goes.
hnfong 5 days ago|||
It's survival bias. You'd never see the confusion from would-have-failed standards-wannabes that xkcd927 helped prevent.
mturmon 6 days ago|||
"xkcd 927 Considered Harmful" ?
nickagliano 6 days ago||
Fantastic comment
kouteiheika 6 days ago|||
Beacuse with uv finally Python dependency management is not a shitshow. The faster we get everyone to switch the better.
oblio 6 days ago|||
Occasionally the reviews match reality.
More comments...