Posted by deepakjois 6 days ago
For me there used to be a clear delineation between scripting languages and compiled languages. Python has always seemed to want to be both and I'm not too sure it can really. I can live with being mildly wrong about a concept.
When Python first came out, our processors were 80486 at best and RAM was measured in MB at roughly £30/MB in the UK.
"For the longest time, ..." - all distros have had scripts that find the relevant Python or Java or whatevs so that's simply daft. They all have shebang incantations too.
So we now have uv written in Rust for Python. Obviously you should install it via a shell script directly from curl!
I love all of the components involved here but please for the love of a nod to security at least suggest that the script is downloaded first, looked over and then run.
I recently came across a Github hosted repo with scripts that changed Debian repos to point somewhere else and install ... software. I'm sure that's all fine too.
curl | bash is cute and easy and very, very insecure.
No? You can install it via pip.
https://github.com/InboraStudio/Proxmox-VGPU
Note the quite professional looking README.md and think about the audience for this thing - kittens hitting the search bong and trying to get something very complicated working.
Read the scripts: they are pretty short and could put your hypervisor in the hands of someone else who may not be too friendly.
Now pip has the same problem except you don't normally go in with a web browser first.
I raised an issue to at least provide a hint to casual browsers and also raised it with the github AI bottie complaint thang which doesn't care about you, me or anything else for that matter.
Not an expert but I think there's performance gains to calling the binary directly rather than through python.
1) Subscribe to the GitHub repo for tag/release updates.
2) When I get a notification of a new version, I run a shell function (meup-uv and meup-ruff) which grabs the latest tag via a GET request and runs an install. I don't remember the semantics off the top of my head, but it's something like:
cargo install --jobs $(( $(nproc) / 2 )) --tag ${TAG} --git ${REPO} [uv|ruff]
Of course this implies I'm willing to wait the ~5-10 minutes for these apps to compile, along with the storage costs of the registry and source caches. Build times for ruff aren't terrible, but uv is a straight up "kick off and take a coffee break" experience on my system (it gets 6-8 threads out of my 12 total depending on my mood).Eh. There's a lot of space in the middle to "well actually" about, but Python really doesn't behave like a "compiled" language. The more important question is: what do you ship to people, and how easily can they use it? Lots of people in this thread are bigging up Go's answer of "you ship a thing which can run immediately with no dependencies". For users that solves so many problems.
Quite a few python usecases would benefit from being able to "compile" applications in the same sense. There are py-to-exe solutions but they're not popular or widely used.
#!/usr/bin/env bash
eval "$(conda shell.bash hook)"
conda activate myenv
python myscript
Admittedly this isn't self contained like the PEP 723 solution. $ dnf search --cacheonly uv
Matched fields: name (exact)
uv.x86_64: An extremely fast Python package installer and resolver, written in Rust
As Ive gotten older I've grown weary of third party tools, and almost always try to stick with the first party built in methods for a given task.
Does uv provide enough benefit to make me reconsider?
I don't know why there is such a flurry of posts since it's a tool that is more than a year old, but it's the one and only CLI tool that I recommend when Python is needed for local builds or on a CI.
Hatch was a good contender at the time but they didn't move fast enough, and the uv/ruff team ate everybody's lunch. uv is really good and IMHO it's here to stay.
Anyway try it for yourself but it's not a high-level tool that is hiding everything, it's fast and powerful and yet you stay in control. It feels like a first-party tool that could be included in the Python installer.
The answer is an unequivocal yes in this case. uv is on a fast track to be the defacto standard and make pip relegated to the ‘reference implementation’ tier.
Try it for <20mins and if you don't like it, leave it behind. These 20mins include installation, setup, everything.
If you use anything outside the standard library the only reliable way to run a script is installing it in a virtual environment. Doing that manually is a hassle and pyenv can be stupidly slow and wastes disk space.
With uv it's fast and easy to set up throw away venvs or run utility scripts with their dependencies easily. With the PEP-723 scheme in the linked article running a utility script is even easier since its dependencies are self-declared and a virtual environment is automatically managed. It makes using Python for system scripting/utilities practical and helps deploy larger projects.
Really? `apt install pipx; pipx install sphinx` (for example) worked flawlessly for me. Pipx is really just an opinionated wrapper that invokes a vendored copy of Pip and the standard library `venv`.
The rest of your post seems to acknowledge that virtual environments generally work just fine. (Uv works by creating them.)
> Sometimes you're limited by Python version and it can be non-trivial to have multiple versions installed at once.
I built them from source and make virtual environments off of them, and pass the `--python` argument to Pipx.
> If you use anything outside the standard library the only reliable way to run a script is installing it in a virtual environment. Doing that manually is a hassle and pyenv can be stupidly slow and wastes disk space.
If you're letting it install separate copies of Python, sure. (The main use case for pyenv is getting one separate copy of each Python version you need, if you don't want to build from source, and then managing virtual environments based off of that.) If you're letting it bootstrap Pip into the virtual environment, sure. But you don't need to do either of those things. Pip can install cross-environment since 22.3 (Pipx relies on this).
Uv does save disk space, especially if you have multiple virtual environments that use the same packages, by hard-linking them.
> With uv it's fast and easy to set up throw away venvs or run utility scripts with their dependencies easily. With the PEP-723 scheme in the linked article running a utility script is even easier since its dependencies are self-declared and a virtual environment is automatically managed.
Pipx implements PEP 723, which was written to be an ecosystem-wide standard.
How much you can benefit depends on your use case. uv is a developer tool that also manages installations of Python itself (and maintains separate environments for which you can choose a Python version). If you're just trying to install someone else's application from PyPI - say https://pypi.org/project/pycowsay/ as an example - you'll likely have just as smooth of an experience via pipx (although installation will be even slower than with pip, since it's using pip behind the scenes and adding its own steps). On the other hand, to my understanding, to use uv as a developer you'll still need to choose and install a build backend such as Flit or Hatchling, or else rely on the default Setuptools.
One major reason developers are switching to uv is lockfile support. It's worth noting here that an interoperable standard for lockfiles was recently approved (https://peps.python.org/pep-0751/), uv will be moving towards it, and other tools like pip are moving towards supporting it (the current pip can write such lockfiles, and installing from them is on the roadmap: https://github.com/pypa/pip/issues/13334).
If you, like me, prefer to follow the UNIX philosophy, a complete developer toolchain in 2025 looks like:
* Python itself (if you want standalone binaries like the ones uv uses, you can get them directly; you can also build from source like I do; if you want to manage Python installations then https://github.com/pyenv/pyenv is solid, or you can use the multi-language https://asdf-vm.com/guide/introduction.html with https://github.com/asdf-community/asdf-python I guess)
* Ability to create virtual environments (the standard library takes care of this; some niche uses are helped out by https://virtualenv.pypa.io/)
* Package installer (Pip can handle this) and manager (if you really want something to "manage" packages by installing into an environment and simultaneously updating your pyproject.toml, or things like that; but just fixing the existing environment is completely viable, and installers already resolve dependencies for whatever it is they're currently installing)
* Build frontend (the standard is https://build.pypa.io/en/stable/; for programmatic use, you can work with https://pyproject-hooks.readthedocs.io/en/latest/ directly)
* Build backend (many options here - by design! but installers will assume Setuptools by default, since the standard requires them to, for backwards compatibility reasons)
* Support for uploading packages to PyPI (the standard is https://twine.readthedocs.io/en/stable/)
* Optional: typecheckers, linters, an IDE etc.
A user on the other hand only needs
* Some version of Python (the one provided with a typical Linux distribution will generally work just fine; Windows users should usually just install the current version, with the official installer, unless they know something they want to install isn't compatible)
* Ability to create virtual environments and also install packages into them (https://pipx.pypa.io/stable/ takes care of both of these, as long as the package is an "application" with a defined entry point; I'm making https://github.com/zahlman/paper which will lift that restriction, for people who want to `import` code but not necessarily publish their own project)
* Ability to actually run the installed code (pipx handles this by symlinking from a standard application path to a wrapper script inside the virtual environment; the wrappers specify the absolute path to the virtual environment's Python, which is generally all that's needed to "use" that virtual environment for the program. It also provides a wrapper to run Pip within a specific environment that it created. PAPER will offer something a bit more sophisticated here, for both aspects.)
uv is the magic that deals with all of the rough edges/environment stuff I usually hate in Python. All I need to do is `uv run myFile.py` and uv solves everything else.
(also: possible there's always been a way and I'm an idiot)
#!/usr/bin/env -S mise x xh jq fzf gum -- bash
todo=$(xh 'https://jsonplaceholder.typicode.com/todos' | jq '.[].title' | fzf)
gum style --border double --padding 1 "$todo"
It makes throwing together a bash scripts with dependencies very enjoyableFirstly, I have been a HN viewer for so many time and this is the one thing about pep python scripts THAT always get to the top of leaderboard of hackernews by each person discovering it themselves.
I don't mean to discredit the author. His work was simple and clear to understand. I am just sharing this thesis that I have that if someone wants karma on Hackernews for whatever reason, this might be the best topic. (Please don't pitchfork me since I don't mean offense to the author)
Also, can anybody please explain to me on how to create that pep metadata in uv from just a python script and without anything else, like some command which can take a python script and give pep and add that in the script, I am pretty sure that uv has a feature flag but I feel that the author might've missed out on this feature because I don't know when coding one off scripts in python using AI (gemini) it had some options with pep so I always had to paste uv's documentation I don't know, so please if anybody knows a way to create pep easier using the cli, then please tell me! Thanks in advance!!
$ touch foo.py
$ uv add --script foo.py requests
Updated `foo.py`
$ cat foo.py
# /// script
# requires-python = ">=3.13"
# dependencies = [
# "requests",
# ]
# ///
But my tool was really finnicky and I guess it was built by AI ,so um yea, I guess you all can try it, its on pypi. I think that it has a lot of niche cases where it doesn't work. Maybe someone can modify it to make it better as I had built it like 3-4 months ago if I remember correctly and I have completely forgotten how things worked in uv.
The downside is that there are a bunch of seemingly weird lines you have to paste at the begging of the script :D
If anyone is curios it's on pypi (pysolate).
Not quite the same but interesting!