Top
Best
New

Posted by ibraheemdev 6 hours ago

Astral to Join OpenAI(astral.sh)
https://openai.com/index/openai-to-acquire-astral/
930 points | 596 comments
NiloCK 5 hours ago|
A concern:

More and more plainly, OpenAI and Anthropic are making plays to own (and lease) the "means of production" in software. OK - I'm a pretty happy renter right now.

As they gobble up previously open software stacks, how viable is it that these stacks remain open? It seems perfectly sensible to me that these providers and their users alike have an interest in further centralizing the dev lifecycle - eg, if Claude-Code or Codex are interfaces to cloud devenvs, then the models can get faster feedback cycles against build / test / etc tooling.

But when the tooling authors are employees of one provider or another, you can bet that those providers will be at least a few versions ahead of the public releases of those build tools, and will enjoy local economies of scale in their pipelines that may not be public at all.

throwaway63467 5 hours ago||
It’s a small tool shop building a tiny part of the Python ecosystem, let’s not overstate their importance. They burned through their VC money and needed an exit and CLI tool chains are hyped now for LLMs, but this mostly sounds like an acquihire to me. Dev tools are among the hardest things to monetize with very few real winners, so good for them to get a good exit.
druml 4 hours ago|||
Small tool shop, burning VC money, true. "Tiny part of the Python ecosystem" is an understatement given how much impact uv has made alone.
rob 4 hours ago|||
Just a tiny project with over 100 million downloads every month, over 4 million every day. No big deal. Just a small shop, don't overstate its importance.

https://pypistats.org/packages/uv

FuckButtons 3 hours ago|||
Sure, but if tomorrow uv and ruff ceased to exist, we could all go back to any number of other solutions.
arw0n 54 minutes ago|||
Ruff is nice, but not important, uv is one of the few things making the python ecosystem bearable. Python is a language for monkeys, and if you don't give monkeys good tools, they will forever entangle themselves and you. It is all garbage wrapped in garbage. At least let me deploy it without having to manually detangle all that garbage by version.

I'm done pretending this is a "right tools for the right job" kind of thing, there's wrong people in the right job, and they only know python. If no one self-writes code anymore anyway, at least use a language that isn't a clusterfuck of bad design decisions, and has 1 trillion lines of code in the collective memory of people who don't know what a stack is.

theptip 11 minutes ago||
I agree uv is great but let’s not get carried away here. Poetry is good, pip was fine for many use-cases after they added native lock files.
tomrod 3 hours ago||||
Maybe you could. I would stare longingly into the void, wondering if I can ever work another python project after having experienced uv, ruff, and ty.

Such an outcome would make me wonder regarding the wisdom of "It is better to have love and lost than to have never loved at all."

signal11 2 hours ago|||
I was using poetry pretty happily before uv came along. I’d probably go back.

Note that uv is fast because — yes, Rust, but also because it doesn’t have to handle a lot of legacy that pip does[1], and some smart language independent design choices.

If uv became unavailable, it’d suck but the world would move on.

[1] https://nesbitt.io/2025/12/26/how-uv-got-so-fast.html

theLiminator 2 hours ago||
Maybe I could give up uv, but giving up ruff would suck.
crdrost 1 hour ago||
This is just the weirdest thread.

Like, the whole point of open source is that this thread is not a thing. The whole point is "if this software is taken on by a malevolent dictator for life, we'll just fork it and keep going with our own thing." Or like if I'm evaluating whether to open-source stuff at a startup, the question is "if this startup fails to get funding and we have to close up shop, do I want the team to still have access to these tools at my next gig?" -- there are other reasons it might be in the company's interests, like getting free feature development or hiring better devs, but that's the main reason it'd be in the employees' best interests to want to contribute to an open-source legacy rather than keep everything proprietary.

asa400 52 minutes ago||
The leadership and product direction work are at least as hard as the code work. Astral/uv as absolutely proven this, otherwise Python wouldn't be a boneyard for build tools.

Projects - including forks - fail all the time because the leadership/product direction on a project goes missing despite the tech still being viable, which is why people are concerned about these people being locked up inside OpenAI. Successfully forking is much easier said than done.

giancarlostoro 2 hours ago||||
It is an MIT licensed project, someone will absolutely fork it.
WesolyKubeczek 2 hours ago||
You seem to be underestestimating the laziness of the people, and overestimating their resolve. Angry forks usually don't last, angst doesn't prevent maintenance burnouts.
giancarlostoro 1 hour ago|||
You underestimate the value that something like uv and company bring to the ecosystem. Given enough time I could have seen it replacing some core utilities, now that its owned by OpenAI I don't see that happening, unless OpenAI "donates" the project but keeps the devs on a payroll.
deadbabe 2 hours ago|||
Maybe consider something other than python.
scuff3d 52 minutes ago||
Good luck with that. I haven't been successful at convincing anyone to move away from it. I'm so fucking sick of writing Python at work lol
giancarlostoro 2 hours ago||||
While I hope it never comes to that, all the code is MIT licensed, I would assume everyone would make the sensible decision for fork it.
alsetmusic 1 hour ago||||
I see Apache and MIT license files in their GitHub. What's to prevent the community from forking and continuing development if the licenses change?
eviks 1 hour ago||
The same things that prevented "community" from building the tool in the first place
zem 51 minutes ago|||
that makes zero sense to me. developing something like ruff from scratch takes a lot of things happening - someone having the idea, the time to develop it from scratch in their free time, or the money to do it as a job, and perhaps the need to find collaborators if it's too large a project for one person. but now ruff is there, there's no need to build it from scratch. if I wanted to build a python linter or formatter I would simply fork ruff and build on top of it. as others have said in this subthread, that's the whole point of open source!
johnisgood 1 hour ago|||
Cannot we at one point consider the tool to be "done"? I mean, what is there to constantly change and improve? Genuinely curious. It sounds like a tool that can be finished. Can it not be?
influx 25 minutes ago||
You’d be surprised how many features the Python runtime adds each release. It’s not trivial for tooling to keep up with language changes.
wiseowise 19 minutes ago||||
I would just ditch Python, like I did 8 years ago.
crimsoneer 3 hours ago||||
Eurgh, I do not want to ever touch Poetry or pyenv again, thank you very much.
skywhopper 1 hour ago|||
I mean, if you believe the hype on this website, Claude Code could build a perfect clone of uv in a few hours using only the documentation.
joelthelion 1 hour ago||||
That says more about the sad state of modern CI pipelines than anything about uv's popularity.

Not disputing that it's a great and widely used tool, BTW.

johnisgood 1 hour ago||||
I do feel like it is overstated, and the number of downloads is not a good metric at all. There are npm packages with many millions of downloads, too.
throwaway63467 3 hours ago||||
The “requests” package gets downloaded one billion times every month, should that be a multi billion dollar VC company as well? It’s a package manager and other neat tooling, it’s great but it’s hardly the essence of what makes Python awesome, it’s one of the many things that makes this ecosystem flourish. If OpenAI would enshittify it people would just fork or move on, that’s all I’m saying, it’s not in any way a single point of failure for the Python ecosystem.
druml 3 hours ago||
> the essence of what makes Python awesome

This is not the point of uv or any good package manager. The point is what prevents Python to suck. For a long time package management had been horrible in Python compared what you could see in other languages.

LtWorf 46 minutes ago||||
It's not difficult to download something yourself 4 million times every day to look popular :)
skywhopper 1 hour ago|||
I mean, these sorts of numbers speak to the mind-bogglingly inefficient CI workflows we as an industry have built. I’d be surprised if there were 4 million people in the world who actually know what ‘uv’ is.
victorbjorklund 36 minutes ago||||
They have some nice ideas. But if they turn to shit you can just fork their tools and use that instead.
__MatrixMan__ 8 minutes ago||
Agreed.

Maybe there needs to be some nonprofit watchdog which helps identify those cases in their early stages and helps bootstrap open forks. I'd fund to a sort of open capture protection savings account if I believed it would help ensure continuity of support from the things I rely on.

swexbe 46 minutes ago||||
VC money bailing out other VCs. A tale as old as time.
Hamuko 4 hours ago|||
Do you have any statistics for that?
jengland 4 hours ago|||
uv has almost 2x the number of monthly downloads Poetry has.

- https://pypistats.org/packages/poetry - https://pypistats.org/packages/uv

In the 2024 Python developer survey, 18% of the ecosystem used Poetry. When I opened this manifold question[0], I'm pretty sure uv was about half of Poetry downloads.

Estimating from these numbers, probably about 30% of the ecosystem is using `uv` now. We'll get better numbers when the 2025 Python developer survey is published.

Also see this: https://biggo.com/news/202510140723_uv-overtakes-pip-in-ci-u...

[0]: https://manifold.markets/JeremiahEngland/will-uv-surpass-poe...

pm90 4 hours ago|||
anecdotally every place ive worked at has switched over and never looked back.
_moof 4 hours ago|||
Same. It's game-changing - leaps and bounds above every previous attempt to make Python's packaging, dependency management, and dev workflow easy. I don't know anyone who has tried uv and not immediately thrown every other tool out the window.
macNchz 3 hours ago||
I use uv here and there but have a bunch of projects using regular pip with pip-tools to do a requirements.in -> requirements.txt as a lockfile workflow that I've never seen enough value in converting over. uv is clearly much faster but that's a pretty minor consideration unless I were for some reason changing project dependencies all day long.

Perhaps it never grabbed me as much because I've been running basically everything in Docker for years now, which takes care of Python versioning issues and caches the dependency install steps, so they only take a long time if they've changed. I also like containers for all of the other project setup and environment scaffolding stuff they roll up, e.g. having a consistently working GDAL environment available instantly for a project I haven't worked on in a long time.

shawnwall 4 hours ago|||
been in the python game a long time and i've seen so many tools in this space come and go over the years. i still rely on good ol pip and have had no issues. that said, we utilize mypy and ruff, and have moved to pyproject etc to remotely keep up with the times.
jitl 4 hours ago|||
uv solved it, it will be the only tool people use in 2 more years. if you’re a python shop / expert then you can do pip etc but uv turned incidental python + deps from a huge PITA for the rest of us, to It Just Works simplicity on the same level or better than Golang.
pdntspa 3 hours ago|||
Then can they please figure out some way of invoking it that doesnt require prefixing everything with 'uv'
arw0n 33 minutes ago|||
For any command, you can create an 'alias' in your shell config. That way you can get rid of the prefix.
maleldil 2 hours ago||||
You can source the virtualenv like normal.
cjp 3 hours ago||||
direnv, .envrc, "layout uv"

https://github.com/direnv/direnv/wiki/Python#uv

tomrod 3 hours ago||||
alias in ~/.zshrc?
malcolmgreaves 3 hours ago|||
uv run bash/zsh/your shell of choice
1718627440 4 hours ago|||
I don't want software on my computer, that just downloads and installs random stuff. This is the job of the OS in particular the package manager.
wiseowise 11 minutes ago|||
Don't worry, gramps, pip won't trigger your tinfoil hat.
zbentley 3 hours ago||||
Do you not use non-OS package managers?

If not, do you develop software with source dependencies (go, java, node, rust, python)? If so, how do you handle acquiring those dependencies—by hand or using a tool?

1718627440 3 hours ago|||
> Do you not use non-OS package managers?

Mostly no, sometimes I give up and still use pip as a separate user.

> If not, do you develop software with source dependencies (go, java, node, rust, python)? If so, how do you handle acquiring those dependencies—by hand or using a tool

I haven't felt the need to use Go, the only Java software I use is in the OS repo. I don't want to use JS software for other reasons. This is one of the reasons why I don't like Rust rewrites. Python dependencies are very often in the OS repo. If there is anything else, I compile it from source and I curse when software doesn't use or adheres to the standard of the GNU build system.

maleldil 2 hours ago||
I hope you understand you are part of a very, very small minority.
LtWorf 43 minutes ago|||
Personally I run "apt install whateverineed"
QuantumNomad_ 3 hours ago||||
What’s the point of constraining oneself to what is in the OS package manager? I like to keep my dependencies up to date. The versions in the OS package manager are much older.

And let’s say you constrain yourself to your OS package manager. What about the people on different distros? Their package managers are unlikely to have the exact same versions of your deps that your OS has.

1718627440 3 hours ago||
> What’s the point of constraining oneself to what is in the OS package manager? I like to keep my dependencies up to date. The versions in the OS package manager are much older.

I favor stability and the stripping of unwanted features (e.g. telemetry) by my OS vendor over cutting edge software. If I really need that I install it into /usr/local, that it what this is for after all.

> And let’s say you constrain yourself to your OS package manager. What about the people on different distros? Their package managers are unlikely to have the exact same versions of your deps that your OS has.

This is a reason to select the OS. Software shouldn't require exact versions, but should stick to stable interfaces.

mirekrusin 4 hours ago||||
Then don't use it?
maccard 3 hours ago|||
Do you use pip?
tomrod 3 hours ago|||
Geospatial tends to be the Achilles heel for python projects for me. Fiona is a wiley beast of a package, and GDAL too. Conda helped some but was always so slow. Pip almost uniformly fails in this area for me.
crimsoneer 2 hours ago||
Yup, the fact UV just installed geopandas out of the box with no issues blew my mind.
woodruffw 3 hours ago||||
As a point of information: Astral did not, in fact, burn through its VC money. I agree that dev tools are difficult to monetize, though.

(Source: I'm an Astral employee.)

nullhole 3 hours ago|||
> As a point of order: Astral did not, in fact, burn through its VC money.

That's a point of information, not a point of order.

jdgoesmarching 1 hour ago|||
Is pointing out the incorrect use of a point of order itself a point of order, or also a point of information?
woodruffw 3 hours ago|||
You're right, I've edited it.
benterix 3 hours ago||||
Finally someone competent to answer the crucial question. Taken into account the enormous amount of excellent work you did, and the fact that dev tools are hard to monetize, what was your strategy?
woodruffw 3 hours ago|||
You can find some resources on our strategy in previous blog posts, like this one on pyx[1].

[1]: https://astral.sh/blog/introducing-pyx

ontouchstart 7 minutes ago||
Are you going to join codex team as well? I am curious about how the codex code base will evolve after you guys joined. It is going to affect Python/Rust toolchains tremendously.
cj 3 hours ago|||
Check this HN thread from 8 months ago: https://news.ycombinator.com/item?id=44358216
OriginalMrPink 3 hours ago|||
[dead]
19205817 47 minutes ago||||
They were hyped here without any pushback. Maybe OpenAI thinks the Astral folks will now evangelize and foist Codex and ChatGPT onto the open source "community".

People need to be very careful about resisting. OpenAI wants to make everyone unemployed, works with the Pentagon, steals IP, and copyright whistleblowers end up getting killed under mysterious circumstances.

insane_dreamer 5 minutes ago||
given that they delivered the goods I would not say they were "hyped"
anentropic 52 minutes ago||||
That was my feeling - more than 'owning' uv etc I could see this as being about getting people onboard who had a proven track record delivering developer tooling that was loved enough to get wide adoption
giancarlostoro 2 hours ago||||
> Dev tools are among the hardest things to monetize with very few real winners, so good for them to get a good exit.

I'm on the fence about cancelling my JetBrains subscription I've had for nearly 10 years now. I just don't use it much. Zed and Claude Code cover all my needs, the only thing I need is a serious DataGrip alternative, but I might just sit down with Claude and build one for myself.

__mharrison__ 3 hours ago||||
uv is the best thing to happen to package management in Python.

It's not perfect, but it is light-years better than what preceded it.

I jumped ship to it and have not looked back. (So have many of my clients).

gigatexal 3 hours ago||||
Uv is the defacto way to do projects. Ty is really really good. Ruff is the defacto linter. I mean they’ve earned a lot of clout.
scuff3d 53 minutes ago||||
That's kind of like saying Cargo is a small part of the Rust ecosystem.

It's not there yet, but it's getting there.

raincole 2 hours ago||||
> tiny part of the Python ecosystem

https://xkcd.com/2347/

throwaw12 4 hours ago|||
uv and ruff is not tiny part anymore, its growing fast
Syntaf 3 hours ago||
Not to mention their language server + type checker `ty` is incredible. We moved our extremely large python codebase over from MyPy and it's an absolute game changer.

It's so fast in fact that we just added `ty check` to our pre-commit hooks where MyPy previously had runtimes of 150+ seconds _and_ a mess of bugs around their caching.

ren_engineer 46 minutes ago|||
somebody looked at Claude Code's binaries and Anthropic is testing out their own app platform called antspace. Not sure why people are shocked, they've been cloning features of their API customers and adding them to their core products since day 1. Makes sense they will take user data and do it for Claude Code by copying features or buying up what developers are using so they can lock people into a stack. These are the same people that trained on every scrap of data they could get their hands on and now complain about distilling models from their output

https://x.com/AprilNEA/status/2034209430158619084

Ironically this type of stuff really makes me doubt their AGI claims, why would they bother with this stuff if they were confident of having AGI within the next few years? They would be focused on replacing entire industries and not even make their models available at any price. Why bother with a PaaS if you think you are going to replace the entire software industry with AGI?

Frieren 43 minutes ago||
> they've been cloning features of their API customers and adding them to their core products since day 1

Is this not just the strategy of all platforms. Spy on all customers, see what works for them and copy the most valuable business models. Amazon does that with all kinds of products.

Platforms will just grow to own all the market and hike prices and lower quality, and pay close to nothing to employees. This is why we used to have monopoly regulations before being greedy became a virtue.

volkercraig 5 hours ago|||
It's not any different from the launch of the FSF. There's a simple solution. If you don't want your lunch eaten by a private equity firm, make sure whatever tool you use is GPL licensed.
palmotea 4 hours ago|||
> If you don't want your lunch eaten by a private equity firm, make sure whatever tool you use is GPL licensed.

1. For the record: the GPL is entirely dependent on copyright.

2. If AI "clean-room" re-implementations are allow to bypass copyright/licenses, the GPL won't protect you.

shimman 50 minutes ago|||
"Clean room" is doing a lot of heavy lifting. Having the entire corpus of knowledge for humanity and how LLMs work, how can you honestly argue in court that this is purely clean room implementation?

This is right up there with Meta lawyers claiming that when they torrent it's totally legal but when a single person torrents it's copyright infringement.

goku12 4 hours ago||||
> If AI "clean-room" re-implementations are allow to bypass copyright/licenses, the GPL won't protect you.

Isn't that the same for the obligations under BSD/MIT/Apache? The problem they're trying to address is a different one from the problem of AI copyright washing. It's fair to avoid introducing additional problems while debunking another point.

islandfox100 4 hours ago||||
Maybe I'm reading wrong here, but what's the implication of the clean room re-implementations? Someone else is cloning with a changed license, but if I'm still on the GPL licensed tool, how am I "not protected"?
darkwater 4 hours ago|||
1. Company A develops Project One as GPLv3

2. BigCo bus Company A

3a. usually here BigCo should continue to develop Project One as GPLv3, or stop working on it and the community would fork and it and continue working on it as GPLv3

3b. BigCo does a "clean-room" reimplementation of Project One and releases it under proprietary licence. Community can still fork the older version and work on it, but BigCo can continue to develop and sell their "original" version.

bloppe 4 minutes ago|||
As a real world example, Redis was both Company A and BigCo. Project One is now ValKey.
makapuf 3 hours ago|||
2. BigCo owns ProjectOne now 3a. Bigco is now free to release version N+1 as closed source only. 3b. Community can still fork the older version and work on it, but BigCo can continue to develop and sell their original version.
eru 4 hours ago|||
There's basically no different between GPL and BSD in that case.
worldsayshi 3 hours ago|||
If clean-room re-implementations are allowed to bypass copyright/licenses (software) copyright is dead in general?
justcool393 1 hour ago||
well no, (clean room )reimplementations of APIs have done since time immemorial. copyright applies to the work itself. if you implement the functionality of X, software copyright protects both!

patents protect ideas, copyright protects artistic expressions of ideas

munk-a 38 minutes ago||||
A GPL license helps but if support for a dependency is pulled you'll likely end up needing to divert more resources to maintain it anyways. There really isn't any guarantee against this cost - you either pay someone else to maintain it and hope they do a good job, build it in house and become an "also that thing" company, or follow a popular project without financially supporting it and just hope other people pick up your slack.

Preferring GPL licensed software means that you're immune to a sudden cut off of access so it's always advisable - but it's really important to stay on top of dependencies and be willing to pay the cost if support is withdrawn. So GPL helps but it isn't a full salve.

dirkc 3 hours ago||||
While the license is important, it's the community that plays the key role for me. VC funder open source is not the same as community developed open source. The first can very quickly disappear because of something like a aquihire, the second has more resilience and tends to either survive and evolve, or peter out as the context changes.

I'm careful to not rely too heavily on VC funded open source whenever I can avoid it.

petcat 4 hours ago|||
The biggest scam the mega-clouds and the Githubs ever pulled was convincing open source developers that the GPL was somehow out of vogue and BSD/MIT/Apache was better.

All so they could just vacuum it all up and resell it with impunity.

kjksf 4 hours ago|||
I don't remember GitHub or Amazon advocating MIT over GPL.

Feel free to prove me wrong by pointing out this massive amount of advocacy from "mega-clouds" that changed people's minds.

The ads, the mailing list posts, social media comments. Anything at all you can trace to "mega-clouds" execs.

colesantiago 4 hours ago||
https://choosealicense.com/

https://choosealicense.com/about/

> "GitHub wants to help developers choose an open source license for their source code."

This was built by GitHub Inc a very very long time ago.

Master_Odin 55 minutes ago|||
The site puts the MIT and GPLv3 front and center with a nice quick informative blurb on them. How are they pushing MIT over GPL?
supern0va 3 hours ago||||
>This was built by GitHub Inc a very very long time ago.

So long ago, in fact, that it was five years before their acquisition by Microsoft.

thayne 2 hours ago|||
I don't see anything on there saying that non-copyleft licenses are better, unless you are in an ecosystem that prefers a different license.
leetrout 4 hours ago||||
I remember a somewhat prominent dev in the DC area putting on Twitter around 2012 or so something like "I do plenty of open source coding and I don't put a fucking license on it" and it stuck with me for all these years that it was a weird stance to take.
roryirvine 2 hours ago|||
Dan Bernstein took that attitude back in the 90s - I think his personal theory of copyright went something like "if it doesn't have a license, then it's obviously public domain", which ran counter to the mainstream position of "if it doesn't have a license, then you have to treat it as proprietary".

And, sure, djb wasn't actually likely to sue you if you went ahead and distributed modified versions of his software... but no-one else was willing to take that risk, and it ended up killing qmail, djbdns, etc stone dead. His work ended up going to waste as a result.

chuckadams 2 hours ago|||
I doubt the lack of license was the reason DJB's projects didn't take over the world. Most of them required heavy forking to break away from hardwired assumptions about the filesystem and play nice with the OS distribution, and DJB is himself notoriously difficult to work with. Still, qmail managed to establish maildir as the standard format and kill off mbox, and for that alone I'm eternally grateful.
roryirvine 2 hours ago||
Well, there were always plenty of patches available - it's just that lots of them conflicted with each other, and that was a product of the licensing.

Agreed with the rest, though. I relied heavily on qmail for about a decade, and learned a lot from the experience, even if it was a little terrifying on occasion!

chuckadams 1 hour ago||
These days one would just most likely create a fork on github. Vim was also maintained through separate patches for a long time, but Bram was a lot more accepting about integrating and distributing those patches himself.
12_throw_away 1 hour ago|||
> his personal theory of copyright went something like "if it doesn't have a license, then it's obviously public domain"

I mean philosophically and morally, sure, one can take that position ... but copyright law does not work like that, at least not for anything published in the US after 1989 [1].

[1] https://www.copyright.gov/circs/circ03.pdf

skeeter2020 4 hours ago|||
John Carmack said that about a week ago.
xorcist 3 hours ago||||
The big cloud providers are perfectly happy to use GPL'd stuff (see: Elastic, MySQL). They don't need to use embrace-and-extend, they're content with hosting.

The ones pushing for permissive licenses are rather companies like Apple, Android (and to some extent other parts of Google), Microsoft, Oracle. They want to push their proprietary stuff and one way to do that in the face of open source competition is by proprietary extensions.

tomnipotent 2 hours ago||
> ones pushing for permissive licenses are rather companies like Apple, Android

The FOSS community at large embraced permissive licenses and it had nothing to do with the interests of big corporations.

benterix 3 hours ago||||
You probably mean AGPL. Companies hated GPL from the start and nothing has changed to this day. But the cloud is specifically against AGPL.
eru 4 hours ago|||
Huh? When you deploy something in the cloud, you don't have to share your GPL'ed stuff either. Google doesn't.
munk-a 44 minutes ago|||
I think the good news here is that since OpenAI is a zombie company at this point this particular acquisition shouldn't be too concerning - and from what I've seen Anthropic has been building out in a direction of increased specialization. That said vertical integration is as much of a problem as it always was and it'd be excellent to see some sane merger oversight from the government.
ffsm8 34 minutes ago|||
Hmm, from my perspective, an essential step to legitimize "vibecoding" in an enterprise setting is to to have a clearly communicated best practice - and have the LLM be hyper optimized for that setting.

Like having a system prompt which takes care of the project structure, languages, libraries etc

It's pretty much the first step to replacing devs, which is their current "North Star" (to be changed to the next profession after)

Once they've nailed that, the devs become even more of a tool then they're already are (from the perspective of the enterprise).

rTX5CMRXIfFG 5 hours ago|||
If it ever goes bad, well I hope that that’s an impetus for new open source projects to be started — and with improvements over and lessons learned from incumbent technologies, right at the v1 of said projects.
Maxion 5 hours ago|||
If LLMs turn out to be such a force multiplier, the way to fight it is to ensure that there are open source LLMs.
captainbland 4 hours ago|||
I think the issue is that LLMs are a cash problem as much as they are a technical problem. Consumer hardware architectures are still pretty unfriendly to running models which are actually competitive to useful models so if you want to even do inference on a model that's going to reliably give you decent results you're basically in enterprise territory. Unless you want to do it really slowly.

The issue that I see is that Nvidia etc. are incentivised to perpetuate that so the open source community gets the table scraps of distills, fine-tunes etc.

butlike 4 hours ago||
You got me thinking that what's going to happen is some GPU maker is going to offer a subsidized GPU (or RAM stick, or ...whatever) if the GPU can do calculations while your computer is idle, not unlike Folding@home. This way, the company can use the distributed fleet of customer computers to do large computations, while the customer gets a reasonably priced GPU again.
vlovich123 3 hours ago||
The kinds of GPUs that are in use in enterprise are 30-40k and require a ~10KW system. The challenge with lower power cards is that 30 1k cards are not as powerful, especially since usually you have a few of the enterprise cards in a single unit that can be joined efficiently via high bandwidth link. But even if someone else is paying the utility bill, what happens when the person you gave the card to just doesn’t run the software? Good luck getting your GPU back.
nunez 41 minutes ago||||
Open-source models will never be _truly_ competitive as long as obtaining quality datasets and training on them remains prohibitively expensive.

Plus, most users don't want to host their own models. Most users don't care that OpenAI, Anthropic and Google have a monopoly on LLMs. ChatGPT is a household name, and most of the big businesses are forcing Copilot and/or Claude onto their employees for "real work."

This is "everyone will have an email server/web server/Diaspora node/lemmy instance/Mastodon server" all over again.

fnordpiglet 5 hours ago||||
The problem is even if an OSS had the resources (massive data centers the size of NYC packed with top end custom GPU kits) to produce the weights, you need enormous VRAM laden farms of GPUs to do inference on a model like Opus 4.6. Unless the very math of frontier LLMs changes, don’t expect frontier OSS on par to be practical.
lukeschlather 3 hours ago|||
I feel like you're overstating the resources required by a couple orders of magnitude. You do need a GPU farm to do training, but probably only $100M, maybe $1B of GPUs. And yes, that's a lot of GPUs, but they will fit in a single datacenter, and even in dollar terms, there are many individual buildings in NYC that are cheaper.
palmotea 4 hours ago||||
> you need enormous VRAM laden farms of GPUs to do inference on a model like Opus 4.6.

It's probably a trade secret, but what's the actual per-user resource requirement to run the model?

supern0va 3 hours ago|||
There's already an ecosystem of essentially undifferentiated infrastructure providers that sell cheap inference of open weights models that have pretty tight margins.

If the open weights models are good, there are people looking to sell commodity access to it, much like a cloud provider selling you compute.

runarberg 5 hours ago||||
That would be accepting the framing of your class enemy, there is no reason to do that.
metalliqaz 5 hours ago|||
unless they are also pirate LLMs, I don't see how any open source project could have pockets deep enough for the datacenters needed to seriously contend
bix6 5 hours ago||||
If it goes bad? It’s too late by that point. And how is open source going to compete with billions of investment dollars?
darth_avocado 5 hours ago||
If AI tools are as good as the CEOs claim, we should have no friction towards building multiple open source alternatives very quickly. Unless of course, they aren’t as good as they are being sold as, in which case, we have nothing to worry about.
hot_iron_dust 5 hours ago|||
What would the new open source projects do differently from the "old" ones? I don't think you can forbid model training on your code if your project is open source.
pixelsort 3 hours ago|||
In the many darker timelines that one can extrapolate, capturing essential tech stacks is just a pre-cursor to capturing hiring.

Once we start seeing Open AI and Anthropic getting into the certifications and testing they'll quickly become the gold standard. They won't even need to actually test anyone. People will simply consent to having their chat interactions analyzed.

The models collect more information about us than we could ever imagine because definitionally, those features are unknown unknowns for humans. For ML, the gaps in our thinking carry far richer information about is than our actual vocabularies, topics of interest, or stylometric idiosyncrasies.

echelon 2 hours ago||
As if there will be hiring in the fullness of time.

There will come a day when you can will an entire business into existence at the press of a button. Maybe it has one or two people overseeing the business logic to make sure it doesn't go off the rails, but the point is that this is a 100x reduction in labor and a 100,000x speed up in terms of delivery.

They'll price this as a $1M button press.

Suddenly, labor capital cannot participate in the market anymore. Only financial capital can.

Suddenly, software startups are no longer viable.

This is coming.

The means of production are becoming privatized capital outlays, just like the railroads. And we will never own again.

There is nothing that says our careers must remain viable. There is nothing that says our output can remain competitive, attractive, or in demand. These are not laws.

Knowledge work may be a thing of the past in ten years' time. And the capital owners and hyperscalers will be the entirety of the market.

If we do not own these systems (and at this point is it even possible for open source to catch up?), we are fundamentally screwed.

I strongly believe that people not seeing this - downplaying this - are looking the other way while the asteroid approaches.

This. Is. The. End.

pixelsort 2 hours ago|||
There could be opportunities we haven't anticipated.

What if labor organizes around human work and consumers are willing to pay the premium?

At that point, it's an arms race against the SotA models in order to deepen the resolution and harden the security mechanisms for capturing the human-affirming signals produced during work. Also, lowering the friction around verification.

In that timeline, workers would have to wear devices to monitor their GSR and record themselves on video to track their PPG. Inconvenient, and ultimately probably doomed, but it could extend or renew the horizon for certain kinds of knowledge work.

vineyardmike 1 hour ago||
> What if labor organizes around human work and consumers are willing to pay the premium?

We could start today, but sweat shops and factories dominate the items on our shelves.

But I’m sure people will draw the line at human made software…/s

nunez 47 minutes ago|||
And never forget that we collectively cheered it on as the asteroid's crater got deeper and wider.
charcircuit 10 minutes ago|||
As the cost of software trends towards $0 I don't see how one can realistic own "the" means of production rather than "a" means. Any competitor can generate a similar product cheaply.
cube2222 5 hours ago|||
Honestly, for now they seem to be buying companies built around Open Source projects which otherwise didn't really have a good story to pay for their development long-term anyway. And it seems like the primary reason is just expertise and tooling for building their CLI tools.

As long as they keep the original projects maintained and those aren't just acqui-hires, I think this is almost as good as we can hope for.

(thinking mainly about Bun here as the other one)

bix6 5 hours ago||
And how likely is that?

Once you’re acquired you have to do what the boss says. That means prioritizing your work to benefit the company. That is often not compatible with true open source.

How frequently do acquired projects seriously maintain their independence? That is rare. They may have more resources but they also have obligations.

And this doesn’t even touch on the whole commodification and box out strategy that so many tech giants have employed.

getpokedagain 1 hour ago|||
Stop using MIT licensed software being run by small vc backed operations if you value stability. They are risky and often costly Trojan horses.
brabel 1 hour ago||
What do you mean? MIT is essentially as open as you can get. The worst that can happen is that they will relicense, eventually, to force big users to pay, but when that happens everybody knows how it goes: some consortium of other big companies forks it and continues development as if nothing happened.
TrackerFF 4 hours ago|||
But how does this work out in the long run, in the case of AGI?

If AGI becomes available, especially at the local and open-source level, shouldn't all these be democratized - meaning that the AGI can simply roll out the tooling you need.

After all, AGI is what all these companies are chasing.

butlike 4 hours ago||
Let us assume AGI never comes. I don't plan scenarios for when aliens land, why should I for AGI? It's not particularly close.
nazgulnarsil 2 hours ago|||
it never made sense to have devs all over the world doing the same task with tiny variation. Centralization was inevitable. LLMs might have been a step change but the trajectory was already set.
Zopieux 53 minutes ago||
The exact opposite has started: every single developer with an LLM subscription now has 45 variations of any foundational tool and library, to cater to their weird use-case, because it was easier for the LLM to just modify it rather than adapting to it. Almost nobody upstreams such improvements (or they are too niche anyway).

The ecosystem will be this way for a while, if not the new normal.

butlike 4 hours ago|||
If it becomes too antagonistic, people will change. The desire to build things is larger than any given iron fist du jour. Just ask Oracle or IBM.
goku12 4 hours ago||
Could you say the same about the Chrome browser? Google is using it to EEE the web (Embrace, Extend and Extend it till it's a monstrosity that nobody else can manage). That's pretty antagonistic. But did people change?
butlike 3 hours ago||
Sample size: 1 but I use Arc browser. It's still webkit under the hood (and in maintenance mode now), though it's actually pretty good and last I checked had most of the baked in google stuff toggled-off by default
andrepd 2 hours ago|||
"Bearded German philosopher" once again being uncannily applicable to 21st century happenings...
bargainbin 3 hours ago|||
This is a logical conclusion of most open source tools in a capitalist economy, it's been this way for decades.

Equivalent or better tools will pop up eventually, heck if AI is so fantastic then you could just make one of your own, be the change you want to see in the world, right?

justinhj 3 hours ago|||
These are MIT/Apache 2. Sure they can buy and influence the direction but they can't prevent forks if they stray from what users want.
dismalaf 3 hours ago|||
Of course they're trying to capture existing tech stacks. The models themselves are plateauing (most advancement is coming from the non-LLM parts of the software), they took too much VC money so they need to make some of it back. So gobbling up wafers, software, etc... is the new plan for spending the money and trying to prevent catastrophic losses.
AndrewKemendo 4 hours ago|||
Explain to me how this is any different than Microsoft, Blackrock, Google, Oracle, Berkshire or any other giant company acquiring their way to market share?
gigatexal 3 hours ago|||
If our corporate overlords are gonna buy up all that is good I’d rather it have been Anthropic and not that wierdo humans-need-food-and-care-for-inference-so-LLMs aren’t-that-power-hungry Sam Altman. Man that guy is weird.

Oh well. They’ll hopefully get options and make millions when the IPO happens. Everyone eventually sells out. Not everyone can be funded by MIT to live the GNU maximalist lifestyle.

paseante 2 hours ago|||
[dead]
devnotes77 5 hours ago|||
[dead]
mountainriver 2 hours ago||
The strangest part is that Python is effectively a dead language because of agentic coding.

Why on earth would agents ever code in as terrible a language as Python when the cost of significantly better languages is essentially free? The only advantage Python ever had was that it was easy to write

stuxnet79 2 hours ago|||
> cost of significantly better languages is essentially free

Is it? We still need meatspace humans to vet what these AI agents produce. Languages like C++ / Rust etc still require huge cognitive overhead relative to Python & that will not change anytime soon.

Unless the entire global economy can run on agents with minimal human supervision someone still has to grapple with the essential complexity of getting a computer to do useful things. At least with Python that complexity is locked away within the CPython interpreter.

Also an aside, when has a language ever gotten traction based solely on its technical merits? Popularity is driven by ease-of-use, fashion, mindshare, timing etc.

amunozo 1 hour ago||||
Isn't that also an advantage for LLMs? Apart from more available data.
vaylian 51 minutes ago||||
What language is universally better than Python? I don't think Python is perfect, but it is definitely one of the best languages out there. It is elegant and it is has a huge ecosystem of libraries, frameworks and tutorials. There is a lot of battle-tested software in Python that is running businesses.
kevin42 2 hours ago||||
That's an interesting take, but I'm not sure 'easy to write' is the only advantage.

There is also a really good ecosystem of libraries, especially for scientific computing. My experience has been that Claude can write good c++ code, but it's not great about optimization. So, curated Python code can often be faster than an AI's reimplementation of an algorithm in c++.

sho_hn 2 hours ago||||
Your stance is aggressive and provocative, but no less so than the challenge AI poses to software developers in general. I think what you say should be seriously entertained.

And as someone who loves Python and has written a lot of it, I tend to agree. It's increasingly clear the way to be productive with AI coding and the way to make it reliable is to make sure AI works within strong guardrails, with testsuites, etc. that combat and corral the inherent indeterminism and problems like prompt injection as much as possible.

Getting help from the language - having the static tooling be as strict and uncompromising as possible, and delegating having to deal with the pain to AI - seems the right way.

arw0n 2 hours ago||||
Let's see how it plays out. My current assumption is that degrees and CVs will become more important in the workplace. Things like good architecture, maintainability, coherence, they are all hard to measure. A true 10x developer without a college degree will lose to the PhD without any hard skills. And these types only speak python, so they will instruct the AI to type python. Or maybe they'll vibecode rust and elixir, I don't know. But the cynic in me strongly thinks this will make all our bullshitty jobs way more bullshitty, and impostors will profit the most.
jakeydus 2 hours ago||||
I feel like this is a relatively hot take. Python has advantages beyond being easy to write. It's simple. It can do just about anything any other language can do. It's not the most performant on its own, but it's performant enough for 99% of use cases, and in the 1% you can write a new or use an existing C library instead. Its simplicity and ease of adoption make python very well represented in the training data.

If I ask an LLM or agentic AI to build something and don't specify what language to use, I'd wager that it'll choose python most of the time. Casual programmers like academics or students who ask ChatGPT to help them write a function to do X are likely to be using Python already.

I'm not a Python evangelist by any means but to suggest that AI is going to kill Python feels like a major stretch to me.

EDIT: when I say that Python can do anything any other language can do, that's with the adage in mind. Python is the second best language for every task.

Bnjoroge 48 minutes ago||||
hilariously bold take with no evidence to support the claim
raincole 1 hour ago||||
It's such a laughable take. First of all a language is never getting popular simply because it's good. Actually most used languages are usually terrible.[0]

Secondly it's non factual. Python's market share grew in 2025[1][2][3]. Probably driven by AI demand.

[0]: even truer for natural languages.

[1]: https://survey.stackoverflow.co/2025/technology#most-popular...

[2]: https://survey.stackoverflow.co/2024/technology#most-popular...

[3]: https://pypl.github.io/PYPL.html

whattheheckheck 54 minutes ago||
Yeah the swath of billions of new devs that have a lower barrier to try out coding will navigate them to python
saltyoldman 1 hour ago|||
Absolutely agree with this. I'm hoping via advent of agentic, Rust dominates in the next few years. It may even cause Wasm to be dominant as the new "applet" language.
dahlia 3 hours ago||
What strikes me most about this acquisition isn't the AI angle. It's the question of why so many open source tools get built by startup teams in the first place.

I maintain an open source project funded by the Sovereign Tech Fund. Getting there wasn't easy: the application process is long, the amounts are modest compared to a VC round, and you have to build community trust before any of that becomes possible. But the result is a project that isn't on anyone's exit timeline.

I'm not saying the startup path is without its own difficulties. But structurally, it offloads the costs onto the community that eventually comes to depend on you. By the time those costs come due, the founders have either cashed out or the company is circling the drain, and the users are left holding the bag. What's happening to Astral fits that pattern almost too neatly.

The healthier model, I think, is to build community first and then seek public or nonprofit funding: NLnet, STF, or similar. It's slower and harder, but it doesn't have a built-in betrayal baked into the structure.

Part of what makes this difficult is that public funding for open source infrastructure is still very uneven geographically. I'm based in Korea, and there's essentially nothing here comparable to what European developers can access. I had no choice but to turn to European funds, because there was simply no domestic equivalent. That's a structural problem worth taking seriously. The more countries that leave this entirely to the private sector, the more we end up watching exactly this kind of thing play out.

alexchantavy 2 hours ago||
I think this overstates the “betrayal” angle.

A lot of great open source comes out of startups because startups are really good at shipping fast and getting distribution (open source is part of this strategy). Users can try the tool immediately, and VC funding can put a lot of talent behind building something great very quickly.

The startup model absolutely creates incentive risk, but that’s true of any project that becomes important while depending on a relatively small set of maintainers or funders.

I’m not sure an acquisition is categorically different from a maintainer eventually moving on or burning out. In all of those cases, users who depend on the project take on some risk. That’s not unique to startups; it’s true of basically any software that becomes important.

There’s no perfect structure for open source here - public funding, nonprofit support, and startups all suck in their own ways.

And on the point you make about public funding being slow: yeah, talented people can’t work full-time on important things unless there’s serious funding behind it. uv got as good as it is because the funding let exceptional people work on it full-time with a level of intensity that public funding usually does not.

dahlia 2 hours ago||
That's fair, and I don't really blame anyone for taking the startup route. It's often the only realistic path to working full-time on something you care about. My point is more that it shouldn't have to be. The more public funding flows into open source infrastructure, the less that tradeoff becomes necessary in the first place. Korea being almost entirely absent from that picture is part of why I feel this so keenly.
jackbravo 1 hour ago|||
Most likely, because it is less money :-p. But also because it is less known and harder, as you already mentioned. Personally, I'm based in Mexico, and I would never have thought about trying to get nonprofit funding for a community project, nor would I know where to start to get that.
theallan 1 hour ago||
> I maintain an open source project funded by the Sovereign Tech Fund.

I would absolutely love to know more about this if you are willing to share the story?

dahlia 1 hour ago||
Sure! Here's what I wrote about the story: https://writings.hongminhee.org/2025/10/stf-fedify/
hijodelsol 5 hours ago||
This is a serious risk for the open source ecosystem and particularly the scientific ecosystem that over the last years has adopted many of these technologies. Having their future depend on a cap-ex heavy company that is currently (based on reporting) spending approx. 2.5 dollars to make a dollar of revenue and must have hypergrowth in the next years or perish is less than ideal. This should discourage anybody doing serious work to adopt more of the upcoming Astral technologies like ty and pyx. Hopefully, ruff and uv are large enough to be forked should (when) the time comes.
rst 5 hours ago||
On the flip side, I'm not sure I ever saw a revenue plan or exit strategy for Astral other than acquihire. And most plausible bidders are unfortunate in one way or another.
japhyr 5 hours ago|||
Astral was building a private package hosting system for enterprise customers. That was their stated approach to becoming profitable, while continuing to fund their open source work.
organsnyder 5 hours ago|||
Private package hosting sounds like a commodity that would be hard to differentiate.
nunez 32 minutes ago|||
It's also a crowded and super mature space space between JFrog (Artifactory) and Sonatype (Nexus). They already support private PyPI repositories and are super locked in at pretty much every enterprise-level company out there.
atomicnumber3 4 hours ago||||
A commodity yes, but could be wrapped in to work very nicely with the latest and greatest in python tooling. Remember, the only 2 ways to make money are by bundling and unbundling. This seems like a pretty easy bundling story.
IshKebab 4 hours ago|||
Yeah you'd think so but somehow JFrog (makers of Artifactory) made half a billion dollars last year. I don't really understand that. Conda also makes an implausible amount of money.
nunez 28 minutes ago|||
Makes sense to me.

Most of the companies that spend $$$$ with them can't use public registries for production/production-adjacent workloads due to regulations and, secondarily a desire to mitigate supply chain risk.

Artifactory is a drop-in replacement for every kind of repository they'll need to work with, and it has a nice UI. They also support "pass-through" repositories that mirror the public repositories with the customization options these customers like to have. It also has image/artifact scanning, which cybersecurity teams love to use in their remediation reporting.

It's also relatively easy to spin up and scale. I don't work there, but I had to use Artifactory for a demo I built, and getting it up and running took very little time, even without AI assistance.

japhyr 3 hours ago|||
From my understanding there are a lot of companies that need their own package repositories, for a variety of reasons. I listened to a couple podcasts where Charlie Marsh outlined their plans for pyx, and why they felt their entry into that market would be profitable. My guess is that OpenAI just dangled way more money in their faces than what they were likely to get from pyx.

Having a private package index gives you a central place where all employees can install from, without having to screen what each person is installing. Also, if I remember right, there are some large AI and ML focused packages that benefit from an index that's tuned to your specific hardware and workflows.

kickopotomus 2 hours ago|||
Private artifact repositories also help to mitigate supply chain risk since you can host all of your screened packages and don't have to worry about something getting removed from mvn-central, PyPI, NPM, etc.

Plus the obvious need for a place to host proprietary internal libraries.

y1n0 2 hours ago||||
We have some kind of simple pip repo that is private where I work. What would astral bring to the table?
quadrifoliate 1 hour ago||
How many people use that simple pip repo daily? If the number is not in the high hundreds, or a few thousands; maybe nothing. But once you get up there, any kind of better coordination layer is useful enough to pay money to a third party for, unless maintaining a layer over pip is your core competency.
tempest_ 33 minutes ago|||
I mean that was a thing at one point but I feel like it is baked into github/gitlab etc now
pjmlp 4 hours ago||||
What would be the added value against JFrog or Nexus, for example?
justcool393 1 hour ago||||
i mean ofc but like you can self-host pypi and the "Docker Hub" model isn't like VC-expected level returns especially as ECR and GHCR and the other repos exist
r_lee 4 hours ago|||
that was never going to work, let's be honest
hijodelsol 5 hours ago|||
They could have joined projects like the Linux Foundation which try to not depend on any single donor, even though complete independence from big tech is not possible. I don't know the motivation behind Astral's approach, but this acquisition does leave a weird taste behind about how serious they were about truly open source software. Time will tell, I guess. (Edit: typo)
colesantiago 5 hours ago||
> I don't know the motivation behind Astral's approach, but this acquisition does leave a weird taste behind about how serious they were about truly open source software.

It was because Astral was VC funded.

https://astral.sh/blog/announcing-astral-the-company-behind-...

chis 2 hours ago|||
My hope would be that this eventually pushes pip to adopt a similar feature-set and performance improvements. It's always a better story when the built-in tool is adequate instead of having to pick something. And yes UV is rust but it's pretty clear that Python could provide something within 2-5x the speed.
materielle 2 hours ago|||
The problem is funding.

There seems to be a pervasive believe that the Python tooling and interpreter suck and are slow because the maintainers don’t care, or aren’t capable.

The actual problem is that there isn’t enough money to develop all of these systems properly.

Google says that Astral had 15 team members. Or course, it’s so hard to make these projections. But it wouldn’t shock me if uv and ruff are each individually multi-million dollar pieces of software.

If you’d like to invest a million dollars to improve pip, or work for free for 3 years to do it yourself, I’m not sure if anyone would object.

thayne 2 hours ago|||
pip isn't exactly a "built-in" tool. Beyond the python distribution having a stub module that downloads pip for you.
Maxion 5 hours ago|||
These tools are open source, if they lock them down the community will just fork them.
pjmlp 5 hours ago|||
Nice idea in theory, in practice is how many folks down in Nebraska are going to show up.
zem 34 minutes ago||
as someone who works in the python tooling space I think you underestimate the number of people who would be willing to do this. i would personally help maintain a community fork of ruff if it got to the point where one was needed, though I draw the line at moving to nebraska first.
hijodelsol 5 hours ago|||
This might be true for uv and ruff, and hopefully that will happen. But pyx is a platform with associated hosting and if successful would lock people into the Astral ecosystem, even if the code itself was open source.
pjmlp 5 hours ago|||
I never adopted them, keep using mostly Python written stuff.

Either pay for the product, or use stuff that isn't dependent on VC money, this is always how it ends.

hijodelsol 5 hours ago|||
There are ways to independently fund open source projects, though. I have previously contributed to the Python Software Foundation and to individual open source maintainers through GitHub donations (which are not dependent on GitHub, as there are many alternatives). Projects like the Linux Foundation exist, too. And government funding, especially for scientific endeavors or where software is used to fulfill critical state tasks, is an option, too. I refuse to subject to the hypercommercialization of software and still believe in the principles behind open source.
pjmlp 5 hours ago||
Which is why I mentioned "....use stuff that isn't dependent on VC money...".
WhyNotHugo 5 hours ago|||
> I never adopted them, keep using mostly Python written stuff.

Maybe you use non-transitive pure Python dependencies, but it's likely that your tools and dependencies still rely on stuff in Rust or C (e.g.: py-cryptography and Python itself respectively).

pjmlp 4 hours ago||
I use mostly the batteries, given that the only purpose I have for Python, since version 1.6, is UNIX scripting tasks, beyond shell.

As mentioned multiple times, since my experience with Tcl and continuously rewriting stuff in C, I tend to avoid languages that don't come with JIT, or AOT, in the reference tooling.

I tend to work with Java, .NET, node, C++, for application code.

Naturally AI now changes that, still I tend to focus on approaches that are more classical Python with pip, venv, stuff written in C or C++ that is around for years.

dadrian 2 hours ago|||
As opposed to Pip, which is obviously free and sustainable forever.
tmaly 5 hours ago|||
Would single maintainers of critical open source projects be a better situation?
mcdonje 5 hours ago||
Are you not aware of foundations?
kjksf 4 hours ago||
The issue is lack of money not lack of legal structure.

Consider ffmpeg. You can donate via https://www.ffmpeg.org/spi.html

How much money do they make from donations? I don't know but "In practice we frequently payed for travel and hardware."

Translation: nothing at all.

If such a fundamental project that is a revenue driver for so many companies, including midas-level rich companies like Google, can't even pay decent salaries for core devs from donations, then open source model doesn't work in terms of funding the work even at the smallest possible levels of "pay a reasonable market rate for devs".

You either get people who just work for free or businesses built around free work by providing something in addition to free software (which is hard to pull off, as we've seen with Bun and Astral and Deno and Node).

mcdonje 2 hours ago||
Google contributed tons of developer hours for things like bug fixes, without which the project might not be where it is today.

There are examples of foundations or other similar entities paying developers, like Linux, SQLite, even Zig.

Maybe the difference is some projects rely on core contributors more because external contributions are more restricted in some way.

But sure, the entire open source model doesn't work, lol

adolph 2 hours ago|||
> This is a serious risk for the open source ecosystem and particularly the scientific ecosystem that over the last years has adopted many of these technologies.

At worst, it's just Anaconda II AI Boogaloo. The ecosystems will evolve and overcome, or will die and different ecosystems rise to meet the need going forward.

I anticipate OpenAI will get bored and ignore Astral's tools. Software entropy will do its thing and we will remember an actively developed uv as the good old days until something similar to cargo gets adopted as part of Python's standard distribution.

llll_lllllll_l 5 hours ago||
I don't know how to search for that report, can you share it?
incognito124 5 hours ago||
Possibly the worst possible news for the Python ecosystem. Absolutely devastating. Congrats to the team
dcre 3 hours ago||
Can't blame you for not trusting OpenAI, but it seems to me they would gain very little from fucking up uv (or more precisely doing things that have a side effect of fucking up uv), and they have tons of incentive to cultivate developer good will. Better to think of buying and supporting a project like this as a very cheap way to make developers think they're not so bad.
throwaway5752 3 hours ago||
No they don't have incentive to cultivate developer goodwill. They are monetizing replacing developers everywhere. That is the trillion-dollar valuation. They have the opposite incentive.
dcre 2 hours ago||
They are not. A very large proportion of their revenue comes from developers. A large proportion of their marketing and product work is aimed at developers. You have to work really hard to not see this. Just look at what Altman and Brockman tweet about.

https://xcancel.com/gdb

https://xcancel.com/sama/

throwaway5752 2 hours ago||
All the various APM companies are implementing "Assign to agent" flows. The various foundation model providers will be satisfied getting a subscription for 10% of total comp of a developer, instead of pocketing 60% of the total comp completely replacing them?

The only thing that could prevent this is lack of ability to execute, like how Uber wanted to replace drivers with FSD vehicles.

dcre 1 hour ago||
It's not about what they wish would happen, it's about what they think will happen. In my view they are acting precisely like they believe they will be making a proportion of developer pay by making them more productive rather than replacing developers. I think they understand that the alternative doesn't really work out for them or anyone.

Even if they believe that their systems will eventually tank employment and replace developers rather than augment meant, the fate of Astral doesn't matter at all in that scenario because a) nobody has a job, and b) you can build your own uv replacement for $20.

dgb23 1 hour ago||
Could it be that they want developers to use their stuff so they get telemetry and mind share out of it? As a stepping stone for the ultimate goals so to speak?
antod 1 hour ago|||
One thing to keep in mind is that uv was the product of a whole lot of packaging PEPs finally landing and standards being set. That combined with not having to support all the old baggage meant they could have an effect modernizing community packaging standards.

I hope those two factors mean that if things go really wrong, then the clean(ish) break with all the non standard complex legacy means an easier future for community packaging efforts.

stephbook 44 minutes ago||
This. Their current approach is open sourced. There's no going back any more.
blitzar 5 hours ago|||
I hope they got paid, I will be very sad if they didn't at least get G5 money.
fortuitous-frog 4 hours ago||
Curious how well upstream contributors or projects get contributed for these sort of headline-gathering acquisitions (probably not at all, unfortunately).

OpenClaw notably was built around Mario Zechner's pi[0]; uv I believe was highly adapted from Armin Ronacher's rye[1], and uses indygreg's python-build-standalone[2] for distributing Python builds (both of which were eventually transferred to Astral).

[0]: https://github.com/badlogic/pi-mono

[1]: https://github.com/astral-sh/rye

[2]: https://github.com/astral-sh/python-build-standalone

vdfs 5 hours ago|||
On the other hand, we get to see what other thing will try to replace pip
politelemon 1 hour ago||
wx
PurpleRamen 4 hours ago|||
Yeah, no. there are many worse news than this.

In the worst case, Astral will stop developing their tools, someone else will pick them up and will continue polishing them. In the best case, they will just continue as they did until now, and nothing will really change on that front.

Astral is doing good work, but their greatest benefit for the ecosystem so far was showing what's possible and how it's down. Now everyone can take up the quest from here and continue. So any possible harm from here out will be not that deep, at worst we will be missing out on many more cool things they could have built.

nsbk 56 minutes ago||
Let the forks roll!
huksley 5 hours ago||
UV_DISABLE_AGENT=1 UV_DISABLE_AI_HINTS=1 uv add
emmettm 29 minutes ago|
lol, underrated comment
jjice 5 hours ago||
Not who I would've liked to acquire Astral. As long as OpenAI doesn't force bad decisions on to Astral too hard, I'm very happy for the Astral team. They've been making some of the best Python tooling that has made the ecosystem so much better IME.
smallpipe 5 hours ago||
If Codex’s core quality is anything to go by, it’s time to create a community fork of UV
piskov 4 minutes ago|||
At least it’s in rust.

Unlike those react-game-engine guys over at Claude

pronik 4 hours ago|||
Maybe they are being acquired to improve the quality of Codex.
OutOfHere 3 hours ago||
That's the thing. To me that says that as soon as cash becomes tight at OpenAI, the Astral staff will no longer get to work on Python tooling anymore, namely uv, etc.
shimman 25 minutes ago||
Tale as old as time in SV, why we keep trusting venture capital to be the community's stewards I have no idea.

We need public investment in open source, in the form of grants, not more private partnerships that somehow always seem to hurt the community.

supriyo-biswas 4 hours ago|||
Eh, if it turns out to be too bad I guess I’ll just end up switching back to pipenv, which is the closest thing to uv (especially due to the automatic Python version management, but not as fast).
dec0dedab0de 2 hours ago|||
I would much rather use pipenv, if it only had the speed of uv.

Every interface kenneth reitz originally designed was fantastic to learn and use. I wish the influx of all these non-pythonistas changing the language over the last 10 years or so would go back and learn from his stuff.

zbentley 3 hours ago|||
Does pipenv download and install prebuilt interpreters when managing Python versions? Last I used it it relied on pyenv to do a local build, which is incredibly finicky on heterogenous fleets of computers.
lern_too_spel 4 hours ago||
The priorities of the tooling will change to help agents instead of human users directly. That's all that's happening.
japhyr 5 hours ago||
This has me thinking about VS Code and VS Codium. I've used VS Code for a while now, but recently grew annoyed at the increasingly prevalent prompts to subscribe to various Microsoft AI tools. I know you can make them go away, but if you bounce between different systems, and particularly deal with installing VS Code on a regular basis, it becomes annoying.

I started using VS Codium, and it feels like using VS Code before the AI hype era. I wonder if we're going to see a commercial version of uv bloated with the things OpenAI wants us all to use, and a community version that's more like the uv we're using right now.

sschueller 3 hours ago|
MS is actively making your life using VS Codium a pain. They removed the download button the extension marketrplace making it very difficult to download extensions and installing them in VS Codium since VS Codium does not have access to the official MS extension marketplace. Many don't publish outside the marketplace for example Platformio. [1]

[1] https://github.com/platformio/platformio-vscode-ide/issues/1...

NewsaHackO 13 minutes ago|||
Also, Microsoft does not allow use of their LSP for python. You have to use the barebones Jedi LSP.
satya71 4 minutes ago||
Fortunately, there are competing LSPs of reasonable quality now. I'm using pyrefly. Not sure if ty/ruff have one too.
barnabee 1 hour ago||||
I've not struggled to find the things I need at https://open-vsx.org (usually by searching directly within VSCodium), but then I only use it for editing things like markdown docs and presentations, LaTeX/Typst, rather than coding, which I prefer to do in a terminal and with a modal editor.
matkoniecz 2 hours ago|||
Luckily I avoided extensions before switching to VS Codium.

Glad to hear that I am avoiding Microsoft's spam.

ragebol 5 hours ago||
Not often that I audibly groan at a HN headline :-(
alex_suzuki 5 hours ago||
Same here. I’ve adopted uv across all of my Python projects and couldn’t be happier. ty looks very promising as well.

Probably inevitable, and I don’t blame the team, I just wish it were someone else.

saalweachter 2 hours ago|||
I kind of feel like the nature of the Python ecosystem is a dozen or so extremely useful frameworks/tools that everyone uses heavily for 3 years and then abandons and never speaks of again.

I'm not very deep in Python anymore, but every time I dip my toes back in it's a completely different set of tools, with some noticably rare exceptions (eg, numpy).

pprotas 3 hours ago||||
Monkey paw curls tight

Microsoft acquires Astral

Wish comes with a cost

ragebol 5 hours ago|||
Ty, Ruff, UV, all great tools I recently started really using and I couldn't be happier with them.

Sigh

krick 4 hours ago||
I think, it may be the first time I am actually upset by acquire announcement. I am usually like "well, it is what it is", but this time it just feels like betrayal.
Fervicus 3 hours ago||
> it just feels like betrayal

It was a VC backed tool. What did you expect?

krick 2 hours ago||
Nothing. I was very much aware of their prospects. Well, best-case scenario I could imagine them being acquired by Google or Microsoft, that would have looked like a prettier death, to be honest. Anyway, knowing that people eventually die doesn't mean you are immune to being sad when somebody dear actually dies. Especially when they die so young and full of potential.
lucrbvi 6 hours ago||
This is a weird pattern accross OpenAI/Anthropic to buy startups building better toolings.

I don't really see the value for OAI/Anthropic, but it's nice to know that uv (+ ty and many others) and Bun will stay maintained!

jpalomaki 5 hours ago||
Somebody took a deeper look at Claude Code and claims to find evidence of Anthropic's PaaS offering [1]. There's certainly money to be made by offering a nice platform where "citizen developers" can push their code.

From Astral the (fast) linter and type checker are pretty useful companions for agentic development.

[1] https://x.com/AprilNEA/status/2034209430158619084

lucrbvi 5 hours ago||
I wouldn't be surprised if Vercel were bought by Anthropic/OAI (but maybe it would be too expensive?)
bikelang 5 hours ago|||
No no - SpaceX/xAi must now buy Vercel so that we can deploy our bloated Next apps to space.
GCUMstlyHarmls 5 hours ago|||
Next now renamed to Xext.
dirkc 3 hours ago|||
At least in space there is lots of space and no heat /s - I'd love for Next to exist in a vacuum
jimmydoe 5 hours ago|||
Nothing is too expensive. It will be a bidding war.
luxcem 21 minutes ago|||
The value is to control the tool chain from idea to production so it can be automated by agents. It's no secret that the final goal is to fully replace developers, the flow "idea to production". It's easier to control that flow if you control each tool and every step.

I won't be surprised if the next step is to acquire CI/CD tools.

synthc 5 hours ago|||
`uv agent` and `bun agent` in 3....2.....1....
rgilliotte 5 hours ago||
Totally agree

The value for Anthropic / OAI is that they have a strong interest in becoming the "default" agent.

The one that you don't need to install, because it's already provided by your package manager.

everforward 5 hours ago||
I don't think this holds because we're talking about developers who know how to use a package manager, on a piece of software you have to install anyways. The friction of "uv add $other_llm_software" is too low for it to have a real impact.

I think they're more into the extra context they can build for the LLM with ruff/ty.

dec0dedab0de 2 hours ago|||
until "uv add $openai_competitor" mysteriously breaks in odd, difficult to troubleshoot ways.
siva7 4 hours ago|||
You fool think they are targeting developers with this purchase?
everforward 4 hours ago||
I don’t think they’re targeting the C suite with it, because they don’t use uv and Microsoft already has Copilot for the “it’s bad but bundled with stuff you’re already paying for” market.
DoctorDabadedoo 5 hours ago|||
Good that they got some money and a longer runaway, but I have my doubts the product will improve rather than be smothered to death.

Embrace, extend, extinguish. Time will tell.

TheCondor 1 hour ago|||
Does OpenAI use a lot of python?

There is the literal benefit of "we use the hell out of this tool, we need to make sure it stays usable for us" and then there is what they can learn from or coerce the community in to doing.

jbonatakis 17 minutes ago||
I don't know about OpenAI using a lot of Python, but Astral builds all their tools in Rust and just exposes Python bindings. Codex is all Rust. It feels like a reasonable acquisition from that perspective. They're banking on at least in part on the Astral team being able to integrate with and supercharge Codex.
0x3f 5 hours ago|||
> it's nice to know that uv (+ ty and many others) and Bun will stay maintained!

Depends if you think the bubble is going to pop, I suppose. In some sense, independence was insulation.

butlike 4 hours ago|||
They probably prompted for what they should do next and got this as a half-hallucinated response lol
itissid 5 hours ago|||
Isn't this something to do with their paid pyx(as opposed to ty/ruff etc) thingy?
LoganDark 5 hours ago|||
I'm not so sure. I sort of wish they hadn't been acquired because these sort of acquihires usually result in stifling the competition while the incumbent stagnates. It definitely is an acquihire given OpenAI explicitly states they'll be joining the Codex team and only that their existing open-source projects will remain "maintained".
OutOfHere 3 hours ago|||
Why do you think that uv, etc. will stay maintained? They will for now, but as soon as cash is tight at OpenAI, they'll get culled so fast that you won't see it coming. This is the risk.
christina97 5 hours ago||
I mean they are “startups” on the way to mega-companies. They need internal tooling to match.
selectnull 52 minutes ago|
I see a lot of comments that are "somebody should fork this" or "community will fork it" or similar.

I didn't see a single comment of "I will fork it" type.

More comments...