Posted by willm 9/2/2025
I think there's a good reason as well that async generators are at the core of async/await, but people rarely use them outside of abstractions. I like them, but they are a lower level tool for sure. Once they 'click' and you feel good with them, great, but chances aren't very good that everyone you work with will feel the same.
Generators add on a sort of cognitive overhead more than mess, I guess. Sometimes it makes sense to pull them out, but often it doesn't. Promises probably encapsulate 95% of common use cases. Promises just do one thing, whereas everywhere you use generators, you've got all the power and potential of generators. Kind of a 'great power, great responsibility' problem
No, if you call both function one will try and fetch a none responding url and the other will immediately raise an exception.
AWSCLI was broken for over a year- we had to do a ton of work to deal with the various packaging issues.
Don't break userspace.
Hey! We have a product that we clearly want to release worldwide. Let's build it on something that doesn't have Unicode. Or any real threading. And is slow as hell.
You picked a platform that was going to have to break user space.
At least it wasn't JavaScript
What does `create_client` return?! don't you worry your pretty head about it, it'll be whatever you want it to be! flexability!!11
I didn't go digging into it, but I'd guess they used the ubiquitous "six" library for backporting unicode functionality, but the point is likely "but why start underwater?!"
Asyncio means learning different syntax that buys me nothing over the existing tools. Why would I bother?
I personally blame low async adoption in Python on 1) general reduction in its popularity vs Typescript+node, which is driven by the desire to have a single stack on the frontend and backend, not by bad or good async implementations in Python (see also: Rails, once the poster child of the Web, now nearly forgotten) 2) lack of good async stdlib. parallelism and concurrency are distant thirds.
async is a concurrency mechanism.
That is, if you use external stuff and can delegate work to them, then async is concurrent (async io for instance)
But if you do not, then async is regular code with extra steps
My understanding is that JS can't do that (besides service workers which are non-shared memory), but it still has multiple concurrent code-blocks being executed at the same time, just in linear fashion. It will just never use multiple CPU cores at the same time (unless calling some non-JS non-shared-memory code)
If someone has it already implemented, like say in FastAPI, then it's pretty trivial to use but to just use async is kind of a pain.
Well I had a fix https://news.ycombinator.com/item?id=43982570
The essential idea was I could be processing ~100 requests per vCPU in the async event loop while threading would max out 2-4 threads per CPU. Of course let us assume for either model we're waiting for 50-2000ms DB query or service call to finish before sending the response.
Is this not true? And if it is true, why isn't the juice is worth the squeeze: more than an order of magnitude more saturation/throughput for the same hardware and same language, just with a new engine at its heart?