Posted by david927 1 day ago
Ask HN: What are you working on? (May 2026)
So: ac-ng didn't reduce the impact of the DDoS, but it does lead to impact when there is no DDoS. Worst of both worlds.
So I'm working on an apt-cacher that goes to lengths to keep working as much as possible when the upstream is down. It will check the repo metadata and keeps a list of your "hot packages", and will download those before flipping the new metadata to be live, effectively a snapshot. It won't allow you to download a package you've never downloaded before in the case of a DDoS, but packages that you do download regularly (machine re-installs, apt updates), it will ensure are available in the repo.
I'm calling it apt-cacher-ultra. It is pretty early days, it'll probably be another week before it's ready for a beta. I'm running it in my dev cluster right now, successfully.
Big thing I made recently is moving it from SvelteKit to Hono + Inertia + Vue.
I like SvelteKit, but I was struggling with stability in active development periods, and writing proper tests was very hard due to mocking all the magic, especially outside trivial testing tools.
Now the whole app is straightforward Hono MVC with Vue powered UI. Logic is easy to test, and all UI states exposed in Storybook.
I wrote a custom adapter that makes Inertia run on Hono, and coincidentally same thing was released by Hono author itself as official module, which is great sign for adoption!
So, try Inertia – it is a best of both worlds. You write MVC backend as you like, and use modern JS frameworks for templates.
It replaces paper stamp cards with Apple Wallet passes (Google Wallet coming soon) without the need for customers to download an app or signup. It’s still very work-in-progress (forgive the landing page) but I’m enjoying using Ruby on Rails. Please let me know your thoughts!
Currently supports docker, containerd, wasm runtimes. I am adding support for jvm and kvm, etc. Works as is in macos. There is mock runtime too to mock it for various testing various distributed services etc.
It is a fun little experimental project.
https://github.com/calfonso/rusternetes
I picked Rust as my language before the AI hype in popularity so I'm biased on k8s tooling in my focus language.
Cool project!
Just launched Studio, which is the self-hosted version of DB Pro.
I also keep a devlog. #9 was just published to YouTube.
Self-Host Your Own Database Client | DB Pro Devlog #9 https://youtu.be/MJvSrJGtk70
Cham (https://github.com/jfim/cham) is an archive for internet content, you give it an URL and it'll archive it for you, extract the text with readability if it's an article, or extract the audio track then transcribe it. Content is automatically summarized and tagged, and you can start a conversation with a LLM about the article. It supports feeds too so you can subscribe to blogs and keep the articles in case the blog goes away. I still need to add search, improve the CLI, add all the missing features, and do a lot of improvements all over the place.
To improve reliability, I made passe-partout which is basically a Chrome browser with a rest API (https://github.com/jfim/passe-partout) and veilleur (https://github.com/jfim/veilleur) which turns any blog listing into a RSS feed. So this way I can take blogs that are rendered using JavaScript, don't have a RSS feed and load the articles directly into Cham.
Also built a modular MCP server with OAuth2 dynamic registration so that I can have my own MCP server that works with the web, desktop, and cli versions of Claude/Claude code. Currently have modules for editing files so that I can edit/search my Obsidian vault from Claude, fetching pages through passe-partout (since some pages block LLMs from reading them), and proxying MCP servers so that servers that only support bearer token auth can still work with web Claude.
Also, a gnome terminal emulator UI with some unique features like split browser/terminal tabs. https://github.com/jfim/jfterm
Mostly an excuse to see how far I can push LLM code generation to write tons of software that I've always wanted but never had the bandwidth to tackle, and learning to deal with the sometimes questionable code quality that comes from it.