Posted by websku 1 day ago
From to time, test the restore process.
They tend to slip out of declarative mode and start making untracked changes to the system from time to time.
We've gone a step further, and made this even easier with https://zo.computer
You get a server, and a lot of useful built-in functionality (like the ability to text with your server)
I agree you could use LLMs to learn how it works, but given that they explain and do the actions, I suspect the vast majority aren't learning anything. I've helped students who are learning to code, and very often they just copy/paste back and forth and ignore the actual content.
And I find the stuff that the average self hoster needs is so surface level that LLMs flawlessly provide solutions.
If you're self hosting for other reasons then that's fine. I self host media for various reasons, but I also give all my email/calendar/docs/photos over to a big tech company because I'm not motivated by that aspect.
They also aren't seeing any of your sensitive data being hosted on the server. At least the way I use them is getting suggestions for what software and configs I should go with, and then I do the actual doing. Which means I'm becoming independently more capable than I was before.
I'm asking Claude technical questions about setup, e.g., read this manual, that I have skimmed but don't necessarily fully understand yet. How do I monitor this service? Oh connect Tailscale and manage with ACLs. But what do I do when it doesn't work or goes down? Ask Claude.
To get more accurate setup and diagnostics, I need to share config files, firewall rules, IPv6 GUAs, Tailscale ACLs... and Claude just eats it up, and now Anthropic knows it forever too. Sure, CGNET, Wireguard, and ssh logins stand between us, but... Claude is running a terminal window on a LAN device next to another terminal window that does have access to my server. Do I trust VS Code? Anthropic? The FOSS? Is this really self-hosting? Ahh, but I am learning stuff, right?
For now I'm just using Cloudflare tunnels, but ideally I also want to do that myself (without getting DDoS)
I’ll bite. You can save a lot of money by buying used hardware. I recommend looking for old Dell OptiPlex towers on Facebook Marketplace or from local used computer stores. Lenovo ThinkCentres (e.g., m700 tiny) are also a great option if you prefer something with a smaller form factor.
I’d recommend disregarding advice from non-technical folks recommending brand new, expensive hardware, because it’s usually overkill.
And then you can only use distros which have a raspberry pi specific build. Generic ARM ones won't work.
I build out my server in Docker and I’ve been surprised that every image I’ve ever wanted to download has an ARM image.
I'm not familiar with Dell product names specifically but 'tower' sounds like it'll sit there burning 200W idle. Old laptops (sliding out the battery) is what I've been opting for, which use barely anything more than the router it sits next to. Especially if you just want to serve static files as GP seems to be looking for, an old smartphone will be enough but there you can't remove the battery (since it won't run off of just the charger)
My first "server" was a 65€ second-hand laptop including shipping iirc, in ~2010 euros so say maybe 100€ now when taking inflation into account. I used that for a number of years and had a good idea of what I wanted from my next setup (which wasn't much heavier, but a little newer cpu wasn't amiss after 3 years). Don't think one needs to even go so far as 200$ for a "local Bandcamp archive" (static file storage) and serving that via some streaming webserver
Jellyfin docs do mention "Not having a GPU is NOT recommended for Jellyfin, as video transcoding on the CPU is very performance demanding" but that's for on-the-fly video transcoding. If you transcode your videos to the desired format(s) upon import, or don't have any videos at all yet as in GP's case, it doesn't matter if the hardware is 20x slower. Worst case, you just watch that movie in source material quality: on a LAN you won't have network speed bottlenecks anyway, and transcoding on GPU is much more expensive (purchase + ongoing power costs) than the gigabit ethernet that you can already find by default on every laptop and router
What I've found: Claude Code is great at the "figure out this docker/nginx/systemd incantation" part but the orchestration layer (health checks, rollbacks, zero-downtime deploys) still benefits from purpose-built tooling. The AI handles the tedious config generation while you focus on the actual workflow.
github.com/elitan/frost if curious
(In)famous last words?