The current wave of AI agents is diminishing the value of identity as a DDOS or content-moderation signal. The formula until now included bot = bad, but unless your service wants to exclude everyone using OpenClaw and friends, that's no longer a valid heuristic.
If identity is no longer a strong signal, then the internet must move away from CAPTCHAs and logins and reputation, and focus more on the proposed content or action instead. Which might not be so bad. After all, if I read a thought-provoking, original, enriching comment on HN, do I really care if it was actually written by a dog?
We might finally be getting close to https://xkcd.com/810/.
One more half thought: what if the solution to the Sybil problem is deciding that it's not a problem? Go ahead and spin up your bot network, join the party. If we can design systems that assign zero value to uniqueness and require originality or creativity for a contribution to matter, then successful Sybil "attacks" are no longer attacks, but free work donated by the attacker.
I would rather just read the thought as it was originally expressed by a human somewhere in the AI's training data, rather than a version of it that's been laundered through AI and deployed according to the separate, hidden intent of the AI's operator.
That always-on device? To get critical mass, instead of just the nerds, you'd need it to ship with devices which are always-on, like routers/gateways, smart TV's. Then you're back to being at the mercy of centralized companies who also don't love patching their security vulnerabilities.
(1) Security. An always-on, externally accessible device will always be a target for breaking in. You want the device to be bulletproof, and to have defense in depth, so that breaking into one service does not affect anything else. Something like Proxmox that works on low-end hardware and is as easy to administer as a mobile phone would do. We are somehow far from this yet. A very limited thing like a static site may be made both easy and bulletproof though.
(2) Connectivity providers should allow that. Most home routers don't get a static IP, or even a globally routable IPv4 at all. Or even a stable IPv6. This complicates the DNS setup, and without DNS such resources are basically invisible.
From the pure resilience POV, it seems more important to keep control of your domain, and have an automated way to deploy your site / app on whatever new host, which is regularly tested. Then use free or cheap DNS and VM hosting of convenience. It takes some technical chops, but can likely be simplified and made relatively error-proof with a concerted effort.
With IPv6 it would theoretically be possible, but currently with ipv4 and NATs everywhere, your website would almost never be reachable, even with fancy workarounds like dynDNS
The challenge I've always felt, is shared services -- if I'm running infra myself, I can depend upon it, but if someone else is running it, I'm never really sure if I can, which makes external services really hard to rely on and invest into.
Maybe you can get further than expected with individual services? But shared services at some point seem really useful.
I think web2 solved that in an unfortunate way, where you know the corporations operating the services / networks are aligned in some ways but not in others.
But would be great to have shared services that do have better guarantees. Disclaimer, we're working on something in that direction, but really curious what others have seen or thinking in this area.
Here's my small contribution to that. https://github.com/micro/mu - an app platform without ads, algorithms or tracking.
HTTP requires always-on + always-discoverable infrastructure
It's all over the place.
I have tried to get them to publish markdown sites using GitHub pages, but the pain of having to git commit and do it via desktop was the blocker.
So I recently made them a mobile app called JekyllPress [0] with which they can publish their posts similar to WordPress mobile app. And now a bunch of them regularly publish on GitHub pages. I think with more tools to simplify the publishing process, more people will start using GitHub pages (my app still requires some painful onboarding like creating a repo, enabling GitHub pages and getting PAT, no oAuth as I don't have any server).
When you publish to Facebook, WordPress etc you can't easily get your stuff out. You will have to process them even if they allow you to download your content as a zip folder. The images will be broken. Links between pages won't work etc.
You can’t run your own email server. All other large email providers will consider your self hosted emails as spam by default. It understandable why they took this stance (due to actual spam) but it is also awfully convenient it also increases their market power.
We are now at the whim of large corps even if we get a custom domain with them.
This is like talking about how book authors don't need Amazon when you have a printer and glue at home.