Top
Best
New

Posted by speckx 2 hours ago

The “small web” is bigger than you might think(kevinboone.me)
125 points | 46 comments
GuB-42 18 minutes ago|
I don't expect many people to agree but I think that the "small web" should reject encryption, which is the opposite direction that Gemini is taking.

I don't deny the importance of encryption, it is really what shaped the modern web, allowing for secure payment, private transfer of personal information, etc... See where I am getting at?

Removing encryption means that you can't reasonably do financial transactions, accounts and access restriction, exchange of private information, etc... You only share what you want to share publicly, with no restrictions. It seriously limits commercial potential which is the point.

It also helps technically. If you want to make a tiny web server, like on a microcontroller, encryption is the hardest part. In addition, TLS comes with expiring certificates, requiring regular maintenance, you can't just have your server and leave it alone for years, still working. It can also bring back simple caching proxies, great for poor connectivity.

Two problems remain with the lack of encryption, first is authenticity. Anyone can man-in-the-middle and change the web page, TLS prevents that. But what I think is an even better solution is to do it at the content level: sign the content, like a GPG signature, not the server, this way you can guarantee the authenticity of the content, no matter where you are getting it from.

The other thing is the usual argument about oppressive governments, etc... Well, if want to protect yourself, TLS won't save you, you will be given away by your IP address, they may not see exactly what you are looking at, but the simple fact you are connecting to a server containing sensitive data may be evidence enough. Protecting your identity is what networks like TOR are for, and you can hide a plain text server behind the TOR network, which would act as the privacy layer.

throw5 5 minutes ago||
> But what I think is an even better solution is to do it at the content level: sign the content, like a GPG signature

How would this work in reality? With the current state of browsers this is not possible because the ISP can still insert their content into the page and the browser will still load it with the modified content that does not match the signature. Nothing forces the GPG signature verification with current tech.

If you mean that browsers need to be updated to verify GPG signature, I'm not sure how realistic that is. Browsers cannot verify the GPG signature and vouch for it until you solve the problem of key revocation and key expiry. If you try to solve key revocation and key expiry, you are back to the same problems that certificates have.

marginalia_nu 9 minutes ago||
Big thing that made encryption required is arguably that ISPs started injecting crap into webpages.

Governments can still track you with little issue since SNI is unencrypted. It's also very likely that Cloudflare and the like are sharing what they see as they MITM 80% of your connections.

susam 1 hour ago||
A little shell function I have in my ~/.zshrc:

  pages() { for _ in {1..5}; do curl -sSw '%header{location}\n' https://indieblog.page/random | sed 's/.utm.*//'; done }
Here is an example output:

  $ pages
  https://alanpearce.eu/post/scriptura/
  https://jmablog.com/post/numberones/
  https://www.closingtags.com/blog/home-networking
  https://www.unsungnovelty.org/gallery/layers/
  https://thoughts.uncountable.uk/now/
On macOS, we can also automatically open the random pages in the default web browser with:

  $ open $(pages)
Another nice place to discover independently maintained personal websites is: https://kagi.com/smallweb
varun_ch 1 hour ago||
A fun trend on the "small web" is the use of 88x31 badges that link to friends websites or in webrings. I have a few on my website, and you can browse a ton of small web websites that way.

https://varun.ch (at the bottom of the page)

There's also a couple directories/network graphs https://matdoes.dev/buttons https://eightyeightthirty.one/

101008 32 minutes ago||
A beautiful trend that has been going for 30 years ;-)

One of the happiest moments of my childhood (I'm exagerating) was when my button was placed in that website that I loved to visit everyday. It was one of the best validations I ever received :)

bigbuppo 19 minutes ago||
I might dox myself with it, but I proudly display the cheez-its button on my site because it's true.
8organicbits 1 hour ago||
One objection I have to the kagi smallweb approach is the avoidance of infrequently updated sites. Some of my favorite blogs post very rarely; but when they post it's a great read. When I discover a great new blog that hasn't been updated in years I'm excited to add it to my feed reader, because it's a really good signal that when they publish again it will be worth reading.
freediver 46 minutes ago||
To clarify criteria is less than 2 years since last blog post.
oopsiremembered 37 minutes ago||
I'm with you. Also, sometimes I'm specifically looking for some dusty old site that has long been forgotten about. Maybe I'm trying to find something I remember from ages ago. Or maybe I'm trying to deeply research something.

There's a lot more to fixing search than prioritizing recency. In fact, I think recency bias sometimes makes search worse.

afisxisto 54 minutes ago||
Cool to see Gemini mentioned here. A few years back I created Station, Gemini's first "social network" of sorts, still running today: https://martinrue.com/station
freediver 38 minutes ago||
Kagi Small Web has about 32K sites and I'd like to think that we have captured most of (english speaking) personal blogs out there (we are adding about 10 per day and a significant effort went into discovering/fidning them).

It is kind of sad that the entire size of this small web is only 30k sites these days.

Cyan488 15 minutes ago||
I'm noticing sites that break the rules. I report (flag) them, is that useful or should I just PR to remove them?
savolai 31 minutes ago|||
Does this use frames or iframe? https://kagi.com/smallweb

I would expect a raw link in the top bar to the page shown, to be able to bookmark it etc.

pil0u 33 minutes ago||
[dead]
shermantanktop 1 hour ago||
This is a specific definition of "small web" which is even narrower than the one I normally think of. But reading about Gemini, it does make me wonder if the original sin is client-side dynamism.

We could say: that's Javascript. But some Javascript operates only on the DOM. It's really XHR/fetch and friends that are the problem.

We could say: CSS is ok. But CSS can fetch remote resources and if JS isn't there, I wonder how long it would take for ad vendors to have CSS-only solutions...or maybe they do already?

akkartik 1 hour ago|
Yeah, CSS is Turing Complete: https://lyra.horse/x86css
upboundspiral 1 hour ago||
I think the article briefly touches on an important part: people still write blogs, but they are buried by Google that now optimizes their algorithm for monetization and not usefulness.

Anyone interested in seeing what the web when the search engines selects for real people and not SEO optimized slop should check out https://marginalia-search.com .

It's a search engine with the goal of finding exactly that - blogs, writings, all by real people. I am always fascinated by what it unearths when using it, and it really is a breath of fresh air.

It's currently funded by NLNet (temporarily) and the project's scope is really promising. It's one of those projects that I really hope succeeds long term.

The old web is not dead, just buried, and it can be unearthed. In my opinion an independent non monetized search engine is a public good as valuable as the internet archive.

So far as I know marginalia is the only project that instead of just taking google's index and massaging it a bit (like all the other search engines) is truly seeking to be independent and practical in its scope and goals.

marginalia_nu 56 minutes ago||
Thanks for shilling.

Regarding the financials, even though the second nlnet grant runs out in a few weeks, I've got enough of a war chest to work full time probably a good bit into 2029 (modulo additional inflation shocks). The operational bit is self-funding now, and it's relatively low maintenance, so if worse comes to worst I'll have to get a job (if jobs still exist in 2029, otherwise I guess I'll live in the shameful cardboard box of those who were NGMI ;-).

boxedemp 1 hour ago|||
I think that's a cool project, though I found the results to be less relevant than Google.
janalsncm 51 minutes ago||
Whether the results are less relevant or not depends massively on what you searched and whether the best results even exist in the Marginalia search index or not.

If Google is ranking small web results better than Marginalia, that’s actionable.

If the best result isn’t in the index and it should be, that’s actionable.

marginalia_nu 46 minutes ago||
Well to be fair, Marginalia is also developed by 1 guy (me), and Google has like 10K people and infinite compute they can throw at the problem. There has been definite improvements, and will be more improvements still, but Google's still got hands.
lich_king 1 hour ago||
> Google that now optimizes their algorithm for monetization and not usefulness.

I don't think they do that. Instead, "usefulness" is mostly synonymous with commercial intent: searching for <x> often means "I want to buy <x>".

Even for non-commercial queries, I think the sad reality is that most people subconsciously prefer LLM-generated or content-farmed stuff too. It looks more professional, has nice images (never mind that they're stock photos or AI-generated), etc. Your average student looking for an explanation of why the sky is blue is more interested in a TikTok-style short than some white-on-black or black-on-gray webpage that gives them 1990s vibes.

TL;DR: I think that Google gives the average person exactly the results they want. It might be not what a small minority on HN wants.

marginalia_nu 25 minutes ago|||
Google and most search engines optimize for what is most likely to be clicked on. This works poorly and creates a huge popularity bias at scale because it starts feeding on its own tail: What major search engines show you is after all a large contributor to what's most likely to be clicked on.

The reason Marginalia (for some queries) feels like it shows such refreshing results is that it simply does not take popularity into account.

BrenBarn 1 hour ago|||
> I think that Google gives the average person exactly the results they want.

There is some truth in this, but to me it's similar to saying that a drug dealer gives their customers exactly what they want. People "want" those things because Google and its ilk have conditioned them to want those things.

sdenton4 39 minutes ago||
On the one hand, a search engine is not heroin... It's a pretty broken analogy.

On the other hand, we could probably convince Cory Doctorow to write a piece about how fentanyl is really about the enshitification of opiates.

lich_king 1 hour ago||
It's easy to hand-curate a list of 5,000 "small web" URLs. The problem is scaling. For example, Kagi has a hand-curated "small web" filter, but I never use it because far more interesting and relevant "small web" websites are outside the filter than in it. The same is true for most other lists curated by individual folks. They're neat, but also sort of useless because they are too small: 95% of the things you're looking for are not there.

The question is how do you take it to a million? There probably are at least that many good personal and non-commercial websites out there, but if you open it up, you invite spam & slop.

freediver 40 minutes ago|
I mainly use Kagi Small Web as a starting point of my day, with my morning coffee. Especially now when categories are added, always find something worth reading. The size here does not present a problem as I would usually browse 20-30 sites this way.
lich_king 30 minutes ago||
Right, but that basically works as a retro alternative to scrolling through social media. If you're looking for something specific, it's simultaneously true that there's a small web page that answers your question and that it's not on any "small web" list because the owner of the webpage never submitted it there, or didn't meet the criteria for inclusion.

For example, I have several non-commercial, personal websites that I think anyone would agree are "small web", but each of them fails the Kagi inclusion criteria for a different reason. One is not a blog, another is a blog but with the wrong cadence of posts, etc.

freediver 11 minutes ago||
Feel free to suggest changes to criteria for inclusion. It is mostly the way it is now as the entire project is maintained by one person - me :)
lasgawe 1 hour ago|
mm, yeah. I like the idea of the small web not as a size category but as a mindset. people publishing for the sake of sharing rather than optimizing for attention or monetization.
rapnie 1 hour ago||
The fediverse is also generally experienced as a small web, where it comes to mindset. Though that is not always to the liking or preference of those expecting to find alternatives to big church social media platforms.
apples_oranges 1 hour ago||
Feeding llms you mean
8organicbits 1 hour ago|||
Is there a good free-but-subscriber-only solution for blogs? It seems like a contradiction, but in practice it may be manageable.
pixl97 25 minutes ago||
If it takes off in any amount, then LLMs will just subscribe and pull said data from sites at a reasonable pace (or not, it's free so make many accounts).
stronglikedan 1 hour ago|||
they gotta eat too!
More comments...