Top
Best
New

Posted by simedw 7/1/2025

Show HN: Spegel, a Terminal Browser That Uses LLMs to Rewrite Webpages(simedw.com)
426 points | 180 commentspage 2
ohadron 7/1/2025|
This is a terrific idea and could also have a lot of value with regards to accessibility.
taco_emoji 7/1/2025|
The problem, as always, is that LLMs are not deterministic. Accessibility needs to be reliable and predictable above all else.
Jotalea 7/2/2025||
Insanely resource expensive, but still a very interesting "why not?" idea. I think a fitting use case would be adapting newer websites for them to work on older hardware. That is, assuming the new technologies used are not vital to the functionality of the website (ex. Spotify, YouTube, WhatsApp) and can be adapted to older technologies (ex. Google Search, from all the styles that it has, to a simple input and a button).

In theory this could be used for ad blocking; though more expensive and less efficient, but the idea is there.

So, it is a very curious idea, but we still have to find an appropriate use case.

hyperific 7/1/2025||
Why not use pandoc to convert html to markdown and have the LLM condense from there?
cheevly 7/1/2025||
Very cool! My retired AI agent transformed live webpage content, here's an old video clip of transforming HN to My Little Pony (with some annoying sounds): https://www.youtube.com/watch?v=1_j6cYeByOU. Skip to ~37 seconds for the outcome. I made an open-source standalone Chrome extension as well, it should probably still work for anyone curious: https://github.com/joshgriffith/ChromeGPT
js2 7/2/2025||
I was unfamiliar with Textual which seems more interesting than Slowly Braised Lamb Ragu:

https://textual.textualize.io/

mossTechnician 7/1/2025||
Changes Spegel made to the linked recipe's ingredients:

Pounds of lamb become kilograms (more than doubling the quantity of meat), a medium onion turns large, one celery stalk becomes two, six cloves of garlic turn into four, tomato paste vanishes, we lose nearly half a cup of wine, beef stock gets an extra ¾ cup, rosemary is replaced with oregano.

simedw 7/1/2025||
Fantastic catch! It led me down a rabbit hole, and I finally found the root cause.

The recipe site was so long that it got truncated before being sent to the LLM. Then, based on the first 8000 characters, Gemini hallucinated the rest of the recipe, it was definitely in its training set.

I have fixed it and pushed a new version of the project. Thanks again, it really highlights how we can never fully trust models.

jugglinmike 7/1/2025|||
Great catch. I was getting ready to mention the theoretical risk of asking an LLM be your arbiter of truth; it didn't even occur to me to check the chosen example for correctness. In a way, this blog post is a useful illustration not just of the hazards of LLMs, but also of our collective tendency to eschew verity for novelty.
andrepd 7/1/2025|||
> Great catch. I was getting ready to mention the theoretical risk of asking an LLM be your arbiter of truth; it didn't even occur to me to check the chosen example for correctness.

It's beyond parody at this point. Shit just doesn't work, but this fundamental flaw of LLMs is just waved away or simply not acknowledged at all!

You have an algorithm that rewrites textA to textB (so nice), where textB potentially has no relation to textB (oh no). Were it anything else this would mean "you don't have an algorithm to rewrite textA to textB", but for gen ai? Apparently this is not a fatal flaw, it's not even a flaw at all!

I should also note that there is no indication that this fundamental flaw can be corrected.

throwawayoldie 7/2/2025|||
> the theoretical risk of asking an LLM be your arbiter of truth

"Theoretical"? I think you misspelled "ubiquitous".

orliesaurus 7/1/2025|||
oh damn...
achierius 7/1/2025||
Did you actually observe this, or is just meant to be illustrative of what could happen?
mossTechnician 7/1/2025||
This is what actually happened in the linked article. The recipe is around the text that says

> Sometimes you don't want to read through someone's life story just to get to a recipe... That said, this is a great recipe

I compared the list of ingredients to the screenshot, did a couple unit conversions, and these are the discrepancies I saw.

hambes 7/2/2025||
I've thought about getting a web browser to work on the terminal for a while now. This is an idea that hadn't occured to me yet and I'm intrigued.

But I feel it doesn't solve the main issue of terminal-based web browsing. Displaying HTML in the terminal is often kind of ugly and css-based fanciness does not work at all, but that can usually just be ignored. The main problem is javascript and dynamic content, which this approach just ignores.

So no real step forward for cli web browsing, imo.

deadbabe 7/1/2025||
I would like to see a version of this where an LLM just takes the highlights of various social media content from your feed and just gives you the stuff worth watching. This also means excluding crap you had no interest in and was simply inserted into your feed. Fight algorithms with algorithms. Eliminate doom scrolling.
adrianpike 7/1/2025||
Super neat - I did something similar on a lark to enable useful "web browsing" over 1200 baud packet - I have Starlink back at my camp but might be a few miles away, so as long as I can get line of sight I can Google up stuff, albeit slow. Worked well but I never really productionalized it beyond some weekend tinkering.
coder543 7/1/2025|
Just a typo note: the flow diagram in the article says "Gemini 2.5 Pro Lite", but there is no such thing.
simedw 7/1/2025|
You are right, it's Gemini 2.5 Flash Lite
More comments...