Top
Best
New

Posted by vkuprin 3 hours ago

Show HN: I built a tool that watches webpages and exposes changes as RSS(sitespy.app)
I built Site Spy after missing a visa appointment slot because a government page changed and I didn’t notice for two weeks.

It watches webpages for changes and shows the result like a diff. The part I think HN might find interesting is that it can monitor a specific element on a page, not just the whole page, and it can expose changes as RSS feeds.

So instead of tracking an entire noisy page, you can watch just a price, a stock status, a headline, or a specific content block. When it changes, you can inspect the diff, browse the snapshot history, or follow the updates in an RSS reader.

It’s a Chrome/Firefox extension plus a web dashboard.

Main features:

- Element picker for tracking a specific part of a page

- Diff view plus full snapshot timeline

- RSS feeds per watch, per tag, or across all watches

- MCP server for Claude, Cursor, and other AI agents

- Browser push, Email, and Telegram notifications

Chrome: https://chromewebstore.google.com/detail/site-spy/jeapcpanag...

Firefox: https://addons.mozilla.org/en-GB/firefox/addon/site-spy/

Docs: https://docs.sitespy.app

I’d especially love feedback on two things:

- Is RSS actually a useful interface for this, or do most people just want direct alerts?

- Does element-level tracking feel meaningfully better than full-page monitoring?

43 points | 15 comments
tene80i 16 minutes ago|
RSS is a useful interface, but: "Do most people just want direct alerts?" Yes, of course. RSS is beloved but niche. Depends who your target audience is. I personally would want an email, because that's how I get alerts about other things. RSS to me is for long form reading, not notifications I must notice. The answer to any product question like this totally depends on your audience and their normal routines.
xnx 1 hour ago||
I like https://github.com/dgtlmoon/changedetection.io for this. Open source and free to run locally or use their Saas service.
vkuprin 49 minutes ago||
Yep, changedetection.io is a good project. With Site Spy, I wanted to make the browser-first workflow much easier: install the extension, connect it to the dashboard, click the exact part of the page you care about, and then follow changes as diffs, history, or RSS with very little setup. I can definitely see why the open-source / self-hosted route is appealing too.
raphman 48 minutes ago|||
There's also https://github.com/thp/urlwatch/ - (not aware of any SaaS offer - self-hosted it is).
vkuprin 42 minutes ago||
Yep, urlwatch is a good one too. This category clearly has a strong self-hosted tradition. With Site Spy, what I’m trying to make much easier is the browser-first flow: pick the exact part of a page visually, then follow changes through diffs, history, RSS, and alerts with very little setup
beepbooptheory 7 minutes ago|||
Sure but this one has a MCP server, costs money, and was presumably made last night!
pelcg 43 minutes ago||
Looks cool and this can be self hosted and it is for free.

Nice will try this out!

enoint 1 hour ago||
Quick feedback:

1. RSS is just fine for updates. Given the importance of your visa use-case, were you thinking of push notifications?

2. Your competition does element-level tracking. Maybe they choose XPath?

vkuprin 52 minutes ago|
Yep, Site Spy already has push notifications, plus email and Telegram alerts. I see RSS as the open interface for people who want to plug updates into their own reader or workflow. For urgent things like visa slots or stock availability, direct alerts are definitely the main path.

And yeah, element-level tracking isn't a brand new idea by itself. The thing I wanted to improve was making it easy to pick the exact part of a page you care about and then inspect the change via diffs, history, or RSS instead of just getting a generic "page changed" notification

hinkley 11 minutes ago||
Back in 2000 I worked for a company that was trying to turn something like this into the foundation for a search engine.

Essentially instead of having a bunch of search engines and AI spamming your site, the idea was that they would get a feed. You would essentially scan your own website.

As crawlers grew from an occasional visitor to an actual problem (an inordinate percent of all consumer traffic at the SaaS I worked for was bots rather than organic traffic, and would have been more without throttling) I keep wondering why we haven’t done this.

Google has already solved the problem of people lying about their content, because RSS feeds or user agent sniffing you can still provide false witness to your site’s content and purpose. But you’d only have to be scanned when there was something to see. And really you could play games with time delays on the feed to smear out bot traffic over the day if you wanted.

bananaflag 53 minutes ago||
Very good!

This is something that existed in the past and I used successfully, but services like this tend to disappear

vkuprin 45 minutes ago|
That’s a completely fair concern. Services in this category do need to earn trust over time. I built the backend to handle a fair amount of traffic, so I’m not too worried about growth on that side. My goal is definitely to keep this running for the long term, not treat it like a one-off project
makepostai 1 hour ago||
This is interesting, gonna try it on our next project! thumb up
digitalbase 1 hour ago||
Cool stuff. You should make it OSS and ask a one time fee for it. I would run it on my own infra but pay you once(.com)
pwr1 1 hour ago|
Interesting... added to bookmarks. Could come in handy in the future