Posted by tosh 12 hours ago
However, I skip permashortlinks - I try to keep my regular links relevant and short. Also, I like seeing full links, they can often indicate what content awaits there - vs short links, which are more opaque.
That's one more benefit of this workflow: it can be adjusted to fit one's personal preferences. I suppose others might prefer short links or maybe at some point I'll change my mind; with POSSE making these kind of changes is easy.
https://indieweb.org/permashortlink does give a few reasons, but they’re bunk. “More reliable in email”? Not meaningfully so. “Quicker to recall / copy due to size”? Not typically a concern. Maybe a nice-to-have, but you can consider adjusting your URL style, then it can be even better. “Less effort to manually enter”? Repeat of the previous point.
And it doesn’t address the problems of the permashortlink. Cost. Diluting across different domains. Having something different to maintain and remember.
Don’t do separate permashortlinks. Just fix your regular links to not be bad.
However I am not sure about "perma-shortlinks", for discovery on other sites as the means of networking and discovering content. It seems clunky to maintain as it requires a human or some automation to curate/maintain the links. If a blog removes a link to another blog, then that pathway is closed.
It would be cool if we could solve that with a "DNS for tags/topics" a - Domain Content Server (DCS) e.g.
1. tomaytotomato.com ==> publishes to the DCS of topics (tech, java, travel)
2. DCS adds domain to those topics
3. Rating or evaluating of the content on website based on those tags (not sure the mechanics here, but it could be exploited or gamed)
You could have several DCS for topics servers run by organisations or individuals.
e.g. the Lobsters DNS for topics server would be really fussy about #tech or #computerscience blog posts, and would self select for more high brow stuff
Meanwhile a more casual tech group would score content higher for Youtube content or Toms Hardware articles.
This is just spit balling.
The whole point of syndication is that it's curated by humans (you, if it's your own feed).
A social media feed implies 1(n) curated by 1 algorithm hosted on Facebook/Twitter/Instagram
What I was thinking is:
- foo.social
- bar.social (tech curations)
- java.bar.social (sub curated Java list)
All these DCS (domain content servers) would be polled by your own local client
Your client can then aggregate or organise how it shows this feed
e.g. I could have a trending aggregator for situations where a blog post is shown on multiple domains (sort of shows virality)
There's an RSS and JSON feed for each collection and a combined feed as well.
[0]: https://web.archive.org/web/20160904131420/https://indieweb....