Posted by mostcallmeyt 6 hours ago
One thing that would be on my personal wish list for any Wikipedia alternative is ease of machine processing: the MediaWiki format/mark-up and the templates are horribly inconsistent and a nightmare to parse. This should be done better by any serious successor. Wikipedia has got the excuse "historically grown", any successor doesn't.
Do you mean something different, e.g. not just structured data?
In the past, when there have been Wikipedia forks, they haven't generally tried to stay in sync with Wikipedia, at least not in both directions. Do we have an example of long-term forks of collaborative software or text editing projects that did manage to keep sharing productively in multiple directions? Maybe the BSDs to some extent?
I wonder how much work people are willing to do to keep actively collaborating with people whom they have big ongoing disagreements with (at least in areas where those disagreements don't have an impact). Or can such collaboration be made relatively seamless with appropriate tooling?
Wikipedias whole value function, don't get me wrong it's a great value function, is that it is a curated centralized web.
You can find the article reposted here:
https://helenofdestroy.substack.com/p/49-wikipedia-rotten-to...
I think many platforms would benefit from implementing that. Especially git forges, though I think someone is already working on that.
The difference between a wiki and a social media network is that anyone can spin up a template social media site; the fundamental user-side barrier to entry is pretty small. The same is not true of wikis - at least not high quality ones. Documentary standards, tone, quality, reviewership, consistency, policy, moderation, accountability, leadership, thoroughness, these are all qualities that take time and commitment to develop. They are hallmarks of centralization for a reason: arguably the innovation of human governance is centered around qualities like these. They take a long time to develop.
As a counterpart to Wikipedia, well... fragmentation is often a death knell for efficient knowledge transfer. We are already losing massive swathes of our early Internet history due to fragmentation, attrition, and destruction. The thought that any piece of knowledge stored in a safehouse could go offline at once, without replication or warning, it scares me a bit. The thought that we don't really know who we're trusting as stewards of human knowledge in a federated model disturbs me too. You can have your issues with Wikipedia but at least you know who they are. You know their biases.
That's not to say there aren't use cases for this... but man, this seems like an easy way to lose or destroy important parts of our shared history on accident.