Posted by foresto 3 hours ago
It's becoming increasingly apparent that if you don't use something truly free and open source and host it yourself, you're just setting yourself up for more of this sort of thing.
You can't trust anyone to properly handle the problem of "how the hell do we keep creeps the f*ck away from kids?" with any amount of common sense.
https://telegra.ph/why-not-matrix-08-07
There are even custom message/media types that people use to upload hidden content you can't see even if you're joined to the same channel using a typical client.
Edit: It seems I've suddenly been rate–limit banned.
19. "media downloads are unauthenticated by default" -> fixed in Jun 2024: https://matrix.org/blog/2024/06/26/sunsetting-unauthenticate...
20. "ask someone else’s homeserver to replicate media" -> also fixed by authenticated media
21. "media uploads are unverified by default" - for E2EE this is very much a feature; running file transfers through an antivirus scanner would break E2EE. (Some enterprisey clients like Element Pro do offer scanning at download, but you typically wouldn't want to do it at upload given by the time people download the AV defs might be stale). For non-encrypted media, content can and is scanned on upload - e.g. by https://github.com/matrix-org/synapse-spamcheck-badlist
22. "all it takes is for one of your users to request media from an undesirable room for your homeserver to also serve up copies of it" - yes, this is true. similarly, if you host an IMAP server for your friends, and one of them gets spammed with illegal content, it unfortunately becomes your problem.
In terms of "invisible events in rooms can somehow download abusive content onto servers and clients" - I'm not aware of how that would work. Clients obviously download media when users try to view it; if the event is invisible then the client won't try to render it and won't try to download the media.
Nowadays many clients hide media in public rooms, so you have to manually click on the blurhash to download the file to your server anyway.
https://news.ycombinator.com/item?id=46982421
https://tech.yahoo.com/social-media/articles/now-bypass-disc...
The 3D model method might work on Persona, but that demo only shows it fooling K-IDs classifier.
https://piunikaweb.com/2026/02/12/discord-uk-age-verificatio...
Moderation and centralization while typically aren't independent, aren't necessarily dependent. One can imagine viewing content with one set of moderation actions and another person viewing the same content with a different set of moderation actions.
We sort of have this in HN already with viewing flagged content. It's essentially using an empty set for mod actions.
I believe it's technically viable to syndicate of mod actions and possibly solves the mod.labor.prpbl, but whether it's a socially viable way to build a network is another question.
It seems far too risky to sign up on a service for the purpose of intercommunication that is able (or even likely) to burn bridges with another for any reason at any time. In the end people will just accumulate on 2 or 3 big providers and then you have pseudo-federation anyway.
However all the LGBT+ friendly servers federate with each other and that's good enough for me. I like not having to see toxicity, there's too much of it in the world already.
"People will just accumulate on 2 or 3 big providers" is far from an inevitable circumstance, but there are conditions that make it more likely. That, too, is largely down to negligence or malice (but less so than the abusive communications problem).
Is that still true? As the admin of a small instance, I find the abuse coming from mastodon.social has been really low for a few years. There is the occasional spammer, but they often deal with it as quickly as I do.
> It's neutral to this topic, it's about tech.
this thread began by xe bringing up failures in moderation affecting trans people
Asking trans people to ignore this is like asking Jews to be comfortable in a bar where only ten percent of the patrons are Nazis. Arguing that "well not everyone is a Nazi" doesn't help, an attitude of "we're neutral about Nazis, we serve drinks to anyone" still makes it a Nazi bar, just implicitly rather than explicitly.
We do discuss all kinds of different topics here. Despite what many people here want to believe, Hacker News isn't exclusively for tech and tech-related subjects.
>and pretty sure anything openly transphobic would be flagged or deleted pretty soon.
But not banned, that's the problem. The guidelines are extremely pedantic but nowhere is bigotry, racism, antisemitism or transphobia mentioned as being against those guidelines. You might say that shouldn't be necessary, but it's weird that so much effort is put into tone policing specific edge cases but the closest the guidelines come to defending marginalized groups is "Please don't use Hacker News for political or ideological battle. It tramples curiosity." Transphobia is treated as a mere faux pas on the same par as being too snarky, or tediously repetitive. The real transgression being not the bigotry but "trampling curiosity." Any trans person who posts here knows that bigots who hate them and want to do them harm aren't going to suffer meaningful consequences (especially if they just spin up a green account) and that the culture here isn't that concerned about their safety.
Read the green account just below me. That sort of thing happens all the time. Yes, the comment is [dead] but why should a trans person be comfortable here, or consider themselves welcome, knowing that this is the kind of thing they'll encounter?
No thanks, there are other services I can use.