Top
Best
New

Posted by markatlarge 4 days ago

A Developer Accidentally Found CSAM in AI Data. Google Banned Him for It(www.404media.co)
120 points | 93 commentspage 2
bsowl 4 days ago|
More like "A developer accidentally uploaded child porn to his Google Drive account and Google banned him for it".
jkaplowitz 4 days ago||
The penalties for unknowingly possessing or transmitting child porn are far too harsh, both in this case and in general (far beyond just Google's corporate policies).

Again, to avoid misunderstandings, I said unknowingly - I'm not defending anything about people who knowingly possess or traffic in child porn, other than for the few appropriate purposes like reporting it to the proper authorities when discovered.

jjk166 4 days ago|||
The issue is that when you make ignorance a valid defense, the optimal strategy is to deliberately turn a blind eye, as it reduces your risk exposure. It further gives refuge for those who can convincingly feign ignorance.

We should make tools readily available and user friendly so it is easier for people to detect CSAM that they have unintentionally interacted with. This both shields the innocent from being falsely accused, and makes it easier to stop bad actors as their activities are detected earlier.

pixl97 3 days ago||
No, it should be law enforcement job to determine intent, not a blanket you're guilty. This being Actus Reus is a huge mess that makes it easy to frame people and get in trouble with no guilty act.
jjk166 3 days ago||
Determining intent takes time, is often not possible, and encourages people to specifically avoid the work to check if something needs to be flagged. Not checking is at best negligent. Having everybody check and flag is the sensible option.
pixl97 3 days ago||
Ah, yes, the assume everyone is guilty and let god sort them out method. Authoritarians love it.
jjk166 1 day ago||
Everyone in this case meaning "people demonstrated to be in possession of child porn who took no action". And they are not assumed guilty, they are exactly as innocent as anyone with a dead body in their fridge that they also "had no idea about."
burnt-resistor 4 days ago|||
That's the root problem with all mandated, invasive CSAM scanning. (Non-signature based) creates an unreasonable panopticon that leads to lifelong banishment by imprecise, evidence-free guessing. It also hyper-criminalizes every parent who accidentally takes a picture of their kid without being fully dressed. And what about DoS victims who are anonymously sent CSAM without their consent to get them banned for "possession"? While pedo is gross and evil no doubt, but extreme "think of the children" measures that sacrifice liberty and privacy create another evil that is different. Handing over total responsibility and ultimate decision-making for critical matters to a flawed algorithm is lazy, negligent, and immoral. There's no easy solution to any such process, except requiring human review should be the moral and ethical minimum standard before drastic measures (human in the loop (HITL)).
jmogly 3 days ago||
On one hand, I would like to say this could happen to anyone, on the other hand, what the F?? why are people passing around a dataset that contains child sexual abuse material??, and on another hand, I think this whole thing just reeks of techy-bravado, and I don’t exactly blame him. If one of the inputs of your product (OpenAI, google, microsoft, meta, X) is a dataset that you can’t even say for sure does not contain child pornography, that’s pretty alarming.
UberFly 4 days ago|
Posting articles that are paywalled is worthless.
stronglikedan 3 days ago||
Finding the non-paywalled link in the comments is trivial.