Top
Best
New

Posted by mrjaeger 4 hours ago

Meta and YouTube found negligent in landmark social media addiction case(www.nytimes.com)
290 points | 139 commentspage 3
jmyeet 3 hours ago|
I believe social media is on a collision course with an iceberg called Section 230.

Broadly speaking, Section 230 differentiates between publishers and platforms. A platform is like Geocities (back in the day) where the platform provider isn't liable for the content as long as they staisfy certain requirements about havaing processes for taking down content when required. A bit like the Cox decision today, you're broadly not responsible for the actions of people using your service unless your service is explicitly designed for such things.

A publisher (in the Section 230 sense) is like any media outlet. The publisher is liable for their content but they can say what they want, basically. It's why publishers tend to have strict processes around not making defamatory or false statements, etc.

I believe that any site that uses an algorithmic news feed is, legally speaking, a publisher acting like a platform.

Example: let's just say that you, as Twitter, FB, IG or Youtube were suddenly pro-Russian in the Ukraine conflict. You change your algorithm to surface and distribute pro-Russian content and suppress pro-Ukraine content. Or you're pro-Ukrainian and you do the reverse.

How is this different from being a publisher? IMHO it isn't. You've designed your algorithm knowingly to produce a certain result.

I believe that all these platforms will end up being treated like publishers for this reason.

So, with today's ruling about platforms creating addiction, (IMHO) it's no different to surfacing content. You are choosing content to produce a certain outcome. Intentionally getting someone addicted is funtionally no different to changing their views on something.

I actually blame Google for all this because they very successfully sold the idea that "the algorithm" ranks search results like it's some neutral black box but every behavior by an algorithm represents a choice made by humans who created that algorithm.

timdev2 2 hours ago|
Why do you believe that "Section 230 differentiates between publishers and platforms"?
jmyeet 2 hours ago||
Section 230(c)(i) [1]:

> (c) (c)Protection for “Good Samaritan” blocking and screening of offensive material

> (1) Treatment of publisher or speaker

> No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

This is a protection for being a platform for third-party (including user-generated) content.

Some more discussion on this distinction [2]:

> Section 230’s legal protections were created to encourage the innovation of the internet by preventing an influx of lawsuits for user content.

It goes on to talk about publishers, distributors and Internet Service Providers, the last of which I characterize as "platforms".

By the way, my view here isn't a fringe view [3]:

> One argument advanced by those who want to limit immunity for platforms is that these algorithms are a form of content creation, and should therefore be outside the scope of Section 230 immunity. Under this theory, social media companies could potentially be held liable for harmful consequences related to content otherwise created by a third party.

This is exactly my view.

[1]: https://www.law.cornell.edu/uscode/text/47/230

[2]: https://bipartisanpolicy.org/article/section-230-online-plat...

[3]: https://www.naag.org/attorney-general-journal/the-future-of-...

Dracophoenix 1 hour ago||
This isn't good reasoning. According to your analysis, any website, ISP, or hosting provider that uses a firewall or Cloudflare is by definition a publisher, since they algorithmically shape traffic to prohibit suspicious IP addresses from access.
jmyeet 1 hour ago||
Not at all. Intenet matters. Is Cloudfare trying to shape user behavior or push a particular position or content? No.

Just look at the Cox decision from the Supreme Court today. As long as the (Internet) service isn't designed for or sold as a method of downloading copyrighted material, the provider isn't responsible for any actions by its users. In other words, intent matters.

I find that technical people really get stuck on this aspect of the law. They look for technical compliance or an absolute proof standard because we're used to doing things like proving something works mathematically. But the law is subjective and holistic. It looks at the totality of evidence and applies a subjective test.

And intent here is fairly easy to establish. We could take an issue like Russia and look at all the posts and submissions and see how many views and interactions those posts got. We then divide them into pro-Russian and pro-Ukraine and establish a clear bias. We also look at any modifications made to the algorithm to achieve those goals.

This is nothing like Cloudfare DDoS protection.

parsimo2010 2 hours ago||
I don’t feel good about this case- on the one hand, I’m all for sticking it to big corporations. On the other hand, nobody has claimed that Meta and YouTube were doing anything illegal, so this case is different from civil suits brought after a criminal case finds someone guilty. This is a case where the jury decided they don’t like how two corporations acted, and are just giving money to one person. Why does this plaintiff in particular deserve this money?

I’ve argued in the past that the right way to create the change in corporations we want is to change the laws, and people have made valid points that Congress has basically given up on doing that. But even so, civil cases with fines don’t seem like that way to make lasting change. In the analogues to the tobacco fights, there are LAWS that regulate tobacco company behaviors as a result. The civil case here isn’t going to result in any law. So what are companies supposed to do? Tiptoe around some ill defined social boundary and hope they don’t get sued? Because apparently the defense of, “no I didn’t target that person and I didn’t break any laws” is still going to get you fined. What happens when a company from a conservative location gets sued in a liberal location for causing a social ill? Oh, we’re cool with that. But what if a company from a liberal location gets sued in a conservative location for the same thing? Oh, maybe we don’t like that as much. I’m taking the libertarian side here. I know plenty of people who don’t watch TV, don’t use Facebook, and I know plenty of people that recognized that they were spending too much time on digital platforms and decided to quit or cut back. So a healthy person can self regulate on these apps, I’ve seen it and done it. I’m just not sure how much responsibility Meta and YouTube bear in my mind. If they’re getting fined $3M plus some TBD punitive amount, are we saying that this 20 year old person lost out on earning that much money in their life or would need to spend $3M on therapy because of Meta or YouTube? It feels a little steep off a fine for one person.

If Meta and YouTube really were/are making addictive products, wouldn’t a lot more people be harmed? Shouldn’t this be a class action suit where anyone with mental trauma or depression be included?

I don’t know the details of the case, but I highly doubt that this one plaintiff was targeted specifically, and I doubt their case is that unique. I read tons of news articles about cyber bullying, depression, suicide attempts, and tech addiction. Does every one get to sue Meta and YouTube for $3M now?

lokar 59 minutes ago|
The case was brought under product liability law.

If I sell you gizmo, and I know, or should know, that using the gizmo could seriously harm you, and I don't tell you or do anything about it, I am liable for damages you incur.

ChrisArchitect 4 hours ago||
Notably a different case from the other one in New Mexico:

Jury finds Meta liable in case over child sexual exploitation on its platforms

https://news.ycombinator.com/item?id=47509984

SpicyLemonZest 4 hours ago|
And one with much deeper implications on how they operate. It's easy for Meta to just hire more moderators or treat reports of exploitation with higher priority; if this verdict stands, I think they have no realistic choice but to abandon usage targets.
aprilthird2021 3 hours ago||
Realistically they will hire expensive lawyers, pay out hundreds of millions to billions in settlements, fire lots of people (workforce is predominantly American), etc.

Even if they do what you're saying, lots of people who've used any Meta property in the last 15 years has a potentially viable case, and no future work can swat those away

2OEH8eoCRo0 3 hours ago||
Huge if upheld. This was the bellwether case for thousands of other similar cases.
Handy-Man 3 hours ago||
IMO, parents share just as much blame here, if not more. Giving your kids independence doesn't mean being oblivious to what they're doing online. Too many parents confuse hands-off parenting with not parenting at all.
bluedevil2k 3 hours ago||
Have you met kids? They’re devious, tech knowledgeable, and scheming and can find ways around any rule. Plus, no matter how good of a parent you are, you’re somewhat at the mercy of their friends’ parents as well. I can block TikTok from my daughter’s phone, but can’t block her from watching her friend’s phone while she’s out of the house.
intended 2 hours ago||
I dont think parents going up against psychologists, data scientists, product managers and software engineers with the best pay in the world is any kind of fair fight.
aprilthird2021 3 hours ago||
I can't help but feel these are "revenge" verdicts. Public perception of these companies is dirt low, and there are so few levers the average person has to change what they feel is an increase in atomization, loneliness, breakdown of civic discourse, Cambridge Analytica level political targeting, misinformation, etc.

Maybe the social media companies could do more to combat all these. They certainly have a level of profit compared to what they provide to the average person that makes people squirm.

But does anyone believe for a second that YouTube is responsible for a person's internet / video watching addiction? It's like saying cable television is responsible for people who binge watch TV.

It's hard to square this circle while sports gambling apps and Polymarket / Kalshi are tearing through the landscape right now with no real pushback

bitwank 3 hours ago|
>But does anyone believe for a second that YouTube is responsible for a person's internet / video watching addiction?

Yes? Is there an algorithm or not?

edwardsrobbie 4 hours ago||
[dead]
apopapo 4 hours ago|
Will they also find liable all the companies that produce addictive food by injecting sugar into everything?

What about the "infinite" broadcasts found on all television channels?

This is ridiculous and pathetic.

btmiller 2 hours ago|||
A full sentence answer for you: yes.
BoredPositron 4 hours ago|||
In other countries that's the case so I don't know why it shouldn't be applicable in the US?
richwater 3 hours ago||
People provide proof that other companies apply punitive damages to food companies knowinly adding sugar to food
BoredPositron 3 hours ago||
8 countries in Europe, 4 in South America and 3 in Asia have an added sugar tax. So yes they did.
pixl97 3 hours ago||
"Libertarian demands companies have unlimited freedom until a corporation with unlimited freedom repeatedly eats their face with no consequences, wonders why the face eating leopards they voted for are actually allowed"