Top
Best
New

Posted by ibotty 10/28/2024

The sins of the 90s: Questioning a puzzling claim about mass surveillance(blog.cr.yp.to)
243 points | 193 commentspage 2
convivialdingo 10/28/2024|
I used to work with the guy who was named by DJB in the crypto export case which removed the restrictions. IIRC, the NSA guy used to be his student!
ForHackernews 10/28/2024||
I haven't seen the talk, but it sounds plausible to me: Technical people got strong crypto so they didn't worry about legislating for privacy.

We still have this blind spot today: Google and Apple talk about security and privacy, but what they mean by those terms is making it so only they get your data.

MattJ100 10/28/2024|
> Technical people got strong crypto so they didn't worry about legislating for privacy.

The article debunks this, demonstrating that privacy was a primary concern (e.g. Cypherpunk's Manifesto) decades ago. Also that mass surveillance was already happening even further back.

I think it's fair to say that security has made significantly more progress over the decades than privacy has, but I don't think there is evidence of a causal link. Rather, privacy rights are held back because of other separate factors.

thecrash 10/28/2024|||
As you point out, decades ago privacy was a widespread social value among everyone who used the internet. Security through cryptography was also a widespread technical value among everyone (well at least some people) who designed software for the internet.

Over time, because security and cryptography were beneficial to business and government, cryptography got steadily increasing technical investment and attention.

On the other hand, since privacy as a social value does not serve business or government needs, it has been steadily de-emphasized and undermined.

Technical people have coped with the progressive erosion of privacy by pointing to cryptography as a way for individuals to uphold their privacy even in the absence of state-protected rights or a civil society which cares. This is the tradeoff being described.

ForHackernews 10/28/2024|||
> demonstrating that privacy was a primary concern (e.g. Cypherpunk's Manifesto) decades ago. Also that mass surveillance was already happening even further back.

How does that debunk it? If they were so concerned, why didn't they do anything about it?

One plausible answer: they were mollified by cryptography. Remember when it was revealed that the NSA was sniffing cleartext traffic between Google data centers[0]? In response, rather than campaigning for changes to legislation (requiring warrants for data collection, etc.), the big tech firms just started encrypting their internal traffic. If you're Google and your adversaries are nation state actors and other giant tech firms, that makes a lot of sense.

But as far as user privacy goes, it's pointless: Google is the adversary.

[0] https://theweek.com/articles/457590/why-google-isnt-happy-ab...

MattJ100 10/29/2024|||
I think it's a bit dismissive to claim that "they didn't do anything about it", just because you're not living in a perfect world right now.

As one prominent example, the EFF has been actively campaigning all this time: "The Electronic Frontier Foundation was founded in July of 1990 in response to a basic threat to speech and privacy.". A couple of decades later, the Pirate Party movement probably reached its peak. These organizations are political activism, for digital rights and privacy, precisely by the kind of people who are here accused of "doing nothing".

In a few decades, people will probably look back on this era and ask why we didn't do anything about it either.

warkdarrior 10/28/2024|||
Sure, that line of thinking makes sense, but I do not understand the alternative. Are you saying that if we (the users) got new legislation (e.g., requiring warrants), then big tech wouldn't do mass surveillance anymore?
ForHackernews 10/29/2024|||
Yes, I think if there were laws that forbid mass data collection by private companies, or assessed sufficiently high penalties in the case of a breach (such that keeping huge troves of PII became a liability rather than an asset) then big tech firms would largely obey those laws.
immibis 10/29/2024|||
I think they're saying if they couldn't do cryptography they'd push for legislation.
RamAMM 10/28/2024||
The missed opportunity was to provide privacy protection before everyone stepped into the spotlight. The limitations on RSA key sizes etc (symmetric key lengths, 3DES limits) did not materially affect the outcomes as we can see today. What did happen is that regulation was passed to allow 13 year olds to participate online much to the detriment of our society. What did happen was that business including credit agencies leaked ludicrous amounts of PII with no real harm to the bottom lines of these entities. The GOP themselves leaked the name, SSN, sex, and religion of over a hundred million US voters again with no harm to the leaking entity.

We didn't go wrong in limiting export encryption strength to the evil 7, and we didn't go wrong in loosening encryption export restrictions. We entirely missed the boat on what matters by failing to define and protect the privacy rights of individuals until nearly all that mattered was publicly available to bad actors through negligence. This is part of the human propensity to prioritize today over tomorrow.

elric 10/28/2024|
> What did happen is that regulation was passed to allow 13 year olds to participate online much to the detriment of our society.

That's a very hot take. Citation needed.

I remember when the US forced COP(P?)A into being. I helped run a site aimed at kids back in those days. Suddenly we had to tell half of those kids to fuck off because of a weird and arbitrary age limit. Those kids were part of a great community, had a sense of belonging which they often didn't have in their meatspace lives, they had a safe space to explore ideas and engage with people from all over the world.

But I'm sure that was all to the detriment of our society :eyeroll:.

Ad peddling, stealing and selling personal information, that has been detrimental. Having kids engage with other kids on the interwebs? I doubt it.

ryandrake 10/28/2024|||
Kids are not stupid, though. They know about the arbitrary age limit, and they know that if they are under that limit, their service is nerfed and/or not allowed. So, the end effect of COPPA is that everyone under 13 simply knows to use a fake birthdate online that shows them to be over the limit.
elric 10/28/2024|||
Sure, it's one of the many rules that's bent and broken on a daily basis. Doesn't make it any less stupid. And it falls on the community owner to enforce, which is doubly stupid, as the only way to prove age is to provide ID, which requires a lot of administration, and that data then becomes a liability.
RamAMM 10/29/2024||
If you care about something (say a child from the guardians perspective or perhaps a business from the owner's perspective) you find solutions.
RamAMM 10/29/2024||||
I was one of those kids at one point. In meatspace we have ways to deal with it and online we do as well. Of course if there is no risk to a business then they will put no resources into managing that risk.
1oooqooq 10/29/2024|||
ah to be 13 and having to lie about being 30 to not be banned from some game. so later you can be 30 and lie about being 13 to be able to play without too much ads.
dfxm12 10/28/2024||||
COP(P?)A

COPA [0] is a different law which never took effect. COPPA [1] is what you're referring to.

Ad peddling, stealing and selling personal information, that has been detrimental.

I agree and what's good for the gander is good for the goose. Why did we only recognize the need for privacy for people under an arbitrary age? We all deserve it!

0 - https://en.wikipedia.org/wiki/Child_Online_Protection_Act

1 - https://en.wikipedia.org/wiki/Children%27s_Online_Privacy_Pr...

RamAMM 10/29/2024||||
>Ad peddling, stealing and selling personal information, that has been detrimental.

So we agree on this part.

> What did happen is that regulation was passed to allow 13 year olds to participate online much to the detriment of our society.

My claim is that if "we" hadn't allowed 13 year olds to sign away liabilities when they registered on a website there would be fewer minors using social media in environments that are mixed with adults; more specifically guardians of minors would be required to decide if their kids should have access and in doing so would provide the correct market feedback to ensure that sites of great value to minors (education resources being top of mind for me) would receive more market demand and at the same time social platforms would have less impact on children as there would be fewer kids participating in anti-nurturing environments.

burningChrome 10/28/2024||||
>> Having kids engage with other kids on the interwebs? I doubt it.

Unless those kids aren't interacting with kids at all, but instead pedo's masquerading as kids for nefarious reasons. Which yes, has been VERY detrimental to our society.

elric 10/28/2024||
Nah. I'm not buying it. What's the rate of kids interacting with pedos instead of other kids?

Knee-jerk responses like yours, and "what about the children"-isms in general are likely more detrimental than actual online child abuse. Something about babies and bathwater.

umanwizard 10/29/2024||||
I remember routinely clicking on some checkbox to say I was over 13 well before I was actually over 13. I'm sure most of the kids who actually cared about being on your site were still on it after the ban.
bippihippi1 10/28/2024|||
the issue with online kids isn't just the availability of the internet to kids but the availability of the kids to the internet
ikmckenz 10/28/2024|
This is a good article, and throughly debunks the proposed tradeoff between fighting corporate vs government surveillance. It seems to me that the people who concentrate primarily on corporate surveillance primarily want government solutions (privacy regulations, for example), and eventually get it in their heads that the NSA are their friends.