Posted by robin_reala 3/29/2025

237 points | 66 comments
scq 3/29/2025|
This seems like a bug in the ScreenAI service? There's no evidence whatsoever for his claim that Google "trains a machine vision model on the contents of my screen".

According to https://chromium.googlesource.com/chromium/src/+/main/servic... it is just inference.

> These functionalities are entirely on device and do not send any data to network or store on disk.

There is also this description in the Chrome OS source code:

> ScreenAI is a binary to provide AI based models to improve assistive technologies. The binary is written in C++ and is currently used by ReadAnything and PdfOcr services on Chrome OS.

yard2010 3/29/2025||
You're paraphrasing a google source, it might be biased or just lie, they need the money, after all, they can't risk it.
scq 3/30/2025||
You can read the source for the integration, it lines up exactly with what I said.

You can make up any conspiracy theory you want, but there's no evidence for it.

bri3d 3/29/2025||
This. Go to chrome://flags and disable “Enable OCR For Local Image Search” and I bet the problem goes away.

It’s a stupid feature for Google to enable by default on systems that are generally very low spec and badly made, but it’s not some evil data slurp. One of the most obnoxious things about enshittification is the corrosive effect it seems to have had on technical users’ curiosity: instead of researching and fixing problems, people now seem very prone to jump to “the software is evil and bad” and give up at doing any kind of actual investigation.

NoNotTheDuo 3/29/2025|||
> but it’s not some evil data slurp.

Not yet anyway. We’ve just seen Amazon change how all Echo’s/Alexa’s operate. It has been local-only for years and years, but now they want the audio data, so they’ve changed the Terms of Service. There’s no reason to believe Google won’t do the same thing sometime in the future.

simpaticoder 3/29/2025||
"Trauma" is when one horrible experience lowers your danger threshold so much that it triggers on everything, and becomes useless and harmful. "Learning" is when new threat awareness lowers the threshold an 'appropriate amount'. Even if the GP was strictly wrong about their conclusion, in my personal opinion they are quite right to remain vigilant.

Note to parent: it is strictly unfair to lump Google in with Amazon (and if you demonize a good actor long enough, eventually they'll aquiesce since they are already paying the reputational price). However given that they are American corporations operating on similar incentives during the Wild West (or World War) of AI aka WWAI, it makes sense to be suspicious. Heaven knows "reputational downside" is just about the only counter-veiling incentive left, since Trump has stripped consumers and investors of virtually all legal protection (see: CFPB elimination; SEC declines Hawk Tua coin grift prosecution; Trump pardons Trevor Milton). I think it is an excellent time for all of us to be extremely careful with the software we use.

broknbottle 3/29/2025||
Google is an Advertisement company. Everything they do revolves around slurping up the most valuable data to better identify people and be able to identify trends. They’ve become increasingly less and less open as year goes by and they still haven’t found their next big cash cow to offset decline to their current cash cow.
mystified5016 3/29/2025||||
> it’s not some evil data slurp

This puts a dangerous amount of trust onto a company which has very clearly and explicitly signaled to everyone for decades that they do not care one iota about you, your privacy, or your safety.

Assuming that Google isn't doing anything malicious is a very unwise and ill-informed stance to take. If it isn't malicious now, it will be very soon. Absolutely no exceptions.

_Algernon_ 3/29/2025||||
Learned helplessness is a common symptom of abuse. Not surprising that we would see it here as well.
bbarnett 3/29/2025||||
It’s a stupid feature for Google to enable

Enter Google 2025!

No longer just terrible search due to lack of care, and conflict of interest.

Instead, now terrible search due to AI, terrible everything due to AI, pushed everywhere and everyplace, degrading and reducing capabilities ecosystem wide.

Ridiculous and just often wrong AI gibberish on search pages, Android camera apps that blur people's faces when trying to "enhance" pics you take, and of course replacing OCR stuff that works well, with some half finished buggy AI junk.

From their doctored and made up AI demos, to an inability to make anything stable or of quality, Google has turned from world class to Nikola in a short couple of years.

TeMPOraL 3/29/2025||||
> One of the most obnoxious things about enshittification is the corrosive effect it seems to have had on technical users’ curiosity: instead of researching and fixing problems, people now seem very prone to jump to “the software is evil and bad” and give up at doing any kind of actual investigation.

There's little here worth being curious about. Tech companies made sure of that. They mostly aren't doing anything particularly groundbreaking in situations like these - they're doing the stupid or the greedy thing. And, on the off chance that the tech involved is in any way interesting, it tends to have decades of security research behind it applied to mathematically guarantee we can't use it for anything ourselves - and in case that isn't enough, there's also decades of legal experience applied to stop people from building on top of the tech.

Nah, it's one thing to fix bugs for the companies back when they tried or pretended to be friendly; these days, when half the problems are intentional malfeatures or bugs in those malfeatures, it stops being fun. There are other things to be curious about, that aren't caused by attempts to disenfranchise regular computer users.

bri3d 3/29/2025||
> There's little here worth being curious about.

I’m all for OP returning the computer Google broke, as sibling comments have suggested, but the curiosity route would have been fruitful for them too; I’m pretty sure the flag I posted or one of the adjacent ones will fix their issue.

I also personally found this feature kind of interesting of itself; I didn’t know that Google were doing model-based OCR and content extraction.

> on the off chance that the tech involved is in any way interesting, it tends to have decades of security research behind it applied to mathematically guarantee we can't use it for anything ourselves

My current profession and hobby is literally breaking these locks and I’m still not quite sure what you mean here. What interesting tech do you feel you can’t use or apply due to security research?

> there's also decades of legal experience applied to stop people from building on top of the tech.

Again… I’m genuinely curious what technology you feel is locked up in a legal and technical vault?

I feel that we’ve really been in a good age lately for fundamental technologies, honestly - a massive amount of AI research is published, almost all computing related sub-technologies I can think of are growing increasingly strong open-source and open-research communities (semiconductors all the way from PDK through HDL and synthesis are one space that’s been fun here recently), and with a few notable exceptions (3GPP/mobile wireless being a big one), fewer cutting edge concepts are patent encumbered than ever before.

> There are other things to be curious about, that aren't caused by attempts to disenfranchise regular computer users.

If anything I feel like this is a counter-example? It’s an innocuous and valuable feature with a bug in it. There’s nothing weird or evil going on to intentionally or even unintentionally disenfranchise users. It’s something with a feature toggle that’s happing in open source code.

> it's one thing to fix bugs for the companies back when they tried or pretended to be friendly

Here, we can agree. If a company are going to ship automatic updates, they need to be more careful about regressions than this, and they don’t deserve any benefit of the doubt on that.

ashoeafoot 3/29/2025||||
And what were you doing when they took over, dad? Oh, i was intestinal villian, secreting liberal free choice messages for a cooperation that didn't even pay me.
throwaway48476 3/29/2025||||
In other words, tech companies have lost the benefit of doubt.

That's what a decade of enshittification gets them.

oefrha 3/29/2025||
A decade ago people were also posting these outrage posts about Big Tech (and Small Tech) that more often than not turned out to be bugs/nothingburgers. I was here.
josephg 3/29/2025||
I was too. There was a period of great optimism around the web, Google and (in part), Apple. But it was closer to 20 years ago now.

I remember talking to someone from Microsoft around that time. (Who were an enemy of the opensource world at the time). They said the shine would wear off, and everyone would get annoyed and distrustful of Google too. I remember my conscious brain agreeing. But my emotional mind loved Google - we all did. I just couldn’t imagine it.

Well. It’s pretty easy to imagine now.

bri3d 3/30/2025||
I think the fall happened a long time ago. It's funny - I'm recently accused often of wearing rose-colored glasses on HN, but I think the present is actually quite a bit better than 15 years ago was when it comes to privacy. It's easy to forget how bad things were for a while in there.

15 years ago now I think Google were at their worst. Google were doing a good job in my eyes until roughly the time of the DoubleClick acquisition, when they pivoted away from "we're going to do ads the Good Way with AdWords" and into "screw it, we're just going to do ads," picked up the infamous DoubleClick cookie and their general "we profile people using every piece of data we can possibly think of" approach, and started making insane product decisions like public-contacts-by-default Google Buzz.

Since then, through a combination of courts forcing them to and what seems like a somewhat genuine internal effort, Google have been adding privacy controls back in many places. I certainly don't agree with the model still, but I think that Google in 2025 are actually much less of a privacy threat than 2010 Google were.

Outside Google, 15 years ago was also the peak Browser Toolbar and Installer Wrapper Infostealers era, where instead of building crypto scams or AI-wrapper companies, the hustle bros were busy building flat-out spyware instead.

I know I'm outside of the majority on HN recently, but I generally feel that the corporate _notion_ of user privacy has actually gotten a lot better since the early 2000s, while the _implementation_ has gotten worse. That is to say, companies, especially large ones, care much more about internal controls and have much less of a "we steal lots of data and figure out how to sell it later" model. Unfortunately, at the same time, we've seen the rise of "data driven" product management, always on updates, and "product telemetry," which erode the new attitude towards privacy at a technical level by building easily exploitable troves of sensitive information.

Of course, in exchange for large companies becoming more conscious about privacy, we now have a million smaller companies working to fill the "we steal all the data" shoes. It's still a battle that's far from won.

acuntcalleddan 3/29/2025|||
[flagged]
hilbert42 3/29/2025||
As tossandthrow says, return it as being defect. Reckon this should apply anywhere where warranty applies.

If what you say is correct then the device is (a) not fit for purpose and (b) it's possible you may be able to claim damages on the basis that the manufacturer has changed its modus operandi without your permission or consent and it's now incompatible with the way you work, etc., etc.

If Google reckons it had the right to alter your device because you agreed to its EULA, then it seems you'd still have a case on grounds that it no longer functions as it should.

There are only two things that will stop these bastards—them realizing such behavior is draining money from their hip pockets and proper consumer and privacy legislation.

But forget the latter, democracy is stuffed, and Big Tech has it by the balls anyway.

josephg 3/29/2025||
> But forget the latter, democracy is stuffed, and Big Tech has it by the balls anyway.

Not everywhere. Here in Vic, Australia, I can return a product for defects any time within its “expected product lifetime”. How long is that? It’s never specified explicitly! So yeah, it kinda doesn’t matter how old a laptop is if the manufacturer pulls stunts like this. You can still give them a headache if you want to.

Europe also has great customer protection laws. And this domain is .NZ - I wouldn’t be surprised if New Zealand has decent customer protection laws too.

The US’s democracy is stuffed. But thankfully the world is much bigger than the United States.

CamouflagedKiwi 3/29/2025||
Yes, NZ does. The Consumer Guarantees Act is pretty strong - goods must work for a "reasonable period", which is similarly not defined but has generally been upheld as you'd hope by the courts. Companies can't contract out of it.
_heimdall 3/29/2025||
> proper consumer and privacy legislation.

> But forget the latter, democracy is stuffed

What does consumer and privacy legislation have to do with democracy?

They may both be important, but I see no connection between the two other than the fact that those democratically elected would be the ones making the legislation (and any legislation).

hilbert42 3/29/2025||
Democracy doesn't work as it should when Big Tech and Big Business pull the strings to get what they want.

When entities other than ordinary citizens get their way—as they do—then citizens are disadvantaged. That ought to be pretty damn obvious, if not then take a look at the world around you.

For starters, examine the myriad of legislation that's beneficial to ordinary citizens that has been blocked or neutered by Big Tech/Business. Citizens may have the vote but they don't hold the power.

_heimdall 3/30/2025||
Whether it worked under a Democratic system depends on whether the legislation you are concerned with was passed legally.

Democracy would have worked perfectly fine if democratically elected officials made decisions and passed legislation that they were legally allowed to pass. We may disagree with what what passed, bit that's a concern of the outcome rather than the process in which those people were elected.

I very much agree with you with regards to the problems of big tech, big business, and lobbying in general. They are technically operating within the laws created by democratically elected officials, though. That's the problem.

We need a smaller government with less reach and fewer powers. We don't need to claim that those who were democratically elected somehow escaped democracy while working within the bounds of the rules they were given, we need to limit the rules.

hilbert42 3/30/2025||
"They are technically operating within the laws created by democratically elected officials, though."

Two issues: first, they may be technically operating within the law but if the legislation which enacted the law was achieved by processes/means that were biased/not truly democratic (i.e.: ones that benefit them) then citizens are disadvantaged. Unequal representation is undemocratic.

Second, laws may be on the books but if the State does not prosecute when they are violated then it makes a mockery of the law. Big Tech/Business has used political power and influence to stop the State prosecuting. For example, Sherman antitrust (and its successors) have been on the books since the 1890s and the State has done essentially nothing to reign in monopolistic practices of these companies.

That's just for starters. By any objective measure democracy in the US is essentially non functional. One only has to look at the polarized political divide which is widening further by the day to see that.

tossandthrow 3/29/2025||
If the machine is less than 2 years old (and you are in eruope) just return it as defect.
em-bee 3/29/2025||
laws should be adapted to extend the warranty every time a remote change is made to the device. basically, the warranty should hold as long as the device is maintained. say, each update should come with half a year of warranty. it's a bit tricky as it could motivate companies to stop updating, but that could be solved with a separate law forcing companies to provide an extra number of years to provide updates. (if that doesn't exist already)
juergbi 3/29/2025|||
This may make sense if the extended warranty is limited to defects introduced by the remote change. I.e., if they remotely break your device, they should be responsible for fixing the damage. A full warranty extension doesn't seem reasonable to me, though.

With regards to your last sentence, I think a good first step would be to require at least security and other critical updates to be provided within the full warranty period. And this would make sense even without the (limited) warranty extension, and I actually consider it more important.

em-bee 3/29/2025||
if the extended warranty is limited to defects introduced by the remote change

yes, of course. it may be hard to distinguish though. the device getting hot may create additional stress on the mainboard or RAM or other parts causing it to break faster.

haswell 3/29/2025||||
I don’t think relying on new laws as the primary incentive is going to get far, especially when big tech has the outsized influence they do on government. This then just leaves a strong incentive to stop updating things.
ginko 3/29/2025|||
Sounds like a good way to make sure that companies drop SW support as quickly as possible.
tossandthrow 3/29/2025|||
Not if it as accompanied by laws about a lower bar for the functioning of software. Eg. regular sec patches etc.

However, this would be a great way of separating hardware and software products - and would that be so bad?

exe34 3/29/2025||||
As long as they release all the source and build tools, I'm okay with that.
TeMPOraL 3/29/2025|||
They already do.
ginko 3/29/2025||
If OP is in Europe Google could be drawn and quartered for GDPR violations.
kleiba 3/29/2025|||
"could" being the point here - as Joe Bloke, you're not going to get yourself into a legal dispute with Google, but it's not very hard to return an electronic device as an ordinary end consumer when it's still under warranty.
dgellow 3/29/2025|||
Matthyze 3/29/2025||||
That's not how the GDPR works. Just like criminal procedure, subjects do not sue alleged offenders themselves. The state instead sues on their behalf.
nottorp 3/29/2025|||
You don't need to sue Google, just file a complaint with whatever the GDPR authority is in your country ...
kleiba 3/29/2025|||
I'd be interested to know if anyone on HN has actually done that in the past, and what the experience was?
robin_reala 3/29/2025||
Confiks managed to get Spotify to reinstate the API that allowed connections to SongShift with a GDPR complaint about the right to data portability (or threat thereof): https://news.ycombinator.com/item?id=24764371
zazazache 3/29/2025|||
Better to just contact NOYB directly, they are responsible for about 50% of all GDPR fines to date. Outside of Norway the DPAs seem mostly useless, especially when it comes to big tech…
mardifoufs 3/29/2025||||
GPDR forbids local OCR? Can you be more specific
hilbert42 3/29/2025|||
Exactly!
rs186 3/29/2025||
I flashed UEFI firmware on my Chromebook to use Linux. I have been wondering if that's the correct decision, given a number of compatibility issues I have run into on that distro. But seeing this, I know I can live with those issues but not with Chrome.
tjpnz 3/29/2025||
If op is a Kiwi he should be covered under the Consumer Guarantees Act.

https://www.consumerprotection.govt.nz/general-help/consumer...

bri3d 3/29/2025||
The source is available: https://chromium.googlesource.com/chromium/src/+/refs/tags/1...

It’s not “training an AI model on screen contents without consent.”

It is a stupid feature for Google to enable by default: likely what’s making OP’s machine useless is that it’s running an OCR inference model on the OP’s images to index them for search.

Go to chrome://flags and disable “Enable OCR For Local Image Search” and I bet the problem goes away. The AI Service does have a few other features, but that’s the one that’s likely to be cooking the machine.

As for the other comments on this thread, I doubt there’s anything to do with GDPR here. It’s all local.

xg15 3/29/2025|
This also seems crazy to me on a technical level: OP says, this operates on the contents of the screen. So essentially the OS already has the text in memory, renders it to the frame buffer, then OCRs it back - I suppose because it's "cheaper" in dev time to just slap OCR on a screenshot than maybe spend some time looking up what's already accessible in memory and through the UI toolkits.

CPU time is indeed cheaper than dev time, especially if it's your users' CPUs and not yours.

bri3d 3/29/2025||
Well, kind of, I don’t think it’s that crazy. This service does two things:

* Performs image OCR on images, generically. This is then used for several features: “I type a word in the search box and it can look through my screenshots and photos,” “I’m in one of those horrible scanned image-only PDFs and I want to search,” and so on.

* Performs “main content extraction” on websites by using a screenshot of the website _alongside_ the accessibility tree for that website’s structure. It basically says “given this tree of elements and screenshot, can you prune the tree to just the elements a user would care about.” The fact that this is necessary is more an indictment of the DOM than this feature, IMO :)

xg15 3/29/2025||
Ah ok, that seems more sensible. My understanding was it would pretty much literally make screenshots and then run OCR on them. If there is enough smartness to only run OCR on the parts that have no text information, it makes more sense. (Though as we see here, even that approach can be too much)
Animats 3/29/2025||
Is this the "PDF Searchify" feature?[1]

[1] https://windowsreport.com/chromes-new-feature-makes-scanned-...

scq 3/29/2025|
Looking at the Chromium source, PDF Searchify indeed uses this service (search for "ENABLE_SCREEN_AI_SERVICE").
ur-whale 3/29/2025||
I'm trying to remember the last time some actually positive and surprising (in a good way) news came out of Google.

I seem to remember a time when they produced one of those every week.

Lately it seems to be mostly the kind of fuck-up and misstep this article talks about.

And I'm not even mentioning those where the misbehaving is actually willful.

meta-level 3/29/2025|
This is was Microsoft did with their copilot thing, but they knew 95% would not care or even realize..
More comments...