Top
Best
New

Posted by erohead 11/13/2025

Android developer verification: Early access starts(android-developers.googleblog.com)
1362 points | 676 comments
rom1v 11/13/2025|
I want to be able to install apps from alternative app stores like F-Droid and receive automatic updates, without requiring Google's authorization for app publication.

Manually installing an app via adb must, of course, be permitted. But that is not sufficient.

> Keeping users safe on Android is our top priority.

Google's mandatory verification is not about security, but about control (they want to forbid apps like ReVanced that could reduce their advertising revenue).

When SimpleMobileTools was sold to a shady company (https://news.ycombinator.com/item?id=38505229), the new owner was able to push any user-hostile changes they wanted to all users who had installed the original app through Google Play (that's the very reason why the initial app could be sold in the first place, to exploit a large, preexisting user base that had the initial version installed).

That was not the case on F-Droid, which blocked the new user-hostile version and recommended the open source fork (Fossify Apps). (see also this comment: https://news.ycombinator.com/item?id=45410805)

ferguess_k 11/13/2025||
Yes, it's all about control. Control the platform. Control the access to the platform, and the world is your oyster. And the political and legislation system are their friends. It is the establishment.

The only way to fight is to indoctrinate the next generation, at home, and in school, to use FOSS. People tend to stick to whatever they used in childhood. We the software engineers should volunteer in giving speeches to students about this. It is much easier to sell ideologies to younger people when they are rebellious to the institutions.

wiz21c 11/14/2025|||
I agree with you. But you do realize that it's been like that since about 20 years now. It started because of Microsoft (proprietary software), then Google (propriteary platform), now ChatGPT (proprietary knowledge).

And I tried to tell my kids. And it failed mostly.

But in the long run (a decade), what is exceptional and proprietary will become common FOSS. And everybody will benefit.

ferguess_k 11/14/2025||
I envision this as an ideology. We don't need every kid to follow it, and I don't expect the majority to follow. A 1-2% is good enough. That's why giving speeches to teenages might be the best bang for the buck. There are always kids who need to escape into some cool ideas and it could be the idea of FOSS.
Workaccount2 11/13/2025||||
Really its probably the dumbass judge that told Google "The apple app store isn't anti-competitive because they don't allow any competitors on their platform" when google asked why the play store was ruled a monopoly and the app store wasn't.

I cannot think of a more detached and idiotic ruling than that.

viktorcode 11/14/2025|||
The US anti trust legislation punishes the abuse of monopoly power, not being monopoly in itself. Google was found guilty in leveraging their dominating position on the platform to do just that.

On the other hand in the US Apple's App Store was not found to be a monopoly in the first place. Different cases about abusing dominating position also didn't go far.

Gunax 11/13/2025||||
Hmm, having read that, I am starting to sympathize with Google if they are going to be punished for being open.

No one seems to care that Apple has never allowed freedom on their devices. Even the comments here don't seem to mention it. Google was at least open for a while.

Or maybe no one mentions it just because the closed iPhone is a fait accompli at this point.

neuralRiot 11/13/2025||
Perhaps because Apple never “promised” to be open, Google instead built itself by playing the good guy and started to switch when money called so those who chose them for that reason feel betrayed.
ferguess_k 11/13/2025||||
I guess they are going to say whatever to prove the case. The legislation system is highly...closed and shun of laymen.
tenuousemphasis 11/13/2025||
It's the JUDGE that came up with that reasoning.
xxpor 11/13/2025|||
Because that's the law, like it or not. Apple doesn't have a problem because the rules were the rules from day 1. Google did a bait and switch, legally.
AnthonyMouse 11/14/2025||
What does antitrust law have to do with "day 1"? So if Ford and GM are both already in all 50 states and then they try to divide up territory between them, that's illegal, but at the point when there were still areas one of them wasn't in, they could publicly announce a contractual agreement to not enter into the other's territory? That seems not just questionable but actively bad policy with an enormous perverse incentive.

And if you're going to say this:

> Because that's the law, like it or not.

I would ask you to point me to the text in the statute requiring the courts to do that.

ferguess_k 11/13/2025|||
Yeah it's the judge.
saturnite 11/13/2025||
I think you missed the point that judges aren't part of the legislative branch. They're in the judicial branch.
ferguess_k 11/14/2025||||
Please allow me to correct my bad English and replace it with "the law circle".
gudnuff 11/13/2025|||
[dead]
Spivak 11/13/2025|||
But the ruling is correct. You can't have it both ways, if you invite competition you're not allowed to be anti-competitive. You can be Nintendo, offer a single store, only allow first party hardware, and exercise total control over your product. Then your anticompetitive behavior can only be evaluated externally. But if you open yourself up to internal competition with other phone vendors, other stores, and then you flex your other business units (gapps) to force those other vendors to favor you then you're in big trouble.
chii 11/14/2025|||
> But the ruling is correct. You can't have it both ways, if you invite competition you're not allowed to be anti-competitive

That's just stupid, because being anti-competitive is an emergent outcome, rather than anything specific.

Apple is definitely anti-competitive, but they exploited such a ruling so that they can skirt it. Owning a platform that no other entrants are allowed is anti-competitive - whether you're small or large. It's only when you're large that you should become a target to purge via anti-competitive laws. This allows small players to grow, but always face the threat of purging - this makes them wary of trying to take advantage too much, which results in better consumer outcomes.

amne 11/14/2025||
That's like Karcher opening a megamall to sell all their offering, vacuums, pressure washers, floor washers, you name it .. and then you, Bosch, complaining you can't sell your vacuum in Karcher's megamall where all the people go.

What are you even saying?

Whereas google was letting Bosch sell vacuums in their megamall, but only if it uses Google dust filters and people buy only Google made dust filters and Bosch isn't allowed to sell their own dust filters in the megamall.

AnthonyMouse 11/14/2025|||
It's like a company buying all the land within a 100 mile radius and then nominally "selling" plots to people but with terms of service attached that restrict what you can do with the land you bought and that allow the company to change the terms at any time. And then, after people have moved in, most of them having not even read the terms or realized it wasn't an ordinary sale, they start enforcing the terms against competitors. Which most people don't notice because they aren't competitors, and because the terms also prohibited anyone in the city from telling people what's going on[1]. Then people eventually notice and start to ask whether terms locking out competitors like that are an antitrust violation, and someone says that they're not because the people there agreed to them.

[1] https://som.yale.edu/sites/default/files/2022-01/DTH-Apple-n...

But how is an agreement prohibiting people from patronizing competitors not an antitrust violation? It's not a matter of who agreed to it, it's matter of what they're requiring you to agree to.

astafrig 11/14/2025||
> nominally "selling" plots to people but with terms of service attached that restrict what you can do with the land you bought and that allow the company to change the terms at any time.

So, a lease.

AnthonyMouse 11/14/2025||
That's, to begin with, not even how a lease generally works. A lease isn't where you pay once up front to take permanent possession of something.

Moreover, did people buying iPhones on "day 1" think they were buying them or leasing them? Did Apple call it a sale or a rental agreement?

chii 11/14/2025|||
> Karcher opening a megamall to sell all their offering

And their mall is monopolistic if it is only for Karcher products. However, because a competitor can easily open a mall next door, it means this Karcher mall is small, and so the enforcers should leave it be. Until the day Karcher buys up all the mall space, in which case, they (regulators) start purging their mall monopoly.

The threat of being purged because you've acquired a large enough monopoly should _always_ be there. It's part of doing business in a fair environment.

AnthonyMouse 11/14/2025|||
> You can be Nintendo, offer a single store, only allow first party hardware, and exercise total control over your product.

How is this not even more anti-competitive?

It's fine to be mad at Google for being duplicitous, but treachery is in the nature of false advertising or breach of contract. Antitrust is something else.

"You can monopolize the market as long as you commit to it from the start" seems like the text of the law a supervillain would be trying pass in order to destroy the world.

Spivak 11/14/2025||
You can't monopolize a market where there is no market. Nintendo can be anticompetitive in the wider games industry, but there is no market for software that runs on a Switch.

I didn't say I liked the ruling, just that it's correct. The opposite conclusion would be absurd, that you can invent a market where there isn't one and claim a company has a monopoly over it. You would be asking the court to declare that every computing device is a de facto marketplace for software that could run on it and that you can't privilege any specific software vendor. I would love if that were true but you can hopefully agree that such a thing would be a huge stretch legally.

AnthonyMouse 11/14/2025||
> You can't monopolize a market where there is no market. The opposite conclusion would be absurd, that you can invent a market where there isn't one and claim a company has a monopoly over it.

There is no such thing as "there is no market". There is always a market. The question is, what's in the market? The typical strategy is to do the opposite -- have Nintendo claim that they're competing with Sony and Microsoft in the same market to try to claim that it isn't a monopoly.

But then the question is, are they the same market? So to take some traditional examples, third party software that could run on MS-DOS could also run on non-Microsoft flavors of DOS. OS/2 could run software for Windows. The various POSIX-compliant versions of Unix and Linux could run the same software as one another. Samsung phones can run the same apps as Pixel phones. Which puts these things in the same market as each other, because they're actually substitutes, even though they're made by different companies.

Conversely, you can't run iOS apps on Android or get iOS apps from Google Play or vice versa. It's not because they're different companies -- both of them could support both if they wanted to -- it's that they choose not to and choices have consequences.

If you intentionally avoid competing in the same market as another company then you're not competing in the same market as that company and the absurdity is trying to have it both ways by doing that and then still wanting to claim them as a competitor.

Spivak 11/14/2025||
You avoided the important part, there is no market for hardware that can play Nintendo Switch games and there is no market for software providers on Nintendo Switch. And they are legally allowed to do that. You can sell appliances that are bound to a single vendor and you are allowed to not license your hardware or software to 3rd parties.

Since that is a legally permissible action it would be an odd thing for a court to declare that doing such a thing is anticompetitive. If they did they would be declaring all locked down hardware effectively illegal. And while that might be nice it's a bit of a pipedream. Where Google fucked up is that they did license their software to 3rd parties—good for them. But then Google had some regrets and didn't like the fact that they didn't have control over those 3rd parties. But they did have some leverage in the form of Google Play and GSM because users expect it to be there on every Android phone. And then they used that leverage. That's the fuckup. They used Google Play and GSM access to make 3rd parties preinstall Chrome and kill 3rd party Android forks. They used anticompetitive practices on their competitors—other Android device manufacturers.

This situation can't occur for Apple or Nintendo because there aren't other iOS/Switch device manufacturers and they don't have to allow them to exist. They can be anticompetitive for other reasons but not this.

AnthonyMouse 11/15/2025||
> You avoided the important part, there is no market for hardware that can play Nintendo Switch games and there is no market for software providers on Nintendo Switch.

There is a market for these things. Nintendo sells hardware that can play Nintendo Switch games and people buy it. That's a market.

It seems like you're trying to claim that a monopoly isn't a market, but how can that possibly be how antitrust laws work? Your argument is that they don't apply to something if it is a monopoly?

> And they are legally allowed to do that.

That's just assuming the conclusion. Why should it be legal for them to exclude competitors from selling software to their customers? The obviously anti-competitive thing should obviously be a violation of any sane laws prohibiting anti-competitive practices. The insanity is the number of people trying to defend the practice.

Consider what it implies. 20th century GE could have gone around buying houses, installing a GE electrical panel and then selling the houses with a covenant that no one could use a non-GE appliance in that house ever again, or plug in any device that runs on electricity without their permission. They could buy and sell half of all the housing stock in the country and Westinghouse the other half and each add that covenant and you're claiming it wouldn't be an antitrust violation.

Apple wouldn't have been able to get their start because they'd have needed permission from GE or Westinghouse for customers to plug in an Apple II or charge an iPhone and they wouldn't get it because those companies were selling mainframes or flip phones and wouldn't want the competition. If that's not an antitrust violation then we don't have antitrust laws.

> If they did they would be declaring all locked down hardware effectively illegal.

It's fine for hardware to be locked down by and with the specific permission of the person who owns it. But how is it even controversial for the manufacturer locking down hardware for the purpose of excluding competitors to be a violation of the laws against inhibiting competition? It's exactly the thing those laws are supposed to be prohibiting.

sylos 11/14/2025||||
So basically you're saying we're fucked. People don't care about FOSS in general, let alone when their phone says it's dangerous.
johnnyanmac 11/14/2025|||
If peopele cared about privacy as much as politics pretends they do, we'd have solved so many problems in society.

Fortunately, those fighting, albeit a minority, have done great work in protecting this. No reason to stop now.

ferguess_k 11/14/2025|||
Yeah we are fucked, but as long as a small percentage of us, like 1% of the population knows, understands and agrees with the idea I think we are fine.
eth0up 11/14/2025||
We'll have to initiate solid self defense protocol though. I think the first thing we should do is get a new logical fallacy term officially coined.

Stallman/StallManned Abusing the principles of the Slippery Slope to discredit perfectly rational predictions

fithisux 11/14/2025||||
Really difficult because you need to have two devices.

One mandated be the establishment and one mandated by visions and freedom.

But it would be a great start.

On my work laptop I am mandated to use Windows 11 but I run (and when I have time) I develop FOSS.

motbus3 11/16/2025|||
Imagine needing to agree with a TOS that can lock you out of your phone when they change/add some random new policy
leoedin 11/13/2025|||
I don't really see how you can both allow developers to update their apps automatically (which is widely promoted as being good security practice) and also defend against good developers turning bad.

How does Google know if someone has sold off their app? In most cases, F-Droid couldn't know either. A developer transferring their accounts and private keys to someone else is not easily detected.

jlokier 11/13/2025|||
> In most cases, F-Droid couldn't know either.

F-Droid is quite restrictive about what kinds of app they accept, they build the app from source code themselves, and the source code must be published under a FLOSS license. They have some checks that have to pass for each new version of an app.

Although it's possible for a developer to transfer their accounts and private keys to someone shady, F-Droid's checks and open source requirements limit the damage the new developer can do.

https://f-droid.org/docs/Inclusion_Policy/

https://f-droid.org/docs/Anti-Features/

Sophira 11/13/2025|||
One thing worth noting, these checks and restrictions only apply if you're using the original F-Droid repository.

Many times I've seen the IzzyOnDroid repository recommended, but that repo explicitly gives you the APKs from the original developers, so you don't get these benefits.

Zak 11/13/2025||
That's true. The whole point of an open ecosystem is that you get to decide who you get your software from. You can decide on the official F-Droid repository and get the benefits and drawbacks of a strict open source rule with the F-Droid organization's curation if that's your preference. You can add other repositories with different curation if you prefer that.
Hizonner 11/13/2025|||
You know what? That's bullshit.

Anybody slightly competent can put horrendous back doors into any code, in such a way that they will pass F-Droid's "checks", Apple's "checks", and Google's "checks". Source code is barely a speed bump. Behavioral tests are a joke.

johnnyanmac 11/14/2025||
Anyone determined enough can break into any house. If not through ingenuitiy, then by a brick to your window. Doesn't mean we shouldn't lock our doors, turn off our lights, and close our curtains anyway.

The fortunate thing is that 99% of people won't bother trying to break your app if it's not dead simple. Advanved security mechanisms to check for backdoors is probably something only billionaire tech companies need to worry about.

Hizonner 11/14/2025||
You totally misunderstand the threat model. It's not about anybody breaking your app. It's about people making their own apps do things they're not supposed to do.

... and there's always a tradeoff in terms of how much of a deterrent anything is. The app store checks are barely measurable.

johnnyanmac 11/14/2025||
The app store checks are barely measureable, yes. Hence why being open source is the best check for any undocumented changes. Even if it's not discovered on FDoid, reports will come out for those who dig. Much easier to view source code than decompiling an APK to analyze.

But at some point there needs to be some level of trust in anything you install. You can't rely on institutions to make sure everything is squeaky clean. They can't even do that on content platforms (or at least, they choose not to afford it).

bogwog 11/13/2025||||
> In most cases, F-Droid couldn't know either. A developer transferring their accounts and private keys to someone else is not easily detected.

1. The Android OS does not allow installing app updates if the new APK uses a different signing key than the existing one. It will outright refuse, and this works locally on device. There's no need to ask some third party server to verify anything. It's a fundamental part of how Android security works, and it has been like this since the first Android phone ever release.

2. F-Droid compiles all APKs on its store, and signs them with its own keys. Apps on F-Droid are not signed by the developers of those apps. They're signed by F-Droid, and thus can only be updated through and by F-Droid. F-Droid does not just distribute APKs uploaded by random people, it distributes APKs that F-Droid compiled themselves.

So to answer your question, a developer transferring their accounts/keys to someone else doesn't matter. It won't affect the security of F-Droid users, because those keys/accounts aren't used by F-Droid. The worst that can happen is that the new owner tries injecting malware into the source code, but F-Droid builds apps from source and is thus positioned to catch those types of things (which is more than can be said about Google's ability to police Google Play)

And finally,

> How does Google know if someone has sold off their app?

Google should not know anything about the business dealings of potential competitors. Google is a monopoly[1], so there is real risk for developers and their businesses if Google is given access to this kind of information.

[1]: https://www.google.com/search?q=is+google+a+monopoly%3F&udm=...

saturnite 11/13/2025|||
Android also has the feature of warning the user if an update is coming from a different source than what is installed. This will happen even if they have the same key. This reply isn't trying to argue against anything you've said. I am just adding to the list of how Android handles updates.
AshamedCaptain 11/13/2025||||
> F-Droid compiles all APKs on its store, and signs them with its own keys. Apps on F-Droid are not signed by the developers of those apps. They're signed by F-Droid, and thus can only be updated through and by F-Droid. F-Droid does not just distribute APKs uploaded by random people, it distributes APKs that F-Droid compiled themselves.

For most programs I use, they just publishing the developer's built (and signed) APK. They do their own build in parallel and ensure that the result is the same as the developer's build (thanks to reproducible builds), but they still end up distributing the developer's APK.

bogwog 11/13/2025||
Can you give some examples? I've heard that's a thing, but I'm not familiar with any apps that actually pull it off (reproducible builds are difficult to achieve)
AshamedCaptain 11/13/2025||
Reproducible builds may be hard to achieve, but that doesn't mean you don't have a list of such builds long enough to crash your browser: https://verification.f-droid.org/verified.html
crumpled 11/13/2025||
Weird to have a page like that if a human can't use it. Needs some pagination, f-droid!

It's like we're supposed to save the page and grep it or something. Doesn't work in my Firefox.

nandomrumber 11/13/2025||||
You have to trust somebody.

Who is F-Droid? Why should I trust them?

How do I know they aren’t infiltrated by TLAs? (Three Letter Agencies), or outright bad-actors.

Didn’t F-Droid have 20 or so apps that contained known vulnerabilities back in 2022?

Who are all these people? Why should I trust them, and why do most of them have no link to a bio or repository, or otherwise no way to verify they are who they say they are and are doing what they claim to be doing in my best interests?

https://f-droid.org/en/about/

bogwog 11/13/2025|||
I trust them, at least a lot more than I do Google, which is a known bad actor, and collaborator with "TLAs". F-Droid has been around for a very long time, if you didn't know. They've built and earned the trust people have in them today.

> Didn’t F-Droid have 20 or so apps that contained known vulnerabilities back in 2022?

Idk what specific incident you're referring to, but since they build apks themselves in an automated way, if a security patch to an app breaks the build, that needs to be fixed before the update can go out (by F-Droid volunteers, usually). In that case, F-Droid will warn about the app having known unpatched vulnerabilities.

Again, this is above and beyond what Google does in their store. Google Play probably has more malware apps than F-Droid has lines of code in its entire catalog.

nandomrumber 11/13/2025||
This incident

https://gitlab.com/fdroid/fdroiddata/-/merge_requests/11496

rpdillon 11/14/2025||
Right, that's literally the team marking 12 apps as having known vulnerabilities (seems like it was because of a WebRTC vulnerability that was discovered). It's the F-Droid system working as intended to inform users about what they're installing.

You're calling it an incident like it was an attack or something, but it just seems like everyday software development. Google Play and the App Store don't let me know when apps have known vulnerabilities. I think F-Droid is coming out way ahead here.

rpdillon 11/14/2025||||
So Google and Apple are already known to work with US government agencies. This was revealed in the Snowden leaks in 2013, and confirmed on multiple occasions since. Neither Google nor Apple tell you when apps you're downloading from the store contain known vulnerabilities. We know for a fact that both Google Play and the App Store are filled with scams and malware: it's widely documented.

So to my reading F-Droid comes out ahead on every metric you've listed: It has no known associations with US government agencies. They do inform you when your apps have known vulnerabilities. I'm not aware of any cases of scams or malware being distributed through F-Droid.

I highly recommend it. It's the main store I've been using on my phone for probably more than a decade now.

AshamedCaptain 11/13/2025||||
Because you can literally verify every single step of what they do. That's the reason you can trust them.

You cannot apply this logic to almost anyone else. Apple, Google, etc. can only give you empty promises.

botanical76 11/13/2025||||
I understand your concern, though your suspicion is a little shortsighted. It can be personally dangerous to volunteer for projects that directly circumvent the control of the establishment.
rstuart4133 11/13/2025|||
> Who is F-Droid? Why should I trust them?

For the same reason you trust many things. They have a long track record of doing the right thing. As gaining reputation for doing the wrong thing would more or less destroy them, it's a fair incentive to continue doing the right thing. It's a much better incentive that many random developers of small apps in Google's play store have.

However, that's not the only reason to trust them. They also follow a set of processes, starting with a long list of criteria saying what app's they will accept https://f-droid.org/docs/Inclusion_Policy/ That doesn't mean malware won't slip past them on occasion, but if you look at the amount of malware that slips past F-Droid and projects with similar policies like Debian and compare them to other app stores like Google's, Apple and Microsoft there is no comparison. Some malware slips past Debian's defences once every few years. I would not be surprised if new malware isn't uploaded to Google app store every few minutes. The others aren't much better.

The net outcome of all that is the open source distribution platforms like F-Droid and Debian, that have procedures in place like tight acceptance policies and reproducible builds are by a huge margin the most reliable and trustworthy on the planet right now. That isn't saying they are perfect, but rather if Google's goal is to keep their users safe they should be doing everything in their power to protect and promote F-Droid.

> How do I know they aren’t infiltrated by TLAs? (Three Letter Agencies), or outright bad-actors.

You don't know for sure, but F-Droid policies make it possible to detect if the TLA did something nefarious. The combination of reproducible builds, open source and open source's tendency to use source code management systems that provide to audit trail showing who changed every line shine a lot of sunlight into the area. Sunlight those TLA's your so paranoid about hate.

This is the one thing that puzzles me about F-Droid opposition in particular. Google is taking a small step here towards increasing accountability of app developers. But a single person signing an app is in reality a very small step. There are likely tens if not hundreds of libraries underpinning it, developed by thousands of people. That single developer can't monitor them all, and consequently libraries with malware inserted from upstream repositories like NPM or PyPi regularly slips through. Transparency the open source movement mostly enforces is far greater. You can't even modify the amount of whitespace in a line without it being picked up by some version control system that records who did it, why they did it, and when. So F-Droid is complaining about a small increase in enforced transparency from Google, when they demand far, far more from their contributors.

I get that Google's change probably creates some paper-cuts for F-Droid, but I doubt it's something that can't be worked around if both sides collaborate. This blog post sounds like Google is moving in that direction. Hear, hear!

nandomrumber 11/13/2025||
> They also follow a set of processes, starting with a long list of criteria saying what app's they will accept

How is this an argument in favour of being able to run whatever software you want on hardware you own?

rstuart4133 11/13/2025||
You can run any software you like on Android, if it's open source. You just compile it yourself, and sign it with the limited distribution signature the blog post mentions. Hell, I've never done it, but re-signing any APK with your own signature sounds like it should be feasible. If it is, you can run any APK you want on your own hardware.

Get a grip. Yes it might be possible the world is out to get you. But it's also possible Google is trying to do exactly what they say on the tin - make the world a safer place for people who don't know shit from clay. In this particular case, if they are trying to restrict what an person with a modicum of skillz can do on their own phone it's a piss poor effort, so I'm inclined to think it's the latter. They aren't even removing the adb app upload hole.

Someone 11/14/2025|||
>> In most cases, F-Droid couldn't know either. A developer transferring their accounts and private keys to someone else is not easily detected.

> 1. The Android OS does not allow installing app updates if the new APK uses a different signing key than the existing one. It will outright refuse, and this works locally on device

You missed the and private keys part of the original claim.

bogwog 11/14/2025||
No I didn't. Finish reading the rest of the comment.
lopis 11/13/2025||||
If an app updates to require new permissions, or to suddenly require network access, or the owner contact details change, Google Play should ideally stop that during the update review process and let the users know. But that wouldn't be good for business.
berkes 11/13/2025|||
An update can become malicious even without change in permissions.

E.g. my now perfectly fine QR reader already has access to camera (obvious), media (to read QR in an image file or photo) and network (enhanced security by on-demand checking the URL for me and showing OG etc so I can more informed choose to open the URL)

But it could now start sending all my photo's to train an LLM or secretly make pictures of the inside of my home, or start mining crypto or whatnot. Without me noticing.

kuschku 11/13/2025||
See that's what the intent system was originally designed to prevent.

Your QR reader requires no media permission if it uses the standard file dialogs. Then it can only access files you select, during that session.

Similarly for the camera.

And in fact, it should have no network access whatsoever (and network should be a user controllable permission, as it used to be — the only reason that was removed is that people would block network access to block ads)

berkes 11/14/2025|||
> And in fact, it should have no network access whatsoever (and network should be a user controllable permission, as it used to be — the only reason that was removed is that people would block network access to block ads)

Sure, a QR code scanner can work fine without network. E.g. it could use the network to check a scanned URL against the "safe browsing API" or to pre-fetch the URL and show me a nice OG preview. You are correct to say you may not need nor want this. But I and others may like such features.

Point is not to discuss wether a QR scanner should have network-access, but to say that once a permission is there for obvious or correct reasons, it can in future easily get abused for other reasons. Without changing the permissions.

My mail-app needs network. Nothing prohibits it from abusing this after an update to pull in ads, or send telemetry to third parties. My sound record app needs microphone permissions. Nothing prohibits it from "secretly" recording my conversations after an update (detectable since a LED and icon will light up).

If you want to solve "app becoming malicious after an update", permissions aren't the tool. They are a tiny piece of that puzzle, but "better permissions" aren't the solution either. Nor is "better awareness of permissions by users".

iggldiggl 11/14/2025|||
> See that's what the intent system was originally designed to prevent.

> Your QR reader requires no media permission if it uses the standard file dialogs. Then it can only access files you select, during that session.

On the one hand, yes, good point, but it runs into the usual problem with strict sandboxing – it works for the simple default use case, but as soon as you want to do more advanced stuff, offer a nicer UI, etc. etc. it breaks down.

E.g. barcode scanners – yes, technically you could send a media capture intent to ask the camera app to capture a single photo without needing the camera permission yourself, but then you run into the problem that maybe the photo isn't suitable enough for successful barcode detection, so you have to ask the user to take another picture, and perhaps another, and another, and…

So much nicer to request the camera permission after all and then capture a live image stream and automatically re-run the detection algorithm until a code has been found.

fauigerzigerk 11/13/2025||||
>...or to suddenly require network access...

That's the most baffling thing to me. There is simply no option to remove network permissions from any app on my Pixel phone.

It's one of the reasons why I avoid using mobile apps whenever I can.

uyzstvqs 11/13/2025|||
It's weird because GrapheneOS does have this. Networking is a permission on Android, but stock Android doesn't give you the setting.
dns_snek 11/13/2025||
I believe that permission is currently "leaky". The app can't access the network but it can use Google Play services to display ads.

I believe that would theoretically allow exfiltration of data but I don't understand all of the details behind this behavior and how far it goes.

OrangeMusic 11/13/2025||||
Google wants 0 friction for apps to display ads.
fauigerzigerk 11/13/2025|||
So does Apple apparently.
sharperguy 11/13/2025|||
What incentive is there for OEMs to not add this option though? Does Google refuse to veriy their firmware if they offer this feature?
Arnt 11/13/2025|||
The network permission was displayed in the first versions of Android, then removed. I heard (hearsay alert) at the time that it was because so many apps needed it, and they wanted to get rid of always-yes questions. IIRC this happened before the rise of in-app advertising.

If people always answer yes, they grow tired and eventually don't notice the question. I've seen it happen with "do you want to overwrite the previous version of the document you're editing, which you saved two minutes ago?" At that point your question is just poisoning the well. Makes sense, but still, hearsay alert.

fauigerzigerk 11/13/2025|||
As far as I'm concerned they can grant this permission by default. I just want the power to disable it.

A while ago I wanted to scan the NFC chip in my passport. Obviously, I didn't want this information to leave my device.

There are many small utility apps and games that have no reason to require network access. So "need" is not quite the right word here. They _want_ network access and they _want_ to be able to bully users into granting it.

That's a weird justification for granting it by default. But I wouldn't care if I could disable it.

Arnt 11/13/2025||
Android doesn't grant this by default, strictly speaking. Rather, an application can enable it by listing it in the application manifest. Most permissions require a question to to the user.

Did you find a suitable app? I don't really remember, but https://play.google.com/store/apps/details?id=com.nxp.taginf... might suit you.

fauigerzigerk 11/13/2025||
I did find one but it was years ago so I don't remember.
u8080 11/13/2025||||
Could have been easily solved by granting it by default, but I doubt that was original intent.
Arnt 11/13/2025||
Well, the original intent was to ask the user for permission at installation time, which turned out to be a poor idea after a while. Perhaps you mean that it would have been simple to change the API in some particular way, while retaining compatibility with existing apps? If I remember the timeline correctly, which is far from certain, this happened around the same time as Android passed 100k apps, so a fairly strong compatibility requirement.
u8080 11/13/2025||
I mean, just make it "Granted" by default and give user ability to control it. Permissions API was already broken few times(i.e. Location for bluetooth and granular Files permissions)
cesarb 11/13/2025||||
> Does Google refuse to veriy their firmware if they offer this feature?

If a manufacturer doesn't follow the Android CDD (https://source.android.com/docs/compatibility/cdd), Google will not allow them to bundle Google's closed source apps (which include the Google Play store). It was originally a measure to prevent fragmentation. I don't know whether this particular detail (not exposing this particular permission) is part of the CDD.

rickdeckard 11/13/2025||
It's not explicitly part of the CDD, but implicitly. The device must support the Android permissions model and is only allowed to extend this implementation using OWN permissions (in a different namespace than 'android'), but not allowed to deviate from it.

INTERNET is a "normal permission", automatically granted at install time if declared in the manifest. OEMs cannot change the grant behavior without breaking compatibility because:

The CDD explicitly states that the Android security model must remain intact. Any deviation would fail CTS (Compatibility Test Suite) and prevent Play certification.

rickdeckard 11/13/2025|||
Well, apart from the OEM violating the Android Compatibility Definition Document (CDD), failing the Compatibility Test Suite (CTS) and thus not getting their device Play-certified (so not being able to preload all the Google services, there is an economical impact as well:

As OEM you want Carriers to sell your device above everything else, because they are able to sell large volumes.

Carriers make money using network traffic, Google is paying Revenue-Share for ads to Carriers (and OEMs of certain size). Carriers measure this as part of the average revenue per user (ARPU).

--> The device would be designed to create less ARPU for the Carrier and Google and thus be less attractive for the entire ecosystem.

bmacho 11/13/2025||||
It is solvable from user space.

E.g. TrackerControl https://github.com/TrackerControl/tracker-control-android can do it, it is a local vpn which sees which application is making a request and blocks it.

You can write your own version of it if you don't trust them.

lopis 11/18/2025||
I've been using a similar VPN solution. It works great for apps that absolutely should not be connected, like my keyboard. But it has an obvious downside: you can't use a VPN on your phone while you're using that.
raxxorraxor 11/13/2025|||
Some apps would use this for loopback addresses, which as far as I know will then need network permission. The problem here is the permission system itself because ironically Google Play is full of malicious software.

And neither Android nor iOS a safer than modern Desktop systems. On the contrary because leaking data is its own security issue.

LtWorf 11/13/2025||
Wasn't the loopback address recently used maliciously?
salawat 11/13/2025||
Yes. Facebook/Meta was using a locally hosted proxy to get info smuggled back without using routes that are increasingly obstructed by things like ad blockers if I recall correctly.

https://securityonline.info/androids-secret-tracking-meta-ya...

Search string for DDG: Meta proxy localhost data exfiltration

asmor 11/13/2025||||
This is a huge problem in the Chrome Web Store and Google is doing very little about it. If you ever made an extension that is even just a little popular, expect to get acquisition offers by people who want to add malicious features somewhere between click fraud, residential IP services or even password stealers.
babuskov 11/13/2025||
Same for Play Store. I have 2 games and I keep getting offers all the time. The last one offered $2000 for the developer account or a $100 monthly rent.

From their email pitch:

> We’re now offering from $500 to $2000 for a one-time purchase of a developer account that includes apps, or a rental deal starting from $100.

> No hidden conditions — quick process, secure agreement, and immediate payment upon verification.

> We’re simply looking for reliable accounts to publish our client apps quickly, and yours could be a perfect match.

bmacho 11/13/2025|||
Indeed, an update can't be more malicious than the permissions allow it to be. You have a calculator app with limited permissions, it is "safe" to set to allow the developer to update it. No danger in that.

But I don't think it is enough, or it is the right model. In other cases, when the app has dangerous permissions already, auto-update should be a no-go.

cesarb 11/13/2025||
> Indeed, an update can't be more malicious than the permissions allow it to be.

...in the absence of sandbox escape bugs.

mid-kid 11/13/2025||||
> F-Droid couldn't know either

F-Droid is not just a repository and an organization providing the relevant services, but a community of like-minded *users* that report on and talk about such issues.

rixed 11/13/2025||||
> which is widely promoted as being good security practice

Maybe that's the mistake right there?

It is a good practice only as long as you can trust the remote source for apps. Illustration: it is a good security practice for a Debian distro, not so much for a closed source phone app store.

eMPee584 11/13/2025||
OPEN SOURCE EVERYTHING is the premier solution.. again.
Aissen 11/13/2025||||
By using the distributor model, where a trusted 3rd party builds & distributes the apps. Like every Linux distro or like what F-droid does.
GuB-42 11/13/2025||||
The point here is that app developers have to identify themselves. Google has no intention to verify the content of sideloaded apps, just that it is signed by a real person, for accountability.

They don't know if the person who signed the app is the developer, but should the app happen to be a scam and there is a police investigation, that is the person who will have to answer questions, like "who did you transfer these private keys to?".

This, according to Google and possibly regulators in countries where this will be implemented, will help combat a certain type of scam.

It shouldn't be a problem for YouTube Vanced, at least in the proposed form. The authors, who are already idendified just need to sign their APK. AFAIK, what they are doing is not illegal or they would have been shut down long ago. It may be a problem for others though, and particularly F-Droid, because F-Droid recompiles apps, they can't reasonably be signed by the original author.

The F-Droid situation can resolve itself if F-Droid is allowed to sign the apps it publishes, and in fact, doing that is an improvement in security as it can be a guarantee that the APK you got is indeed the one compiled by F-Droid from publicly available source code.

sharperguy 11/13/2025|||
APKs are already signed. Now Google requries that they be signed by a key which is verified by their own signatures. Which means they can selectively refused to verify whichever keys are inconvenient to them.
tcfhgj 11/13/2025||||
> Google has no intention to verify the content of sideloaded apps, just that it is signed by a real person, for accountability.

for now

raxxorraxor 11/13/2025|||
Still believe that signing binaries this way is always bullshit.

I stopped developing for mobile systems ages ago because it just isn't fun anymore and the devices are vastly more useless. As a user, I don't use apps anymore either.

But you can bet I won't ever id myself to Google as a dev.

bmacho 11/13/2025||||
> I don't really see how you can both allow developers to update their apps automatically (which is widely promoted as being good security practice) and also defend against good developers turning bad.

These are not compatible, but only because the first half is simply false. Allowing a developer to send updates is not "good" but "bad" security practice.

maybewhenthesun 11/13/2025||||
That's true in theory. But as you can see in practice is that google does very little to protect their users, while F-Droid at least tries.

Which shows that the whole 'security' rigmarole by google is bullshit.

niutech 11/13/2025||||
In many cases developer e-mail address changes, IP address changes, billing address changes, tax ID changes...
rollcat 11/13/2025||
This exactly. Transferring ownership is a business transaction. Track that. If the new owner is trying to hide it, this is fraud, and should be dealt with in court.
IshKebab 11/14/2025||||
This is a big problem with Chrome extensions and Google hasn't done anything about it there, so I don't think they actually care about it. I'm not actually sure how you would solve that problem even theoretically.
4u00u 11/13/2025||||
To be fair, on Google Play you have the option to transfer the app to someone else's account. People don't need to trade accounts...
nandomrumber 11/13/2025||
That doesn’t help mitigate the class of attack you responded to.
fukka42 11/13/2025|||
Quite simple: Actual human review that works with the developers.

But this costs money, and the lack of it is proof google doesn't really care about user security. They're just lying.

curtisnewton 11/13/2025|||
> without requiring Google's authorization for app publication.

funnily enough, I am installing google drive for computers right now (macOS), I had to download a .pkg and basically sideload the app, which is not published on the Apple Store

Why the double standard, dear Google?

curt15 11/13/2025|||
>I had to download a .pkg and basically sideload the app, which is not published on the Apple Store

You mean install the app? The fact that Apple and Google wish to suggest that software from outside their gardens is somehow subnormal doesn't mean other people need to adopt their verbiage.

curtisnewton 11/13/2025||
> You mean install the app?

Correct, I mean install the app.

Sideloading is the corporate jargon for "installing an app".

tom1337 11/13/2025||||
Probably because they require APIs which cannot be used when publishing to the AppStore. The whole Microsoft Office Suite is available in the macOS App Store - but Microsoft Teams must be downloaded from their website and cannot be installed via the AppStore...
curtisnewton 11/13/2025||
> Probably because they require APIs which cannot be used when publishing to the AppStore

That's the funny part.

They do stuff they want to prohibit to other developers because "safety".

But we all know that Google can do massively more harm than scammers pushing their scammy apps to a few hundreds people.

For example, in today's news "Google hit with EU antitrust investigation into its spam policy".

There's a bit of irony in it and a lot of hypocrisy, IMO.

jhasse 11/13/2025|||
Bad example because that .pkg was probably signed with a developer certificate with approval from Apple - just as would be the case on Android in the future.
Lapel2742 11/13/2025|||
> > Keeping users safe on Android is our top priority.

Somebody tell them that I do not want to be kept safe by Big Brother.

wiseowise 11/13/2025|||
Your personal data will be kept safe on our servers, citizen, whether you like it or not.
A4ET8a8uTh0_v2 11/13/2025|||
Enforcer, informing citizen on basic practices undermines citizen's delusion of being free. Please refer to room 22a for re-alignment and training.
m463 11/16/2025|||
> Your personal data will be kept safe on our servers, citizen, whether you like it or not.

... and our business partners. And app developers that grab your clipboard. And their business partners. and a few more levels of data brokers. The spi^H^H^H data-vacuum must flow

ThatMedicIsASpy 11/13/2025|||
EU did more by mandating 5 years of updates…
pxc 11/13/2025|||
And of course, code signing can't protect you from such a thing. When software publishing rights get bought, so (usually) do the signing keys.

Curation (and even patching) by independent, third-party volunteers with strong value commitments does protect users from this (and many other things). Code signing is still helpful for F/OSS distributions of software, but the truth is that most of the security measures related to app installation serve primarily to solve problems with proprietary app markets like Google's Play Store and Apple's App Store. Same thing with app sandboxing.

It's unfortunate but predictable when powerful corporations taint genuine security features (like anti-tampering measures, built-in encryption devices, code signing, sandboxing, malware scanning, etc.) by using them as instruments of control to subdue their competitors and their own users.

soulofmischief 11/13/2025|||
The entire SimpleMobileTools situation left such a bad taste in my mouth. No upfront communication, it had to be discovered in a GitHub issue thread after people started asking questions.

It was shady as fuck on Kaputa's part, especially given ZipoApps is an Israeli adware company, a.k.a. surveillance company, and given Israel's track record with things like using Pegasus against journalists/activists or blowing up civilian-owned beepers, this should automatically be a major security incident and at least treated as seriously as the TikTok debacle.

Kaputa should be extremely ashamed of himself and outted from the industry. I and many others would have gladly paid a yearly subscription for continued updates of the suite instead of a one-time fee, but instead of openly discussing such a model with his userbase, he went for the dirtiest money he could find.

1vuio0pswjnm7 11/13/2025|||
If "automatic updates" were optional and off-by-default then users would not be vulnerable to something like SimpleMobileTools

Why not let the user decide

Letting someone else decide has potential consequences

Using F-Droid app ("automatic updates") is optional, as it should be

"Automatic updates" is another way of saying "allow somone else to remotely install software on this computer"

Some computer owners might not want that. It's their decision to make

I disable internet access to all apps by default, including system apps

When source code is provided I can remove internet access before compilation

Anyway, the entire OS is "user-hostile" requiring constant vigilance

It's controlled by an online ad services company

Surveillance as a business

binkHN 11/13/2025|||
> If "automatic updates" were optional and off-by-default then users would not be vulnerable to something like SimpleMobileTools

The problem is the vast majority of users want this on by default; they don't want to be bothered with looking at every update and deciding if they should update or not.

CodeMage 11/13/2025||
The vast majority of users want their apps to work. They don't care whether that happens through automatic updates or not.

It's the developers who don't want the headache of not having automatic updates.

1vuio0pswjnm7 11/14/2025|||
"Automatic updates" is "remote code execution (RCE)" by permission

Given the frequent complaints about the former, the notion of "permission" is dubious

jeroenhd 11/13/2025|||
> I want to be able to install apps from alternative app stores like F-Droid and receive automatic updates

That's actually possible, though app stores need to implement the modern API which F-Droid doesn't seem to do quite well (the basic version of F-Droid (https://f-droid.org/eu/packages/org.fdroid.basic/) seems to do better). Updating from different sources (i.e. downloading Signal from GPlay and then updating it from F-Droid or vice versa) also causes issues. But plain old alternative app stores can auto-update in the background. Could be something added in a relatively recent version of Android, though.

If this Verified bullshit makes it through, I expect open source Android development to slowly die off. Especially for smaller hobbyist-made apps.

johnnyWx0021 11/13/2025||
[dead]
svat 11/13/2025||
From the very first announcement of this, Google has hinted that they were doing this under pressure from the governments in a few countries. (I don't remember the URL of the first announcement, but https://android-developers.googleblog.com/2025/08/elevating-... is from 2025-August-25 and mentions “These requirements go into effect in Brazil, Indonesia, Singapore, and Thailand”.) The “Why verification is important” section of this blog post goes into a bit more detail (see also the We are designing this flow specifically to resist coercion, ensuring that users aren't tricked into bypassing these safety checks while under pressure from a scammer), but ultimately the point is:

there cannot exist an easy way for a typical non-technical user to install “unverified apps” (whatever that means), because the governments of countries where such scams are widespread will hold Google responsible.

Meanwhile this very fact seems fundamentally unacceptable to many, so there will be no end to this discourse IMO.

thisislife2 11/13/2025||
I don't buy this argument at all that this specific implementation is under pressure from the government - if the problem is indeed malware getting access to personal data, then the very obvious solution is to ensure that such personal data is not accessible by apps in the first place! Why should apps have access to a user's SMS / RCS? (Yeah, I know it makes onboarding / verification easy and all, if an app can access your OTP. But that's a minor convenience that can be sacrificed if it's also being used for scams by malware apps).

But that kind of privacy based security model is anathema to Google because its whole business model is based on violating its users' privacy. And that's why they have come with such convoluted implementation that further give them control over a user's device. Obviously some government's too may favour such an approach as they too can then use Google or Apple to exert control over their citizens (through censorship or denial of services).

Note also that while they are not completely removing sideloading (for now) they are introducing further restrictions on it, including gate-keeping by them. This is just the "boil the frog slowly" approach. Once this is normalised, they will make a move to prevent sideloading completely, again, in the future.

cesarb 11/13/2025|||
> Why should apps have access to a user's SMS / RCS?

It could be an alternative SMS app like TextSecure. One of the best features of Android is that even built-in default applications like the keyboard, browser, launcher, etc can be replaced by alternative implementations.

It could also be a SMS backup application (which can also be used to transfer the whole SMS history to a new phone).

Or it could be something like KDE Connect making SMS notifications show up on the user's computer.

thisislife2 11/13/2025|||
That's all indeed valid.

> One of the best features of Android is that even built-in default applications like the keyboard, browser, launcher, etc can be replaced by alternative implementations.

When sideloading is barred all that can easily change. If you are forced to install everything from the Google Play Store, Google can easily bar such things, again in the name of "security" - alternate keyboards can steal your password, alternate browsers can have adware / malware, alternate launcher can do many naughty things etc. etc.

And note that if indeed giving apps access to SMS / RCS data is really such a desirable feature, Google could have introduced gate-keeping on that to make it more secure, rather than gate-keeping sideloading. For example, their current proposal says that they will allow sideloading with special Google Accounts. Instead of that, why not make it so that an app can access SMS / RCS only when that option is allowed when you have a special Google Account?

The point is that they want to avoid adding any barriers where a user's private data can't be easily accessed.

AnthonyMouse 11/13/2025|||
> Instead of that, why not make it so that an app can access SMS / RCS only when that option is allowed when you have a special Google Account?

Because then you still need a special Google Account to install your app when it needs to access SMS / RCS.

How about solving this problem in a way that doesn't involve Google rather than the owner of the device making decisions about what they can do with it? Like don't let the app request certain permissions by default, instead require the user to manually go into settings to turn them on, but if they do then it's still possible. Meanwhile apps that are installed from an app store can request that permission when the store allows it, so then users have an easy way to install apps like that, but in that case the app has been approved by Google or F-Droid etc. And the "be an app store" permission works the same way, so you have to do it once when you install F-Droid but then it can set those permissions the same as Google Play.

It's not Google's job to say no for you. It's only their job to make sure you know what you're saying yes to when you make the decision yourself.

hanikesn 11/13/2025||
>instead require the user to manually go into settings to turn them on, but if they do then it's still possible

They clearly addressed this option in the post, under sufficient social engineering pressure these settings will easily be circumvented. You'd need at least a 24h timeout or similar to mitigate the social pressure.

AnthonyMouse 11/13/2025||
> They clearly addressed this option in the post, under sufficient social engineering pressure these settings will easily be circumvented. You'd need at least a 24h timeout or similar to mitigate the social pressure.

"Under sufficient social engineering pressure" is the thing that proves too much. A 24h timeout can't withstand that either. Nor can the ability for the user to use their phone to send money, or access their car or home, or read their private documents, or post to their social media account. What if someone convinces them to do any of those things? The only way to stop it is for the phone to never let them do it.

By the time you're done the phone is a brick that can't do anything useful. At some point you have to admit that adults are responsible for the choices they make.

hexage1814 11/13/2025||
>By the time you're done the phone is a brick that can't do anything useful. At some point you have to admit that adults are responsible for the choices they make.

Absolutely this! It's just nanny state all over again.

AnthonyMouse 11/13/2025|||
This is somehow even worse. It's strictly enforced with no regard for context, you don't have the constitutional rights you have against the government and you can't vote them out.

Markets are supposed to be better because you can switch to a competitor but that only applies when there is actually competition. Two companies both doing the same thing is not a competitive market.

OptionX 11/13/2025||||
It'd just devolve into security whack a mole about what permissions need those special account or not, ending with basically all of them making it the same as just needing dev verification anyway for anything remotely useful.

And despite that, you assuming that dev verification means no malware. The Play Store requires developers to register with the same verification measures we're talkingand malware is hardly unheard of there.

john01dav 11/13/2025|||
> alternate keyboards can steal your password, alternate browsers can have adware / malware, alternate launcher can do many naughty things etc. etc.

It's plausible that Google is done some of these things, like doing some sort of data mining on everything that you type for example (steal your password), and many official google apps have ads if you don't pay them

thisislife2 11/13/2025||
Definitely. All mobile keyboards become keyloggers if you enable the spellcheck feature or autocomplete / suggestion feature or any AI feature on it (because they need to collect data to "improve service"). Apple also has made changes to its mobile OS when it helps data collection. E.g Allowing messenger apps like WhatsApp to integrate with the Phone app ensures that Apple now knows who you call (voice / video) on WhatsApp.
KurSix 11/13/2025|||
I'm not sure it's entirely fair to say this is just Google flexing control
nandomrumber 11/13/2025||
Last year Australians reported losing AU$20 million to phishing attacks, and AU$318 million to scams of all types.

It stands to reason that financial service industry peak bodies are in conversation with governments and digital service providers, including data providers, to try to better protect users.

There are obvious conflicting goals, and the banks / governments can’t really appear to be doing nothing.

And technical users are probably most certainly lacking a representative at the table, and are the group that has the least at stake. Whacko fringe software-freedom extremists, they probably call us.

ori_b 11/13/2025||
Does that mean that the Google and the government are taking full legal liability for protecting me from scams?
BrenBarn 11/13/2025||||
Yeah. I mean the irony is that the one advantage of having a controlled and monitored app store would be that the entity monitoring it enforces certain standards. Games don't need access to your contacts, ever. If Google Play would just straight up block games that requested unnecessary permissions, it might have value. Instead we have 10,000 match-three games that want to use your camera and read all your data and Google is just fine with that. If the issue was access to personal data, a large proportion of existing apps should just be banned.
Groxx 11/14/2025||
I really think all permissions systems need what we had back in xposed/appops days:

Permissions should ~always be "accept (with optional filters)", "deny", and "lie". If the game wants contacts access and won't take no for an answer, I should be able to feed it a lie: empty and/or fake and/or sandboxed data. It's my phone and my data, not the app's.

We had it over a decade ago, xposed supported filtered and fake data for many permissions. It's strictly user-hostile that Android itself doesn't have this capability.

Groxx 11/13/2025||||
re OTPs, there's a special permission-less way to request sms codes, with a special hash in the content so it's clearly an opt-in by both app and sender: https://developers.google.com/identity/sms-retriever/overvie...

so no, it's not necessary at all. and many apps identify OTPs and give you an easy "copy to clipboard" button in the notification.

but that isn't all super widely known and expected (partly because not all apps or messages follow it), so it's not something you can rely on users denying access to.

omnifischer 11/13/2025||||
Playstore is the one that contains majority of the malware and people get it only that way. I rarely know of people side-loading that have issues.

https://www.google.com/search?q=ars+technica+playstore+malwa...

reddalo 11/13/2025||
Installing apps from sources that are not the Play Store requires a bit of technical knowledge anyway. My grandma is not going to download a random APK and give all the necessary permissions to install it and run it.
nandomrumber 11/13/2025||
It’s been a few months since I used an Android device.

What was the process? Enable developer mode and grant ’can install apps’ to a browser or file browser?

Am I remembering this correctly?

The only other step is to download a file from the internet, or otherwise receive one. That’s not a technical-knowledge step though

prmoustache 11/13/2025||
no, that is not done via developer mode. When You download or try to open an apk from any app, it asks you if you want to allow it to install apps and send you to the configuration dialog. You still have to validate the app installation manually tbrough another dialog. In that case I usually leave the config dialog open while the app is installed, then disable the app permission right after install because that option is usually not easy to find. I usually only do it once on a new smartphone to install f-droid from a browser then allow f-droid and aurora store permanently.

I think that is the part that should be fixed, users should be able to allow a one time exception to avoid letting that permission activated by mistake. I don't need to allow permanently a web browser to install apps.

nandomrumber 11/13/2025||
Point being: it’s easier than my middle aged blue collar tradesman’s brain remembered it.

The comment I replied to tried to tell us some technical knowledge required.

Doesn’t sound like it?

krzyk 11/13/2025||||
Because Tasker is fundamental for some. Those arguments are similar to "think of children".
lern_too_spel 11/13/2025||||
> Note also that while they are not completely removing sideloading (for now) they are introducing further restrictions on it, including gate-keeping by them.

This blog post is specifically saying there will be a way to bypass the gatekeeping on Google-blessed Android builds, just as we wanted.

> But that kind of privacy based security model is anathema to Google because its whole business model is based on violating its users' privacy.

Despite this, they sell some of the most privacy-capable phones available, with the Pixels having unlockable bootloaders. Even without unlocking the bootloader to install something like GrapheneOS, they support better privacy than the other mass market mobile phones by Samsung and Apple, which both admittedly set a low bar.

JulianHC 11/13/2025||||
I concur.

If they are concerned about malware then one of the obvious solutions would be safe guarding their play store. There is significant less scam on iphone because apple polices their app store. Meanwhile scam apps that i reported are still up on google play store.

miki123211 11/13/2025||||
> if the problem is indeed malware getting access to personal data, then the very obvious solution is to ensure that such personal data is not accessible by apps

Then you'd have the other "screaming minority" on HN show up, the "antitrust all the things" folks.

AnthonyMouse 11/13/2025||
The "let's actually enforce antitrust laws" people are in the majority:

https://today.yougov.com/economy/articles/47798-most-america...

https://www.antitrustinstitute.org/wp-content/uploads/2024/1...

nandomrumber 11/13/2025||
Your first link shows a graph that indicates more than 50% of Americans believe there is at least some competition, or a lot of competition; and that less than 1/3rd believe there is not enough, or no, competition in every sector of the economy that would be relevant to this discussion.

And that most Americans believe that bigger companies tend to have lower prices than smaller ones.

It’s not particularly clear then that there should be a lot of motivation to change things.

AnthonyMouse 11/13/2025||
You're choosing the questions that have framing issues:

> more than 50% of Americans believe there is at least some competition, or a lot of competition in every sector of the economy that would be relevant to this discussion.

We're talking about Google and Apple but the relevant category would be "technology companies". Do phone platforms or mobile app distribution stores have "a lot of competition"? It's hard to see how anybody could think that. Do games and AI and web hosting? Sure they do. But they're lumping them all together.

They're also using "some competition" as the second-to-highest amount of competition even though that term could reasonably apply to a market where one company has 90% market share but not 100%, and it's confusingly similar to "not much competition". And they're somehow showing oil and gas as having less competition than telecommunications when oil and gas is a textbook fungible commodity and telecommunications is Comcast. That question has issues.

> And that most Americans believe that bigger companies tend to have lower prices than smaller ones.

This is the thing where Walmart has lower prices than the mom and pop. That doesn't imply that Walmart has better quality or service than a smaller company, and it doesn't imply that Walmart is operating in a consolidated market. Retail is objectively competitive in most areas.

Whereas when a big company is in a consolidated market, "big companies tend to have lower prices" doesn't hold and you get Google and Apple extracting 30%.

Moreover, the relevant part of that link was this part: More than two thirds of people, including the majority of both parties, support antitrust laws, six times as many people think they're not strict enough than think they're too strict and significantly more people agree with "the government should break up big tech" than disagree.

nandomrumber 11/13/2025||
On the other hand, maybe if the railways weren’t broken up the USA might have been crisscrossed with high speed rail by now.

Then we could argue how high speed rail would have been cheaper if the railways had been broken up.

PS I appreciate your thoughtful response, and your contributions to HN more generally.

AnthonyMouse 11/14/2025||
> On the other hand, maybe if the railways weren’t broken up the USA might have been crisscrossed with high speed rail by now.

Eh. The rails themselves are a natural monopoly in the same way roads are. It's one of the few things it makes sense to have the government build, or at least contract to have someone build, and then provide to everyone without restriction.

Meanwhile train cars and freight hauling and passenger service aren't any more of a natural monopoly than taxis or trucks. They get monopolized if someone is allowed to leverage a monopoly over the tracks into a monopoly over the rest of it, but that's unnecessary and undesirable. Separating them out allows the market that can be competitive to be competitive. Which is the same reason you don't want a tech monopoly leveraging it into control over ancillary markets that could otherwise be competitive.

There are two main reasons train service in the US is a shambles. The first is that the population density is too low, especially in the west. How many people do you expect to be riding a train from Boise to Des Moines on a regular basis? And the second is that truck drivers don't like freight rail, car companies don't like passenger rail and oil companies don't like either one, and they all lobby against anything that would make it better in the parts of the country where it could actually work. It's hard to make something good when there are millions of voters and billions of dollars trying to get it to suck.

notatoad 11/13/2025||||
>Why should apps have access to a user's SMS / RCS?

can you imagine the outrage from all the exact same people who are currently outraged about develeloper verification if google said they were cutting off any third-party app access to SMS/RCS?

trueismywork 11/13/2025|||
Its a fact even if you dont buy this
Lammy 11/13/2025|||
Google have their own reasons too. They would love to kill off YouTube ReVanced and other haxx0red clients that give features for free which Google would rather sell you on subscription.

Just look at everything they've done to break yt-dlp over and over again. In fact their newest countermeasure is a frontpage story right beside this one: https://news.ycombinator.com/item?id=45898407

svat 11/13/2025|||
I can easily believe that Google's YouTube team would love to kill off such apps, if they can make a significant (say ≥1%) impact on revenue. (After all, being able to make money from views is an actual part of the YouTube product features that they promise to “creators”, which would be undermined if they made it too easy to circumvent.)

But having seen how things work at large companies including Google, I find it less likely for Google's Android team to be allocating resources or making major policy decisions by considering the YouTube team. :-) (Of course if Android happened to make a change that negatively affected YouTube revenue, things may get escalated and the change may get rolled back as in the infamous Chrome-vs-Ads case, but those situations are very rare.) Taking their explanation at face value (their anti-malware team couldn't keep up: bad actors can spin up new harmful apps instantly. It becomes an endless game of whack-a-mole. Verification changes the math by forcing them to use a real identity) seems justified in this case.

My point though was that whatever the ultimate stable equilibrium becomes, it will be one in which the set of apps that the average person can easily install is limited in some way — I think Google's proposed solution here (hobbyists can make apps having not many users, and “experienced users” can opt out of the security measures) is actually a “least bad” compromise, but still not a happy outcome for those who would like a world where anyone can write apps that anyone can install.

Zak 11/13/2025||
I would like a world where buying something means you get final say over how it operates even if you might do something dangerous/harmful/illegal.
miki123211 11/13/2025|||
I would like a world where I have the final say over whether I should have a final say.

One way to achieve this is to only allow sideloading in "developer mode", which could only be activated from the setup / onboarding screen. That way, power users who know they'll want to sideload could still sideload. The rest could enjoy the benefits of an ecosystem where somebody more competent than their 80-year-old nontechnical self can worry about cybersecurity.

Another way to do this would be to enforce a 48-hour cooldown on enabling sideloading, perhaps waived if enabled within 48 hrs of device setup. This would be enough time for most people to literally "cool off" and realize they're being scammed, while not much of an obstacle for power users.

vrighter 11/13/2025|||
You can sideload, I mean INSTALL, software on any linux desktop. Yet there are still tons of people saying that desktop linux has gotten good enough for most of everyone's grandma to daily-drive.
stackghost 11/13/2025||
When everyone's Grandma is running Linux then the Indian scammers will know how to trick Grandma into thinking dmesg spam is "a virus" and just install this totally-not-malware, just like they do with the windows event viewer.

In other words, it's not any quality of Linux other than how niche it is.

Lammy 11/13/2025|||
It's an excellent example of the fruitlessness of technical solutions to people problems. Some people are just destined to get scammed, and it isn't worth throwing away General Purpose Computing to try to help them. Be present in Grandma's life and she won't be desperate to trust the nice man on the phone just to have someone to talk to. If it weren't this it would be iTunes gift cards, or Your Vehicle's Extended Warranty, or any number of other avenues.
uyzstvqs 11/13/2025|||
The actual stopping power here is that any grandma who uses a Linux desktop has a family member (or other contact) who helps with technical matters. They've been educated about internet & phone scams, and will immediately call their technical contact when anything is suspicious.
Zak 11/13/2025||||
This becomes a problem when someone asks me for help with their phone and I want to point them to some apps from F-Droid to reduce their exposure to surveillance marketing.

Of course that's a side effect Google probably wouldn't be sad about.

jraph 11/13/2025||||
These two solutions wouldn't work for me. My phone is covered, I use a custom ROM, but I like being able to help people install cool stuff that's not necessarily on the Play store, organically, without planning.
HumanOstrich 11/13/2025||||
I'm not sure I like the idea of "you have to wait 48 hours now for sideloading in case you are an idiot". Most idiots will then have sideloading on after 48 hours and still get hit with the next scam anyway.
curtisnewton 11/13/2025||||
> more competent than their 80-year-old nontechnical self can worry about cybersecurity

80-year-old nontechnical self can easily operate machines and devices that are much more complex and easily more dangerous than a smartphone.

And yet we're here pretending that those same people will install apps without even thinking about it.

Careless people are careless, we know that, we don't make them safer by treating everyone else like toddlers with a gun in their hands.

consp 11/13/2025|||
> which could only be activated from the setup / onboarding screen

Yea no. Now companies have to supply two phones, one for dev and one for calling. It is hard enough to get one...

Aurornis 11/13/2025||||
You’re still proving the point above, which is ignoring the fact that the restriction is specifically targeted at a small number of countries. Google is also rolling out processes for advanced users to install apps. It’s all in the linked post (which apparently isn’t being read by the people injecting their own assumptions)

Google is not rolling this out to protect against YouTube ReVanced but only in a small number of countries. That’s an illogical conclusion to draw from the facts.

unsungNovelty 11/13/2025|||
Its my device. Not google's. Imagine telling you which NPM/PIP packages you can install from your terminal.

Also, its not SIDE loading. Its installing an app.

freefaler 11/13/2025|||
Well... it would be good if this was true, but read the ToS and it looks more like a licence to use than "ownership" sadly :(
AnthonyMouse 11/13/2025|||
"Android" is really a lot of different code but most of it is the Apache license or the GPL. Google Play has its own ToS, but why should that have to do with anything when you're not using it?
johnnyanmac 11/14/2025|||
Google doesn't own AOSP. We don't need any google apps on an andoid phone for it to function.
xnx 11/13/2025||||
I agree, but I don't see why Google gets more critical attention than the iPhone or Xbox.
_blk 11/13/2025|||
iPhone has always been that way (try installing an .ipa file that's not signed with a valid apple developer certificate). For Google forced app verification is a major change. Xbox I don't know..
da_chicken 11/13/2025|||
Yeah, let's ask the Debian team about installing packages from third party repos.

I'm not on the side of locking people out, but this is a poor argument.

cookiengineer 11/13/2025||
> Yeah, let's ask the Debian team about installing packages from third party repos.

Debian already is sideloaded on the graciousness of Microsoft's UEFI bootloader keys. Without that key, you could not install anything else than MS Windows.

Hence you don't realize how good of an argument it is, because you even bamboozled yourself without realizing it.

It gets a worse argument if we want to discuss Qubes and other distributions that are actually focused on security, e.g. via firejail, hardened kernels or user namespaces to sandbox apps.

Ms-J 11/13/2025||
"Debian already is sideloaded on the graciousness of Microsoft's UEFI bootloader keys. Without that key, you could not install anything else than MS Windows."

This is only true if you use Secure boot. It is already not needed and insecure so should be turned off. Then any OS can be installed.

cesarb 11/13/2025|||
> This is only true if you use Secure boot. [...] so should be turned off. Then any OS can be installed.

You can only turn off Secure Boot because Microsoft allows it. In the same way Android has its CDD with rules all OEMs must follow (otherwise they won't get Google's apps), Windows has a set of hardware certification requirements (otherwise the OEM won't be able to get Windows pre-installed), and it's these certification requirements that say "it must be possible to disable Secure Boot". A future version of Windows could easily have in its hardware certification requirements "it must not be possible to disable Secure Boot", and all OEMs would be forced to follow it if they wanted Windows.

And that already happened. Some time ago, Microsoft mandated that it must not be possible to disable Secure Boot on ARM-based devices (while keeping the rule that it must be possible to disable it on x86-based devices). I think this rule was changed later, but for ARM-based Windows laptops of that era, it's AFAIK not possible to disable Secure Boot to install an alternate OS.

Lammy 11/13/2025||||
I agree with you and run with it disabled myself, but some anti-cheat software will block you if you do this. Battlefield 6 and Valorant both require it.
a96 11/13/2025||
This is the real malware that people should be protected from.
cookiengineer 11/13/2025||||
Now tell me how

Turning off UEFI secure boot on a PC to install another "unsecure distribution"

vs.

Unlocking fastboot bootloader on Android to install another "unsecure ROM"

... is not the exact same language, which isn"t really about security but about absolute control of the device.

The parallels are astounding, given that Microsoft's signing process of binaries also meanwhile depends on WHQL and the Microsoft Store. Unsigned binaries can't be installed unless you "disable security features".

My point is that it has absolutely nothing to do with actual security improvements.

Google could've invested that money instead into building an EDR and called it Android Defender or something. Everyone worried about security would've installed that Antivirus. And on top of it, all the fake Anti Viruses in the Google Play Store (that haven't been removed by Google btw) would have no scamming business model anymore either.

Ms-J 11/13/2025||
"... is not the exact same language, which isn"t really about security but about absolute control of the device.

The parallels are astounding, given that Microsoft's signing process of binaries also meanwhile depends on WHQL and the Microsoft Store. Unsigned binaries can't be installed unless you "disable security features".

My point is that it has absolutely nothing to do with actual security improvements."

I agree. It is the same type of language.

HumanOstrich 11/13/2025|||
While it's possible to install and use Windows 11 without Secure Boot enabled, it is not a supported configuration by Microsoft and doesn't meet the minimum system requirements. Thus it could negatively affect the ability to get updates and support.

> It is already not needed and insecure so should be turned off.

You know what's even less secure? Having it off.

Lammy 11/13/2025||
The name “Secure Boot” is such an effective way for them to guide well-meaning but naïve people's thought process to their desired outcome. Microsoft's idea of Security is security from me, not security for me. They use this overloaded language because it's so hard to argue against. It's a thought-terminating cliché.

Oh, you don't use <thing literally named ‘Secure [Verb]’>?? You must not care about being secure, huh???

Dear Microsoft: fuck off; I refuse to seek your permission-via-signing-key to run my own software on my own computer.

Ms-J 11/13/2025|||
Agreed.

Also Secure boot is vulnerable to many types of exploits. Having it enabled can be a danger in its self as it can be used to infect the OS that relies on it.

codethief 11/13/2025||
Could you elaborate? This is news to me?
codethief 11/13/2025|||
> Dear Microsoft: fuck off; I refuse to seek your permission-via-signing-key to run my own software on my own computer.

No one is stopping you from installing your own keys, though?

Lammy 11/13/2025||
I do not want to be in the business of key management. This is not something that needed encryption. More encryption ≠ better than.

I also dual-boot Windows and that's a whole additional can of worms; not sure it would even be possible to self-key that. Microsoft's documentation explicitly mentions OEMs and ODMs and not individual end users: https://learn.microsoft.com/en-us/windows-hardware/manufactu...

codethief 11/15/2025||
> This is not something that needed encryption. More encryption ≠ better than.

Securing the boot chain protects against a whole range of attacks, so yes, it is objectively better from a security POV.

cookiengineer 11/16/2025||
Name a single prevented bootkit that wasn't able to avoid the encryption and signature verification toolchain altogether.

Malware developers know how to avoid this facade of an unlocked door.

Users do not.

That's the problem. It's not about development, it's about user experience. Most users are afraid to open any Terminal window, let alone aren't even capable of typing a command in there.

If you argue about good intent from Microsoft here, think again. It's been 12 years since Stuxnet, and the malware samples still work today. Ask yourself why, if the reason isn't utter incompetence on Microsoft's part. It was never about securing the boot process, otherwise this would've been fixed within a day back in 2013.

Pretty much all other bootkits also still work btw, it's not a singled out example. It's the norm of MS not giving a damn about it.

jeroenhd 11/13/2025||||
The countries that go after Google are the first wave, they're applying these restrictions globally not much later.

The linked post is full of fluff and low on detail. Google doesn't seem to have the details themselves; they're continuing with the rollout while still designing the flow that will let experienced users install apps like normal.

Aeolun 11/13/2025|||
A small number of countries now. The rest of the world in 2027 and beyond.
ashleyn 11/13/2025||||
yt-dlp's days are fairly numbered as Google has a trump card they can eventually deploy: all content is gated behind DRM. IIRC the only reason YouTube content is not yet served exclusively through DRM is to maintain compatibility with older hardware like smart TVs.
kldg 11/13/2025|||
Youtube already employs DRM on some of their videos (notably their free* commercial movies). if you try to take a screenshot, the frame is blacked out. this can be bypassed by applying a CSS blur effect of 0 pixels, permitting extraction; detection of DRM protection and applying the bypass is likely trivial for the kinds of people already writing scripts and programs utilizing yt-dlp. the css method of bypass has been widely disseminated for years (over a decade?), but programmers love puzzles, so a sequel to current DRM implementation seems justified. YT could also substantially annoy me by expiring their login cookies more frequently; I think I have to pull them from my workstation every month or two as-is? at some point, they could introduce enough fragility to my scripts where it's such a bother to maintain that I won't bother downloading/watching the 1-3 videos per day I am today -- but otoh, I've been working on a wasm/Rust mp4 demuxer and from-scratch WebGL2 renderer for video and I'm kind of attached to seeing it through (I've had project shelved for ~3 weeks after getting stuck on a video seek issue), so I might be willing to put a lot of effort into getting the videos as a point of personal pride.

the real pain in the butt in my present is Patreon because I can't be arsed to write something separate for it. as-is, I subscribe to people on Patreon and then never bother watching any of the exclusive content because it's too much work. some solutions like Ghost (providing an API for donor content access) get part of the way to a solution, but they are not themselves a video host, and I've never seen anyone use it.

quotemstr 11/13/2025||
> this can be bypassed by applying a CSS blur effect of 0 pixels, permitting extraction

That's not real DRM then. The real DRM is sending the content such that it flows down the protected media path (https://en.wikipedia.org/wiki/Protected_Media_Path) or equivalent. Userspace never sees decrypted plaintext content. The programmable part of the GPU never seen plaintext decrypted content. Applying some no-op blur filter would be pointless since anything doing the blur couldn't see the pixels. It's not something you can work around with clever CSS. To compromise it, you need to do an EoP into ordinarily non-programmable scanout of the GPU or find bad cryptography or a side channel that lets you get the private key that can decode the frames. Very hard.

Is this how YT works today? Not on every platform. Could it work this way? Definitely. The only thing stopping them is fear of breaking compatibility with a long tail of legacy devices.

etatoby 11/13/2025||||
Something I've never understood about DRM is, if the content is ultimately played on my device, what stops me from reverse engineering their code to make an alternative client or downloader? Is it just making it harder to do so? Or is there a theoretical limit to reverse engineering that I'm not getting? Do they have hardware decryption keys in every monitor, inside the LCD controller chip?
gear54rus 11/13/2025|||
in short and simple terms, those parasites colluded with hardware manufacturers and put a special chip in your computer and monitor that runs enslavement software

without opening it up physically there is no way to make it stop or get the raw stream before it's displayed

A4ET8a8uTh0_v2 11/13/2025||
This. Some ways back I actually purchased bluray recording device only to learn that its firmware is deliberately crippled to accommodate someone's business model. There are people who do the unsung hero work, but those types of skills are not exactly common and a business asshole is a dime a dozen any century you want to pick.
ploek 11/13/2025||||
Yes, the decryption happens in hardware. For your OS (and potential capturing software running on it) the place where you see the video is just an empty canvas on which the hardware renders the decrypted image.
potwinkle 11/13/2025|||
All levels of Widevine are cracked, but only the software-exclusive vulnerabilities are publicly available. It's only used for valuable content though (netflix/disney+/primevideo), so it might still work out for YouTube as no one will want to waste a vulnerability on a Mr. Beast slop video.
AnthonyMouse 11/13/2025|||
The reason they have different levels is that the DRM pitchmen got tired of everyone making fun of their ineffective snake oil, so they tried to make a version that was harder to break at the cost of not supporting most devices.

Naturally that got broken too, and even worse, broken when it's only supported by a minority of devices and content, because the more devices and content it's used for the easier it is to break and the larger the incentive to do it.

If you tried to require that for all content then it would have to be supported by all devices, including the bargain bin e-waste with derelict security, and what do you expect to happen then?

darkwater 11/13/2025|||
Do you have any link? All the things I can find are about the 2019 L3 crack
kotaKat 11/13/2025||
I don’t have any personal links but know that there is a constant cat-and-mouse game of cracking Widevine devices for their L1 keyboxes and using them on high-value content (as mentioned).

That’s why a lot of low end Android devices often have problems playing DRMed content on the Web: their keyboxes got cracked open and leaked wide enough for piracy that they got revoked and downgraded down to L3.

khannn 11/13/2025||||
Too bad that I'm going iPhone if Google removes sideloading and now I know about revanced so they aren't getting any more than the zero dollars that youtube and youtube music are worth from me

If I'm going to live in a walled garden it's going to the fanciest

m4rtink 11/13/2025||
I still don't get this mindset - all is lost, I am not going to do anything aboit that AND I will punish them by going with the even worse option!
Perz1val 11/13/2025||
If neither does what you want, you'll use other metrics, which often make ios a better choice. Simple as that
khannn 11/13/2025||
If they're going to reduce me to a user, iOS is the better choice. I had an iPhone before and it's a picture taking, instagram, social media machine with iMessage—bringing the console wars to normies since inception.

Because the hardware is so constrained an iphone lasts forever compared to a similar android. My two year old pixel is slow now, but I know people completely happy with a five year old iphone. Pause, I checked and the oldest iphone that receives updates is an iphone 11, which is the exact model I had before going back to android.

akimbostrawman 11/13/2025||
I have multiple generations of pixel phones and could not tell the difference in performance between them in basic tasks. Maybe because i installed GrapheneOS which makes both stock android and ios feel like a bloat and spyware riddled toy.
khannn 11/14/2025||
I have a Pixel 7 and it's ridiculously slow so I've been thinking about GrapheneOS.
akimbostrawman 11/14/2025||
The only reason for me to get a pixel is GOS. I never want to get back, it makes other mobile os feel icky.
charcircuit 11/13/2025|||
You would still be able to adb installs them. They wouldn't die.
gdulli 11/13/2025|||
Developers of these apps would have little motivation if the maximum audience size was cut down to the very few who would use adb. The ecosystem would die.
userbinator 11/13/2025|||
Or someone comes up with an easy adb wrapper and now it becomes the go-to way to install apps.
xyzzy_plugh 11/13/2025||
Shizuku[0][1] already exists, it would certainly suck but it wouldn't be the end of the world.

Of course I would be much happier if I didn't need to use Shizuku in the first place.

[0]: https://play.google.com/store/apps/details?id=moe.shizuku.pr...

[1]: https://shizuku.rikka.app/

celsoazevedo 11/13/2025||
That uses a workaround based on WiFi debugging even though it's all local. It doesn't run if you're not connected to a trusted WiFi network, you have to set it all up when connecting to a new network, etc.

Not only users are not connected to WiFi all the time, but in many developing countries people often have no WiFi at home and rely on mobile data instead. It's a solution, but not a solution for everyone or a solution that works all the time.

wolvesechoes 11/13/2025|||
And how do you estimate the audience that even cares about those issues?

I think number of people caring about alternative app stores, F-droid or whatever is very similar to the number of people willing to use adb if necessary, so rather small.

gdulli 11/13/2025||
But the ecosystem exists, regardless of what the absolute number is, and it would be bad to lose it. If the platform was more open like Windows the ecosystem would grow, if it was less open like iOS it would die.
gblargg 11/13/2025||||
Somehow I think having to use ADB instead of something like F-Droid with automatic updates would put a damper on things.
AuthError 11/13/2025||||
how many people ll do this though? i would expect sub 1% conversion from existing users if they had to do that
dns_snek 11/13/2025|||
> Google has hinted

I beg to differ:

> In early discussions about this initiative, we've been encouraged by the supportive initial feedback we've received.

> the Brazilian Federation of Banks (FEBRABAN) sees it as a “significant advancement in protecting users and encouraging accountability.” This support extends to governments as well

> We believe this is how an open system should work

Google isn't "hinting" that they're doing this under pressure, that announcement makes it quite clear that this is Google's initiative which the governments are supportive of because it's another step on a ratcheting mechanism that centralizes power.

> because the governments of countries where such scams are widespread will hold Google responsible

Your comment is normalizing highly problematic behavior. Can we agree that vague "pressure from the government" shouldn't be how policies and laws are enacted? They should make and enforce laws in a constitutional manner.

If you believe that it's normal for these companies and government officials to make shadow deals that bypass the rule of law, legal procedures, separation of powers and the entire constitutional system of governance that our countries have, then please drop the pretense that you stand for democracy and the rule of law (assuming that you haven't already).

Otherwise we need to be treating it for what it is - a dangerous, corrupt, undemocratic shift in our system of governance.

thaumasiotes 11/13/2025|||
> there cannot exist an easy way for a typical non-technical user to install “unverified apps” (whatever that means), because the governments of countries where such scams are widespread will hold Google responsible.

What, the same way they hold Microsoft responsible for the fact that you can install whatever you want in Windows?

Obviously, there can exist an easy way for a non-technical user to install unverified apps, because there has always been one.

svat 11/13/2025||
This is actually a good point, and something I've been wondering about too. What changed between the 90s and now, that Microsoft didn't get blamed for malware on Windows, but Google/Apple would be blamed now for malware on their devices? It seems that the environment today is different, in the sense that if (widespread) PCs only came into existence now, the PC makers would be considered responsible for harms therefrom (this is a subjective opinion of course).

Assuming this is true (ignore if you disagree), why is that? Is it that PCs never became as widespread as phones (used by lots of people who are likely targets for scammers and losing their life savings etc), or technology was still new and lawmakers didn't concern themselves with it, or PCs (despite the name) were still to a large extent "office" devices, or the sophistication of scammers was lower then, or…? Even today PCs are being affected by ransomware (for example) but Microsoft doesn't get held responsible, so why are phones different?

pmontra 11/13/2025|||
What changed is that Apple made the masses familiar with the concept of installing software only from a store with a vetting process. For short, the walled garden. That was mostly an alien thing in the world of software. All of us grew with the possibility of getting an installer and install it whenever we wanted. There were some form of protections against piracy but nothing else.

Once Apple created the walled garden every other company realized how good it could be for their bottom lines and attempted to do the same thing.

So, to answer your question, Microsoft got blamed for viruses and made fun of but there wasn't a better way in the mainstream. There is one now.

PCs will resist this trend for a while because it's also mainstream that they are used to do work. Many people use a PC every day with some native application from a company they have a direct contract with. For example: accounting software. Everybody can add another example from their own experience. Those programs don't come from the Windows store and it will be a long term effort to gatekeep everything into the store or move them into a web browser.

The .NET MAUI technology we had a post about yesterday is one of the bricks that can build the transition.

StopDisinfo910 11/13/2025||
> So, to answer your question, Microsoft got blamed for viruses and made fun of but there wasn't a better way in the mainstream. There is one now.

I don't think App Store is a better way.

From my point of view, people keep mistaking the actual progress - generalised sandboxing and reduced API surface - with the major regression - controlled distribution. At the beginning of the App Store, when the sandboxing and APIs were poor, they were frequent security issues.

Apple marketing magic is somehow convincing people that it's their questionable veting which made things secure and not the very real security innovations.

pmontra 11/13/2025||
I'm with you and personally I boycott Apple because of the walled garden, for what it's worth. However it is a better way (a more convenient way?) for companies to make money and it gave an idea to legislators and regulators. Now they expect that the owner of the OS can decide what runs and what does not run on their OS and be made accountable for it.
travisgriggs 11/13/2025||||
Windows 95 (and patronage) had become a shitshow. It’s easy to forget how much time us tech types were spending “fixing” uncle’s PC that somehow got malware on it. How we touted Linux as an escape from the hellscape of crapware.

It was into this void that the “everything seems new” iPhone stepped and ventured out in a different course. I’m neither speaking for or against apples normalization of an App Store as a primary source of updates, just recalling the way things were, and positing that Apple was trying a different approach that initially offered a computing platform that wasn’t the hellscape that MS platform was quickly becoming.

vladms 11/13/2025||
Windows 95 was fundamentally broken as if I recall correctly there was much less security features (accounts, file permissions, etc.). Nowadays there are less problems with it.
calgoo 11/13/2025||
Its not that it was broken, its that security was not really a thing. You had your antivirus to protect you from people adding stuff to discs, but thats it. Windows 95 was just an exe file in the windows folder that you could run from DOS.

Windows NT / OS2 did have more security as it was meant for shared environments, but even there, corporations ended up using stuff like Novell NetWare to get the actual networking services.

Windows 2000 was the first version of consumer windows based on the NT kernel instead of the DOS / Windows 95/98/ME based systems. I still remember running around the office updating windows 2000 machines to service pack 4 to protect us against the first real massive virus "ILOVEYOU".

Edit: Still on first coffee, sorry about the ramblings

vladms 11/13/2025||
Sure, my point was that even if iPhone ecosystem is more secure than Windows 95, I do not think this is due mostly to the "walled garden", but because (as you mention) Windows 95 just did not care about security at all. By the time iPhone appeared the security of Windows systems (2000 and later) had already improved (even if not perfect) and there was a possibility to configure it more "locked down", if you wanted.
wmf 11/13/2025|||
I always blamed Microsoft for Windows insecurity. But seriously, Windows did not have any vetting process for apps and apps didn't really have access to money. Google's problem is that they claim Android is a secure way to do banking but it isn't.
tomrod 11/13/2025|||
I bought the hardware, therefore I have the right to modify and repair. Natural right, full stop. That right ends are your nose, as the saying goes.
kccqzy 11/13/2025|||
Consider whether your natural right argument might not stand in several other countries’ legal systems.

The era of United States companies using common sense United States principles for the whole world is coming to an end.

orbital-decay 11/13/2025|||
Okay, but currently it's the opposite: an US company is forcing the principles of these few legal systems for the whole world.
tomrod 11/13/2025||||
Nah, that's the beauty of it. Liberal principles make a much more robust political foundation that post-liberal principles. The US is known for the former despite current flirtations with the latter. However, liberal principles aren't tied to any one country. Fortunately for us!
Krasnol 11/13/2025|||
The era of common sense in the United States came to an end.
ashikns 11/13/2025||||
Yeah then you have the choice to not buy the locked down hardware, you don't have a right to get open hardware FROM Google.

Of course there are no good options for open hardware, but that is a related but separate problem.

orbital-decay 11/13/2025|||
It's not a separate problem, Google are actively suppressing any possibility of open mobile hardware. They force HW manufacturers to keep their specs secret and make them choose between their ecosystem and any other, not both. There's a humongous conflict of interests and they're abusing their dominating position.
dmitrygr 11/13/2025||
> They force HW manufacturers to keep their specs secret

Spoken like someone who has never ever worked with any hardware manufacturers. They do not need reasons for that. They all believe their mundane shit is the most secret-worthy shit ever. They have always done this. This predates google, and will outlive it.

renewiltord 11/13/2025||
Often it is because they don't know their own devices. We got a dev board from Qualcomm once and the documentation was totally bogus.
procaryote 11/13/2025||||
Regulating this is the way to not let general computing die to fuel google and apple profits.

People should have the right to run whatever software they like on the computing hardware they own. They should have the right to repair it.

The alternative is that everything ends up like smart-tvs where the options are "buy spyware ridden crap" or "don't have a tv"

m4rtink 11/13/2025|||
Given how antitrust is not really working right now I would say this is debatable. Also monopolies in the past were forced to do various things to keep their status for longer.
pessimizer 11/13/2025||||
> I bought the hardware, therefore I have the right to modify and repair. Natural right, full stop.

There is absolutely nothing "natural" about trading your pile of government promises for the right to call government men with guns and sticks if you are alienated from the option to physically control an object. Your natural right is to control what you can defend.

Rights are what we decide them to be. Or rather, what people in power decide them to be, i.e. people who hold and issue large amounts of government promises, and recruit and direct the most men with guns and sticks.

Ms-J 11/13/2025||||
This is correct. Our natural rights go much further than unnatural prohibitions from the government.

Do what you please and get enough people to do it with you, and no one can stop you.

tjwebbnorfolk 11/13/2025||||
Oh, so you're good with everyone having the "natural right" to turn handguns into automatic weapons simply because they find themselves in possession of the correct atoms? How about adding a 3rd story on the top of your house without needing a permit or structural evaluation?

Note that adding "full stop" pointlessly to the end of sentences does not strengthen your argument.

xigoi 11/13/2025|||
The difference is that you can’t kill other people by installing an app.
tjwebbnorfolk 11/18/2025|||
sorry i thought we were talking about "natural rights" here, not rights conditional on whether the hardware does something you like or not
tomrod 11/13/2025||||
Guns aren't a natural right by any stretch. Defense is, but you're confusing the US bill of rights with natural rights of all humans.
hooverd 11/13/2025|||
Now that's just some stupid hyperbole.
colordrops 11/13/2025||||
I don't think it's illegal to do whatever you want with your phone. That doesn't mean google legally is required to make it easy or even possible. That being said I ethically they should allow it, and considering their near monopoly status they should be forced to keep things open. In fact there should be right to repair laws too.
procaryote 11/13/2025||
The way to go from fervently hoping they make the ethical choice to actually protecting the users is to regulate it
calvinmorrison 11/13/2025||||
I suppose you have the right to do whatever you want with it, including zapping it in the microwave or using it as a rectal probe. I am not sure that right extends are far as forcing companies to deliver a product to your specifications (open software, hardware, or otherwise)
yehat 11/13/2025|||
You won't believe it, but many years ago the TVs for sale where required to come with their full schematics and they really did.
tomrod 11/13/2025|||
Right to repair requires it, thank goodness.
Aurornis 11/13/2025|||
> Natural right, full stop.

You’re still missing the point the comment is making: In countries where governments are dead set on holding Google accountable for what users do on their phones, it doesn’t matter what you believe to be your natural right. The governments of these countries have made declarations about who is accountable and Google has no intention of leaving the door open for that accountability.

You can do whatever you want with the hardware you buy, but don’t confuse that with forcing another company to give you all of the tools to do anything you want easily.

brazukadev 11/13/2025||
That's deflection, there's Google blocking users from installing apps and there's OP insinuating that it might be because of governments coercion but there's no evidence to support this. Scammers pay Google to show ads to install apps, that's what the governments are holding Google responsible and it won't change with blocking installing apps.
vachina 11/13/2025||
Malicious app delivery goes beyond Google ads. In Singapore, most scam app installs are from social engineering, e.g. install new app to receive payment, install new app to buy something for cheap.

I’m amazed at how gullible some people are but that’s how it is.

brazukadev 11/13/2025||
That's not how it is, Google helps scammers and make a lot of money from it so they are responsible and should pay for it
xg15 11/13/2025|||
> there cannot exist an easy way for a typical non-technical user to install “unverified apps” (whatever that means), because the governments of countries where such scams are widespread will hold Google responsible.

You can also view this as a "tragedy of the commons" situation. Unverified apps and sideloading is actively abused by scammers right now.

> Meanwhile this very fact seems fundamentally unacceptable to many, so there will be no end to this discourse IMO.

I get that viewpoint and I'm also very glad an opt-out now exists (and the risk that the verification would be abused is also very real), but yeah, more information what to do against scammers then would also be needed.

LoganDark 11/13/2025|||
It's not possible to provide a path for advanced users that a stupid person can't be coerced to use.

Moreover, it's not possible to provide a path for advanced users that a stupid person won't use by accident, either.

These are what drive many instances of completely missing paths for advanced users. It's not possible to stop coercion or accidents. It is literally impossible. Any company that doesn't want to take the risk can only leave advanced users completely out of the picture. There's nothing else they can do.

Google will fail to prevent misuse of this feature, and advanced users will eventually be left in the dust completely as Google learns there's no way to safely provide for them. This is inevitable.

edent 11/13/2025|||
Android could have, for example, a 24 hour "cooling off" period for sideloading approval. Much like some bootloader unlocking - make it subject to a delay.

That immediately takes the pressure off people who are being told that their bank details are at immediate risk.

cesarb 11/13/2025|||
> Android could have, for example, a 24 hour "cooling off" period for sideloading approval.

And, to prevent the scammer from simply calling back once the 24 hours are gone, make it show a couple of warnings (at random times so they can't be predicted by the scammer) explaining the issue, with rejecting these warnings making the cooling off timer reset (so a new attempt to enable would need another full 24 hours).

hattmall 11/13/2025|||
The people gullible enough to fall for a scam like that are also gullible enough to follow more instructions 24 hours later. I think if you could force a call to the phone and have an agent or even AI that talks to user and makes sure no scam is involved then gives an unlock code based on deviceID or something. But that would cost money and scammers would work around it anyway.
0xDEAFBEAD 11/13/2025|||
>It's not possible to provide a path for advanced users that a stupid person can't be coerced to use.

I actually think you might be wrong about this? Imagine if Google forced you to solve a logic puzzle before sideloading. The puzzle could be very visual in nature, so even if a scammer asked the victim to describe the puzzle over the phone, this usually wouldn't allow the scammer to solve it on the victim's behalf. The puzzle could be presented in a special OS mode to prevent screenshots, with phone camera disabled so the puzzle can't be photographed in a mirror, and phone call functionality disabled so a scammer can't talk you through it as easily. Scammers would tell the victim to go find a friend, have the friend photograph the puzzle, and send the photo to the scammer. At which point the friend hopefully says "wait, wtf is going on here?" (Especially if the puzzle has big text at the top like "IF SOMEONE ASKS YOU TO PHOTOGRAPH THIS, THEY ARE LIKELY VICTIM OF AN ONGOING SCAM, YOU SHOULD REFUSE", and consists of multiple stages which need to be solved sequentially.)

In addition to logic puzzles, Google could also make you pass a scam awareness quiz =) You could interleave the quiz questions with logic puzzle stages, to help the friend who's photographing the puzzle figure out what's going on.

I guess this could fail for users who have two devices, e.g. a laptop plus a phone, but presumably those users tend to have a little more technical sophistication. Maybe display a QR code in the middle of the puzzle which opens up scam awareness materials if photographed?

Or, instead of a "scam awareness quiz" you could could give the user an "ongoing scam check", e.g.: "Did a stranger recently call you on the phone and tell you to navigate to this functionality?" If the user answers yes, disable sideloading for the next 48 hours and show them scam education materials.

LoganDark 11/13/2025||
It would also fail for users who are differently abled. That sounds like an absolute nightmare for accessibility. Good news for preventing scams, but bad news for anyone without full mental and physical faculties.
0xDEAFBEAD 11/15/2025||
I'm not sure why you couldn't make the flow I describe just as accessible as anything else in Android? But I'll grant your premise and respond anyways.

If the user lacks full mental faculties, they are part of the userbase we need to protect from scams. Most likely, a user without full mental faculties who is trying to sideload will be a scam victim.

If the user lacks the necessary physical faculties to "solve a puzzle on their phone", they probably get help from friends regularly; a friend should be able to help with sideloading. Enabling sideloading should be a one-time operation right?

sharperguy 11/13/2025|||
Considering phone scammers often convince their victims to:

- install remote desktop software

- run commands in the windows terminal

- withdraw cash from the bank

- lie to the bank teller about their purpose

- insert their cash into a bitcoin ATM at a gas station

- ignore warnings about scams which appear on the screen of the ATM

- insert the scammers bitcoin address into the machine

It isn't a stretch to imagine they could convince the victim to install adb and sideload an app.

extraduder_ire 11/13/2025|||
A change google made to android earlier this year prevents you from allowing unknown sources and installing apks while you are on a phone call.

I'm surprised they didn't think of doing that sooner.

Ajedi32 11/13/2025||||
Notice though that we don't forbid people from withdrawing cash from the bank in order to prevent this.

Warning about scams is fine, as is taking steps to make it harder, but once you start trying to completely remove the agency of mentally sound adults "for their own good" then we have a problem.

0xDEAFBEAD 11/13/2025||||
It seems to me if you raise the difficulty enough, and lower the success rate enough, at some point a given scam stops being economical. https://news.ycombinator.com/item?id=45913529
port11 11/13/2025|||
It's waaaay more complicated to download ADB and side load a random APK.

This is either a move towards tighter control of the platform or a government request. And somewhat ironic, given that iOS is being pressured to be a bit more open.

Frieren 11/13/2025|||
> there cannot exist an easy way for a typical non-technical user to install “unverified apps” (whatever that means), because the governments of countries where such scams are widespread will hold Google responsible.

But it is perfectly fine to sell crypto and other complex financial assets to kids and other people that do not know they are from apps in the Play store.

If "safety" takes control from you then it is implemented. If real safety puts profits in danger then it is fight against. Quite a dystopia.

wkat4242 11/13/2025|||
Then let them do that for those countries. Not for everyone. I'm not in any of those autocratic countries. Or offer an opt out in the countries where this isn't a thing. Using adb is not really great for doing updates.

And also, I'm the owner of my device. Not my country.

m4rtink 11/13/2025|||
Once they do it in one country, there will be much more pressure and incentives to do it everywhere.
curtisnewton 11/13/2025|||
> I'm not in any of those autocratic countries

Autocratic Albania banned by law ads on YouTube so if you are in Albania (or your VPN is - wink! wink!) you get to watch YouTube without ads legally

I, too, hate those autocratic countries were government act for the good of the people, instead of ruling in favour of greedy billionaires

woliveirajr 11/13/2025|||
I'm pretty sure Brazil doesn't have a law saying that Google must forbid sideload. I'm sure that government (be it President, Central Bank etc) doesn't pressure Google about it.

I'm sure some private actors (for example, banks) would love that smartphones are as tight as possible (reason: [0]). Perhaps the same reason applies to Google [1]. But no, "Brazil" isn't demanding that from Google.

[0]: consider that some virus (insecure apps, for example) could somehow steal information from bank apps (even as simple as capture login information). The client might sue the bank and the bank might have to prove that their app is secure and the problem was in the client's smartphone.

[1]: the client, the bank etc might complain to Google that their Android is insecure

shevy-java 11/13/2025|||
Aha - that is a much better explanation than I assumed, aka "the people forced Google to behave". So Google is scared of having to pay fines or having their CEOs end up in jail. I actually think there should be a new rule - easy-jail mode for CEOs globally. Does not have to be long but say, a few days in jail for ignoring the law, and right hold the CEOs responsible for that. You earn a lot of money, so you also gotta take the risk.
mschuster91 11/13/2025|||
> From the very first announcement of this, Google has hinted that they were doing this under pressure from the governments in a few countries. (I don't remember the URL of the first announcement, but https://android-developers.googleblog.com/2025/08/elevating-... is from 2025-August-25 and mentions “These requirements go into effect in Brazil, Indonesia, Singapore, and Thailand”.)

In ye goode olde times, the US would have threatened invasion and that would have been the end of it.

Half /s, because it actually used to be the case that the US government exercised its massive influence (and not just militarily) onto other countries for the benefit of its corporations and/or its citizens... these days, the geopolitical influence of the US has been reduced to shreds and the executive's priorities aren't set by doing what's (being perceived as being) right but by whomever pays the biggest bribes.

KurSix 11/13/2025|||
The tension here is classic: governments want accountability, Google wants plausible deniability, and users want control
sebastiennight 11/13/2025||
...and users want ̶c̶o̶n̶t̶r̶o̶l̶ convenience.

Seems more appropriate.

xzjis 11/13/2025|||
Why can't they just put up a big, red warning: "Never enable software installation if someone asks you to (over the phone or via message). If you're unsure, check out this article on scams."?
thw_9a83c 11/13/2025||
> "Never enable software installation if someone asks you..."

Imagine a situation in which a frightened, stressed user sees such a message on their screen. Meanwhile, a very convincing fake police officer or bank representative is telling them over the phone that they must ignore this message due to specific dangerous emergency situation to save the money in their bank account. Would the user realize at that moment that the message is right and the person on the phone is a thief? I'm not so sure.

0xDEAFBEAD 11/13/2025||
What if there is a 12-hour delay to unlock "power user mode", and during that entire 12-hour unlock period, the phone keeps displaying various scam education information to help even an unsophisticated user figure out what's going on? Surely Google can devote a few full-time employees to keeping such educational materials up to date, so they ideally contain detailed descriptions of the most common scams a user is going to be subject to at any given time.
thw_9a83c 11/13/2025||
This would help for sure. Ideally, the phone should stay in "expert mode" for a limited time only, like 1 hour.

However, there is still a danger that scammers will call after 12 hours, and they will be more convincing than educational material (or the user may not have read it).

ordu 11/13/2025||
> However, there is still a danger that scammers will call after 12 hours

It is unlikely it will work. Scammers are talking all the time and creating a sense of urgency, people have issues to think and listen at the same time, and they tend to drop thinking completely when in a haste. 12 hours of a break will give the victim time to think at least. Probably it will give time to talk about it with someone, or to google things.

like_any_other 11/14/2025|||
> because the governments of countries where such scams are widespread will hold Google responsible.

How many virus infections and scams was Microsoft held responsible for? What about Red Hat, or Debian?

And at least let Google plainly state this, instead of inventing legal theories based on vague hints from their press releases, to explain why their self-serving user-hostile actions are actually legally mandatory.

makeitdouble 11/13/2025|||
> the governments of countries where such scams are widespread will hold Google responsible.

This argument is FUD at this point.

Sovereign governments have ways to make clear what they want: they pass laws, and there needs to be no back deal or veiled threats. If they intend to punish Google for the rampant scams, they'll need a legal framework for that. That's exactly how it went down with the DMA, and how other countries are dealing with Google/Apple.

Otherwise we're just fantasizing on vague rumors, exchanges that might have happened but represent nothing (some politicians telling bullshit isn't a law of the country that will lead to enforcement).

This would be another story if we're discussing exchanges with the mafia and/or private parties, but here you're explicitely mentionning governments.

hexage1814 11/13/2025||
> they'll need a legal framework for that

Not really. It should, but Google operate in a bunch of contries without proper rule of law.

jacquesm 11/13/2025|||
That's a disingenuous argument though: they are in that position because they chose to make themselves the only way that a 'normal' user is able to install software on these devices. If not for that these governments wouldn't have a point to apply pressure on in the first place.
m4rtink 11/13/2025||
BTW, Stallman and FSF have been saying this the whole time - if you become the only gatekeeper, don't be surprised when government people show up and force you to ban apps or users from your platform.
dev1ycan 11/13/2025|||
This is just lies spread by the very own people that created this system in the first place, if PCs can have apps without "verification" then so can a phone.
immibis 11/13/2025|||
Imagine if they tried to hold the entire world to the standards of Russia, China or North Korea. Yet they don't. This is just an excuse from them, or else they would only enable it in those countries. They don't hold the entire world to Chinese standards so why should they hold them to Brazilian standards? The only reasonable answer is: they also like those standards.
wmf 11/13/2025|||
Or maybe Google just has empathy for people losing millions to scams?
jacquesm 11/13/2025|||
No, then the results of many google web searches would not put scam sites at the top over the official sites. Google is fine with people being scammed. As long as they get their cut. Large corporations don't have empathy.
vachina 11/13/2025||
Meta ads too. It’s bonkers the type of ads they approve, straight up scams or obvious misinformation (some prominent figure is in jail! Click here to find out!)
spaqin 11/13/2025||||
From what I've seen, millions lost to scams are with social engineering; through cold calls masquerading as the authorities, phishing, pig butchering; plenty of scam apps on the Play store harvesting data as well, but not a single real life instance of malware installed outside the officially sanctioned platform.
tjpnz 11/13/2025||||
The same scams Google's ad network facilitates and Google in turn profits from?
sunaookami 11/13/2025||||
The Play Store is full of of scam apps so obviouly they don't.
hooverd 11/13/2025|||
Tell that to their advertising arm.
Aurornis 11/13/2025|||
> because the governments of countries where such scams are widespread will hold Google responsible.

This is the unsurprising consequence of trying to hold big companies accountable for the things people do with their devices: The only reasonable response is to reduce freedoms with those devices, or pull out of those countries entirely.

This happened a lot in the early days of the GDPR regulations when the exact laws were unclear and many companies realized it was safer to block those countries entirely. Despite this playing out over and over again, there are still constant calls on HN to hold companies accountable for user-submitted content, require ID verification, and so on.

raincole 11/13/2025|||
Yes. The same goes with payment processing. I hate visa/mastercard as much as the next person. But if the court says they're accountable for people who buy drug/firearm/child porn, then it seems to be a quite reasonable reaction for them to preemptively limit what the users can buy or sell.

The government(s) have to treat the middlemen as middlemen. Otherwise they are forced to act as gatekeepers.

jacquesm 11/13/2025|||
These two things are not the same. The GDPR afforded rights to common people. Those companies that would pull out are the ones that were abusing data that was never theirs and could no longer do so.
fingerlocks 11/13/2025||
Nah. I know of several startups that had nothing but anonymous telemetry and they blocked all Europe because there was no capacity for compliance. I was at an incubator at the time and the decision was unanimous across a dozen or so companies. It’s not like anyone was going to lose out on VC money from that market
sebastiennight 11/13/2025||
> anonymous telemetry

is not covered by GDPR.

And it's a bit hard to believe that these several startups functioned without ever collecting names, emails, IP, phone number, or address of any lead or customer ever.

fingerlocks 11/13/2025||
Maybe they did? Who knows? Never gonna find out because no one had time to look into it. It certainly wasn’t done with malicious intent, perhaps by accident or oversight, which is likely the situation in most small companies.
m463 11/13/2025|||
this is an unresolvable issue

  security = 1/convenience
or in this case:

  security = 1/freedom  or agency
procaryote 11/13/2025||
Security isn't an absolute and this doesn't notably improve it
phendrenad2 11/13/2025||
If nobody pushed back on anything we'd all be subjected to the laws of the worst country on earth, because big tech companies want to do business there, and putting an if/else around the user's country takes effort.
nirui 11/13/2025||
Excuse me, what exactly is "sideloading"? If I wanted to run third-party code on a system through the means that's supported by the system, then it should be called "running", it's a part of normal operation.

The word "sideload" made it sound like you're smuggle something you shouldn't onto the system. Subtle word tricks like this could sneak poisons into your mind, be watchful.

rollcat 11/13/2025||
You can't make people just stop using a word. The best course of action is to reclaim it. Look at us, we're posting on Hacker News. With a sideloaded browser.
glenstein 11/13/2025||
They already did! The word was install. Or as GP noted, run. They're actually even now much more conventional and widely understood uses, and if anything it's Google attempting to swim against the stream and normalize sideload as language for software installation. Theirs is an object lesson, I think, in appropriately registering the objection and pushing us back to normal language.
troyvit 11/13/2025|||
I keep hearing that here, and people have good reasons why they think of that but to me sideloading always meant having your phone physically next to the device you're pulling an apk from, in other words loading the app from the side.
greycol 11/17/2025|||
I always thought of it as coming from side-channel. Which (until I searched just now and was only offered side channel attacks as a result) I generally construed it as a good thing because the system was assumed to be broken. Things like track 2 diplomacy or messaging the CEO/minister because customer service/bureaucracy was broken. You can go in the side door of a business if you own it or belong there, only dis-empowered customers are forced to go in the front.

Side loading was getting something to work because it should when the system hadn't caught up to the fact that it should work.

glenstein 11/13/2025|||
Yeah, that strikes me as a familiar use also. They seem to be using it to mean not only that but any software installation that doesn't happen via the Play Store, so it's rooted in real history but also conveniently re-appropriated to imply it's veering outside of typically intended use cases.
neop1x 11/13/2025|||
I am not sideloading anything, I am installing apps from f-droid on my device.
1970-01-01 11/13/2025|||
The old Indian word for setting up software was in-sta-lin-it. It was so common, anyone with basic tribal knowledge could gather next to their "Pee See" and execute the code.
rtkwe 11/13/2025|||
You're about two decades late to the complaint party in this context at least. I can find references on google books back in 2006 referencing sideloading.

https://www.google.com/books/edition/CNET_Do_It_Yourself_IPo...

glenstein 11/13/2025||
I'm ready to grant that you found an occurrence in the wild but it takes more than that to demonstrate prevalence, conventional usage, or semantic fidelity to originally intended meanings. Also they are appealing to a usage that's practically as old as the paradigm of personal computing itself, so I don't think they're the one that's out of date.

I happen to remember "sideload" as a term of art for some online file locker sites to mean saving it to your cloud drive instead of downloading it to your computer. A cool usage, but it never caught on.

I think nomenclature as it exists in the PC software universe is closest in spirit on all fronts, in describing running software as, well, running software, and describing installing as installing. While a little conspiratorial in tone they're not wrong that "sideload" pushes the impression that controlling what software you run on your phone should be understood as non-default.

rtkwe 11/13/2025||
This is an instance of an on target usage though relating to the unofficial loading of software onto the device. And in my eyes finding it in a published work by a major publication means it was likely in wider usage in the same context, at the very least it can be an indicator of the start of that particular usage.
hooverd 11/13/2025|||
I'm using sideloaded Firefox right now!
aargh_aargh 11/13/2025||
newspeak FTW!
Aachen 11/13/2025||
Edit: be sure to read geoffschmidt's reply below /edit

The buried lede:

> a dedicated account type for students and hobbyists. This will allow you to distribute your creations to a limited number of devices without going through the full verification

So a natural limit on how big a hobby project can get. The example they give, where verification would require scammers to burn an identity to build another app instead of just being able to do a new build whenever an app gets detected as malware, shows that apps with few installs are where the danger is. This measure just doesn't add up

geoffschmidt 11/13/2025||
But see also the next section ("empowering experienced users"):

> We are building a new advanced flow that allows experienced users to accept the risks of installing software that isn't verified

Aachen 11/13/2025|||
Oh! I thought I had found the crucial piece finally after ~500 words, but there's indeed better news in the section after that! Thanks, I can go sleep with a more optimistic feeling now :)

Also this will kill any impetus that was growing on the Linux phone development side, for better or worse. We get to live in this ecosystem a while longer, let's see if people keep damocles' sword in mind and we might see more efforts towards cross-platform builds for example

ryandrake 11/13/2025||
Let's take the "W". This is pretty good news!
Grimblewald 11/13/2025|||
That's like accepting vaders 'altered' deal, and being grateful it hasn't been altered further.

If google wants a walled garden, let it wall off it's own devices, but what right does it have to command other manufactures to bow down as well? At this stage we've got the choice of dictato-potato phone prime, or misc flavour of peasant.

If you want walled garden, go use apple. The option is there. We don't need to bring that here.

throawayonthe 11/13/2025||
i mean, this program is specifically for google verifed devices...
roblabla 11/13/2025||
Google Certified Devices is any device that has GMS (Google Mobile Services) installed - ergo almost all of them. It's worth noting that a _lot_ of apps stop functioning when GMS is missing because Google has been purposefully been putting as much functionality in them instead of putting them in AOSP. So you end up in a situation where, to make an Android phone compatible with most apps, you need GMS. Which in turn means you need your phone to be Google Certified, and hence must implement this specification.
catlikesshrimp 11/13/2025||||
I am not english native. Is "The W" a synonym for "A Win", described as a positive outcome after a contest? Is there more nuance or context than that?
Aachen 11/13/2025|||
The others answered the question, but I wanted to add that this is "new English" to me as well (also non native though). I first saw it in chats with mostly teenagers in ~2021, where I've also learned "let's go" isn't about going anywhere at all (it means the same as w)

This is the first sign we're getting old :) new language features feel new. The language features I picked up in school, that my parents remarked upon, were simply normal to me, not new at all. I notice it pretty strongly nowadays with my grandma, where I keep picking up new terms in Dutch (mainly loan words) but she isn't exposed to them and so I struggle to find what words she knows. Not just new/updated concepts like VR, gender-neutral pronouns, or a new word for messages that are specifically in an online chat, but also old concepts like bias. It's always been there but I'd have no idea what she'd use to describe that concept

arcfour 11/13/2025||||
Yes, but it's often just "a W" or simply "W" in response to something good or seen as a "win."

There is also the same thing with L for loss/loser. "that's an L take", "L [person]", "take the L here", etc.

They are pretty straightforward in their meaning, basically what you described. I believe it comes from sports but they are used for any good or bad outcome regardless of whether it was a contest.

thristian 11/13/2025||||
I think it's from people reporting sports statistics for a player or team as "W:5 L:7" meaning "five wins and seven losses".

https://knowyourmeme.com/memes/l-and-w-slang

qingcharles 11/15/2025|||
I've never seen it in English outside of the USA, but it's very common inside.
benatkin 11/13/2025||||
This isn't a "W", but I am finding my own "W" from this by seeing others distrust Google, and remembering to continue supporting and looking for open alternatives to Google.
echelon 11/13/2025|||
This is not a win. This is having independent distribution shut down and controlled.

We no longer own our devices.

We're in a worse state than we were in before. Google is becoming a dictator like Apple.

rbits 11/13/2025||
It's not being shut down though. The article says that there will be a way to install unverified apps.
klez 11/13/2025|||
Ok, but sideloading is already a thing. What will this way to install unverified apps be? I doubt it will be an extra screen asking "Are you super-duper sure you want to enable sidloading???" after the one already asking the same question.
exe34 11/13/2025||
They talk about doing it under pressure, so my guess is there might be a waiting period before you're allowed free reign, or maybe per-app. Or some level of calling google, listening to 10 minutes of how poor billionaires are going to starve if you have control of your own device before being allowed to unlock it.
echelon 11/13/2025|||
You'll have to sign if you wish to distribute. That's an easy way for them to control you.
DavideNL 11/13/2025||||
> We are building a new advanced flow that allows experienced users to accept the risks of installing software that isn't verified

Sure, they'll keep building it forever — this is just a delay tactic.

advisedwang 11/13/2025||||
That doesn't say that you can just build an APK and distribute it. I suspect this path _still_ requires you to create a developer console account and distribute binaries signed by it... just that that developer account doesn't have to have completed identity verification.
consp 11/13/2025||
So you will now need a useless and needless account to build and run your own apps? It's like Microsoft forcing online login on pcs.
advisedwang 11/13/2025||
useless, needless and terminateable at Google's pleasure!
rrix2 11/13/2025||||
it's probably just gonna be under the Developer Options "secret" menu
magguzu 11/13/2025||
Which is totally fine IMO, it was weird to me that they weren't going with this approach when they first announced it.

Macs blocked launching apps from unverified devs, but you can override in settings. I thought they could just do something along those lines.

kelnos 11/13/2025|||
That's not fine at all. A developer who doesn't want to (or can't) distribute through the Play Store will now need to teach their users how to enable developer mode and toggle a hidden setting. This raises the barrier a bit more than the current method of installing outside the Play Store.
hasperdi 11/13/2025|||
It's not fine. Some apps particularly banking apps have developer mode detection and refuse to work if developer mode is enabled.
exe34 11/13/2025||
I've switched banks for less.
Aachen 11/13/2025||
Until there are no banks left to switch to

Maybe this sounds dark but see also how the net is tightening around phones that allow you to run open firmware after you've bought the hardware for the full and fair price. We're slowly being relegated to crappy hobbyist projects once the last major vendors decide on this as well, and I don't even understand what crime it is I'm being locked out for

We're too small a group for commercial vendors to care. Switching away isn't enough, especially when there's no solidarity, not even among hackers. Anyone who uses Apple phones votes with their wallet for locking down the ability to run software of your choice on hardware of your choice. It's as anti-hacker as you can get but it's fairly popular among the HN audience for some reason

If not even we can agree on this internally, what's a bank going to care about the fifty people in the country that can't use a banking app because they're obstinately using dev tools? What are they gonna do, try to live bankless?

Of course, so long as we can switch away: by all means. But it's not a long-term solution

exe34 11/13/2025||
I think pretty soon I'll carry a "normal" phone in my bag for things like communication and banking/ticketing, but I'll carry a device I actually like in my pocket. It'll be the best of both worlds - content I want to see often and easily in my pocket, and the stuff I don't want to be distracted by will be harder to reach on a whim.
Aachen 11/13/2025||
Yes, I think I'll have to do the same. I've been in the market for a new phone but the one I had pretty much settled on removed the option to update the boot verification chain so I'm obviously not buying that. Might as well buy apple then

It seems like a finite solution though. Having a second phone is not something most people will do, so the apps that are relegated to run on such devices will become less popular, less maintained, less and less good

Currently, you can run open software alongside e.g. government verification software. I think it's important to keep that option if somehow possible

gblargg 11/13/2025||||
Let me guess, a warning box that requires me to give permission to the app to install from third-party sources? Is that not clear enough confirmation that I know what I'm doing? /s
metadat 11/13/2025|||
So.. all this drama over an alert(yes/no) box?

Wow, this really pulls back the veil. This Vendor (google) is only looking out for numero uno.

cesarb 11/13/2025|||
> So.. all this drama over an alert(yes/no) box?

A simple yes/no alert box is not "[...] specifically to resist coercion, ensuring that users aren't tricked into bypassing these safety checks while under pressure from a scammer". In fact, AFAIK we already have exactly that alert box.

No, what they want is something so complicated that no muggle could possibly enable it, either by accident or by being guided on the phone.

Zak 11/13/2025||
I imagine what they're going to do involves a time delay so a scammer cannot wait on the phone with a victim while they do it.
kitesay 11/13/2025||
I agree. Waiting to see for how long. Has to be 24 hours at a minimum I'd guess.
exe34 11/13/2025||
They could make us fill capchas to pass the time...
Aurornis 11/13/2025|||
> So.. all this drama over an alert(yes/no) box?

The angry social media narratives have been running wild from people who insert their own assumptions into what’s happening.

It’s been fairly clear from the start that this wasn’t the end of sideloading, period. However that doesn’t get as many clicks and shares as writing a headline claiming that Google is taking away your rights.

lern_too_spel 11/13/2025|||
> The angry social media narratives have been running wild from people who insert their own assumptions into what’s happening.

No, until this post, Google had said that it wouldn't be possible to install an app from a developer who hadn't been blessed by Google completely on your device. That is unacceptable. This blog post contains a policy change from Google.

devsda 11/13/2025||||
> The angry social media narratives have been running wild from people who insert their own assumptions

There may have been exaggerations in some cases but these hand wavy responses like "you can still do X but you just can't do Y and Z is now mandatory" or "you can always use Y" is how we got to this situation in the first place.

This is just the next evolution of SafetyNet & play integrity API. Remember how many said use alternatives. Not saying safetynet is bad but I don't believe their intentions were to stop at just that.

gumby271 11/13/2025||||
Sorry what? Their original plan absolutely was the end of sideloading on-device outside of Google's say so. That's what the angry social media narratives were that you seem upset about. Anyone being pedantic and pointing out that adb install is still an option therefore sideloading still exists can fuck off at this point.
advisedwang 11/13/2025||||
I don't think this section is actually the same as the present state just with a new alert box.

I suspect they mean you have to create a android developer account and sign the binaries, this new policy just allows you to proceed without completing the identity verification on that account.

kcb 11/13/2025||||
What are you talking about? This change for "experienced users" was only just announced and not part of any previous announcement. It has not been clear from the start at all.
Superblazer 11/13/2025|||
Have you missed the plot entirely? This is absurd
jacquesm 11/13/2025|||
And of course: you need an account, rather than simply allowing you to tell your OS that yes, you know what you're doing.
KurSix 11/13/2025||
You're right: if the logic is that low-install apps are the most dangerous (because they can fly under the radar), then making it easier for unverified apps to reach a "small" audience doesn't really solve the problem
xyzzy_plugh 11/13/2025||
> we are building a new advanced flow that allows experienced users to accept the risks of installing software that isn't verified. We are designing this flow specifically to resist coercion, ensuring that users aren't tricked into bypassing these safety checks while under pressure from a scammer. It will also include clear warnings to ensure users fully understand the risks involved, but ultimately, it puts the choice in their hands.

As long as this is a one-time flow: Good, great, yes, I'll gladly scroll through as many prompts as you want to enable sideloading. I understand the risks!

But I fear this will be no better than Apple's flow for installing unsigned binaries in macOS.

Please do better.

reddalo 11/13/2025||
I also think we should stop calling it "sideloading". We need a better word. Sideloading has a negative vibe, as if it's a dangerous thing to install apps from sources other than the Play Store.
shaky-carrousel 11/13/2025|||
Sideloading should be called installing, and installing from the play store should be called jailloading.
exe34 11/13/2025||||
I call it installing. If it's from play store I'd say "Install from Play Store".
Dilettante_ 11/13/2025|||
>Sideloading has a negative vibe

Maybe you've just been drinking the propaganda? "Sideloading" to me rolls off the tongue no worse than "hotswapping" or "overclocking".

stavros 11/13/2025||
We've always called it "install".
Dilettante_ 11/13/2025||
There is a distinction between installing something via the primary or a secondary mechanism. If someone said I just had to "install" a windows program and it turned out I had to compile it from scratch and set all the registry entries myself, I would be "astonished"(as in: The Principle Of Least Astonishment).

I fully understand that language matters and if this was an attempt by Google to de-legitimize this way of installing, that's no good. But for Christ's sake, having different names for different things is not inherently malicious.

stavros 11/13/2025||
I don't see why you'd be astonished here. The Play Store downloads the APK and installs the APK. If you've downloaded it already (eg with a browser), you just install the APK. How is that comparable to compiling from scratch and setting the registry entries yourself?
Dilettante_ 11/13/2025||
About five clicks more(than a single click) and a scary safety setting to turn off. But I didn't mean that installing an apk was as involved as my windows example. That was meant to illustrate that there are two completely different lines of action, two completely different levels of user competence at play.

Installing from the play store involves exactly zero knowledge of what an apk even is.

I want to flip the question around and ask you: How are you not seeing that there is a distinction?

advisedwang 11/13/2025|||
Does this allow unsigned binaries like today? Or is this now requiring you have a binary signed by a android developer account but just one without full identity verification.
izacus 11/13/2025||
All Android devices require signed binaries and have done so since 1.0.
noname120 11/13/2025||
Red herring. Self-signed certificates have always been accepted, and generating a certificate is a one-liner:

    keytool -genkeypair -keystore mykey.jks -alias myalias -keyalg RSA
The public testkey certificate is also accepted so you don’t even need to generate one.
NoGravitas 11/14/2025||
Yes, but then when you update the app, it has to be signed with the same certificate. Android effectively uses TOFU for apk signatures.
lokar 11/13/2025|||
What if it imposed a longish (one time) cooldown period? A day?
huem0n 11/13/2025|||
Exactly, this would greatly reduce the ability for scammers in "urgent" situations, but for power users who flip the switch on day one it would rarely be a problem. What would be terrible though ... is if Google made it require a network connection or Google approval.
rbits 11/13/2025|||
1 day is not longish. That would greatly harm apps like F-Droid. You'd have to go through it every time you want to update your apps.
IshKebab 11/13/2025||
He said one-time.
lokar 11/13/2025||
Yeah, just to turn on the mode.

Perhaps make you do it again after each major OS update, or once a year or something.

KurSix 11/13/2025||
The key will be whether they treat experienced users like adults after the initial opt-in
themafia 11/13/2025||
> Keeping users safe on Android is our top priority.

I highly doubt this is your "top" priority. Or if it is then you're gotten there by completely ignoring Google account security.

> intercepts the victim's notifications

And who controls these notifications and forces application developers to use a specific service?

> bad actors can spin up new harmful apps instantly.

Like banking applications that use push or SMS for two factor authentication. You seem to approve those without hesitation. I guess their "top" priority is dependent on the situation.

klabb3 11/13/2025||
> > intercepts the victim's notifications

> And who controls these notifications and forces application developers to use a specific service?

Am I alone in being alarmed by this? Are they admitting that their app sandboxing is so weak that a malicious app can exfil data from other unaffiliated apps? And they must instead rely on centralized control to disable those apps after the crime? So.. what’s the point of the sandboxing - if this is just desktop level lack of isolation?

Glossing over this ”detail” is not confidence inspiring. Either it’s a social engineering attack, in which case an app should have no meaningful advantage over traditional comms like web/email/social media impersonation. Or, it’s an issue of exploits not being patched properly, in which case it’s Google and/or vendor responsibility to push fixes quickly before mass malware distribution.

The only legit point for Google, to me, is apps that require very sensitive privileges, like packet inspection or OS control. You could make an argument that some special apps probably could benefit from verification or special approvals. But every random app?

Zak 11/13/2025|||
> Are they admitting that their app sandboxing is so weak that a malicious app can exfil data from other unaffiliated apps?

An app can read the content of notifications if the appropriate permissions are granted, which includes 2FA codes sent by SMS or email. That those are bad ways to provide 2FA codes is its own issue.

I want that permission to exist. I use KDE Connect to display notifications on my laptop, for example. Despite the name, it's not just for KDE or Linux - there are Windows and Mac versions too.

godshatter 11/13/2025|||
> An app can read the content of notifications if the appropriate permissions are granted, which includes 2FA codes sent by SMS or email.

Do apps generally do this? I've never run into one that doesn't expect me to type in the number sent via SMS or email, rather than grabbing it themselves.

I don't use a lot of apps on my android phone, though, so maybe this is a dumb question to those who do.

Zak 11/13/2025||
Most apps don't read notifications for that purpose, and I'm not sure they'd be allowed in the Play Store if they wanted the permission just for that. It's mainly used for automation and sending notifications to other devices like PCs and maybe smartwatches.
klabb3 11/13/2025|||
Yes, but see my last paragraph. Reading notifications doesn’t apply to the majority of apps. It’s not a binary choice. On iOS, you need special entitlements for certain high level privileges. Isn’t it already the same on Android?
Zak 11/13/2025||
It's similar. I think there's a difference in that special entitlements have to be approved by Apple. Read/manage notifications is under "special app access", which has a different prompt where the user has to pick the app from a list and flip a toggle to grant the permission rather than just tapping OK.
Groxx 11/13/2025||||
yes, they're admitting that their APIs are powerful enough to build accessibility tools (which often must read notifications) and many other useful things (e.g. Pushbullet) that are not possible on iOS.

powerful stuff has room for abuse. I didn't really think there's much of a way to make that not the case. it's especially true for anything that you grant accessibility-level access to, and "you cannot build accessibility tools" is a terrible trade-off.

(personally I think there's some room for options with taint analysis and allowing "can read notifications = no internet" style rules, but anything capable enough will also be complex enough to be a problem)

klabb3 11/13/2025|||
You may be overthinking it. Verification of some sort isn’t the end of the world, it’s arguably an acceptable damage control stop-gap that has precedent on other platforms like special entitlements on iOS and kernel extensions on Windows.

Googles proposal was to require everyone to verify to publish any app through any channel. That would be the equivalent of a web browser enforcing a whitelist of websites, because one scam site asked for access to something bad.

If scam apps use an API designed by Google to steal user data, then they should fix that, without throwing the baby out with the bathwater.

Groxx 11/13/2025||
might have meant to reply to someone else? I haven't said anything about verification here
reorder9695 11/14/2025|||
I mean the solution really is a comprehensive permissions system, for an accessibility system that needs to read notifications you should be able to deny it network permissions and whitelist which app's notifications it's allowed to read
Groxx 11/14/2025||
entirely agreed, but in the context of this thread that means you just have to convince someone to enable it for the one app, rather than the phone as a whole. which doesn't seem to help at all with the coercion scenario (if anything that might make it safer-sounding and therefore easier), just under normal use / to limit possibly-malicious apps.
realusername 11/13/2025|||
> Are they admitting that their app sandboxing is so weak that a malicious app can exfil data from other unaffiliated apps?

It's not news, both iOS and Android sandboxing are Swiss cheese compared to a browser.

People should only install apps from trusted publishers (and not everything from the store is trusted as the store just gors very basic checks)

Groxx 11/13/2025||
browsers are really not much better. on an absolute level, I definitely agree they're better (e.g. they have per-url and only-after-click permissions for some things), but they've all got huge gaps still once you start touching extensions. and beyond that it remains to be seen, since OS-level permissions are significantly broader-possibility than in-browser due to being able to touch far more sensitive data.
boxedemp 11/13/2025|||
Only a few things in life are for sure. Death, taxes, and corpospeak.
_factor 11/13/2025||
Hey, sometimes the dumbest people it works on are also the ones with the decision making ability. What a world to live in.
BrenBarn 11/13/2025|||
Their top priority is making money.
shirro 11/13/2025|||
Making money and complying with the law. They are obligated to do both. In many countries laws are still enforced.

Protecting their app store revenues from competition exposes them to scrutiny from competition regulators and might be counter productive.

Many governments are moving towards requiring tech companies to enforce verification of users and limit access to some types of software and services or impose conditions requiring software to limit certain features such as end to end encryption. Some prominent people in big tech believe very strongly in a surveillance state and we are seeing a lot of buy in across the political spectrum, possibly due to industry lobbying efforts. Allowing people to install unapproved software limits the effectiveness of surveillance technologies and the revenues of those selling them. If legal compliance risks are pushing this then it is a job for voters, not Google to fix.

BrenBarn 11/13/2025||
Complying with the law is just another way of protecting your money. I have no doubt if they would break laws if they judged it better for the bottom line --- in fact I have little doubt they're already doing so. On the flip side, if there were ruinous penalties for their anticompetitive behaviors (i.e., in the tens or hundreds of billions of dollars) they might change course.

Certainly voters need to have their say, but often their message is muffled by the layers of political and administrative material it passes through.

hekkle 11/13/2025|||
BINGO! Google doesn't care at all about user security.

- Just yesterday there was a story on here about how Google found esoteric bugs in FFMPEG, and told volunteers to fix it.

- Another classic example, about how Google doesn't give a stuff about their user's security is the scam ads they allow on youtube. Google knows these are scams, but don't care because they there isn't regulation requiring oversight.

gpm 11/13/2025||
> Just yesterday there was a story on here about how Google found [a security vulnerability that anyone running `ffmpeg -i <untrusted file> ...` was vulnerable to] in FFMPEG, and told [the world about it so that everyone could take appropriate action before hackers found the same thing and exploited it, having first told the ffmpeg developers about it in case they wanted to fix it before it was announced publicly]

Fixed that for you. Google's public service was both entirely appropriate and highly appreciated.

hekkle 11/13/2025||
> and highly appreciated.

Not by the maintainers it wasn't Mr. Google.

gpm 11/13/2025||
Yes, but it was a public service not a service for the maintainers, and as a member of the public who like anyone who had run `ffmpeg -i <thing I downloaded from the internet>` was previously exposed to the vulnerability I highly appreciate their service.

I'd highly appreciate even if the maintainers never did anything with the report, because in that case I would know to stop using ffmpeg on untrusted files.

hekkle 11/13/2025||
So you were using untrusted video files that required the LucasArts Smush codec?

Again, if YOU highly appreciate their service, that's great, but FFMPEG isn't fixing a codec for a decades old game studio, so all Google has done is tell cyber criminals how to infect your Rebel Assault 2. I'm glad you find that useful.

gpm 11/13/2025||
No, I was running on normal untrusted video files. The standard ffmpeg command line would happily attempt to parse those with the LucasArts Smush codec even though I'd never heard of it before.

See the POC in the report by google, the command they run is just `./ffmpeg -i crash.anim -f null /dev/null -loglevel repeat+trace -threads 1` and the only relevant part of that for being vulnerable is that crash.anim is untrusted.

Edit: And to be clear, it doesn't care about the extension. You can name it kittens.mp4 instead of crash.anim and the vulnerability works the same way.

reddalo 11/13/2025|||
Their top priority is preventing people from using YouTube ReVanced or uBlock Origin on Firefox. That's their top priority.
ajkjk 11/13/2025||
this is an absurd rant. they invest, like, billions into security. It's not as perfect as you want it to be but "completely ignoring" is a joke. if you've got actual grievances you should say what they are so that we can actually get on your side instead of rolling our eyes
asadotzler 11/13/2025|||
They absolutely eo completely ignore many security and privacy things because they're very selective in what they focus on, particularly around how those things might impact their ad revenue.

How much they spend is no indicator of how and where they spend it, so is hardly a compelling argument.

wmf 11/13/2025|||
I'm not the OP but we know that SMS is not secure. Google should try banning that first.
arcfour 11/13/2025||
Some security is better than no security. It already took years to even get some of these backwards-thinking companies and services to adopt SMS OTP and it's simple for non-technical users to intuit. Also, believe it or not, some people don't have smartphones, and they will riot if you try to make them switch to any other MFA method...

Of course, I'm not saying we shouldn't push to improve things, but I don't think this is the right reaction either.

notepad0x90 11/13/2025||
This is the worst of both worlds, you can spread your malware as a sideloaded apk just fine, but when it's so big that you're probably burned anyways, then you need to verify your account.

I think a better compromise would have been for google to require developer verification, but also allow third party appstores like f-droid that don't require verification but still are required to "sign" the apks, instead of users enabling wide-open apk sideloading. that way, hobbyists can still publish apps in third party stores, and it is a couple of more steps harder for users to fall for social engineering,because they now have to install/enable f-droid, and then find the right malicious app and download it. The apk downloaded straight from the malicious site won't be loaded no matter what.

Google can then require highlighting things like number of downloads and developer reputation by 3rd party appstores, and maybe even require an inconsistent set of steps to search and find apps to make it harder to social engineer people (like names of buttons, ux arrangements, number of clicks,etc.. randomize it all).

What frustrated me on this topic from the beginning is that solutions like what I'm proposing (and better ones) are possible. But the HN prevailing sentiment (and elsewhere) is pitchforks and torches. Ok, disagree with google, but let's discuss about how to solve the android malware problem that is hurting real people, it is irresponsible to do otherwise.

flakiness 11/13/2025||
It's not super clear from the post, but if I read it correctly there are two modifications suggested.

   - 1: Separate verification type for "student and hobbyist"
   - 2: "advanced flow" for "power users" that allows sideloading of unverified apps - I imagine this is some kind of scare-screen, but we'll see.
What you describe as "worst of both worlds" is about point 1. I'm not sure point 2 is powerful enough to suppor things like f-droid, but again, we'll see.
notepad0x90 11/13/2025||
malware are good at getting users to click past scare screens unfortunately. this isn't a solved problem, even with desktop browsers.
zarzavat 11/13/2025|||
If you don't look both ways when you cross the road, then you may get hit by a car. The solution is to pay attention.

It's acceptable to build a system where human error can lead to catastrophic consequences, even death. Every time you go outside you encounter many of these systems.

Not everything in life can be made 100% safe, but that's no reason to stop living.

consp 11/13/2025||
> The solution is to pay attention.

Swindlers work by that is a story as old as time. Even snakeoil salesmen were good at distracting people from obvious signs of false promises and warnings. People often overestimate their own capabilities greatly, same as there are no bad drivers on the road when you ask people about themselves.

vladms 11/13/2025||
I'm afraid the solution is to learn from mistakes. Which can be painful, takes effort, and at which some people will fail.

Society must be aware we are balancing "protection" and "responsibility". If you want some freedom you must have some responsibility.

I do not mind offering to some people more "protection" if it is clear they give up some "freedom". Some might accept the risks, some will not.

IshKebab 11/13/2025|||
There are definitely things you could do to improve it though. E.g. you can't activate "I know what I'm doing" mode while on the phone or for 1 hour after a phone call. Someone else suggested a one-day cooldown.

Also for the specific scam they mentioned, why do apps even have permission to intercept all notifications?? Just fix that!

mzajc 11/13/2025||
> why do apps even have permission to intercept all notifications?? Just fix that!

I fear "fixing" it would mean removing the feature entirely, which breaks many workflows. Primarily this is used for accessibility (and is controlled in the accessibility settings), but applications such as KDE Connect also make good use of it.

lern_too_spel 11/13/2025|||
> Google can then require highlighting things like number of downloads and developer reputation by 3rd party appstores

F-droid doesn't want to track number of installs because that is an invasion of privacy.

> require developer verification, but also allow third party appstores like f-droid that don't require verification

Now you've moved the problem from Google gatekeeping apps to Google gatekeeping app stores. We don't want either.

notepad0x90 11/13/2025||
Then i guess you can't publish apps? One of those issues where i should be "writing to my congressman" or whatever I guess. the problem is real and people like you are being obtuse, unwilling to find a solution or a compromise. Something as simple as number of installs is an invasion of privacy? how? it's a number, you increment a counter when someone hits download, that's it.

Yeah, if google gets to have rules over what happens by apps that have their seal of approval. that's how seals of approvals work. you're not entitled to these things. you don't have the right to publish to the android platform, if Google, wary of anti-trust suits allows a 3rd party app store, it can institute reasonable requirements.

If an appstore is willingly hosting malware, should Google still provide their seal of approval? That was supposed to be rhetoric, but I wouldn't be surprised if you told me that they should.

This is willful ignorance, I only hope you educate yourself on the harms caused by malware and malicious actors and consider taking a practical approach to finding solutions instead of dying on every single hill.

curtisnewton 11/13/2025|||
> Then i guess you can't publish apps?

I want to distribute apps (someone might also want to simply sell them), not publish them

I don't need a publisher, internet is a publishing media already

> you don't have the right to publish to the android platform

then let me install an alternative OS on the HW i legally bought and own or pay me back.

> the harms caused by malware and malicious actors

life is full of people doing harms and malicious actors, but we don't let Google or any other company gatekeep our lives

notepad0x90 11/13/2025||
> life is full of people doing harms and malicious actors, but we don't let Google or any other company gatekeep our lives

Yeah, you're certainly not speaking for malware victims here. android is not your life, so google gatekeeping android (actually only google approved builds) is not gatekeeping your life.

You certainly should be able to load an alternative OS. isn't that what lineage and other android distributions do already?

monegator 11/14/2025|||
Not when google lobbies your government and banks to require "play integrity" in order to use government apps and bank apps

Not device integrity (locked bootloader, signed image, which can be done with alternative OS) but "play integrity" so approved by google. In other words, you can't run android without Google's services, google's builtin ads.

And the alternative is iOS.

debazel 11/14/2025||||
Loading alternative OSes is not an option anymore because play store attestation is now mandatory to live a normal life in society.
curtisnewton 11/13/2025|||
[dead]
Yokolos 11/13/2025||||
How about the harms of fascist authoritarian governments that will use this functionality to ban any apps they don't like? Why do you people only care about malware and not essential fundamental freedoms that affect us every fucking day?
curtisnewton 11/13/2025|||
I guess it's because propaganda and scare tactics work.
notepad0x90 11/13/2025|||
talk about a straw man. "fascist authoritarian" is rich, governments don't need that to ban apps. Google can ban apk's on all android phones with a play store any time they want. Microsoft can do this on any windows machine with windows update turned (they have in the past), apple can do that with their OS's too.

Your freedoms are not the subject of this topic, not even remotely. Google isn't even banning you from doing anything on android phone, this is strictly about approving android builds by phone vendors, you're not even the subject here. Google doesn't want to approve android builds that allow sideloading. You can still install lineage.

Your argument here is actually "fascist authoritarian", you want to impose your views on the general public, that sideloading should be enabled. Having an option for yourself and other willing people to just not just vendor built android is not enough, you want the public to also leave the gates open so you can sideload your random apk's.

Oh, and for the record, my post was about finding a compromise, not a false dichotomy as you presented. If you made a car without a seatbelt it won't be allowed on the roads, if a phone vendor also builds an unsafe android where random devs an sideload apks, that shouldn't be allowed. Forget Google, governments should be enforcing the sideload ban lol.

You don't appreciate your freedoms and insist on abusing them, so actual freedoms end up being taken away!

StopDisinfo910 11/13/2025|||
> people like you are being obtuse, unwilling to find a solution or a compromise.

How are people being obtuse for refusing to compromise for solutions on a problem which doesn’t exist?

You can’t misrepresent the situation, establish that one American company having absolute control on what people do with their devices is somehow the norm and then complain that people won’t meet you halfway.

notepad0x90 11/13/2025||
> How are people being obtuse for refusing to compromise for solutions on a problem which doesn’t exist?

I'll give you the benefit of doubt and assume you're just not well informed.

Millions of people are losing billions of dollars. Women are having their private media published to the masses. People are getting divorces, fired from jobs,etc.. because of android malware. The problem is nearly non-existent on iPhones to the most part, because they lock that down (but now thanks to "my freedom" type of freedom abusers are changing that too).

Apple already does this. You can't publish a driver for Windows without verifying your identity and buying an expensive code signing cert. Google isn't doing anything new, matter of fact, they're not doing enough! this still permits things like lineageos and other android builds to be installed -- that's your freedom. But since the prevailing sentiment is to resist a more secure way of doing things, the outcome will be that all smartphones will only load signed kernels/firmware in the future, and all signers will be required to id themselves, this will kill a lot of android builds.

This is why compromise is important. Your liberties are important to you, but you can't just dismiss the harm to the masses like that and refuse to find a compromise or a solution, that's how you lose what little freedom you have.

This is why things like "chat control" keep creeping up, and they will succeed down the road.

huem0n 11/13/2025|||
> hobbyists can still publish apps in third party stores

I shouldn't need an internet connection just to make an app for a device I own.

wiseowise 11/13/2025||
Why do I need a store to install something on my phone that I own?
AdmiralAsshat 11/13/2025||
In light of Google's recent push to eliminate this, I went and installed F-Droid to see what we'd be losing. I had thought about it for years, but always held off on doing it on my daily driver phone because I simply didn't want to open the floodgates on allowing apps to start randomly installing on my phone.

But having done it, I'm actually pretty impressed with the existing security. At least on my S24, you have to both enable sideloading at the system level, and enable each specific app to be allowed to "Install other apps" (e.g. when I first tried to launch the APK that I had downloaded from Firefox, I received a notification that I would need to whitelist Firefox to be allowed to install apps. I decided no, and instead whitelisted my File Manager app and then opened the APK through that).

I then installed F-Droid, allowed it to install other apps, installed NewPipe, and then toggled back off the system-level sideloading setting. NewPipe still works, and I don't think anything else can install. This satisfies my security paranoia that once the door to sideloading is opened that apps can install other apps willy-nilly. Not so.

So I really don't see what this new initiative by Google solves, other than, as others have said, control. The idea that somehow all user security woes come from sideloading apps and they would somehow be safe if they simply stuck strictly to the Play Store is patently untrue, given the number of malware-laden apps currently lurking in the Play Store.

NoGravitas 11/14/2025|
You can also de-whitelist your file manager app from installing apps after you install F-Droid.
erohead 11/13/2025||
Sounds like they're rolling back the mandatory verification flow:

Based on this feedback and our ongoing conversations with the community, we are building a new advanced flow that allows experienced users to accept the risks of installing software that isn't verified. We are designing this flow specifically to resist coercion, ensuring that users aren't tricked into bypassing these safety checks while under pressure from a scammer. It will also include clear warnings to ensure users fully understand the risks involved, but ultimately, it puts the choice in their hands. We are gathering early feedback on the design of this feature now and will share more details in the coming months.

Ajedi32 11/13/2025||
I'm a little nervous about what this advanced flow is going to look like, given that sideloading already requires jumping through a bunch of hoops to enable and even that apparently wasn't enough to satisfy Google.

I'm cautiously optimistic though. I'm generally okay with nanny features as long as there's a way to turn them off and it sounds like that's what this "advanced flow" does.

gowthamgts12 11/13/2025|||
> Sounds like they're rolling back the mandatory verification flow

absolutely no. this is for the user side. but if you're a developer who is planning to publish the app in alternative play store/from your website, you have to do verification flow. please read the full text.

Ajedi32 11/13/2025|||
That's only if you don't want your users to have to jump through whatever hoops are needed to bypass the verification requirement.
rbits 11/13/2025|||
But it's not mandatory anymore because people can install it without it being verified.
silisili 11/13/2025||
I feel like if safety was really their top priority, they would have done this long ago and not bothered with this mandatory signing nonsense to begin with...

Still, it seems like good news, so I'll take it.

A4ET8a8uTh0_v2 11/13/2025|
"Allow". This is the entirety of the problem. They are allowing things on my machine that I purchased with monies that I leased my soul for.

Anyway, I am already planning for a future in which Google does not feature as prominently as did until now. Small steps so far ( grapheneOS ), but to me the writing the wall is unmistakable. Google got cold feet over feedback and now they can allow things.

When negative publicity ends, they will start working towards further locking it in again. I am personally done with passively accepting it. It might be annoying, but it degoogling is a simple necessity.

sdoering 11/13/2025||
> I am already planning for a future in which Google does not feature

This. Currently I am still a paying Google customer for a few things running my freelance side business. I am in the process of migrating my data out of Google Drive and migrating my photos out as well.

Next step is taking back control over my email infrastructure. Especially as google nowadays sorts quite a relevant number of important mail to spam, while allowing more and more crap to pass into my inbox.

Also they one sidedly raised the price because they now have AI included. Fuck them - I am not using their shitty AI and I did not buy that. I am using AI daily - just not the crap product Google shoved down my throat.

garpheneOS/postmarketOS are next on my list. As I have a tertiary device around, I will during the dark months ahead set this up and see if it fits my needs.

With Arch now my daily driver (except for the main job), I plan to use way less US tech vendor crap. There are so many beautiful and not to difficult to use OS solutions out there, easily hostable on servers inside a more sensible jurisdiction.

Also currently working on a solution to get around the enshittified YouTube experience. Without it becoming an unreasonable effort to still watch the interesting things on my big screen in the living room. But automated AI audio translations did this in for me. I already find the automated title translations to be abhorrent - now, having had the absolute shit experience of starting a video and having it dubbed by an awful AI voice was just a bit too much for me.

xandrius 11/13/2025||
Consider UbuntuTouch, really nice ecosystem and community, you can run many Android apks.
worksonmine 11/14/2025||
Ironic suggestion in this context considering how hard Ubuntu has pushed snaps over competing solutions. Canonical is the Google of the Linux ecosystem.
More comments...