Posted by bookofjoe 3 hours ago
This is why the FBI can compel Microsoft to provide the keys. It's possible, perhaps even likely, that the suspect didn't even know they had an encrypted laptop. Journalists love the "Microsoft gave" framing because it makes Microsoft sound like they're handing these out because they like the cops, but that's not how it works. If your company has data that the police want and they can get a warrant, you have no choice but to give it to them.
This makes the privacy purists angry, but in my opinion it's the reasonable default for the average computer user. It protects their data in the event that someone steals the laptop, but still allows them to recover their own data later from the hard drive.
Any power users who prefer their own key management should follow the steps to enable Bitlocker without uploading keys to a connected Microsoft account.
Except the steps to to that are disable bitlocker, create a local user account (assuming you initially signed in with a Microsoft account because Ms now forces it on you for home editions of windows), delete your existing keys from OneDrive, then re-encrypt using your local account and make sure not to sign into your Microsoft account or link it to Windows again.
A much more sensible default would be to give the user a choice right from the beginning much like how Apple does it. When you go through set up assistant on mac, it doesn't assume you are an idiot and literally asks you up front "Do you want to store your recovery key in iCloud or not?"
That's not so easy. Microsoft tries really hard to get you to use a Microsoft account. For example, logging into MS Teams will automatically link your local account with the Microsoft account, thus starting the automatic upload of all kinds of stuff unrelated to MS Teams.
In the past I also had Edge importing Firefox data (including stored passwords) without me agreeing to do so, and then uploading those into the Cloud.
Nowadays you just need to assume that all data on Windows computers is available to Microsoft; even if you temporarily find a way to keep your data out of their hands, an update will certainly change that.
I used to be a windows user, it has really devolved to the point where it's easier for me to use Linux (though I'm technical). I really feel for the people who aren't technical and are forced to endure the crap that windows pushes on users now.
That’s the real problem MS has. It’s becoming a meme how bad the relationship between the user and windows is. It’s going to cause generational damage to their company just so they can put ads in the start menu.
Btw - my definition of “possible” would include anything possible in the UI - but if you have to edit the registry or do shenanigans in the filesystem to disable the upload from happening, I would admit that it’s basically mandatory.
I mean, this is one application nobody should ever log into!
I did this myself for about 8 years, from 2016-2024. During that time my desktop system at home was running Linux with ZFS and libvirt, with Windows in a VM. That Windows VM was my usual day-to-day interface for the entire system. It was rocky at first, but things did get substantially better as time moved on. I'll do it again if I have a compelling reason to.
The only windows I am using is the one my company makes me use but I don't do anything personal on it. I have my personal computer next to it in my office running on linux.
Admittedly, the risks of choosing this option are not clearly laid out, but the way you are framing it also isn't accurate
Whether you opt in, or not, if you connect your account to Microsoft, then they do have the ability fetch the bitlocker key, if the account is not local only. [0] Global Reader is builtin to everything +365.
[0] https://github.com/MicrosoftDocs/entra-docs/commit/2364d8da9...
The question is do they ever fetch and transmit it if you opt out?
The expected answer would be no. Has anyone shown otherwise? Because hypotheticals that they could are not useful.
Hell, all the times they keep enabling one drive despite it being really clear I don’t want it, and then uploading stuff to the cloud that I don’t want?
I have zero trust for Microsoft now, and not much better for them in the past either.
This does not apply to standalone devices. MS doesn't have a magic way to reach into your laptop and pluck the keys.
I'd be curious to see a conclusive piece of documentation about this, though
Of course they do! They can just create a Windows Update that does it. They have full administrative access to every single PC running Windows in this way.
1. Is there any indication it forcibly uploads your recovery keys to microsoft if you're signed into a microsoft account? Looking at random screenshots, it looks like it presents you an option https://helpdeskgeek.com/wp-content/pictures/2022/12/how-to-...
2. I'm pretty sure you don't have to decrypt and rencrypt the entire drive. The actual key used for encrypting data is never revealed, even if you print or save a recovery key. Instead, it generates a "protectors", which encrypts the actual key using the recovery key, then stores the encrypted version on the drive. If you remove a recovery method (ie. protector), the associated recovery key becomes immediately useless. Therefore if your recovery keys were backed up to microsoft and you want to opt out, all you have to do is remove the protector.
Once the feature exists, it's much easier to use it by accident. A finger slip, a bug in a Windows update, or even a cosmic ray flipping the "do not upload" bit in memory, could all lead to the key being accidentally uploaded. And it's a silent failure: the security properties of the system have changed without any visible indication that it happened.
ETA: You're not wrong; folk who have specific, legitimate opsec concerns shouldn't be using certain tools. I just initially read your post a certain way. Apologies if it feels like I put words in your mouth.
Stats on this very likely scenario?
From the wikipedia article on "Soft error", if anyone wants to extrapolate.
So roughly you could expect this happen roughly once every two hundred million years.
Assuming there are about 2 billion Windows computers in use, that’s about 10 computers a year that experience this bit flip.
That's wildly more than I would have naively expected to experience a specific bit-flip. Wow!
Was in DEFCON19.
Can't find an explanatory video though :(
More on the topic: Single-event upset[1]
Also, around 1999-2000, Sun blamed cosmic rays for bit flips for random crashes with their UltraSPARC II CPU modules.
Yep, hardware failures, electrical glitches, EM interference... All things that actually happen to actual people every single day in truly enormous numbers.
It ain't cosmic rays, but the consequences are still flipped bits.
This is absurd, because it's basically a generic argument about any sort of feature that vaguely reduces privacy. Sorry guys, we can't have automated backups in windows (even opt in!), because if the feature exists, a random bitflip can cause everything to be uploaded to microsoft against the user's will.
Before you downvote, please entertain this one question: have you been able to predict that mandatory identification of online users under the guise of protecting children would literally be implemented in leading western countries in such a quick fashion? If you were, then upvote my comment instead because you know that will happen. If you couldn't even imagine this say in 2023 - then upvote my comment instead because neither you can imagine mandatory key extraction.
We have mandatory identification for all kinds of things that are illegal to purchase or engage in under a certain age. Nobody wants to prosecute 12 year old kids for lying when the clicked the "I am at least 13 years old" checkbox when registering an account. The only alternative is to do what we do with R-rated movies, alcohol, tobacco, firearms, risky physical activities (i.e. bungee jumping liability waiver) etc... we put the onus of verifying identification on the suppliers.
I've always imagined this was inevitable.
When I go buy a beer at the gas station, all I do is show my ID to the cashier. They look at it to verify DOB and then that's it. No information is stored permanently in some database that's going to get hacked and leaked.
We can't trust every private company that now has to verify age to not store that information with whatever questionable security.
If we aren't going to do a national registry that services can query to get back only a "yes or no" on whether a user is of age or not, then we need regulation to prevent the storage of ID information.
We should still be able to verify age while remaining psuedo-anonymous.
And note that if we are, the records of the request to that database are an even bigger privacy timebomb than those of any given provider, just waiting for malicious actors with access to government records.
Beer, sure. But if you buy certain decongestants, they do log your ID. At least that's the case in Texas.
Yeah, but many people don't actually think War on Drugs policies are a model for civil liberties that should be extended beyond that domain (or, in many cases, even tolerated in that domain.)
> that just so happens to take a screenshot of your screen every few seconds
Recall is off by default. You have to go turn it on if you want it.
Microsoft also happens to own LinkedIn which conveniently "forgets" all of my privacy settings every time I decide to review them (about once a year) and discover that they had been toggled back to the privacy-invasive value without my knowledge. This has happened several times over the years.
How?
There is no point locking your laptop with a passphrase if that passphrase is thrown around.
Sure, maybe some thief can't get access, but they probably can if they can convince Microsoft to hand over the key.
Microsoft should not have the key, thats part of the whole point of FDE; nobody can access your drive except you.
The cost of this is that if you lose your key: you also lose the data.
We have trained users about this for a decade, there have been countless dialogues explaining this, even if we were dumber than we were (we're not, despite what we're being told: users just have fatigue from over stimulation due to shitty UX everywhere); then it's still a bad default.
There is the old password for candy bar study: https://blog.tmb.co.uk/passwords-for-chocolate
Do users care? I would posit that the bulk of them do not, because they just dont see how it applies to them, till they run into some type of problem.
Most (though not all) users are looking for encryption to protect their data from a thief who steals their laptop and who could extract their passwords, banking info, etc. Not from the government using a warrant in a criminal investigation.
If you're one of the subset of people worried about the government, you're generally not using default options.
There are a lot of people here criticising MSFT for implementing a perfectly reasonable encryption scheme.
This isn’t some secret backdoor, but a huge security improvement for end-users. This mechanism is what allows FDE to be on by default, just like (unencrypted) iCloud backups do for Apple users.
Calling bs on people trying to paint this as something it’s not is not “whiteknighting”.
>Microsoft is an evil corporation, so we must take all bad stories about them at face value. You're not some corpo bootlicker, now, are you? Now, in unrelated news, I heard Pfizer, another evil corporation with a dodgy history[1] is insisting their vaccines are safe...
Yes. The thing is: Microsoft made the design decision to copy the keys to the cloud, in plaintext. And they made this decision with the full knowledge that the cops could ask for the data.
You can encrypt secrets end-to-end - just look at how password managers work - and it means the cops can only subpoena the useless ciphertext. But Microsoft decided not to do that.
I dread to think how their passkeys implementation works.
> It protects their data in the event that someone steals the laptop, but still allows them to recover their own data later from the hard drive.
It allows /anyone/ to recover their data later. You don't have to be a "purist" to hate this.
With this scheme the drive is recoverable by the user and unreadable to everyone except you, Microsoft, and the police. Surely that's a massive improvement over sitting in plaintext readable by the world. The people who are prepared to do proper key management will know how to do it themselves.
The real issue is that you can't be sure that the keys aren't uploaded even if you opt out.
At this point, the only thing that can restore trust in Microsoft is open sourcing Windows.
The fully security conscious option is to not link a Microsoft account at all.
I just did a Windows 11 install on a workstation (Windows mandatory for some software) and it was really easy to set up without a Microsoft account.
And newer builds of Windows 11 are removing these methods, to force use of a Microsoft account. [0]
[0] https://www.windowslatest.com/2025/10/07/microsoft-confirms-...
By "really easy" do you mean you had a checkbox? Or "really easy" in that there's a secret sequence of key presses at one point during setup? Or was it the domain join method?
Googling around, I'm not sure any of the methods could be described as "really easy" since it takes a lot of knowledge to do it.
I'd be more concerned about access to cloud data (emails, photos, files.)
It's 2026. The abuses of corporations are well documented. Anyone who still chooses Windows of their own volition is quite literally asking for it and they deserve everything that happens to them.
I'd already long since migrated away from Windows but if I'd been harbouring any lingering doubts, that was enough to remove them.
What do you suggest? I’ll try it in a VM or live usb.
If you care a little more about your privacy and is willing to sacrifice some commodity, go for Fedora. It's community run and fairly robust. You may have issues with media codecs, nvidia drivers and few other wrinkles though. The "workstation" flavor is the most mature, but you may want to give the KDE version a try.
If you want an adventure, try everything else people are recommending here :)
Regardless of which distro you choose, your "desktop experience" will be mostly based on what desktop environment you pick, and you are free to switch between them regardless of distro. Ubuntu for example provides various installers that come with different DEs installed by default (they call them "flavours": https://ubuntu.com/desktop/flavors), but you can also just switch them after installation. I say "mostly" because some distros will also customise the DE a bit, so you might find some differences.
"Nicest desktop experience" is also too generic to really give a proper suggestion. There are DEs which aim to be modern and slick (e.g. GNOME, KDE Plasma, Cinnamon), lightweight (LXQt), or somewhere in between (Xfce). For power users there's a multitude of tiling window managers (where you control windows with a keyboard). Popular choices there are i3/sway or, lately, Niri. All of these are just examples, there are plenty more DEs / WMs to pick from.
Overall my suggestion would be to start with something straightforward (Mint would probably be my first choice here), try all the most popular DEs and pick the one you like, then eventually (months or years later) switch to a more advanced distro once you know more what your goals are and how you want to use the system. For example I'm in the middle of migrating to NixOS because I want a fully declarative system which gives the freedom to experiment without breaking your system because you can switch between different temporary environments or just rollback to previous generations. But I definitely wouldn't have been ready for that at the outset as it's way more complex than a more traditional distro.
Bon appetit!
I heard Kubuntu is not a great distro for KDE, but I can't comment on that personally.
If you want something that "just works," Linux Mint[1] is a great starting point. That gets you into Linux without any headache. Then, later when bored, you can branch out into the thousands[2] of Linux distributions that fill every possible niche
This is a part of Settings that you will never see at a passing glance, so it's easy to forget that you may have it on.
I'd also like to gently push back against the cynicism expressed about having a feature like this. There are more people who benefit from a feature like this than not. They're more likely thinking "I forgot my password and I want to get the pictures of my family back" than fully internalizing the principles and practices of self custody - one of which is that if you lose your keys, you lose everything.
There are two ways to log into macOS: a local user account or an LDAP (e.g. OpenDirectory, Active Directory) account. Either of these types of accounts may be associated with an iCloud account. macOS doesn’t work like Windows where your Microsoft account is your login credential for the local machine.
FileVault key escrow is something you can enable when enabling FileVault, usually during initial machine setup. You must be logged into iCloud (which happens in a previous step of the Setup Assistant) and have iCloud Keychain enabled. The key that wraps the FileVault volume encryption key will be stored in your iCloud Keychain, which is end-to-end encrypted with a key that Apple does not have access to.
If you are locked out of your FileVault-encrypted laptop (e.g. your local user account has been deleted or its password has been changed, and therefore you cannot provide the key to decrypt the volume encryption key), you can instead provide your iCloud credentials, which will use the wrapping key stored in escrow to decrypt the volume encryption key. This will get you access to the drive so you can copy data off or restore your local account credentials.
And just in case it wasn't clear enough, I'd add: a local user account is standard. The only way you'd end up with an LDAP account is if you're in an organization that deliberately set your computer up for networked login; it's not a typical configuration, nor is it a component used by iCloud.
False. If you only put the keys on the Microsoft account, and Microsoft closes your account for whatever reason, you are done.
Bitlocker on by default (even if Microsoft does have the keys and complies with warrants) is still a hell if a lot better than the old default of no encryption. At least some rando can't steal your laptop, pop out the HDD, and take whatever data they want.
They can fight the warrant, if you don't at least object to it then "giving the keys away" is not an incorrect characterization.
Users absolutely 100% will lose their password and recovery key and not understand that even if the bytes are on a desk physically next to you, they are gone. Gone baby gone.
In university, I helped a friend set up encryption on a drive w/ his work after a pen drive with work on it was stolen. He insisted he would not lose the password. We went through the discussion of "this is real encryption. If you lose the password, you may as well have wiped the files. It is not in any way recoverable. I need you to understand this."
6 weeks is all it took him.
So all the state needs to get into your laptop is to get access from Apple to your iCloud account.
I have W11 w a local account and no bitlocker on my desktop computer, but the sheer amount of nonsense MS has been doing these days has really made me question if 'easy modding*' is really enough of a benefit for me to not just nuke it and install linux yet again
* You can get the MO2 mod manager running under linux, but it's a pain, much like you can also supposedly run executable mods (downgraders, engine patches, etc) in the game's context, but again, pain
I recently needed to make a bootable key and found that Rufus out of the box allows you to modify the installer, game changer.
But, the 5th amendment is also why its important to not rely on biometrics. Generally (there are some gray areas) in the US you cannot be compelled to give up your password, but biometrics are viewed as physical evidence and not protected by the 5th.
The 5th Amendment gives you the right to refuse speech that might implicate you in a crime. It doesn’t protect Microsoft from being compelled to provide information that may implicate one of its customers in a crime.
Only fix is apparently waiting until enough for to cram through an Amendment/set a precedent to fix it.
One of the reasons giving for (usually) now requiring a warrant to open your phone they grab from you is because of the amount of third-party data you can access through it, although IIRC they framed is a regular 4th Amend issue by saying if you had a security camera inside your house the police would be bypassing the warrant requirement by seeing directly into your abode.
In practice: https://en.wikipedia.org/wiki/In_re_Boucher
The government gets what the government wants.
You bet I have that enabled.
But this is irrelevant to the argument made above, right?
These two statements are in no way mutually exclusive. Microsoft is gobbling up your supposedly private encryption keys because they love cops and want an excuse to give your supposedly private data to cops.
Microsoft could simply not collect your keys and then would have no reason or excuse to hand them to cops.
Microsoft chose to do this.
Do not be charitable to fascists.
I think the reasonable default here would be to not upload to MS severs without explicit consent about what that means in practise. I suspect if you actually asked the average person if they're okay with MS having access to all of the data on their device (including browser history, emails, photos) they'd probably say no if they could.
Maybe I'm wrong though... I admit I have a bad theory of mind when it comes to this stuff because I struggle to understand why people don't value privacy more.
Eg in England you're already an enemy of the state when you protest against Israel's actions in Gaza. In America if you don't like civilians being executed by ICE.
This is really a bad time to throw "enemy of the state" around as if this only applies to the worst people.
Current developments are the ideal time to show that these powers can be abused.
As of today at 00:00 UTC, no.
But there's an increasingly possible future
where authoritarian governments will brand users
who practice 'non-prescribed use' as enemies of the state.
And when we have a government who's leader
openly gifts deep, direct access to federal power
to unethical tech leaders who've funded elections (ex:Thiel),
that branding would be a powerful perk to have access to
(even if indirectly).But you can still help prevent abuses of mass surveillance without probable cause by making such surveillance as expensive and difficult as possible for the state
Criticizing the current administration? That sounds like something an enemy of the state would do!
Prepare yourself for the 3am FBI raid, evildoer! You're an enemy of the state, after all, that means you deserve it! /s
Tl;dr - "Basically, you’re either dealing with Mossad or not-Mossad. If your adversary is not-Mossad, then you’ll probably be fine if you pick a good password and don’t respond to emails from ChEaPestPAiNPi11s@ virus-basket.biz.ru. If your adversary is the Mossad, YOU’RE GONNA DIE AND THERE’S NOTHING THAT YOU CAN DO ABOUT IT" (Mickens, 2014)
Will they shoot me in head?
What if I truly forgot the password to my encrypted drive? Will they also shoot me in the head?
What about your wife's head? Your kids' heads?
There is always a choice.
In the opposite scenario - without key escrow - we would be reading sob stories of "Microsoft raped my data and I'm going to sue them for irreparably ruining my business" after someone's drive fails and they realize the data is encrypted and they never saved a backup key (and of course have no backups either).
>Journalists love the "Microsoft gave" framing because it makes Microsoft sound like they're handing these out because they like the cops, but that's not how it works.
Because in 2026 "journalism" has devolved into ragebait farming for clicks. Getting the details correct is secondary, and sometimes optional.
Back in the day hackernews had some fire and resistance.
Too many tech workers decided to rollover for the government and that's why we are in this mess now.
This isn't an argument about law, it's about designing secure systems. And lazy engineers build lazy key escrow the government can exploit.
Most of the comments are fire and resistance, but they commonly take ragebait and run with the assumptions built-in to clickbait headlines.
> Too many tech workers decided to rollover for the government and that's why we are in this mess now.
I take it you've never worked at a company when law enforcement comes knocking for data?
The internet tough guy fantasy where you boldly refuse to provide the data doesn't last very long when you realize that it just means you're going to be crushed by the law and they're getting the data anyway.
The solution to that is to not have the data in the first place. You can't avoid the warrants for data if you collect it, so the next best thing is to not collect it in the first place.
Where I live, government passed a similar law to the UK's online identification law not too long ago. It creates obligations for operating system vendors to provide secure identity verification mechanisms. Can't just ask the user if they're over 18 and believe the answer.
The goal is of course to censor social media platforms by "regulating" them under the guise of protecting children. In practice the law is meant for and will probably impact the mobile platforms, but if interpreted literally it essentially makes free computers illegal. The implication is that only corporation owned computers will be allowed to participate in computer networks because only they are "secure enough". People with their own Linux systems need not apply because if you own your machine you can easily bypass these idiotic verifications.
Microsoft (and every other corporation) wants your data. They don't want to be a responsible custodian of your data, they want to sell it and use it for advertising and maintaining good relationships with governments around the world.
The same way companies used to make money, before they started bulk harvesting of data and forcing ads into products that we're _already_ _paying_ _for_?
I wish people would have integrity instead of squeezing out every little bit of profit from us they can.
> I'm sure there's some cryptographic way to avoid Microsoft having direct access to the keys here.
FTA (3rd paragraph): don't default upload the keys to MSFT.
>If you design it so you don't have access to the data, what can they do?
You don't have access to your own data? If not, they can compel you to reveal testimony on who/what is the next step to accessing the data, and they chase that.
It has nothing to do with the state and has to do with getting the RSUs to pay the down payment for a house in a HCOL area in order to maybe have children before 40 and make the KPIs so you don't get stack-ranked into the bottom 30% and fired at big tech, or grinding 996 to make your investors richest and you rich-ish in the process if you're unlikely enough to exit in the upper decile with your idea. This doesn't include the contingent of people who fundamentally believe in the state, too.
Most people are activists only to the point of where it begins to impede on their comfort.
The people going 'well of course' or 'this is for the user' drive me insane here because as said, there are secure ways you can build a key escrow system so that your data and systems are actually secure. From a secure design standpoint it feels more and more like we're living in Idiocracy as people argue insecure solutions are secure actually and perfectly acceptable.
govt := new_govt
terrorist := yous/workers/Corporations/
False. You can design truly end-to-end encrypted secure system and then the state comes at you and says that this is not allowed, period. [1]
[1] https://medium.com/@tahirbalarabe2/the-encryption-dilemma-wh...
Trying to resist building ethically questionable software usually means quitting or being fired from a job.
In the 90s and 00s people overwhelmingly built stuff in tech because they cared about what they were building. The money wasn't bad, but no one started coding for the money. And that mindset was so obvious when you looked at the products and cultures of companies like Google and Microsoft.
Today however people largely come into this industry and stay in it for the money. And increasingly tech products are reflecting the attitudes of those people.
When you get high up in an org, choosing Microsoft is the equivalent of the old "nobody ever got fired for buying IBM". You are off-loading responsibility. If you ever get high up at a fortune 500 company, good luck trying to get off of behemoths like Microsoft.
What happens if I forget my keys? Same thing that happens if my computer gets struck by a meteor. New drive, new key, restore contents from backups.
It's simple, secure, set-and-forget, and absolutely nobody but me and your favored deity have any idea what's on my drives. Microsoft and the USGov don't have any business having access to my files, and it's completely theoretically impossible for them to gain access within the next few decades.
Don't use Windows. Use a secure operating system. Windows is not security for you, it's security for a hostile authoritarian government.
Privacy is not a crime.
This is so much more reasonable than (for example) all the EU chat control efforts that would let law enforcement ctrl+f on any so-called private message in the EU.
Will this make people care? Probably not, but you never know.
It's time - it's never been easier, and there's nothing you'll miss about Windows.
At least they are honest about it, but a good reason to switch over to linux. Particularly if you travel.
If microsoft is giving these keys out to the US government, they are almost certainly giving them to all other governments that request them.
I use BitLocker on my Windows box without uploading the keys. I don't even have it connected to a Microsoft account. This isn't a requirement.
> If they have a key in their possession [...]
So they do have a choice.
The only explanation that makes sense to me is that there's an element of irrationality to it. Apple has a well known cult, but Microsoft might have one that's more subtle? Or maybe it's a reverse thing where they hate Linux for some equally irrational reasons? That one is harder to understand because Linux is just a kernel, not a corporation with a specific identity or spokesperson (except maybe Torvalds, but afaik he's well-regarded by everyone)
https://cointelegraph.com/news/fbi-cant-be-blamed-for-wiping...
Perhaps next time, an agent will copy the data, wipe the drive, and say they couldn't decrypt it. 10 years ago agents were charged for diverting a suspect's Bitcoin, I feel like the current leadership will demand a cut.
Don't think Apple wouldn't do the same.
If you don't want other people to have access to your keys, don't give your keys to other people.
As a US company, it's certainly true that given a court order Apple would have to provide these keys to law enforcement. That's why getting the architecture right is so important. Also check out iCloud Advanced Data Protection for similar protections over the rest of your iCloud data.
[0] https://sixcolors.com/post/2025/09/filevault-on-macos-tahoe-...
As of macOS Tahoe, the FileVault key you (optionally) escrow with Apple is stored in the iCloud Keychain, which is cryptographically secured by HSM-backed, rate-limited protections.
You can (and should) watch https://www.youtube.com/watch?v=BLGFriOKz6U&t=1993s for all the details about how iCloud is protected.
Unbreakable phones are coming. We’ll have to decide who controls the cockpit: The captain? Or the cabin?
The security in iOS is not to designed make you safer, in the same way that cockpit security doesn't protect economy class from rogue pilots or business-class terrorists. Apple made this decision years ago, they're right there in Slide 5 of the Snowden PRISM disclosure. Today, Tim stands tall next to POTUS. Any preconceived principle that Apple might have once clung to is forfeit next to their financial reliance on American protectionism: https://www.cnbc.com/2025/09/05/trump-threatens-trade-probe-...Of course Apple offers a similar feature. I know lots of people here are going to argue you should never share the key with a third party, but if Apple and Microsoft didn't offer key escrow they would be inundated with requests from ordinary users to unlock computers they have lost the key for. The average user does not understand the security model and is rarely going to store a recovery key at all, let alone safely.
> https://support.apple.com/en-om/guide/mac-help/mh35881/mac
Apple will escrow the key to allow decryption of the drive with your iCloud account if you want, much like Microsoft will optionally escrow your BitLocker drive encryption key with the equivalent Microsoft account feature. If I recall correctly it's the default option for FileVault on a new Mac too.
Besides, Apple's lawyers aren't stupid enough to forget to carve out a law-enforcement demand exception.
https://news.ycombinator.com/item?id=46252114
https://news.ycombinator.com/item?id=45520407
Nope. For this threat model, E2E is a complete joke when both E's are controlled by the third party. Apple could be compelled by the government to insert code in the client to upload your decrypted data to another endpoint they control, and you'd never know.
An open source project absolutely cannot do that without your consent if you build your client from the source. That's my point.
> For this threat model
We're talking about a hypothetical scenario where a state actor getting the information encrypted by the E2E encryption puts your life or freedom in danger.
If that's you, yes, you should absolutely be auditing the source code. I seriously doubt that's you though, and it's certainly not me.
Except for that time they didn't.
PGP WDE was a preferred corporate solution, but now you have to trust Broadcom.