Posted by robin_reala 2 days ago
https://x.com/runasand/status/2017659019251343763?s=20
The FBI was able to access Washington Post reporter Hannah Natanson's Signal messages because she used Signal on her work laptop. The laptop accepted Touch ID for authentication, meaning the agents were allowed to require her to unlock it.
A few bookmarklets:
javascript:(function(){if (location.host.endsWith('x.com')) location.host='xcancel.com';})()
javascript:(function(){if (location.host.endsWith('youtube.com')) location.host='inv.nadeko.net';})()
javascript:(function(){if (location.hostname.endsWith('instagram.com')) {location.replace('https://imginn.com' + location.pathname);}})()
[1] https://www.reddit.com/r/uBlockOrigin/comments/1cc0uon/addin...
For example duck://player/fqtK3s7PE_k where the video id in youtube url https://www.youtube.com/watch?v=fqtK3s7PE_k
But it doesn't have that overview page like inv.nadeko.net does
No, orders of magnitude are exponential, not linear, so conventionally “on the order of 1 billion” would be between 100 million × sqrt(10) and 1 billion × sqrt(10), but “billionaire” isn't “net worth on the order of 1 billion” but “net worth of 1 billion or more”, or, when used heirarchically alongside trillionaire ans millionaire “net worth of at least one billion and less than one trillion”.
Assuming that number turns out to be close to reality, how do you weigh so many unnecessary deaths against VTL rockets and the electric cars?
Perhaps a practitioner of Effective Altruism could better answer that question.
Nor how many deaths will be caused by his support for far right parties across Europe, when they start ethnic cleansings.
This doesn't make corruption OK. But he tore out a lifeline for some people without giving them an alternative way to get aid.
Sure. It's a transactional purchase of stability and goodwill, via which the US has benefited enormously.
I mean, by way of the atrocities we've committed around the world, we kinda do.
Even if we buy your thesis, foregoing morals, geopolitics, and history, it's a useful soft power strategy...
I'm not saying fund USAID before healthcare for all in america. I'm saying of all the insane things our government wastes money on, USAID was far down on the list of most egregious.
I've committed no atrocities. Going to guess that you've committed no atrocities. What atrocities did occur, most of those who committed those are dead, the rest are senile in nursing homes. I have no guilt and certainly feel no guilt for those events.
>it's a useful soft power strategy.
Sure, if you're some sort of tyrant. I thought the left was against colonialism... but you guys really just one a more clever, subtle colonialism eh? Figures.
>I'm saying of all the insane things our government wastes money on, USAID was far down on the list of most egregious.
What you're saying is that no cuts can or should be made, unless they are your favorite cuts first. And maybe after you get those, no others need be made at all.
That in itself should make you hate the dude.
Wasn't Edison an asshole?
Children were exploited, and we're doing this net positive analysis on whether he should face the scorn. I'm not having a go at you - it's just frustrating to see very little happening after so much has been exposed, and I think part of it comes from this mindset - 'oh he's a good guy, this is a mistake/misstep' while people that were exploited as children can't even get their justice.
It's sickening.
I'd rather have both. Hawthorne doesn't get nuked if Elon Musk goes to jail.
> Children were exploited
Abuse. Exploitation. CSAM. We're mushing words.
Child rape. These men raped children. Others not only stayed silent in full knowledge of it, but supported it directly and indirectly. More than that, they arrogantly assumed–and, by remaining in the United States, continue to assume–that they're going to get away with it.
Which category is Elon Musk in? We don't know. Most of the people in the Epstein files are innocent. But almost all of them seem to have been fine with (a) partying with an indicted and unrepentant pedophile [1] and (b) not saying for decades–and again, today–anything to the cops about a hive of child rape.
A lot of them should go to jail. All of them should be investigated. And almost all of them need to be retired from public life.
[1] https://web.archive.org/web/20220224113217/https://www.theda...
(Clarification: I’m using the term colloquially. Whether Epstein had a mental condition is unclear.)
[1] https://www.justice.gov/usao-sdny/press-release/file/1180481...
Don't believe me? Go to the epstein emails and try to find it
Yes, a judge is unlikely to order your execution if you refuse. Based on recent pattern of their behavior, masked secret police who are living their wildest authoritarian dreams are likely to execute you if you anger them (for example by refusing to comply with their desires).
I do not follow the logic here, what does that even mean? It seems very dubious. And what happens if one legitimately forgets? They just get to keep you there forever?
This is an area that seems to confuse a lot of people because of what the 5th amendment says and doesn't say.
The reason they can't force you to unlock your phone is not because your phone contains evidence of stuff. They have a warrant to get that evidence. You do not have a right to prevent them from getting it just because it's yours. Most evidence is self-incriminating in this way - if you have a murder weapon in your pocket with blood on it, and the police lawfully stop you and take it, you really are incriminating yourself in one sense by giving it to them, but not in the 5th amendment sense.
The right against self-incrimination is mostly about being forced to give testimonial evidence against yourself. That is, it's mostly about you being forced to testify against yourself under oath, or otherwise give evidence that is testimonial in nature against yourself. In the case of passwords, courts often view it now as you being forced to disclose the contents of your mind (IE live testify against yourself) and equally important, even if not live testimony against yourself, it testimonially proves that you have access to the phone (more on this in a second). Biometrics are a weird state, with some courts finding it like passwords/pins, and some finding it just a physical fact with no testimonial component at all other than proving your ability to access.
The foregone conclusion part comes into play because, excluding being forced to disclose the contents of your mind for a second, the testimonial evidence you are being forced to give when you unlock a phone is that you have access to the phone. If they can already prove it's your phone or that you have access to it, then unlocking it does not matter from a testimonial standpoint, and courts will often require you to do so in the jurisdictions that don't consider any other part of unlocking to be testimonial. (Similarly, if they can't prove you have access to the phone, and whether you have access to the phone or not matters to the case in a material way, they generally will not be able to force you to unlock it or try to unlock it because it woudl be a 5th amendment violation).
Hope this helps.
This seems like a key point though. What's the legal distinction between compelling someone to unlock a phone using information in their mind, and compelling them to speak what's in their mind?
If I had incriminating info on my phone at one point, and I memorized it and then deleted it from the phone, now that information is legally protected from being accessed. So it just matters whether the information itself is in your mind, vs. the ability to access it?
You can actually eliminate phones entirely from your second example.
If you had incriminating info on paper at one point, and memorized it and deleted it, it would now be legally protected from being accessed.
One reason society is okay with this is because most people can't memorize vast troves of information.
Otherwise, the view here would probably change.
These rules exist to serve various goals as best they can. If they no longer serve those goals well, because of technology or whatever else, the rules will change. Being completely logical and self-consistent is not one of these goals, nor would it make sense as a primary goal for rules meant to try to balance societal vs personal rights.
This is, for various reasons, often frustrating to the average HN'er :)
With that in mind...
> Being completely logical and self-consistent is not one of these goals, nor would it make sense as a primary goal for rules meant to try to balance societal vs personal rights.
Do we really know that it wouldn't make sense, or is that just an assumption because the existing system doesn't do it? (Alternatively, perhaps a consistent logical theory simply hasn't been identified and articulated.)
This reminds me of how "sovereign citizens" argue their position. Their logic isn't consistent, it’s built around rhetorical escape hatches. They'll claim that their vehicle is registered with the federal DOT, which is a commercial registration, but then they'll also claim to be a non-commercial "traveler". They're optimizing for coverage of objections, not global consistency.
What you seem to be telling me is that the prevailing legal system is the same, just perhaps with more of the obvious rough edges smoothed out over the centuries.
brb, going to try encoding the USC in Rocq.
The 4th amendment would protect you from them seizing your phone in the first place for no good reason, but would not protect you from them seizing your phone if they believe it has evidence of a crime.
Regardless, it is not the thing that protects you (or doesn't, depending) from having to give or otherwise type in your passcode/pin/fingerprint/etc.
https://news.ycombinator.com/item?id=44746992
This command will make your MacBook hibernate when lid is closed or the laptop sleeps, so RAM is written to disk and the system powers down. The downside is that it does increase the amount of time it takes to resume.
A nice side benefit though, is that fingerprint is not accepted on first unlock, I believe secrets are still encrypted at this stage similar to cold boot. A fingerprint still unlocks from screensaver normally, as long as the system does not sleep (and therefore hibernate)
Does this mean that the Signal desktop application doesn't lock/unlock its (presumably encrypted) database with a secret when locking/unlocking the laptop?
Signal itself wouldn’t even be detectable as an app
Another reason to use my dog's nose instead of a fingerprint.
uhm, are saying that i'm saying that? if so, please show me where i said that. thank you
and biometrics have "legal problems" as stated above
a pin or allowing touchid to automatically be disabled after a period of time or computer movement ("please enter password to login") would be greatly appreciated
as it stands now, i have biometrics disabled.
Also, IANAL, but I'm pretty sure that if law enforcement has a warrant to seize property from you, they're not obligated to do so immediately the instant they see you - they could have someone follow you and watch to see how you unlock your phone before seizing it.
Except when they can: https://harvardlawreview.org/print/vol-134/state-v-andrews/
It's interesting in the case of social media companies. Technically the data held is the companies data (Google, Meta, etc.) however courts have ruled that a person still has an expectation of privacy and therefore police need a warrant.
Imagine it's 1926 and none of this tech is an issue yet. The police can fingerprint and photograph you at intake, they can't compel speech or violate the 5th.
That's exactly what's being applied here. It's not that the police can do more or less than they could in 1926, it's that your biometrics can do more than they did in 1926. They're just fingerprinting you / photographing you .. using your phone.
There's no known technique to force you to input a password.
I fully agree, forced biometrics is bullshit.
I say the same about forced blood removal for BAC testing. They can get a warrant for your blood, that's crazy to me.
Out of habit, I keep my phone off during the flight and turn it on after clearing customs.
Not really, because tools like Cellbrite are more limited with BFU, hence the manual informing LEO to keep (locked) devices charged, amd the countermeasures being iOS forcefully rebooting devices that have been locked for too long.
Quick-press Volume Up, then Quick-press Volume Down. Hold the side power button until the screen turns black (approx. 10 seconds). Immediately hold both the side button and the Volume Down button for 5 seconds. Release the side button but continue holding the Volume Down button for another 10 seconds. The screen will remain black. If the Apple logo appears, the side button was held too long, and the process must be repeated.
If you mean forcing an iOS device out of BFU, that's impossible. The device's storage is encrypted using a key derived from the user's passcode. That key is only available once the user has unlocked the device once, using their passcode.
At least a password and pin you choose to give over.
The real news here isn't privacy control in a consumer OS ir the right to privacy, but USA, the leader of the free world, becoming an autocracy.
I want some of the lockdown stuff (No facetime and message attachments from strangers, no link previews, no device connections), but like half of the other ones I don't want.
Why can't I just toggle an iMessage setting for "no link preview, no attachments", or a general setting for "no automatic device connection to untrusted computers while locked"? Why can't I turn off "random dickpicks from strangers on iMessage" without also turning off my browser's javascript JIT and a bunch of other random crap?
Sure, leave the "Lockdown mode" toggle so people who just want "give me all the security" can get it, but split out individual options too.
Just to go through the features I don't want:
* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more
* Shared photo albums - I'm okay viewing shared photo albums from friends, but lockdown mode prevents you from even viewing them
* Configuration profiles - I need this to install custom fonts
Apple's refusal to split out more granular options here hurts my security.
The other feature I miss is screen time requests. This one is kinda weird - I’m sure there’s a reason they’re blocked, but it’s a message from Apple (or, directly from a trusted family member? I’m not 100% sure how they work). I still _recieve_ the notification, but it’s not actionable.
While I share with your frustration, though, I do understand why Apple might want to have it as “all-or-nothing”. If they allow users to enable even one “dangerous” setting, that ultimately compromises the entire security model. An attacker doesn’t care which way they can compromise your device. If there’s _one_ way in, that’s all they need.
Ultimately, for me the biggest PiTA with lockdown mode is not knowing if it’s to blame for a problem I’m having. I couldn’t tell you how many times I’ve disabled and re-enabled it just to test something that should work, or if it’s the reason a feature/setting is not showing up. To be fair, most of the time it’s not the issue, but sometimes I just need to rule it out.
This feature has the benefit of teaching users (correctly) that browsing the internet on a phone has always been a terrible idea.
It's all objectively terrible, and it accomplishes nothing except allowing the user to use the internet right then and there.
Also, I don't get how the situation with your home internet connection changes much. Your ISP knows exactly where you are because your house doesn't move.
Which sure, not using your phone is more secure, but good luck convincing users that they shouldn't use any apps or websites on the go.
Educate us. What makes it less secure?
[1] https://asahilinux.org/docs/platform/security/ [2] https://support.apple.com/guide/security/hardware-security-o... [3] https://eclecticlight.co/2022/01/04/booting-an-m1-mac-from-h...
Telegram allows you to have distinct disappearing settings for each chat/group. Not sure how it works on Signal, but a solution like this could be possible.
My understanding of Lockdown Mode was that it babyifies the device to reduce the attack surface against unknown zero-days. Does the government saying that Lockdown Mode barred them from entering imply that they've got an unknown zero-day that would work in the PIN-unlock state, but not Lockdown Mode?
Curious.
I think this is pretty unlikely here but it's within the realm of possibility.
The fingerprint sensor does not make access control decisions, so the fault would have to be somewhere else (e.g. the software code branch structure that decides what to do with the response from the secure enclave).
Apple does it different(ly), and I'd argue more securely. Being able to specify the full chain of hardware, firmware, and software always has its advantages.
Apple's fingerprint readers do not perform authentication locally -- instead the data read from the sensor (or derivatives thereof) is compared to a reference which is stored in the secure enclave in the Apple silicon (Ax Tx or Mx) of the Mac or iOS device itself.
How did it know the print even?
* The reporter lied.
* The reporter forgot.
* Apple devices share fingerprint matching details and another device had her details (this is supposed to be impossible, and I have no reason to believe it isn't).
* The government hacked the computer such that it would unlock this way (probably impossible as well).
* The fingerprint security is much worse than years of evidence suggests.
Mainly it was buried at the very end of the article, and I thought it worth mentioning here in case people missed it.
> Apple devices share fingerprint matching details and another device had her details
I looked into it quite seriously for windows thinkpads, unless Apple do it differently, you cannot share fingerprint, they're in a local chip and never move.
Fingerprint security being poor is also unlikely, because that would only apply if a different finger had been registered.
1. If they can get in, now people - including high-value targets like journalists - will use bad security.
2. If the FBI (or another agency) has an unknown capability, the FBI must say they can't get in or reveal their capabilities to all adversaries, including to even higher-profile targets such as counter-intelligence targets. Saying nothing also risks revealing the capability.
3. Similarly if Apple helped them, Apple might insist that is not revealed. The same applies to any third party with the capability. (Also, less significantly, saying they can't get in puts more pressure on Apple and on creating backdoors, even if HN readers will see it the other way.)
Also, the target might think they are safe, which could be a tactical advantage. It also may exclude recovered data from rules of handling evidence, even if it's unusable in court. And at best they haven't got in yet - there may be an exploit to this OS version someday, and the FBI can try again then.
The problem with low entropy security measures arises due to the fact that this low entropy is used to instruct the secure enclave (TEE) to release/use the actual high entropy key. So the key must be stored physically (eg. as voltage levels) somewhere in the device.
It's a similar story when the device is locked, on most computers the RAM isn't even encrypted so a locked computer is no major obstacle to an adversary. On devices where RAM is encrypted the encryption key is also stored somewhere - if only while the device is powered on.
I also recommend looking up PUF and how modern systems use it in conjunction with user provided secrets to dervie keys - a password or fingerprint is one of many inputs into a kdf to get the final keys.
The high level idea is that the key that's being used for encryption is derived from a very well randomized and protected device-unique secret setup at manufacturing time. Your password/fingerprint/whatever are just adding a little extra entropy to that already cryptographically sound seed.
Tl;dr this is a well solved problem on modern security designs.
What does this have to with anything? Tweakable block ciphers or XTS which converts a block cipher to be tweakable operate with an actualized key - the entropy has long been turned into a key.
> Your password/fingerprint/whatever are just adding a little extra entropy to that already cryptographically sound seed.
Correct. The "cryptographically sound seed" however is stored inside the secure enclave for anyone with the capability to extract. Which is the issue I referenced.
And if what you add to the KDF is just a minuscule amount of entropy you may as well have added nothing at all - they perform the addition for the subset of users that actually use high entropy passwords and because it can't hurt. I don't think anyone adds fingerprint entropy though.
Sorry, I'm not sure I follow here. Is anyone believed to have the capability to extract keys from the SE?
The secure enclave (or any Root of Trust) do not allow direct access to keys, they keep the keys locked away internally and use them at your request to do crypto operations. You never get direct access to the keys. The keys used are protected by using IVs, tweaks, or similar as inputs during cryptographic operations so that the root keys can not be derived from the ciphertext, even if the plaintext is controlled by an attacker and they have access to both the plaintext and ciphertext.
Is your concern the secure enclave in an iPhone is deflatable, and in such a way as to allow key extraction of device unique seeds it protects?
Do you have any literature or references where this is known to have occurred?
Tone is sometimes hard in text, so I want to be clear, I'm legit asking this, not trying to argue. If there are any known attacks against Apple's SE that allow key extraction, would love to read up on them.
This is a safe assumption to make as the secret bits are sitting in a static location known to anyone with the design documents. Actually getting to them may of course be very challenging.
> Do you have any literature or references where this is known to have occurred?
I'm not aware of any, which isn't surprising given the enormous resources Apple spent on this technology. Random researchers aren't very likely to succeed.
I never attach my iPhone to anything that's not a power source. I would totally enable an "enhanced protection for external accessories" mode. But I'm not going to enable a general "Lockdown mode" that Apple tells me means my "device won’t function like it typically does"
Note that it behaves subtly differently to how you described in case it was connected to something before being locked. In that case data access will remain -- even though the phone is now locked -- until the device is disconnected.
Anyone can do this for over a decade now, and it's fairly straightforward:
- 2014: https://www.zdziarski.com/blog/?p=2589
- recent: https://reincubate.com/support/how-to/pair-lock-supervise-ip...
This goes beyond the "wired accessories" toggle.
Set to ask for new accessories or always ask.
The lack of optional granularity on security settings is super frustrating because it leads to many users just opting out of any heightened security.
It's "attached" to the wifi and to the cell network. Pretty much the same thing.
FBI unable to extract data from iPhone 13 in Lockdown Mode in high profile case [pdf]
https://storage.courtlistener.com/recap/gov.uscourts.vaed.58...
Obviously, the theoretical answer is yes, given an advanced-enough exploit. But let's say Apple is unaware of a specific rootkit. If each OS update is a wave, is the installed exploit more like a rowboat or a frigate? Will it likely be defeated accidentally by minor OS changes, or is it likely to endure?
This answer is actionable. If exploits are rowboats, installing developer OS betas might be security-enhancing: the exploit might break before the exploiters have a chance to update it.
Modern iOS has an incredibly tight secure chain-of-trust bootloader. If you shut your device to a known-off state (using the hardware key sequence), on power on, you can be 99.999% certain only Apple-signed code will run all the way from secureROM to iOS userland. The exception is if the secureROM is somehow compromised and exploited remotely (this requires hardware access at boot-time so I don't buy it).
So, on a fresh boot, you are almost definitely running authentic Apple code. The easiest path to a form of persistence is reusing whatever vector initially pwned you (malicious attachment, website, etc) and being clever in placing it somewhere iOS will attempt to read it again on boot (and so automatically get pwned again).
But honestly, exploiting modern iOS is already difficult enough (exploits go for tens millions $USD), persistence is an order of magnitude more difficult.
Apple bought out all the jail breakers as Denuvo did for the game crackers.
Do you have sources for these statements?
> in 2018, the prominent Denuvo cracker known as "Voksi" (of REVOLT) was arrested in Bulgaria following a criminal complaint from Denuvo.
https://www.dsogaming.com/news/denuvo-has-sued-revolts-found...
That's how you get off such charges. I'll work for you, if you drop charges. There was a reddit post I can't find when EMPRESS had one of their episodes where she was asked if she wanted to work for. It's happened in the cracking scene before.
> The jailbreaking community is fractured, with many of its former members having joined private security firms or Apple itself. The few people still doing it privately are able to hold out for big payouts for finding iPhone vulnerabilities. And users themselves have stopped demanding jailbreaks, because Apple simply took jailbreakers’ best ideas and implemented them into iOS.
https://www.vice.com/en/article/iphone-jailbreak-life-death-...
And from the jail break community discord.
Even a cursory glance would show it's literally impossible on IOS with even a basic understanding.