Posted by aarghh 3 hours ago
Few in this world have done as much for privacy as the people who built Signal. Yes, it’s not perfect, but building security systems with good UX is hard. There are all sorts of tradeoffs and sacrifices one needs to make.
For those interested in the underlying technology, they’re basically combining reproducible builds, remote attestation, and transparency logs. They’re doing the same thing that Apple Private Cloud Compute is doing, and a few others. I call it system transparency, or runtime transparency. Here’s a lighting talk I did last year: https://youtu.be/Lo0gxBWwwQE
Signal's achievement is that it's very private while being extremely usable (it just works). Under that lens, I don't think it could be improved much.
Exactly. Plus it basically pioneered the multi-device E2EE. E.g., Telegram claimed defaulting to E2EE would kill multi-client support:
"Unlike WhatsApp, we can allow our users to access their Telegram message history from several devices at once thanks to our built-in instant cloud sync"
https://web.archive.org/web/20200226124508/https://tgraph.io...
Signal just did it, and in a fantastic way given that there's no cross device key verification hassle or anything. And Telegram never caught up.
ChatGPT already knows more about me than Google did before LLMs, but would I switch to inferior models to preserve privacy? Hard tradeoff.
>broken SGX metadata protections
Citation needed. Also, SGX is just there to try to verify what the server is doing, including that the server isn't collecting metadata. The real talking is done by the responses to warrants https://signal.org/bigbrother/ where they've been able to hand over only two timestamps of when the user created their account and when they were last seen. If that's not good enough for you, you're better off using Tor-p2p messengers that don't have servers collecting your metadata at all, such as Cwtch or Quiet.
>weak supply chain integrity
You can download the app as an .apk from their website if you don't trust Google Play Store.
>a mandate everyone supply their phone numbers
That's how you combat spam. It sucks but there are very few options outside the corner of Zooko's triangle that has your username look like "4sci35xrhp2d45gbm3qpta7ogfedonuw2mucmc36jxemucd7fmgzj3ad".
>and agree to Apple or Google terms of service to use it?
Yeah that's what happens when you create a phone app for the masses.
"You have to agree to Apple's terms to use it"? What's Signal meant to do, jailbreak your phone before installing itself on it?
I wish apple & google provided a way to verify that an app was actually compiled from some specific git SHA. Right now applications can claim they're opensource, and claim that you can read the source code yourself. But there's no way to check that the authors haven't added any extra nasties into the code before building and submitting the APK / ios application bundle.
It would be pretty easy to do. Just have a build process at apple / google which you can point to a git repo, and let them build the application. Or - even easier - just have a way to see the application's signature in the app store. Then opensource app developers could compile their APK / ios app using github actions. And 3rd parties could check the SHA matches the app binaries in the store.
>Citation needed.
https://en.wikipedia.org/wiki/Software_Guard_Extensions#List...
Is that a typo or are you really implying half the human population use Signal?
Edit: I misread, you are counting almost every messaging app user.
> Yeah that's what happens when you create a phone app for the masses.
No, that's what happens when you actively forbid alternative clients and servers, prevent (secure) alternative methods of delivery for your app and force people to rely on the American megacorps known for helping governmental spying on users, https://news.ycombinator.com/item?id=38555810
>Data and conversations originating from users and the resulting responses from the LLMs are encrypted in a trusted execution environment (TEE) that prevents even server administrators from peeking at or tampering with them.
I think what they meant to say is that data is decrypted only in a trusted execution environment, and otherwise is stored/transmitted in an encrypted format.
"Confer - Truly private AI. Your space to think."
"Your Data Remains Yours, Never trained on. Never sold. Never shared. Nobody can access it but you."
"Continue With Google"
Make of that what you will.
Usually in a context where a cypherpunk deploys E2EE it means only the intended parties have access to plaintexts. And when it's you having chat with a server it's like cloud backups, the data must be encrypted by the time it leaves your device, and decrypted only once it has reached your device again. For remote computing, that would require LLM handles ciphertexts only, basically, fully homomorphic encryption (FHE). If it's that, then sure, shut up and take my money, but AFAIK the science of FHE isn't nearly there yet.
So the only alternative I can see here is SGX where client verifies what the server is doing with the data. That probably works against surveillance capitalism, hostile takeover etc., but it is also US NOBUS backdoor. Intel is a PRISM partner after all, and who knows if national security requests allow compelling SGX keys. USG did go after Lavabit RSA keys after all.
So I'd really want to see this either explained, or conveyed in the product's threat model documentation, and see that threat model offered on the front page of the project. Security is about knowing the limits of the privacy design so that the user can make an informed decision.
It wouldn't be long until Google and Gemini can read this information and Google knows you are using Confer.
Wouldn't trust it regardless if Email is available.
The fact that confer allows Google login shows that Confer doesn't care about users privacy.
This is a perfect example of purism vs. pragmatism. Moxie is a pragmatist who builds things that the average person can actually use. If it means that millions of people who would otherwise have used ChatGPT will migrate because of the reduced friction and get better privacy as a result, that's a win even if at the margin they're still leaking one insignificant piece of metadata to Google.