Top
Best
New

Posted by aarghh 3 hours ago

Signal creator Moxie Marlinspike wants to do for AI what he did for messaging(arstechnica.com)
47 points | 44 comments
vaylian 2 hours ago|
previous discussion: https://news.ycombinator.com/item?id=46600839
kfreds 27 minutes ago||
It’s exciting to hear that Moxie and colleagues are working on something like this. They definitely have the skills to pull it off.

Few in this world have done as much for privacy as the people who built Signal. Yes, it’s not perfect, but building security systems with good UX is hard. There are all sorts of tradeoffs and sacrifices one needs to make.

For those interested in the underlying technology, they’re basically combining reproducible builds, remote attestation, and transparency logs. They’re doing the same thing that Apple Private Cloud Compute is doing, and a few others. I call it system transparency, or runtime transparency. Here’s a lighting talk I did last year: https://youtu.be/Lo0gxBWwwQE

stavros 25 minutes ago|
I don't know, I'd say Signal is perfect, as it maximizes "privacy times spread". A solution that's more private wouldn't be as widespread, and thus wouldn't benefit as many people.

Signal's achievement is that it's very private while being extremely usable (it just works). Under that lens, I don't think it could be improved much.

maqp 9 minutes ago||
>Signal's achievement is that it's very private while being extremely usable (it just works).

Exactly. Plus it basically pioneered the multi-device E2EE. E.g., Telegram claimed defaulting to E2EE would kill multi-client support:

"Unlike WhatsApp, we can allow our users to access their Telegram message history from several devices at once thanks to our built-in instant cloud sync"

https://web.archive.org/web/20200226124508/https://tgraph.io...

Signal just did it, and in a fantastic way given that there's no cross device key verification hassle or anything. And Telegram never caught up.

frankdilo 1 hour ago||
I do wonder what models it uses under the hood.

ChatGPT already knows more about me than Google did before LLMs, but would I switch to inferior models to preserve privacy? Hard tradeoff.

lrvick 1 hour ago||
What he did with messaging... So he will centralize all of it with known broken SGX metadata protections, weak supply chain integrity, and a mandate everyone supply their phone numbers and agree to Apple or Google terms of service to use it?
rcxdude 39 minutes ago||
The issue being there's not really a credible better option. Matrix is the next best, because they do avoid the tie-in to phone numbers and such, but their cryptographic design is not so great (or rather, makes more tradeoffs for usability and decentralisation), and it's a lot buggier and harder to use.
pousada 58 minutes ago|||
Do you know a better alternative that I can get my elderly parents and non-technical friends to use? I haven’t come across one and from my amateur POV it seems much better than WhatsApp or Telegram.
fsflover 1 hour ago||
Not sure why you're gettimg downvoted. This is exactly what he did to instant messaging; extremely damaging to everyone and without solid arguments for such design.
maqp 57 minutes ago||
Or, he took a barely niché messaging app plugin (OTR), improved it to provide forward secrecy for non-round trips, and deployed the current state-of-the art end-to-end encryption to over 3,000,000,000 users, as Signal isn't the only tool to use double-ratchet E2EE.

>broken SGX metadata protections

Citation needed. Also, SGX is just there to try to verify what the server is doing, including that the server isn't collecting metadata. The real talking is done by the responses to warrants https://signal.org/bigbrother/ where they've been able to hand over only two timestamps of when the user created their account and when they were last seen. If that's not good enough for you, you're better off using Tor-p2p messengers that don't have servers collecting your metadata at all, such as Cwtch or Quiet.

>weak supply chain integrity

You can download the app as an .apk from their website if you don't trust Google Play Store.

>a mandate everyone supply their phone numbers

That's how you combat spam. It sucks but there are very few options outside the corner of Zooko's triangle that has your username look like "4sci35xrhp2d45gbm3qpta7ogfedonuw2mucmc36jxemucd7fmgzj3ad".

>and agree to Apple or Google terms of service to use it?

Yeah that's what happens when you create a phone app for the masses.

stavros 22 minutes ago|||
Exactly. These arguments are so weak that they read more like a smear campaign than an actual technical discussion.

"You have to agree to Apple's terms to use it"? What's Signal meant to do, jailbreak your phone before installing itself on it?

kelipso 8 minutes ago||
Moxie Marlinspike sounds like some 90s intelligence guy’s understanding of what an appealing name to hacker groups would sound like. Put a guy like that as so-called creator of some encryption protocol for messaging and promote the app like it’s for secret conversations and you think people won’t be suspicious? It screams honeypot like nothing else.
stavros 5 minutes ago||
So the argument against Signal is now "the creator's nickname sounds odd"? I mean, OK? Keep using WhatsApp, Telegram or Instagram if you think those are more private than Signal.
josephg 41 minutes ago||||
> You can download the app as an .apk from their website if you don't trust Google Play Store.

I wish apple & google provided a way to verify that an app was actually compiled from some specific git SHA. Right now applications can claim they're opensource, and claim that you can read the source code yourself. But there's no way to check that the authors haven't added any extra nasties into the code before building and submitting the APK / ios application bundle.

It would be pretty easy to do. Just have a build process at apple / google which you can point to a git repo, and let them build the application. Or - even easier - just have a way to see the application's signature in the app store. Then opensource app developers could compile their APK / ios app using github actions. And 3rd parties could check the SHA matches the app binaries in the store.

rcxdude 37 minutes ago|||
This is what F-droid does (well, I suspect most apps don't have reproducable builds that would allow 3rd-party verification), but Signal does not want 3rd-party builds of their client anyhow.
actionfromafar 31 minutes ago||
They could still figure out a way to attest their builds against source.
sudahtigabulan 4 minutes ago||||
>>broken SGX metadata protections

>Citation needed.

https://sgx.fail

https://en.wikipedia.org/wiki/Software_Guard_Extensions#List...

Maken 41 minutes ago||||
>over 3,000,000,000 users

Is that a typo or are you really implying half the human population use Signal?

Edit: I misread, you are counting almost every messaging app user.

maqp 34 minutes ago|||
Just WhatsApp. Moxie's ideas are used in plenty of other messengers. The context was "what Moxie did for the field of instant messaging".
rcxdude 35 minutes ago||||
Yeah, whatsapp uses the same protocol.
fsflover 13 minutes ago|||
>> and agree to Apple or Google terms of service to use it?

> Yeah that's what happens when you create a phone app for the masses.

No, that's what happens when you actively forbid alternative clients and servers, prevent (secure) alternative methods of delivery for your app and force people to rely on the American megacorps known for helping governmental spying on users, https://news.ycombinator.com/item?id=38555810

imustachyou 22 minutes ago||
I’m missing something, won’t the input to the llm necessarily be plaintext? And the output too? Then, as long as the llm has logs, the real input by users will be available somewhere in their servers
fasterik 6 minutes ago|
According to the article:

>Data and conversations originating from users and the resulting responses from the LLMs are encrypted in a trusted execution environment (TEE) that prevents even server administrators from peeking at or tampering with them.

I think what they meant to say is that data is decrypted only in a trusted execution environment, and otherwise is stored/transmitted in an encrypted format.

bookofjoe 28 minutes ago||
https://news.ycombinator.com/item?id=46619643
colesantiago 1 hour ago||
The website is: https://confer.to/

"Confer - Truly private AI. Your space to think."

"Your Data Remains Yours, Never trained on. Never sold. Never shared. Nobody can access it but you."

"Continue With Google"

Make of that what you will.

maqp 44 minutes ago||
My issue is it claims to be end-to-end encrypted, which is really weird. Sure, TLS between you and your bank's server is end-to-end encrypted. But that puts your trust on the service provider.

Usually in a context where a cypherpunk deploys E2EE it means only the intended parties have access to plaintexts. And when it's you having chat with a server it's like cloud backups, the data must be encrypted by the time it leaves your device, and decrypted only once it has reached your device again. For remote computing, that would require LLM handles ciphertexts only, basically, fully homomorphic encryption (FHE). If it's that, then sure, shut up and take my money, but AFAIK the science of FHE isn't nearly there yet.

So the only alternative I can see here is SGX where client verifies what the server is doing with the data. That probably works against surveillance capitalism, hostile takeover etc., but it is also US NOBUS backdoor. Intel is a PRISM partner after all, and who knows if national security requests allow compelling SGX keys. USG did go after Lavabit RSA keys after all.

So I'd really want to see this either explained, or conveyed in the product's threat model documentation, and see that threat model offered on the front page of the project. Security is about knowing the limits of the privacy design so that the user can make an informed decision.

irl_zebra 1 hour ago||
Looks like using Google for login. You can also "Continue with Email." Logging in with Google is pretty standard.
colesantiago 1 hour ago||
It is not privacy oriented if you are sharing login, profile information with Google and Confer.

It wouldn't be long until Google and Gemini can read this information and Google knows you are using Confer.

Wouldn't trust it regardless if Email is available.

The fact that confer allows Google login shows that Confer doesn't care about users privacy.

fasterik 22 minutes ago|||
Most people don't care about Google knowing whether they're using a particular app. If they do, they have the option not to use it. The main concern is that the chats themselves are E2E encrypted, which we have every reason to believe.

This is a perfect example of purism vs. pragmatism. Moxie is a pragmatist who builds things that the average person can actually use. If it means that millions of people who would otherwise have used ChatGPT will migrate because of the reduced friction and get better privacy as a result, that's a win even if at the margin they're still leaking one insignificant piece of metadata to Google.

pousada 56 minutes ago|||
You don’t have to use Google login though? People building solutions like this that aim for broad adoption have to make certain compromises and this seems OK to me (just talking about offering a social login option, haven’t checked the whole project in detail)
moralestapia 58 minutes ago||
Backdoor it?
throwpoaster 1 hour ago|
Add a defunct cryptotoken?
temp8830 54 minutes ago|
Hey, Telegram had one. He had to get to feature parity.
More comments...