Top
Best
New

Posted by thadt 5 hours ago

A cryptography engineer's perspective on quantum computing timelines(words.filippo.io)
155 points | 70 commentspage 2
krunck 2 hours ago|
This would also be a good time for certain governments to knowingly push broken PQ KE standards while there is a panicked rush to get PQ tech in place.
FiloSottile 2 hours ago||
Remember that the entities most likely to heed those governments recommendations are those providing services to said government and its military.

I feel like the NSA pushing a (definitely misguided and obviously later exploited by adversaries) NOBUS backdoor has poorly percolated into the collective consciousness, missing the NOBUS part entirely.

See https://keymaterial.net/2025/11/27/ml-kem-mythbusting/ for whether the current standards can hide NOBUS backdoors. It talks about ML-KEM, but all recent standards I read look like this.

adgjlsfhk1 2 hours ago||
IMO the idea that NSA only uses NOBUS backdoors is obviously false (see for example DES's 56 bit key size). The NSA is perfectly capable of publicly calling for an insecure algorithm and then having secret documentation to not use it for anything important.
FiloSottile 1 hour ago|||
DES is the algorithms that was secretly modified by the NSA to protect it against differential cryptanalysis. Capping a key size is hardly a "backdoor."

Also, that was the time of export ciphers and Suite A vs Suite B, which were very explicit about there being different algorithms for US NatSec vs. everything else. This time there's only CNSA 2.0, which is pure ML-KEM and ML-DSA.

So no, there is no history of the NSA pushing non-NOBUS backdoors into NatSec algorithms.

bawolff 1 hour ago|||
> see for example DES's 56 bit key size

In fairness, that was from 1975. I don't particularly trust the NSA, but i dont think things they did half a century ago is a great way to extrapolate their current interests.

some_furry 2 hours ago||
Which governments are you thinking of?
amluto 3 hours ago||
I was in this field a while back, and I always found it baffling that anyone ever believed in the earlier large estimates for the size of a quantum computer needed to run Shor's algorithm. For a working quantum computer, Shor's algorithm is about as difficult as modular exponentiation or elliptic curve scalar multiplication: if it can compute or verify signatures or encrypt or decrypt, then it can compute discrete logs. To break keys of a few hundred bits, you need a few hundred qubits plus not all that much overhead. And the error correction keeps improving all the time.

Also...

> Trusted Execution Environments (TEEs) like Intel SGX and AMD SEV-SNP and in general hardware attestation are just f**d. All their keys and roots are not PQ and I heard of no progress in rolling out PQ ones, which at hardware speeds means we are forced to accept they might not make it, and can’t be relied upon.

This part is embarrassing. We’ve had hash-based signatures that are plenty good for this for years and inspire more confidence for long-term security than the lattice schemes. Sure, the private keys are bigger. So what?

We will also need some clean way to upgrade WebAuthn keys, and WebAuthn key management currently massively sucks.

hujun 27 minutes ago|
> Trusted Execution Environments (TEEs) like Intel SGX and AMD SEV-SNP and in general hardware attestation are just f*d. All their keys and roots are not PQ and I heard of no progress in rolling out PQ ones, which at hardware speeds means we are forced to accept they might not make it, and can’t be relied upon.

compare to SGX, a more critical impacted component is TPM chip, secured/measured boot depends on TPM, and cost of replacing all servers and OS ...

Animats 2 hours ago||
We'll know it's been cracked when all the lost Bitcoins start to move.
xvector 1 hour ago||
The bitcoins won't move until the technology is commoditized (ie well past mainstream usage by the government.)

Having PQ and your adversaries not knowing is far more valuable than the few hundred billion you could get from cracking (and tanking) BTC.

sunshine-o 2 hours ago||
Yep, I was looking into it and from what I understand:

- There is a dark outlook on Bitcoin as the community and devs can't seem to coordinate. Especially on what to do with the "Satoshi coins"

- Ethereum has a hard but clear path (pretty much full rewrite) with a roadmap [0]

- The highly optimized "fast chains" (Solana & co) are in a lot of trouble too.

It would be funny if Bitcoin the asset end up migrating to Ethereum as another erc20 token

- [0] https://pq.ethereum.org/

PretzelPirate 1 hour ago|||
> pretty much full rewrite

This is far from my understanding. Changing out this signature scheme is hard work, but doesn't require a rewrite of the VM.

sunshine-o 16 minutes ago||
Ethereum is way more complex than let's say Bitcoin and all parts are affected. This is not just the "signature scheme".

The fact that the signature size is multiplied by ~10 will greatly affect things like blockspace (what I guess is even more a problem with Bitcoin !)

Also they are the only blockchain I believe that put an emphasis on allowing large number of validators to run on very modest hardware (in the ballpark of a RPI, N100 or phone).

My understanding is they will need to pack it with a larger upgrade to solve all those problems, the so called zkVM/leanVM roadmap.

And then there are the L2 that are an integral part of the ecosystem.

So this is the greatest upgrade ever made on Ethereum, pretty much full rewrite, larger than the transition to proof of stake. I remember before the Proof of Stake migration they were planning to redo the EVM too (with something WASM based at the time) but they had to abandon their plan. Now it seems there is no choice but to do it.

nullc 27 minutes ago|||
Adding new signature schemes to bitcoin is relatively trivial and has been done previously (today Bitcoin supports both schnorr and ecdsa signatures).

Existing PQ standards have signatures with the wrong efficiency tradeoffs for usage in Bitcoin-- large signatures that are durable against a lot of use and supports fast signing, while for Bitcoin signature+key size is critical, keys should be close to single use, and signing time is irrelevant.

To the extent that I've seen any opposition related to this isn't only been in related to schemes that were to inefficient or related to proposals to confiscate the assets of people not adopting the proponent's scheme (which immediately raises concerns about backdoors and consent).

There is active development for PQ signature standards tailored to Bitcoin's needs, e.g. https://delvingbitcoin.org/t/shrimps-2-5-kb-post-quantum-sig... and I think progress looks pretty reasonable.

Claims that there is no development are as far as I can tell are just backscatter from a massive fraud scheme that is ongoing (actually, at least two distinct cons with an almost identical script). There are criminal fraudsters out seeking investments in a scheme to raise money to build a quantum computer and steal Bitcoins. One of them reportedly has raised funds approaching a substantial fraction of a billion dollars from victims. For every one sucker they convince to give them money, they probably create 99 others people panicked about it (since believing it'll work is a pre-req to handing over your money).

commandersaki 1 hour ago||
RemindMe! 3 years "impending doom"
pdhborges 4 hours ago||
What do you recomend as reading material for someone that was in college a while ago (before AE modes got popular) to get up to speed with the new PQ developments?
FiloSottile 4 hours ago|
If you want something book-shaped, the 2nd edition of Serious Cryptography is updated to when the NIST standards were near-final drafts, and has a nice chapter on post-quantum cryptography.

If you want something that includes details on how they were deployed, I'm afraid that's all very recent and I don't have good references.

Sparkyte 3 hours ago||
There is always a price to encryption. The cost goes up the more you have to cater to different and older encryptions while supporting the latest.
vonneumannstan 4 hours ago||
This seems like something uniquely suited to the startup ecosystem. I.e. offering PQ Encryption Migration as a Service. PQ algorithms exist and now theres a large lift required to get them into the tech with substantial possible value.
hlieberman 3 hours ago|
… really? This is simultaneously so far down in the plumbing and extremely resistant to measuring the impact of, I can’t imagine anyone building a company off of this that’s not already deep in the weeds (lookin’ at you, WolfSSL).

The idea that a startup would be competitive in the VC “the only thing that matters are the feels” environment seems crazy to me.

OhMeadhbh 3 hours ago||
Yeah... I spent the 90s working for RSADSI and Certicom implementing algorithms. Crypto is a vitamin, not an aspirin. Hardly anyone is capable of properly assessing risk in general, much less the technical world of information risk management. Telling someone they should pay you money to reduce the impact of something that may or may not happen in the future is not a sales win.
bjourne 2 hours ago||
> Traveling back from an excellent AtmosphereConf 2026, I saw my first aurora, from the north-facing window of a Boeing 747.

Given the author's "safety first" stance on pqc, it seems a bit incongruent to continue to fly to conferences...

OsrsNeedsf2P 3 hours ago|
Why do we "need to ship"? 1,000 qubit quantum computers are still decades away at this point
OhMeadhbh 3 hours ago|
So... In 2013 I was working for Mozilla adding TLS 1.1 and 1.2 support into Firefox. It turns out that some of the extensions common in 1.1, in some instances caused PDUs to grow beyond 16k (or maybe it was 32k, can't remember.). This caused middle boxes to barf. Sure, they shouldn't barf, but they did. We discovered the problem (or rather one of our users discovered the problem) by increasing the key size on server and client certs to push PDU sizes over the limit.

At the very least, you want to start using hybrid legacy / pqc algorithms so engineers at Cisco will know not to limit key sizes in PDUs to 128 bytes.

ekr____ 2 hours ago||
A few points here: There is already very wide use of PQ algorithms in the Web context [0], which is the most problematic one because clients need to be able to connect to any site and there's no real coordination between sites and clients. So we're exercising the middleboxes already.

The incident you're thinking of doesn't sound familiar. None of the extensions in 1.1 really were that big, though of course certs can get that big if you work hard enough. Are you perhaps thinking instead of the 256-511 byte ClientHello issue addressed ion [1]

[0] https://blog.cloudflare.com/pq-2025/ [1] https://datatracker.ietf.org/doc/html/rfc7685

More comments...