Posted by A_D_E_P_T 1 day ago
Quantum computing is currently stuck somewhere in the 1800's, when a lot of the theory was still being worked out and few functional devices had even been constructed.
Yet, for sure we should keep funding both quantum computing and nuclear fusion research.
If you were to guess what reasons there might be that it WON’T happen, what would some of those reasons be?
- Too few researchers, as in my area of quantum computing. I would state there is one other group that has any academic rigour, and is actually making significant and important progress. The two other groups are using non reproducible results for credit and funding for private companies. You have FAANG style companies also doing research, and the research that comes out still is clearly for funding. It doesn't stand up under scrutiny of method (there usually isn't one although that will soon change as I am in the process of producing a recipe to get to the point we are currently at which is as far as anyone is at) and repeatability.
- Too little progress. Now this is due to the research focus being spread too thin. We have currently the classic digital (qubit) vs analogue (photonic) quantum computing fight, and even within each we have such broad variations of where to focus. Therefore each category is still really just at the start as we are going in so many different directions. We aren't pooling our resources and trying to make progress together. This is also where a lack of openness regarding results and methods harms us. Likewise a lack of automation. Most significant research is done by human hand, which means building on it at a different research facility often requires learning off the person who developed the method in person if possible or at worse, just developing a method again which is a waste of time. If we don't see the results, the funding won't be there. Obviously classical computing eventually found a use case and then it became useful for the public but I fear we may not get to that stage as we may take too long.
As an aside, we may also get to the stage whereby, it is useful but only in a military/security setting. I have worked on a security project (I was not bound by any NDA surprisingly but I'm still wary) featuring a quantum setup, that could of sorts be comparable to a single board computer (say of an ESP32), although much larger. There is some value to it, and that particular project could be implemented into security right now (I do not believe it has or will, I believe it was viability) and isn't that far off. But that particular project has no other uses, outside of the military/security.
> I’m going to close this post with a warning. When Frisch and Peierls wrote their now-famous memo in March 1940, estimating the mass of Uranium-235 that would be needed for a fission bomb, they didn’t publish it in a journal, but communicated the result through military channels only. As recently as February 1939, Frisch and Meitner had published in Nature their theoretical explanation of recent experiments, showing that the uranium nucleus could fission when bombarded by neutrons. But by 1940, Frisch and Peierls realized that the time for open publication of these matters had passed.
> Similarly, at some point, the people doing detailed estimates of how many physical qubits and gates it’ll take to break actually deployed cryptosystems using Shor’s algorithm are going to stop publishing those estimates, if for no other reason than the risk of giving too much information to adversaries. Indeed, for all we know, that point may have been passed already. This is the clearest warning that I can offer in public right now about the urgency of migrating to post-quantum cryptosystems, a process that I’m grateful is already underway.
Does anyone know how much underway it is? Do we need to worry that the switch away from RSA won't be broadly deployed before quantum decryption becomes available?
This estimate, however, assumes that interaction can be turned on between arbitrary two qubits. In practice, we can only do nearest-neighbour interactions on a square lattice, and we need to simulate the interaction between two arbitrary qubits by repeated application of SWAP gates, mangling the interaction through as in the 15th puzzle. This two-qubit simulation would add about `n` SWAP gates, which would then multiply the noise factor by the same factor, hence now we need an error rate for logical qubits on a square lattice to be around ~n^(-4/3)
Now comes the error correction. The estimates are somewhat hard to make here, as they depend on the sensitivity of the readout mechanism, but for example let’s say a 10-bit number can be factored with a logical qubit error rate of 10^{-5}. Then we apply a surface code that scales exponentially, reducing the error rate by 10 times with 10 physical qubits, which we could express as ~1/10^{m/10}, where m is the number of physical qubits (which is rather optimistic). Putting in the numbers, it would follow that we need 40 physical qubits for a logical qubit, hence in total 400k physical qubits.
That may sound reasonable, but then we made the assumption that while manipulating the individual physical qubits, decoherence for each individual qubit does not happen while they are waiting for their turn. This, in fact, scales poorly with the number of qubits on the chip because physical constraints limit the number of coaxial cables that can be attached, hence multiplexing of control signals and hence the waiting of the qubits is imminent. This waiting is even more pronounced in the quantum computer cluster proposals that tend to surface sometimes.
[1]: https://link.springer.com/article/10.1007/s11432-023-3961-3
once someone makes a widget that extracts an RSA payload, their govt will seize, spend & scale
they will try to keep it quiet but they will start a spending spree that will be visible from space
Either way he must have known people would read it like you did when he wrote that; so we can safely assume it's boasting at the very least.
> This is the clearest warning that I can offer in public right now about the urgency of migrating to post-quantum cryptosystems...
That has a clear implication that he knows something that he doesn't want to say publically
Still, if that's true, it's an example of the very thing Scott's talking about: there are advances in the field that aren't being made public.
it doesnt need to be imminent for people to start moving now to post-quantum.
if he thinks we are 10 years away from QC, we need to start moving now
They would hack random, long unused, dead addresses holding 5 figure amounts and slowly convert those to money. They would eventually start to significantly lower the value and eventually crash bitcoin if too greedy, but could get filthy rich.