Top
Best
New

Posted by A_D_E_P_T 1 day ago

More on whether useful quantum computing is “imminent”(scottaaronson.blog)
142 points | 115 commentspage 2
sallveburrpi 1 day ago|
So summary is that useful quantum computing is definitely not imminent (as in probably happening in the next 10-20 years) - or am I misreading ?
ktallett 1 day ago||
As someone that works in quantum computing research both academic and private, no it isn't imminent in my understanding of the word, but it will happen. We are still at that point whereby we are comparable to 60's general computing development. Many different platforms and we have sort of decided on the best next step but we have many issues still to solve. A lot of the key issues have solutions, the problem is more getting everyone to focus in the right direction, which also will mean when funding starts to focus in the right direction. There are snake oil sellers right now and life will be imminently easier when they are removed.
ecshafer 1 day ago||
Wouldn't the comparison be more like the 1920s for computing. We had useful working computers in the 1940s working on real problems doing what was not possible before hand. By the 1950s we had computers doing Nuclear bomb simulations and the 1960s we had computers in banks doing accounting and inventory. So we had computers by then, not in homes, but we had them. In the 1920s we had mechanical calculators and theories on computation emerging but not a general purpose computer. Until we have a quantum computer doing work at least at the level of a digital computer I can't really believe it being the 1960s.
ktallett 1 day ago||
I'm not going to pretend that I am that knowledgeable on classic computing history from that time period. I was primarily going off the fact the semi conductor was built in the late 40's, and I would say we have the quantum version of that in both qubit and photonic based computing and they work and we have been developing on them for some time now. The key difference is that there are many more steps to get to the stage of making them useful. A transistor becamse useful extremely quickly and well in Quantum computing, these just haven't quite yet.
tdeck 1 day ago||
Relay based electromechanical computers were doing real, practical work in the early 1940s.
tokai 1 day ago|||
Not to be snarky, but how is it comparable to 60's computing? There was a commercial market for computers and private and public sector adoption and use in the 60s.
ktallett 1 day ago||
There is private sector adoption and planning now of specific single purpose focused quantum devices in military and security settings. They work and exist although I do not believe they are installed. I may be wrong on the exact date, as my classical computer knowledge isn't spot on. The point I was trying to make was that we have all the bits we need. We have the ability to make the photonic quantum version (which spoiler alert is where the focus needs to move to over the qubit method of quantum computing) of a transistor, so we have hit the 50's at least. The fundamentals at this point won't change. What will change is how they are put together and how they are made durable.
GrilledChips 1 day ago|||
In the 60's we actually had extremely capable, fully-developed computers. Advanced systems like the IBM System360 and CDC 6600.

Quantum computing is currently stuck somewhere in the 1800's, when a lot of the theory was still being worked out and few functional devices had even been constructed.

ktallett 17 hours ago||
Oh no, that isn't factually correct. We have the theory. The theory is viable and provable and shown in both of the major branches of quantum computing, qubit and photonic. The key issue as I say is each has multiple 'architectures' for lack of a better term for each branch. We do have functional devices, it is just the function they provide is useless as we can already do it on a laptop. Which partially is a massive issue as Quantum computing almost needs to skip ahead of those development years classical computing was afforded.
noname120 1 day ago|||
What makes it more akin to 60’s general computing development than 60’s fusion power development (that is still ongoing!)? The former is incremental, the latter requires major technological breakthroughs before reaching any sort of usefulness. Quantum computing feels more like there are roadblocks that can’t be ironed out without several technological revolutions.
BlackFly 1 day ago|||
Eh, quantum computing could very well be the next nuclear fusion where every couple of years forever each solved problem brings us to "We're 5 years away!"

Yet, for sure we should keep funding both quantum computing and nuclear fusion research.

andsoitis 1 day ago|||
> it will happen.

If you were to guess what reasons there might be that it WON’T happen, what would some of those reasons be?

ktallett 1 day ago||
So in my view, the issues I think about now are:

- Too few researchers, as in my area of quantum computing. I would state there is one other group that has any academic rigour, and is actually making significant and important progress. The two other groups are using non reproducible results for credit and funding for private companies. You have FAANG style companies also doing research, and the research that comes out still is clearly for funding. It doesn't stand up under scrutiny of method (there usually isn't one although that will soon change as I am in the process of producing a recipe to get to the point we are currently at which is as far as anyone is at) and repeatability.

- Too little progress. Now this is due to the research focus being spread too thin. We have currently the classic digital (qubit) vs analogue (photonic) quantum computing fight, and even within each we have such broad variations of where to focus. Therefore each category is still really just at the start as we are going in so many different directions. We aren't pooling our resources and trying to make progress together. This is also where a lack of openness regarding results and methods harms us. Likewise a lack of automation. Most significant research is done by human hand, which means building on it at a different research facility often requires learning off the person who developed the method in person if possible or at worse, just developing a method again which is a waste of time. If we don't see the results, the funding won't be there. Obviously classical computing eventually found a use case and then it became useful for the public but I fear we may not get to that stage as we may take too long.

As an aside, we may also get to the stage whereby, it is useful but only in a military/security setting. I have worked on a security project (I was not bound by any NDA surprisingly but I'm still wary) featuring a quantum setup, that could of sorts be comparable to a single board computer (say of an ESP32), although much larger. There is some value to it, and that particular project could be implemented into security right now (I do not believe it has or will, I believe it was viability) and isn't that far off. But that particular project has no other uses, outside of the military/security.

layer8 1 day ago|||
What makes you confident that it will happen?
mvkel 1 day ago||
The people who are inside the machine are usually the least qualified to predict the machine's future
uejfiweun 1 day ago||
What an idiotic take. The LEAST qualified? Should I go ask some random junkie off the street where quantum computing will be in 5 years?
charcircuit 1 day ago||
Yes. Apply wisdom of the crowd.
cubefox 1 day ago||
This sounds slightly alarming:

> I’m going to close this post with a warning. When Frisch and Peierls wrote their now-famous memo in March 1940, estimating the mass of Uranium-235 that would be needed for a fission bomb, they didn’t publish it in a journal, but communicated the result through military channels only. As recently as February 1939, Frisch and Meitner had published in Nature their theoretical explanation of recent experiments, showing that the uranium nucleus could fission when bombarded by neutrons. But by 1940, Frisch and Peierls realized that the time for open publication of these matters had passed.

> Similarly, at some point, the people doing detailed estimates of how many physical qubits and gates it’ll take to break actually deployed cryptosystems using Shor’s algorithm are going to stop publishing those estimates, if for no other reason than the risk of giving too much information to adversaries. Indeed, for all we know, that point may have been passed already. This is the clearest warning that I can offer in public right now about the urgency of migrating to post-quantum cryptosystems, a process that I’m grateful is already underway.

Does anyone know how much underway it is? Do we need to worry that the switch away from RSA won't be broadly deployed before quantum decryption becomes available?

JanisErdmanis 20 hours ago|
From analytical arguments considering a rather generic error type, we already know that for the Shor algorithm to produce a useful result, the error rate with the number of logical qubits needs to decrease as ~n^(-1/3), where `n` is the number of bits in the number [1].

This estimate, however, assumes that interaction can be turned on between arbitrary two qubits. In practice, we can only do nearest-neighbour interactions on a square lattice, and we need to simulate the interaction between two arbitrary qubits by repeated application of SWAP gates, mangling the interaction through as in the 15th puzzle. This two-qubit simulation would add about `n` SWAP gates, which would then multiply the noise factor by the same factor, hence now we need an error rate for logical qubits on a square lattice to be around ~n^(-4/3)

Now comes the error correction. The estimates are somewhat hard to make here, as they depend on the sensitivity of the readout mechanism, but for example let’s say a 10-bit number can be factored with a logical qubit error rate of 10^{-5}. Then we apply a surface code that scales exponentially, reducing the error rate by 10 times with 10 physical qubits, which we could express as ~1/10^{m/10}, where m is the number of physical qubits (which is rather optimistic). Putting in the numbers, it would follow that we need 40 physical qubits for a logical qubit, hence in total 400k physical qubits.

That may sound reasonable, but then we made the assumption that while manipulating the individual physical qubits, decoherence for each individual qubit does not happen while they are waiting for their turn. This, in fact, scales poorly with the number of qubits on the chip because physical constraints limit the number of coaxial cables that can be attached, hence multiplexing of control signals and hence the waiting of the qubits is imminent. This waiting is even more pronounced in the quantum computer cluster proposals that tend to surface sometimes.

[1]: https://link.springer.com/article/10.1007/s11432-023-3961-3

osn9363739 1 day ago||
This is the worst quantum computing will ever be.
ktallett 17 hours ago||
That vastly depends on if we choose to go in the right direction.
AlexandrB 1 day ago||
You assume no civilizational collapse is in our future.
qgin 23 hours ago||
Tbh this reply works for pretty much anything anyone ever says
osn9363739 13 hours ago||
I was attempting a sarcastic jab at the whole "this is the worst AI will ever be". Yet tech (or anything really) will often hit a wall, and be about as good as it's ever going to get.
tgi42 1 day ago||
I worked in this field for years and helped build one of the recognizable companies. It has been disappointing to see, once again, promising science done in earnest be taken over by grifters. We knew many years ago that it was going to take FAR fewer qubits to crack encryption than pundits (and even experts) believed.
nacozarina 1 day ago||
another late signal will be a funding spike

once someone makes a widget that extracts an RSA payload, their govt will seize, spend & scale

they will try to keep it quiet but they will start a spending spree that will be visible from space

Traubenfuchs 1 day ago||
Cloud providers will love it when we will need to buy more compute and memory for post quantum TSL.
eightysixfour 1 day ago||
Did anyone else read the last two paragraphs as “I AM NOT ALLOWED TO TELL YOU THINGS YOU SHOULD BE VERY CONCERNED ABOUT” in bright flashing warning lights or is it just me?
svara 1 day ago||
He's making it sound that way, although he might plausibly deny that by claiming he just doesn't want to speculate publicly.

Either way he must have known people would read it like you did when he wrote that; so we can safely assume it's boasting at the very least.

bahmboo 1 day ago|||
I don't think he is saying that. As I said in my other comment here I think he is just drawing a potential parallel to other historic work that was done in a private(secret) domain. The larger point is we simply don't know so it's best to act in a way that even if it hasn't been done already it certainly seems like it will be broken. Hence the move to Post-Quantum Cryptography is probably a good idea!
griffzhowl 1 day ago||
Aaronson says:

> This is the clearest warning that I can offer in public right now about the urgency of migrating to post-quantum cryptosystems...

That has a clear implication that he knows something that he doesn't want to say publically

andrewflnr 1 day ago|||
Very much so. But the specificity and severity of what he knows is not clear just from this. Not necessarily to the point of "bright flashing warning lights" as the top-level comment put it. Anyway, I certainly am glad that people are (as far as I can tell?) more or less on top of the post-quantum transition.
griffzhowl 20 hours ago||
Yes, it can easily just mean that he has some kind of inside information about progress that he doesn't want to divulge, but this is still far from "crypto is broken, guize"

Still, if that's true, it's an example of the very thing Scott's talking about: there are advances in the field that aren't being made public.

machinationu 1 day ago|||
a crypto system is expected to resist for 30 years.

it doesnt need to be imminent for people to start moving now to post-quantum.

if he thinks we are 10 years away from QC, we need to start moving now

ktallett 1 day ago|||
It is more, many companies can't do what they claim to do, or they have done it once at best and had no more consistency. I sense most companies in the quantum computing space right now are of this ilk. As someone that works in academic and private quantum computing research, repeatability and methodology are severely lacking, which always rings alarm bells. Some companies are funded off the back of one very poor quality research paper, reviewed by people who are not experts, that then leads to a company that looks professional but behind the scenes I would imagine are saying Oh shit, now we actually have to do this thing we said we could do.
William_BB 1 day ago|||
Just you
belter 1 day ago||
I ran it through ROT13, base64, reversed the bits, and then observed it....The act of decoding collapsed it into ...not imminent...
willmadden 1 day ago||
We'll know when all of the old Bitcoin P2PK addresses and transacted from addresses are swept.
GrilledChips 1 day ago|
the funny thing is that nobody will ever do that. The moment someone uses quantum computing or any other technology to crack bitcoin in a visible way, the coins they just gave to themselves become worthless because confidence collapses.
pona-a 1 day ago|||
There are some Bitcoin puzzles or old wallets that give some plausible deniability.
Traubenfuchs 1 day ago|||
Well, they wouldn't go for the trillion dollar wale addresses.

They would hack random, long unused, dead addresses holding 5 figure amounts and slowly convert those to money. They would eventually start to significantly lower the value and eventually crash bitcoin if too greedy, but could get filthy rich.

hellobluelings 1 day ago|
[dead]