Posted by ingve 8/31/2025
I don't get this part
If author already produced "115x", how can optimizations make it worse?
Randomness is something which I feel is a pretty weird phenomenon. I am definitely one of those 'God doesn't play with dice' types.
Randomness is also something that we call things when actually it's random from a subjective perspective. If we knew more about a system the randomness just falls away. E.g. if we knew the exact physical properties of a dice roll we could probably predict it better than random.
What if it's the case that quantum mechanics is similar. I.e. that what we think of as randomness isn't really randomness but only appears that way to the best of what we can observe. If this is the case, and if our algorithms rely on some sort of genuine randomness inherent in the universe, then doesn't that suggest there's a problem? Perhaps part of the errors we see in quantum mechanics arise from just something fundamental to the universe being different to our model.
I don't think this is that far fetched given the large holes that our current understanding of physics have as to predicting the universe. It just seems that in the realm of quantum mechanics this isn't the case, apparently because experiments have verified things. However, I think there really is something in the proof being in the pudding (provide a practical use case).
You are probably talking about the Copenhagen interpretation, involving superposition.
Personally, I don't think this is the final theory.
Any theory using calculus, cannot be considered discrete, so is therefore not quantized, and not possibly "physical".
Gerard 't Hooft has more to say on this if you want to hear something from a nobel laureate on the subject.
I think what I've just said foots with your calculus comment, and also a Wolfram-like interpretation is closer to "truth" and your point on discretisation.
Why do you think discretisation/quantisation is necessary for the "physical"?
What can I search for to find his comments on this subject?
We are trying to explain, the physical reality we find outselves in, so, if the universe is fundamentally quantized, it must be discrete, as continuous math would reify infinities.
> What can I search for to find his comments on this subject?
You could check Curt Jaimungal's youtube, Hooft was on it recently.
But if you are up for an existential crisis, just google “hidden variable theories”
Sorry, I was destined to make this joke before I was even born.
I mean no disrespect, but I don’t think it’s a particularly useful activity to speculate on physics if you don’t know the basic equations.
> Aside: multiplication by 16 mod 21 is the inverse of multiplying by 4 mod 21, and the circuits are reversible, so multiplying by 16 uses the same number of Toffolis as multiplying by 4.
I couldn't really find anything explaining the significance of this. The only info I found said that "4 mod 21 = 4" (but I don't know if it was AI slop or not).
Is "multiplying by 4 mod 21" something distinct to quantum computing?
For instance the following are equivalent:
2 = 6 mod 4
6 = 2 mod 4
This 'mod 4' can also appear in parentheses or in some other way, but it must appear at the end. Like I said it is not an operator rather it denotes that the entire preceding statement takes place in the appropriate quotient space.
So it is not (multiplying by (4 mod 21)) but ((multiplying by 4) mod 21)
For example under mod 21 a half can actually be represented by 11. Try it. Times any even number by 11 and you’ll see you halved it.
Take any number that’s a multiple of 4 and times it by 16 under mod 21. You now have that number divided by 4.
Etc.
Absolutely nothing to do with quantum computers.
Digital computers were much easier than that. Make it smaller, make a larger number of it, and you're set.
Quantum computers complexity goes up with ~ n^2 (or possibly ~ e^n) where n is the number of qbits
At the same time, things like d-wave might be the most 'quantum' we might get in the practical sense
There are no theoretical reasons QEC can't exist. In fact it already does. Is it already good enough for universal fault tolerance? No. But then again no one said it would. We are slowly getting closer every year.
In his book, Dyakonov offers zero solid reasons other than "it's hard" and thus likely not possible. That's just an opinion.