The classes of complexity of interest here are P, NP, BPP, and BQP.
— P corresponds to the problems that can be solved by a classical computer (deterministic Turing machine) in polynomial time. For example finding a shortest path between nodes in a graph is in this complexity class.
— NP corresponds to the problems where “verifying a solution” is in P. In other words, it's “easy” to check if a solution is actually valid. We know that P ⊆ NP which means that when it's easy to find a solution, it's also easy to check that a solution is valid (but we don't know if the opposite is true). This class contains interesting problems such as prime factorization on which all the current Web's security depends. It's easy to check if a factorization is valid, but it's hypothesized that it's hard to actually find it. Now the interesting part is that we're not sure it's actually true, so the security house of cards might actually tumble at some point (if it hasn't already secretly).
— BPP contains all the problems that you can solve in polynomial time (so like P) but with a chance of less than 1/3 of being incorrect! In practice, we run the algorithm several times to reduce the probability of being wrong to less than 2^(−ck) (c is a positive constant, k is the number of times it's run).
— BQP is probably the most interesting class here. It's just like BPP but on a quantum computer. Now the interesting part is that we don't know (yet) an efficient way of finding the prime factorization on a classical computer. But on a quantum computer, we have the infamous Shor's algorithm which can solve this problem in polynomial time with a bounded error!
The current knowledge about complexity classes is that P ⊆ BPP ⊆ BQP. We don't know how NP and BQP relate to each other.
What does this mean? We don't know if quantum computers can solve harder problems theoretically. But they seem to do sometimes in practice to the best of our knowledge. Coming back to the Shor's algorithm example, we don't know any classical algorithm that can solve the prime factorization problem as efficiently. But this doesn't mean that such an algorithm doesn't exist!
Now you might think that all of this is only theoretical considerations that only mathematicians care about. After all, we don't care if theoretically there might exist efficient classical algorithms for supposedly hard problems if we don't find them anyway right? This is a valid observation. If big enough quantum computers become a reality one day, we'll be able to solve some hard problems right away. However in the meantime, we might totally find efficient classical algorithms making quantum computers less useful. Actually, it has already happened last year[1]! A recommendation problem that we thought was intractable by classical means (but easy to solve with quantum computers) turned out to be solvable more easily than expected by a classical computer. This classical algorithm draws inspiration from quantum algorithms however. So if anything, quantum computing will force us to look at problems from another perspective and solve them in novel ways.
Note that they have a quantum computer that can solve a very specific type of problem. (But it solves them exponentially faster than a classical computer). They don't have a general purpose quantum computer, in particular, they can't run Shor's algorithm to factorize numbers and break a big part of encryption methods.
But keep in mind that QC does not mean you can solve _any_ problem exponentially faster than a binary computer. If something is NP-Complete (or any of the less familiar “hard problem” classes) then it’s still gonna be hard for the quantum computer.
So you’ll get an exponential speed up for some specific problems like quantum circuit sampling, but it can’t help you solve the Traveling Salesman Problem any faster unless we find a new algorithm—and even then that would imply P=NP.
There’s still lots that QC could bring to the table even if it only works for some problems, but it’s not expected to completely replace classical computers—more like augment them with some new super powers.
The Traveling Salesman Problem is indeed a potential application of quantum computing. Grover’s Search could theoretically find a solution in quadratic time (sqrt(n!) versus n!). On a physical quantum computer, this would have profound impact to many different real-world applications.
> They don't have a general purpose quantum computer, in particular, they can't run Shor's algorithm to factorize numbers and break a big part of encryption methods.
What limitations does it have, other than not having enough qubits to run Shor's algorithm (Scott Aaronson mentions it would require thousands of logical qubits, translating to possibly millions of physical qubits due to error-correction requirements)?
Scott's FAQ says: "The key difference with Google’s effort is that they have a fully programmable device—one that you can program with an arbitrary sequence of nearest-neighbor 2-qubit gates, just by sending the appropriate signals from your classical computer."
Unfortunately, I don't know what "arbitrary sequence of nearest-neighbor 2-qubit gates" means and what kinds of algorithms are out of reach for that method.
> arbitrary sequence of nearest-neighbor 2-qubit gates
The classical equivalent is (very handwavily) a processor that can perform all the ops from a reasonable small ISA, but each op has 1% chance of failing silently.
Hence, any quantum algorithm can be put on this device in principle, but you are limited to stuff that uses only 50 qubits and does not mind the 1% error. There are not really any useful things you can do in the presence of (accumulating) 1% errors, beside proof of concept contrived problems.
Yes, it takes a classical computer more time to simulate a quantum computer than just running that quantum computer and checking the output.
But did anyone think this wasn't going to be the case? Quantum circuit solving is just one example of this phenomenon; physically-based rendering is another: it's much faster to stick a camera in a room and measure the resulting image than it is to simulate light transport of the identical scene in the computer using an unbiased rendering algorithm.
I'm sure there are other examples: any physical phenomenon that is "NP-like" will have this property where setting up the (physical) system and measuring the end result after it has "settled" is exponentially faster than simulating the entire system in a classical computer and then measuring the result.
> in particular, they can't run Shor's algorithm to factorize numbers and break a big part of encryption methods
Given a sufficient time horizon, large swathes of crypto will be broken by such a computer and has many people worried. NIST is working on algorithms resistant to quantum attacks, but we're still far off.
Yes but isn't there a competition to find an algo that can be widely deployed and not subject to weird requirements? A sort of 'goldilocks' algo. For example, I can't recall the name of the algo, but one such candidate for a post-quantum algo required keysizes to be 1TB in size. Another had large computation requirements and so thus couldn't do on-the-fly crypto (which is needed to do LUKS style encryption of disks for example). (Sorry for my lack of sources but it was a Youtube video of some talk about post-quantum crypto algos I randomly stumbled upon and forgot to bookmark)
Excuse me if I’m being ignorant but isn’t a quantum computer unable to reverse hashes? Are they going to make a preimage attack or a collision attack on SHA512 for example, by trying all combinations at once?
If hashes are safe, then why not simply base our signatures and other stuff on them? The only thing that may be missing is actual encryption / decryption, but that can be done by XORing with “one-time” pads built from peeling onion layers from hashes.
A quantum computer can find hash collisions in the square root of the time a classical computer can using Grover's algorithm. (So a 256-bit hash becomes as difficult as a 128-bit hash to reverse). Symmetric encryption (e.g. AES) is the same, so encryption itself doesn't have any problems with quantum computing (if you consider 128-bit AES to be safe in general).
Quantum computing affects the distribution of symmetric keys using public key encryption (and so affects a large amount of encryption as used in people's daily lives) which is vulnerable to quantum computers with sufficient bits.
In that case we are perfectly fine. We can distribute symmetric keys in a different way. Just run a hash 10,000,000 times and your public key will be the last result, while the private key will be the first input. Only you can reconstruct the other layers going backwards, as you “peel the onion” and you can use those numbers any way you want. My favorite application is to generate random numbers among untrusting participants by everyone revealing the next number and then using functions of that as seeds to an RNG. It can be used for tons of byzantine fault tolerant applications.
The only problem is that it seems to also have forward secrecy built in because once you’ve revealed the next onion layer, anyone could have constructed the transcript up to that point. So you can only be sure that the author signed something if you are sure they didn’t reveal the next private key to anyone else (to verify it).
This doesn't work for encryption, though. You'd need a cryptosystem where it's possible to encrypt a message with H(x) such that it can only (efficiently) be decrypted given x, for a hash function H. To the best of my knowledge, no such hash function is known.
Note however that you would need a nonce / salt transmitted with each message because otherwise decrypting messages encrypted with the same key would just be as easy as XORing them!
Snuffle is symmetric key encryption, and yes, it's a well-known construction. However, you previously suggested that an asymmetric encryption scheme (with public and private keys) could be constructed using a hash.
I meant having a sufficient amount of time before certain crypto schemes are broken. Maybe I really meant reasonable. In a reasonable amount of time we can expect quantum computers to attack crypto schemes.
If you mean brute forcing an encrypted message, then yeah: that could take until the heat death of the universe for complex passphrases. Infact you might even need several universes of time to crack some passphrases.
It's not an adiabatic machine. It's a gate-type machine, with registers that experience programmable interactions with its neighbors. That's very much more like a classical computer than an adiabatic machine.
In this context "adiabatic" usually refers to "slow continuous transition" (even if this is not the literal meaning of the word). This device uses the circuit model (discreet gates), not the adiabatic model (continuous transition between hamiltonians) of quantum computing.
Your initial comment also had an imprecision that is pretty common: 1. Adiabatic quantum computers and circuit model quantum computers are equivalent models. 2. Dwave never constructed an adiabatic quantum computer, rather something they called a quantum annealer. These are interesting devices inspired by the model of adiabatic computers, but we do not have a reason to think they are better than classical computers.
No surprise coming from The Economist, this article has done the best job (that I've seen) of explaining these complex topics in a language accessible to most. I now understand the significance of this new discovery and its limitations. And I like how they've stayed humble with their tie-in to Watson's famous prediction in the last sentence.
These little steps towards usable QC keep coming. Is there a paper or doc that outlines what breakthroughs are still needed before we can consider ECC or RSA broken? Is QC interesting for signal processing? How would having a million Qbits in my iPhone change us?
Predicting how home users will benefit from QC is like predicting the iPhone on the day when Edison made an electronic tube. I am a researcher in the field and have no idea how it would be useful in a smartphone.
However, here are the breakthroughs necessary for scalable QC: make the quantum operations about 10 times more precise and the qubits about 10 times more long lived (currently the information stored in the qubits decays a bit too fast). Then make it possible to manufacture not just 50, but a couple of thousand of the qubits. At that point you will be able to simulate important chemistry (for drug discovery and material design) and important field theories (for high energy physics and cosmology).
Breaking RSA will be possible later on when we get to millions of qubits, but it is not that big of a deal as we can switch to other crypto schemes (but recordings of important messages will be decrypted).
To your other question about signal processing: quantum computing is providing technologies that would be very useful in metrology and the creation of crazy sensitive sensors (sensing various fields or displacements or taking pictures at crazy resolutions and signal to noise ratios).
Good points. It’s hard to predict where these QC developments will lead.
One little aside, John Ambrose Fleming[1] is credited for inventing the first electronic tube (a vacuum tube based diode) around 1904 that initiated the development of modern (tube based) electronics.
However, the work at Thomas Edison’s laboratories in previous decades on the electric light bulb were almost certainly important prerequisites to Fleming’s work. He even worked for Edison for a period of time.
The interesting part for signal processing is that you can run the discrete Fourier transform QFT [1] in O((log N)^2) time. It's much faster than the classic Fast Fourier Transform that is O(N log N), that is much faster than the naïve calculation of the Fourier Transform that is O(N^2). https://en.wikipedia.org/wiki/Quantum_Fourier_transform
You get an extremely fast Fourier Transform, but for signal processing the problem is how to enter the data and how to extract the data.
In the Shor's algorithm the initial vale is easy to construct so you don't need O(N) time just to enter the data, and the output that you need to read is only a value, not all the values for every frequency, so you don't need O(N) time just to extract the data. https://en.wikipedia.org/wiki/Shor%27s_algorithm
[1] I prefer to call it QFFT because the method is more similar to the FFT than to the FT, but most of the people disagree with me.
These are between micrometer and millimeter sized, depending on technology and exactly what you count. But miniaturizing them is by far not the most pressing problem. And it will not be a server rack for now, rather a helium dilution refrigerator that works at less than 1 Kelvin.
—————————————————————————————————————————————————————————
The classes of complexity of interest here are P, NP, BPP, and BQP.
— P corresponds to the problems that can be solved by a classical computer (deterministic Turing machine) in polynomial time. For example finding a shortest path between nodes in a graph is in this complexity class.
— NP corresponds to the problems where “verifying a solution” is in P. In other words, it's “easy” to check if a solution is actually valid. We know that P ⊆ NP which means that when it's easy to find a solution, it's also easy to check that a solution is valid (but we don't know if the opposite is true). This class contains interesting problems such as prime factorization on which all the current Web's security depends. It's easy to check if a factorization is valid, but it's hypothesized that it's hard to actually find it. Now the interesting part is that we're not sure it's actually true, so the security house of cards might actually tumble at some point (if it hasn't already secretly).
— BPP contains all the problems that you can solve in polynomial time (so like P) but with a chance of less than 1/3 of being incorrect! In practice, we run the algorithm several times to reduce the probability of being wrong to less than 2^(−ck) (c is a positive constant, k is the number of times it's run).
— BQP is probably the most interesting class here. It's just like BPP but on a quantum computer. Now the interesting part is that we don't know (yet) an efficient way of finding the prime factorization on a classical computer. But on a quantum computer, we have the infamous Shor's algorithm which can solve this problem in polynomial time with a bounded error!
—————————————————————————————————————————————————————————
The current knowledge about complexity classes is that P ⊆ BPP ⊆ BQP. We don't know how NP and BQP relate to each other.
What does this mean? We don't know if quantum computers can solve harder problems theoretically. But they seem to do sometimes in practice to the best of our knowledge. Coming back to the Shor's algorithm example, we don't know any classical algorithm that can solve the prime factorization problem as efficiently. But this doesn't mean that such an algorithm doesn't exist!
Now you might think that all of this is only theoretical considerations that only mathematicians care about. After all, we don't care if theoretically there might exist efficient classical algorithms for supposedly hard problems if we don't find them anyway right? This is a valid observation. If big enough quantum computers become a reality one day, we'll be able to solve some hard problems right away. However in the meantime, we might totally find efficient classical algorithms making quantum computers less useful. Actually, it has already happened last year[1]! A recommendation problem that we thought was intractable by classical means (but easy to solve with quantum computers) turned out to be solvable more easily than expected by a classical computer. This classical algorithm draws inspiration from quantum algorithms however. So if anything, quantum computing will force us to look at problems from another perspective and solve them in novel ways.
[1] https://arxiv.org/abs/1807.04271