We recently looked at Quantum Computing value proposition. We then looked at scenarios where Quantum Computing would have a big impact.
Quantum Computer can solve some problems by several order of magnitude. This brings today’s intractable problems to be easy to solve tomorrow.
The key reason for this computation power is Quantum Superposition. In this article we’ll look at how this quantitatively impacts computing. How does Quantum Computing perform compared to Classical Computing at different scale.
Classical Bits vs qubits
A classical bit can be either 0 or 1. A quantum bit, or qubit, is a superposition of 0 and 1.
A single qubit therefore takes 2 classical values at once. Every operation on the qubit is done on both values at once.
This is why we often hear that a qubit packs more information than a classical bit.
If we look at two bits, they can take the following values:
- 0, 0
- 0, 1
- 1, 0
- 1, 1
Two qubits take all those values at once.
We can see the pattern. One qubit can take the value of two bits. Two qubits can take the values of four bits. In general, n qubits can take the values of 2n.
There are two ways of looking at this capacity scaling. Let’s frame it in the simulation of Quantum Computing on a Classical Computer.
We could carry all possible classical state on operations. This would increase the memory needed.
We could alternatively use one classical state at the time. This would increase the number of iterations hence the compute time.
We are going to use those two angles to look at at the scaling of qubits.
Here is a table giving us a comparison between classical and quantum bits at different scale.
|# of qubits||# bits / # loops||RAM|
We can see the explosive nature of the beast. It takes only 13 qubits to store a kilobyte. 13 bits is just a byte and a half in comparison!
Let’s look at larger scale with both memory (as we did) and time.
For time, let’s assume we have a Classical Computer with a clock speed of 3 GHz. Let’s also assume one operation on a classical state can be done in one clock cycle. The computer could therefore perform 3 billion operations per second.
This estimate is a little optimistic but it gives an order of magnitude.
|# of qubits||# bits /
|13||8192||1 kB||2.73×10-6 s|
|33||8589934592||1 GB||2.9 s|
|43||8.8×1012||1 TB||49 mins|
|53||9.0×1015||1 PB||35 hours|
|63||9.2×1018||1 EB||97.5 years|
|1000||1.1×10301||1.3×10282 EB||1.1×10284 years|
Those numbers drive the point home.
63 qubits contain as much classical bits as an Exabyte of data.
By comparison 63 bits is just under 8 bytes. Just enough to store 8 characters!
If we look at it from an execution time perspective, it would take a century to simulate an operation on 63 qubits.
We added the thousand qubits row to show where this trend is going.
10301 classical bits is a staggering number of bits. For reference, it is believed the universe contains 1080 hydrogen atoms. So we would need more than three universes worth of hydrogen atoms to store those bits!
Operations on a thousand qubits would take 10284 years to simulate. The age of the universe is estimated at 13 billion (1.3×1010) years. Hence we would need to wait 28 times the current age of the universe to achieve that!
Clearly, this isn’t just a convenient speedup.
Quantum Computing opens up a different space of problems.
1000 qubits isn’t where the train stops. It keeps going after that, with that the ability to solve problems we do not have vocabulary to express today!