# Quantum Computing - How does it scale?

We recently looked at Quantum Computing value proposition.  We then looked at scenarios where Quantum Computing would have a big impact.

Quantum Computer can solve some problems by several order of magnitude. This brings today’s intractable problems to be easy to solve tomorrow.

The key reason for this computation power is Quantum Superposition.  In this article we’ll look at how this quantitatively impacts computing.  How does Quantum Computing perform compared to Classical Computing at different scale.

## Classical Bits vs qubits

A classical bit can be either 0 or 1.  A quantum bit, or qubit, is a superposition of 0 and 1.

A single qubit therefore takes 2 classical values at once.  Every operation on the qubit is done on both values at once.

This is why we often hear that a qubit packs more information than a classical bit.

If we look at two bits, they can take the following values:

• 0, 0
• 0, 1
• 1, 0
• 1, 1

Two qubits take all those values at once.

We can see the pattern.  One qubit can take the value of two bits.  Two qubits can take the values of four bits.  In general, n qubits can take the values of 2n.

## Interpretation

There are two ways of looking at this capacity scaling.  Let’s frame it in the simulation of Quantum Computing on a Classical Computer.

We could carry all possible classical state on operations.  This would increase the memory needed.

We could alternatively use one classical state at the time.  This would increase the number of iterations hence the compute time.

We are going to use those two angles to look at at the scaling of qubits.

## Scaling

Here is a table giving us a comparison between classical and quantum bits at different scale.

# of qubits # bits / # loops RAM
1 2 2 bits
2 4 4 bits
3 8 1 byte
4 16 2 bytes
5 32 4 bytes
6 64 8 bytes
7 128 16 bytes
8 256 32 bytes
9 512 64 bytes
10 1024 128 bytes
11 2048 256 bytes
12 4096 512 bytes
13 8192 1 kB

We can see the explosive nature of the beast.  It takes only 13 qubits to store a kilobyte.  13 bits is just a byte and a half in comparison!

Let’s look at larger scale with both memory (as we did) and time.

For time, let’s assume we have a Classical Computer with a clock speed of 3 GHz.  Let’s also assume one operation on a classical state can be done in one clock cycle.  The computer could therefore perform 3 billion operations per second.

This estimate is a little optimistic but it gives an order of magnitude.

# of qubits # bits / # loops RAM Time
13 8192 1 kB 2.73x10-6 s
20 1048576 128 kB 3.5x10-4s
23 8388608 1 MB 2.8x10-3s
33 8589934592 1 GB 2.9 s
43 8.8x1012 1 TB 49 mins
53 9.0x1015 1 PB 35 hours
63 9.2x1018 1 EB 97.5 years
1000 1.1x10301 1.3x10282 EB 1.1x10284 years

Those numbers drive the point home.

63 qubits contain as much classical bits as an Exabyte of data.

By comparison 63 bits is just under 8 bytes.  Just enough to store 8 characters!

If we look at it from an execution time perspective, it would take a century to simulate an operation on 63 qubits.

We added the thousand qubits row to show where this trend is going.

10301 classical bits is a staggering number of bits.  For reference, it is believed the universe contains 1080 hydrogen atoms.  So we would need more than three universes worth of hydrogen atoms to store those bits!

Operations on a thousand qubits would take 10284 years to simulate.  The age of the universe is estimated at 13 billion (1.3x1010) years.  Hence we would need to wait 28 times the current age of the universe to achieve that!

## Summary

Clearly, this isn’t just a convenient speedup.

Quantum Computing opens up a different space of problems.

1000 qubits isn’t where the train stops.  It keeps going after that, with that the ability to solve problems we do not have vocabulary to express today!

### 5 responses

You made a bit of a mistake in your math at the end! 10^284 years isn’t 28 times longer than the current universe at 1.3x10^10. Each exponent is 10x the years. So 10^(284-10) is the actually number of universe ages it would take to process 1000 qubits classically. So it would take 10^274 universe ages to process that same info. Not 28.

2. Ammaorjwns 2021-12-19 at 10:51

Also 1⁸⁰ is 10x smaller than 1⁸¹ so no 1⁸⁰ x 3 don’t mean 1³⁰³

3. Reuben 2022-01-12 at 08:55

I’m confused: the number of possible states for a group of classical bit is also 2^n where n is the number of classical bits?

4. Craig 2022-01-20 at 15:29

Reuben, I was confused by this at first too. The qubits allow each of those 16 states to be stored simultaneously due to superposition (they only assume a discrete state when observed). Classical bits can indeed represent 2^n states with n bits but they can only represent one of those states at any moment. It would take 4 bits x 16 combinations to hold all 16 values in memory but it would only require 4 qubits to represent all 16 values.

5. Joe 2022-02-17 at 14:14

Source?