Classical computing power has historically been doubling every two years (Moore’s law). Progress appears to be slowing and certain problems require computational power that, mathematically, cannot be achieved efficiently using Classical computers.
Quantum Mechanics is a fundamental theory in physics describing the properties of nature on an atomic (i.e. really small) scale. Quantum Mechanics has certain features which do not occur in standard or “classical” physics such as “Superposition” and “Entanglement”.
Whilst development is required, Quantum Computers are expected to be faster than Classical Computers for certain use cases by harnessing such features. We cover this in more detail here.
The two most important Quantum Mechanical behaviours to understand are Superposition and Entanglement
Superposition is the ability of a quantum object such as an electron to simultaneously exist in multiple “states” until it is measured (when the superposition collapses). A lot of theoretical and experimental work has gone into explaining exactly how this actually happens but for these purposes it is best not to think too literally about the concept and instead observe that existing in multiple states allows one quantum object to store more information than a binary classical bit.
Entanglement describes a strong correlation between quantum particles so that two or more quantum particles can be linked in unison, even if separated by great distances. Again this is so hard to intuitively understand (Einstein described it as “spooky action at a distance”) it should suffice to say for our purposes that it allows for greater connectivity of information.
The basic idea behind quantum computing is to utilise Quantum Mechanical behaviours to our advantage. Most articles incorrectly write that quantum computers obtain their power by trying every possible answer to a problem in parallel. Whilst a helpful heuristic, this is not strictly true. Quantum Computers use entanglement between qubits and the probabilities (which can be negative or positive and are technically known as amplitudes) associated with superpositions to carry out a series of operations such that certain probabilities are enhanced and others reduced, even to zero. By pushing the probabilities of a wrong answer to zero, you can find helpful answers.
First we must remind ourselves of ‘bits’. A bit is the smallest unit of classical information and can be in one of two states (we call these states 0 and 1). We can make a bit from anything that has two states; computer scientists used to store bits by punching holes in card, a hole represented a 1 and the absence of a hole represented a 0. Newer technology such as compact disks (CDs) stored bits using tiny dents in the metal surface of the disk, where a variation in the surface represented a 1 and a constant surface represented a 0.
Quantum mechanics is a more accurate model of the world that emerged in the early 20th century. One of the many results of this new model was that the most basic unit of information was not the bit but instead the quantum bit, or “qubit”. More interestingly, it turned out that this new unit of information could be useful for computations and communications, and since then there has been an effort to create a physical qubit.
As with bits, the qubit is an abstract idea that isn’t tied to a specific object and we must find a suitable system to store them. Unfortunately, qubits are much harder to implement than the classical bit; while we can carry round billions of bits on a hard drive in a rucksack, it’s difficult to sustain a handful of qubits under laboratory conditions. As a result of this, many qubits in the computer must be used for error correcting.
For those of you already familiar with the basics, we maintain a table of qubit implementations here.
The core of a Quantum Computer is its processor which contains qubits, the quantum equivalent of a classical computer’s bits (the most basic unit of information). As outlined above, qubits can be represented in a number of different ways and thus the kit that is required to make them useful varies too.
Common to most Quantum Computers today is that they are large, complex engineering feats. As covered in more detail here, much of this engineering goes into isolating the qubits from the outside world so that they maintain their quantum mechanical properties.
This is easiest to visualise with the Superconducting Quantum Computers as used by the likes of IBM and Google. These are essentially large refrigerators with wires to carry signals back and forth. The actual quantum processor (and indeed the qubits) represent a small chip that you could hold in your hand that sits right at the bottom of the machine.
Quantum Computers may not always need to be this large and cumbersome. One only needs to look at computers from the 1950s to get a sense for how technology can develop.
Large companies like IBM are aware that captivating design can bring new technologies to life. Indeed, IBM worked with the Italian design company Goppion on the concept and construction of the case for their IBM Q System One, which was unveiled at the Consumer Electronics Show (CES) in Armonk, New York in 2019.
Further miniaturization of components has forced engineers to consider quantum mechanical effects. As chipmakers have added more transistors onto a chip, transistors have become smaller, and the distances between different transistors have decreased. Today, electronic barriers that were once thick enough to block current are now so thin enough that electrons can tunnel through them (known as quantum tunnelling). Though there are further ways we can increase computing power avoiding further miniaturization, scientists have looked to see if they can harness quantum mechanical effects to create different kinds of computers. Most people will point to the 1980s as the start of physicists actively looking at computing with quantum systems.
You can learn more in our history of quantum computing.
Firstly Quantum Computers are not just a natural improvement of today’s computers; rather they are a different approach to computing utilising Quantum Mechanics (as outlined above).
Secondly, we are still in a relatively early development stage for Quantum Computers. Humanity have been refining the “ingredients” of fast Classical Computing for nearly 100 years (vs. about 20 years for Quantum Computing).
With that being said, a team at Google were able to demonstrate what has been unfortunately termed Quantum Supremacy using their 53 Qubit Sycamore Chip. John Preskill defined Quantum Supremacy as the point where quantum computers “can do things that classical computers can’t, regardless of whether those tasks are useful”. As outlined here, Google’s achievement, whilst remarkable, was purposely contrived to be easy for a Quantum Computer and hard for a Classical Computer. It was able to do a calculation in three minutes which would have taken the world’s fastest supercomputer anywhere from 2.5 days to 10,000 years (depending on who you ask). The calculation itself has no known useful applications.