What is quantum computing? A prodigious leap from bits to qubits
Integrating the fundamentals of quantum mechanics into computer science will bring about a sea change in the depth and breadth of computing power. Although there are still hurdles to overcome, recent advances suggest these obstacles will soon be a thing of the past.
We live under the seemingly apt impression that computing power progresses according to Moore’s Law, continuously and unrelenting: computers are increasingly powerful even as they become smaller and less expensive. But this law, set forth in 1965 by Intel’s co-founder Gordon E. Moore is not interminable; nature has its limits – it is impossible to build a transistor that is smaller than an atom – and there are even those who have conjectured the timeframe of Moore's Law's demise: 2023. Given that increased digitization and connectivity require even more computing power, the advances made in quantum computing are being eyed with particular eagerness.
In order to work, a computer starts with bits, the unit of data that acts as the basis for algorithms and is represented by one of two possible digits: 0 and 1. That’s how classical computing currently works. But, a radical change is underway, underpinned by the dissemination of the fundamentals of quantum mechanics, which uses the model of atomic states – such as superposition and entanglement – in computing.
By virtue of these attributes, qubits in quantum computing can be 0, 1, or 0 and 1 at the same time. “This scenario means that quantum computing is intrinsically parallel and its calculating power grows exponentially with the number of qubits,” explains Vicente Moret, researcher in the Information Communication Technology Research Center (CITIC) at the University of Coruña, professor in its Computer Science Department and author of ‘Adventures in Computer Science,’ chosen by Bookauthority as one of the twelve best books on quantum computing this year.
According to Moret, the practical applications of quantum computing have the potential to be incredibly valuable. “In addition to artificial intelligence and automated machine learning, quantum computing will play a hugely important role in pharmaceutical developments, in the genome, and generally in all fields related to bioinformaiton. Furthermore, it will be decisive to problems related to data encryption, cryptography, and scrambling, as well as in the world of telecommunications.”
When and how will it become a reality?
Quantum computing faces a number of hurdles, the most significant being the “mere” physical construction of a computer that avoids qubit “decoherence” and the errors that can cause. “A quantum computer does not have to be very different from a conventional computer, but it does need to be perfectly airtight and completely isolated,” points out Vicente Moret: “This is due to a property called ‘quantum coherence.’ When a qubit interacts with its environment, it loses it quantum properties, and becomes a standard bit. Interacting with the environment could mean, for example, measuring. Which is why our quantum algorithms have to be executed prior to the qubit decoherence time. In quantum computing we have to be able to execute the full algorithm before we measure, and before the qubit loses its quantum properties.” In order to ensure an environment of absolute isolation where qubits are not susceptible to disturbances, quantum computers have to be at a temperature of absolute zero (-273º Celsius or -460º Fahrenheit) and be in vacuum conditions.
Tech giants like Intel, IBM, and Google have been investing in quantum computing for some time, and Canada’s D-Wave has a 2,000 qubit computer. For its part, Fujitsu launched Digital Annealer, a technology that simulates how a quantum processor works. The Japanese company provides a practical example of the quantum advantage provided by Digital Annealer in respect to traditional computing: to determine what impact pedestrianizing a street would have on 49 adjacent streets, a normal computer would need a year to perform the calculations, a supercomputer would need between eight and and nine months, whereas their technology performs the calculations in 0.3 seconds.
Professor Vicente Moret points out the the Nobel-winning Physicist, Richard Feynman – “one of the pioneers of quantum computing” – speculated that there would not be desktop quantum computers until 2050. Moret shows more optimism: “In my opinion, it won't take 15 or 20 years for us to be using ‘household’ quantum computers.”