Mint Explainer: What does Google's new quantum computer achievement mean?
7 min read . Updated: 23 Feb 2023, 03:36 PM IST- Google announced on Wednesday that it has reduced errors in quantum computers as part of its six key milestones, but fully fault-tolerant quantum computers are still many years away
While Microsoft and Google battle each other with their artificial intelligence-powered chatbot-cum-search-engines called Bing and Bard, respectively, in a bid to stay relevant to users, Google is simultaneously racing to develop a fault-tolerant quantum computer.
The idea is to achieve the low error rates needed to use quantum computers for meaningful applications in the real world, such as factoring large whole numbers into primes or understanding the detailed behaviour of chemical catalysts – problems that classical or traditional computers cannot perform.
The computers we use in our homes and offices today process information with bits (ones and zeroes), while quantum computers have qubits that can process ones and zeroes simultaneously due to a property known as superposition.
Two bits in your normal computer can be in four possible states (00, 01, 10, or 11) but they can represent only one of these states at any given time. A quantum computer, on the other hand, allows two qubits to represent the exact same four states at the same time because of superposition. This is akin to four computers running simultaneously. Further, as you add more qubits, the power of your quantum computer grows exponentially.
Quantum computers also take advantage of another property of quantum physics called quantum entanglement. Albert Einstein referred to entanglement, a property which allows quantum particles to connect regardless of their location in the universe, as “spooky action at a distance". Hence, when you measure a qubit in an entangled system of two qubits, the outcome tells you what you will see when you measure the other qubit. Quantum entanglement allows qubits to communicate with each other even if they are miles (or even millions of miles) apart.
In 2019, Google said its Sycamore quantum computer had achieved ‘quantum advantage or quantum supremacy’ by performing a calculation that would have taken classical computers thousands of years to perform. But in August 2022, Pan Zhang at the Chinese Academy of Sciences in Beijing and his colleagues wrote an improved algorithm for a non-quantum computer that could solve the random sampling problem much faster, challenging Google’s claim that a quantum computer was the only practical way to do it.
One step closer to a fault-tolerant quantum computer
Nevertheless, on Wednesday, Google researchers announced their so-called second milestone on the path to making a useful quantum computer by demonstrating that they could lower the error rate of calculations by making their quantum code bigger. The result was reported in Nature on February 22 in a paper titled 'Suppressing quantum errors by scaling a surface code logical qubit' by Hartmut Neven, VP of Engineering, and Julian Kelly, director of quantum hardware, on behalf of the Google Quantum AI Team.
As we saw above, quantum computers work by manipulating qubits, but these qubits are very sensitive – even stray light can cause calculation errors. The problem escalates as quantum computers become bigger. This necessitates quantum error correction, which protects information by encoding it across multiple physical qubits to form "logical qubits". Physical qubits represent the number of qubits in the computer whereas logical qubits are groups of physical qubits used as a single qubit in computations.
As an example, we can have 100 physical qubits but only 10 logical qubits. Many physical qubits on a quantum processor act as one logical qubit in an error-correcting code called a surface code. "By encoding larger numbers of physical qubits on our quantum processor into one logical qubit, we hope to reduce the error rates to enable useful quantum algorithms," Sundar Pichai, CEO of Google and its parent Alphabet, wrote in a blog.
Google cites an example from communication to drive home the point: "Bob wants to send Alice a single bit that reads “1" across a noisy communication channel. Recognizing that the message is lost if the bit flips to “0", Bob instead sends three bits: “111". If one erroneously flips, Alice could take a majority vote (a simple error-correcting code) of all the received bits and still understand the intended message. Repeating the information more than three times — increasing the ‘size’ of the code — would enable it to tolerate more individual errors."
A larger surface code better protects the logical information it contains. But as Google itself qualifies, it may be many more years before scientists are able to use fault-tolerant quantum computers for large-scale computations with applications across science and industry. "These quantum computers will be much bigger than today, consisting of millions of coherent quantum bits, or qubits," the company explained.
The ultimate goal, Google said, is to demonstrate a pathway to achieving the low error rates needed for using quantum computers in meaningful applications. "With further improvements toward our next milestone (Google has four more milestones to achieve with quantum advantage being the first and the current announcement being the second, according to its declared plan. Milestone six is a machine made of one million physical qubits, encoding 1,000 logical qubits), we anticipate entering the fault-tolerant regime, where we can exponentially suppress logical errors and unlock the first useful error-corrected quantum applications.
"In the meantime, we continue to explore various ways of solving problems using quantum computers in topics ranging from condensed matter physics to chemistry, machine learning, and materials science."
IBM's similar efforts
At its annual Quantum Summit in November, International Business Machines (IBM) Corp showcased its 433-qubit ‘Osprey’ processor, which it claimed "brings us a step closer to the point where quantum computers will be used to tackle previously unsolvable problems". IBM Osprey has the largest qubit count of any IBM quantum processor, more than tripling the 127 qubits on the IBM Eagle processor that was unveiled in 2021.
This processor, according to IBM, has the potential to run complex quantum computations well beyond the computational capability of any classical computer. For reference, the number of classical bits that would be necessary to represent a state on the IBM Osprey processor far exceeds the total number of atoms in the known universe.
IBM is focusing on addressing noise in quantum computers, which continues to be an important factor in sparking adoption of this technology. To simplify this, IBM released a beta update to Qiskit Runtime, which now includes allowing a user to trade speed for reduced error count with a simple option in the application programming interface (API).
As IBM Quantum systems scale towards its stated goal of more than 4,000 qubits by 2025 and beyond, the company plans to go beyond the current capabilities of existing physical electronics. IBM updated the details of the new IBM Quantum System Two, a system designed to be modular and flexible, combining multiple processors into a single system with communication links. IBM aims to bring this system online by the end of 2023.
In an April 2022 interview, Dario Gil, senior VP and director of IBM Research, said, “We still haven't crossed the threshold of quantum advantage (the so-called quantum advantage or quantum supremacy is a point when a quantum system performs functions that today's classical computers cannot) but they are quantum computers, nonetheless...The error rate of the qubits is also improving tremendously (we can get to 10 to the power of minus 4 error rates). And the algorithms and software – the techniques we use for error mitigation and error correction – is also improving. If you combine all of this, (and) if you want to be conservative, we're going to see quantum advantage in this decade."
"We have seen AI-centric or GPU-centric supercomputers, and we are most definitely going to see quantum-centric supercomputers. This is how it may work out. Imagine a quantum computer with hundreds or thousands of qubits with a single cryostat (Heat creates error in qubits hence they need to be cooled to near absolute zero in a device called a cryostat that contains liquid helium), and now imagine a quantum data center with multiple cryostats in a data center.
“You could build a data center that has thousands or tens of thousands of qubits but the connection between these different cryostats in the first generation is classical. If you're smart enough to take a problem and partition the problem in such a way that you can run parallel workloads in the quantum machines and then connect them and stick them classically, you still incur an exponential cost in the classical piece but can still get to a good answer.
"The next step is to combine the field of quantum communications and quantum computing. It's a roadmap over the next 10-20 years, but we will see quantum supercomputers and they are going to work in concert with the current supercomputers."
You can read the full interview here.
IBM, meanwhile, also cautions that as quantum computers grow more powerful, it is crucial that technology providers take steps to protect their systems and data against a potential future quantum computer capable of decrypting today’s security standards.