IBM Aims to Build a 100,000-Qubit Quantum Computer Within 10 Years
Category Technology Friday - May 26 2023, 03:07 UTC - 1 year ago IBM made an announcement on May 22 at the G7 summit in Hiroshima, Japan about their plans to build a 100,000-qubit machine within the next 10 years. It will partner with the University of Tokyo and the University of Chicago in a $100 million dollar initiative to push quantum computing into the realm of full-scale operation. Building a quantum computer of that size will require advances in error correction, qubit performance, and software-led error mitigation.
Late last year, IBM took the record for the largest quantum computing system with a processor that contained 433 quantum bits, or qubits, the fundamental building blocks of quantum information processing. Now, the company has set its sights on a much bigger target: a 100,000-qubit machine that it aims to build within 10 years.
IBM made the announcement on May 22 at the G7 summit in Hiroshima, Japan. The company will partner with the University of Tokyo and the University of Chicago in a $100 million dollar initiative to push quantum computing into the realm of full-scale operation, where the technology could potentially tackle pressing problems that no standard supercomputer can solve.
Or at least it can’t solve them alone. The idea is that the 100,000 qubits will work alongside the best "classical" supercomputers to achieve new breakthroughs in drug discovery, fertilizer production, battery performance, and a host of other applications. "I call this quantum-centric supercomputing," IBM’s VP of quantum, Jay Gambetta, told MIT Technology Review in an in-person interview in London last week.
Quantum computing holds and processes information in a way that exploits the unique properties of fundamental particles: electrons, atoms, and small molecules can exist in multiple energy states at once, a phenomenon known as superposition, and the states of particles can become linked, or entangled, with one another. This means that information can be encoded and manipulated in novel ways, opening the door to a swath of classically impossible computing tasks.As yet, quantum computers have not achieved anything useful that standard supercomputers cannot do. That is largely because they haven’t had enough qubits and because the systems are easily disrupted by tiny perturbations in their environment that physicists call noise.
Researchers have been exploring ways to make do with noisy systems, but many expect that quantum systems will have to scale up significantly to be truly useful, so that they can devote a large fraction of their qubits to correcting the errors induced by noise.
IBM is not the first to aim big. Google has said it is targeting a million qubits by the end of the decade, though error correction means only 10,000 will be available for computations. Maryland-based IonQ is aiming to have 1,024 "logical qubits," each of which will be formed from an error-correcting circuit of 13 physical qubits, performing computations by 2028. Palo Alto–based PsiQuantum, like Google, is also aiming to build a million-qubit quantum computer, but it has not revealed its time scale or its error-correction requirements.
Because of those requirements, citing the number of physical qubits is something of a red herring—the particulars of how they are built, which affect factors such as their resilience to noise and their ease of operation, are crucially important. The companies involved usually offer additional measures of performance, such as "quantum volume" and the number of "algorithmic qubits." In the next decade advances in error correction, qubit performance, and software-led error "mitigation," as well as the major distincti .
Share