In the next couple of years, quantum computers may outperform classical ones on specific tasks, achieving the so-called “quantum advantage”, said Jay M. Gambetta, International Business Machine (IBM) Corp vice-president of Quantum.
However, achieving this “advantage,” according to him, will require improving the hardware and platform and engaging scientists in developing new algorithms for these systems.
“We’ve reached a point where our quantum computers can’t be simulated using classical computers. This stage is what we refer to as quantum utility (implying that researchers can use current quantum computers to explore meaningful scientific problems),” Gambetta told Mint in a recent interview during his India visit. He added, though, that this doesn’t imply the existence of an algorithm that outperforms classical methods.
“You can think of it as a scientific resource, a foundation for future work. We’ve surpassed 100 qubits and 3,000 quantum gates, putting significant effort into enhancing Qiskit—our software platform—so it can support these advancements,” he explained.
Quantum gates are the building blocks of quantum circuits, similar to logic gates in classical computers. They allow quantum computers to perform complex calculations more efficiently than classical computers for specific tasks.
A Ph.D. in Physics, Gambetta was designated an IBM Fellow (the company’s highest honour; it has been awarded to just 338 employees since its founding in 1911, with 89 active as of March 13) in 2018 for his leadership in advancing superconducting quantum computing and shaping IBM’s quantum strategy.
IBM’s quantum strategy, according to Gambetta, involves consistently improving the hardware and enabling algorithms that fit within their error-mitigated systems for now. Gambetta explained that IBM has used error mitigation since 2017 to enhance accuracy, enabling more complex circuits. Error correction, expected by 2029, will further deepen circuits without disrupting user experience, much like in classical computing.
Achieving reliable quantum systems will not only require good qubits but also effective codes, gates, measurements, and state preparation, he explained. “The next challenge lies in scaling to larger, cost-effective systems. While much of the foundational science is solved, scaling to larger, cost-effective, and reliable systems is the next engineering challenge,” Gambetta said.
“Eventually, users won’t need to worry about whether error correction is in place—they’ll simply have access to a growing library of quantum applications. The focus remains on balancing hardware advances with algorithmic innovations to unlock practical quantum computing,” Gambetta explained.
A year ago, for instance, most users were working with only just about 10 qubits. Now, they’re using systems with 100 qubits, and more. IBM itself, which introduced Condor–a 1,121 superconducting qubit quantum processor in December 2023, aims to build a 100,000-qubit system by 2033, for which it is collaborating with the University of Tokyo and the University of Chicago.
While much of the foundational science is solved, scaling to larger, cost-effective, and reliable systems is the next engineering challenge
A qubit, or quantum bit, is the basic unit of information used to encode data in quantum computing. These qubits can store more data, far more than traditional bits can, while performing advanced computations. Even India plans to develop quantum computers with 50-100 qubits in about five years as part of the National Quantum Mission, and accelerate it to 1000 qubits and beyond in eight years.
The focus will also be on developing quantum error correction (QEC) to make quantum computers stable and functional for everyday use, and build quantum algorithms for practical applications. That said, the Tata Institute of Fundamental Research’s (TIFR) six-qubit quantum computer is dwarfed by IBM’s Condor, or even Atom Computing’s neutral atom quantum computer with 1,180 qubits.
“If you put the bet just on the qubits, I don’t think you’re maximizing the success. Success shouldn’t rely solely on qubits; components, software, and algorithms are equally crucial,” Gambetta said. India, according to him, has the potential to accelerate algorithm discovery and hardware development.
“While scaling quantum systems requires significant engineering, the real opportunity lies in building key components—like amplifiers and controllers—forming a “quantum critical supply chain” that doesn’t yet exist,” he explained, adding, “Every time I come here (to India), I’m impressed with the talent”.
Also Read: Meet the four musketeers of India’s quantum computing dream
Superconducting qubits
Quantum computers use various types of qubits—superconducting qubits, trapped ion qubits, quantum dots, photons, and neutral atoms. IBM, according to Gambetta, favours superconducting qubits for their balance of scalability, quality, and speed.
Although other technologies like ions and neutral atoms existed earlier, superconducting qubits have evolved rapidly since their first demonstration in 1999. IBM believes this technology offers the best potential to scale and operate efficiently.
McKinsey’s latest analysis for the third annual Quantum Technology Monitor highlights that chemicals, life sciences, finance, and mobility are poised to benefit first from quantum computing, potentially generating up to $2 trillion in value by 2035.
If you put the bet just on the qubits, I don’t think you’re maximizing the success. Success shouldn’t rely solely on qubits; components, software, and algorithms are equally crucial
That said, the next big challenge, according to Gambetta, is encouraging more algorithm research. “Specifically, we need to implement reliable heuristic algorithms—practical, trial-and-error-based methods—on quantum hardware. Doing so will accelerate the path to quantum advantage by helping us identify useful tasks that quantum computers can perform better than classical ones,” he explained.
According to him, the “Condor” phase focused on “solving scaling, packaging, and manufacturing challenges”. While Condor wasn’t ideal for algorithm research, it helped prove the hardware’s scalability. The next phase, “Heron”, emphasizes improved performance by increasing gate complexity and lowering error rates, even with fewer qubits, according to Gambetta. He added that the focus now will shift to modularity, and connecting multiple quantum processors to enhance scalability.
Also Read: Inside India’s quest to ride the quantum wave
Error per layered gate
IBM, according to Gambetta, has also introduced a new metric called error per layered gate (EPLG), alongside Quantum Volume to evaluate how well circuits are compiled from abstract to physical form. Quantum Volume, which measures how large a quantum circuit can run on hardware, is doubling every year and is called “Gambetta’s law”, similar to “Moore’s law” for traditional computing.
IBM plans to demonstrate error correction on the Starling processor by 2029 and release the BlueJay processor (a 2,000 qubit quantum processing unit, or QPU, capable of circuits with a billion gates) by 2033. BlueJay aims to push technological boundaries, requiring innovations in cryogenics, controls, and quantum interconnects that do not yet exist, according to Gambetta.
IBM, according to Gambetta, anticipates the need for “modular cryogenics that allow multiple fridges to operate independently, advanced Application-Specific Integrated Circuit (ASIC)-based controls, and dense quantum interconnects”. Research on these areas must begin now to ensure readiness by 2033. Collaboration with universities worldwide focuses on both technical challenges and training future talent in quantum algorithms and system design.
The company, meanwhile, is working with universities in Chicago, South Korea, and Japan to train 40,000 students in quantum algorithms. There is a growing demand for individuals with a background in applied mathematics and computer science who can transition into quantum fields.
While IBM continues to hire physicists and engineers, the greatest challenge lies in finding developers who understand both the mathematical underpinnings and practical applications of quantum computing, Gambetta acknowledged.
Also, given that quantum computing draws from diverse fields, combining microwave engineering, solid-state physics, and software development, he said that IBM emphasises continuous learning cycles and systems thinking to drive progress. Mentorship plays a crucial role in adapting to these evolving demands, he added.
Interestingly, Gambetta believes room-temperature operation isn’t necessary for quantum systems, “as the energy required to cool them is minimal compared to the energy needed to control the electronics”. The real challenge, according to him, “lies in scaling—as systems grow, even small failure probabilities can add up”.
“Scaling the electronics and reducing their cost is crucial, as the current FPGA-based systems (field-programmable gate arrays, or FPGAs, are integrated circuits sold off-the-shelf) are expensive per qubit. Future solutions may involve cold ASICs and more affordable, high-quality electronics,” he explained.
Also Read: Quantum mission kicks off; ₹6K cr for eight years