IBM has made significant strides in quantum computing, aiming to demonstrate that useful calculations can be performed with quantum systems before fully error-corrected quantum computing arrives, which is expected to take years.
The company is optimistic that improvements in hardware and software will make quantum systems more efficient and less error-prone. Although the changes announced are not revolutionary on their own, the collective updates across both the hardware and software stacks have made quantum calculations more reliable, leading to more advanced computations than ever before on IBM’s quantum hardware.
In its early efforts, IBM focused on increasing the number of qubits in its systems, eventually reaching 1,000 qubits. However, the error rates in these systems made it clear that large-scale calculations were not yet feasible. To address this, IBM shifted its focus to improving the performance of smaller processors.
The latest announcement centers on the second version of the Heron processor, which has 133 qubits. This is still beyond the capabilities of classical simulations, provided the qubits can operate with sufficiently low errors.
The key improvement in Heron Revision 2 involves addressing TLS (two-level system) errors that had been limiting the coherence of the qubits. These errors occur when defects in the quantum system interact with qubits, causing them to lose their quantum state.
IBM has made small adjustments to the operating frequencies of the qubits to mitigate these issues, ensuring that the Heron chip performs better during calculations. This improvement, along with a rewritten software stack for better control and faster operation, has resulted in significant performance gains, reducing the time required for complex calculations from 122 hours to just a couple of hours.
Despite the progress, errors still remain a challenge in significant quantum calculations. To combat this, IBM is focusing on error mitigation rather than error correction. The company uses a method of amplifying and measuring processor noise at different levels, which helps produce more accurate estimates of the system’s behavior.
IBM has optimized this process using algorithmic improvements and GPU acceleration, allowing for error mitigation to be applied to slightly larger quantum circuits. While the method remains computationally challenging, it has made error mitigation more practical for certain tasks.
One of IBM’s achievements using these methods is successfully simulating the Ising model, a simple quantum system, by performing 5,000 quantum operations. This milestone shows that IBM’s quantum system is capable of producing reasonable results and indicates that quantum computing is becoming a viable tool for scientific research.
However, IBM’s VP, Jay Gambetta, cautioned that quantum computers are not yet able to consistently outperform classical systems, and determining when quantum computing will surpass classical methods is still a complex scientific challenge that will require further research and iteration over the next few years.