Quantinuum scientists making adjustments to a beam line array used to deliver laser pulses in H-Series quantum computers. Photo courtesy of Quantinuum.
Today signifies a major achievement for the entire quantum ecosystem: Microsoft and Quantinuum demonstrated the most reliable logical qubits on record. By applying Microsoft’s breakthrough qubit-virtualization system, with error diagnostics and correction, to Quantinuum’s ion-trap hardware, we ran more than 14,000 individual experiments without a single error. Furthermore, we demonstrated more reliable quantum computation by performing error diagnostics and corrections on logical qubits without destroying them. This finally moves us out of the current noisy intermediate-scale quantum (NISQ) level to Level 2 Resilient quantum computing.
This is a crucial milestone on our path to building a hybrid supercomputing system that can transform research and innovation across many industries. It is made possible by the collective advancement of quantum hardware, qubit virtualization and correction, and hybrid applications that take advantage of the best of AI, supercomputing, and quantum capabilities. With a hybrid supercomputer powered by 100 reliable logical qubits, organizations would start to see scientific advantage, while scaling closer to 1,000 reliable logical qubits would unlock commercial advantage.
Advanced capabilities based on these logical qubits will be available in private preview for Azure Quantum Elements customers in the coming months.
A purpose-built computing platform for science
Many of the hardest problems facing society, such as reversing climate change, addressing food insecurity and solving the energy crisis, are chemistry and materials science problems. However, the number of possible stable molecules and materials may surpass the number of atoms in the observable universe. Even a billion years of classical computing would be insufficient to explore and evaluate them all.
That’s why the promise of quantum is so appealing. Scaled quantum computers would offer the ability to simulate the interactions of molecules and atoms at the quantum level beyond the reach of classical computers, unlocking solutions that can be a catalyst for positive change in our world. But quantum computing is just one layer for driving these breakthrough insights.
Whether it’s to supercharge pharma productivity or pioneer the next sustainable battery, accelerating scientific discovery requires a purpose-built, hybrid compute platform. Researchers need access to the right tool at the right stage of their discovery pipeline to efficiently solve every layer of their scientific problem and drive insights into where they matter most. This is what we built with Azure Quantum Elements, empowering organizations to transform research and development with capabilities including screening massive data sets with AI, narrowing down options with high-performance computing (HPC) or improving model accuracy with the power of scaled quantum computing in the future.
We continue to advance the state-of-the-art across all these hybrid technologies for our customers, with today’s quantum milestone laying the foundation for useful, reliable and scalable simulations of quantum mechanics.
Moving toward resilience
In an article I wrote on LinkedIn, I used a ‘leaky boat’ analogy to explain why fidelity and error correction are so important to quantum computing. In short, fidelity is the value we use to measure how reliably a quantum computer can produce a meaningful result. Only with good fidelity will we have a solid foundation to reliably scale a quantum machine that can solve practical, real-world problems.
For years, one approach used to fix this leaky boat has been to increase the number of noisy physical qubits together with techniques to compensate for that noise but falling short of real logical qubits with superior error correction rates. The main shortcoming of most of today’s NISQ machines is that the physical qubits are too noisy and error-prone to make robust quantum error correction possible. Our industry’s foundational components are not good enough for quantum error correction to work, and it’s why even larger NISQ systems are not practical for real-world applications.
The task at hand for the entire quantum ecosystem is to increase the fidelity of qubits and enable fault-tolerant quantum computing so that we can use a quantum machine to unlock solutions to previously intractable problems. In short, we need to transition to reliable logical qubits — created by combining multiple physical qubits together into logical ones to protect against noise and sustain a long (i.e., resilient) computation. We can only obtain this with careful hardware and software co-design. By having high-quality hardware components and breakthrough error-handling capabilities designed for that machine, we can get better results than any individual component could give us. Today, we’ve done just that.
“Breakthroughs in quantum error correction and fault tolerance are important for realizing the long-term value of quantum computing for scientific discovery and energy security. Results like these enable continued development of quantum computing systems for research and development.”
Dr. Travis Humble, Director, Quantum Science Center, Oak Ridge National Laboratory
A breakthrough for handling quantum errors
That’s why today is such a historic moment: for the first time on record as an industry, we’re advancing from Level 1 Foundational to Level 2 Resilient quantum computing. We’re now entering the next phase for solving meaningful problems with reliable quantum computers. Our qubit-virtualization system, which filters and corrects errors, combined with Quantinuum’s hardware demonstrates the largest gap between physical and logical error rates reported to date. This is the first demonstrated system with four logical qubits that improves the logical over the physical error rate by such a large order of magnitude.
We’ve been able to demonstrate the largest gap between physical and logical error rates yet detected — far below the break-even point, now within a regime where quantum error correction is valuable and works.
As importantly, we’re also now able to diagnose and correct errors in the logical qubits without destroying them — referred to as “active syndrome extraction.” This represents a huge step forward for the industry as it enables more reliable quantum computation.
With this system, we ran more than 14,000 individual experiments without a single error. You can read more about these results here.
“Quantum error correction often seems very theoretical. What’s striking here is the massive contribution Microsoft’s midstack software for qubit optimization is making to the improved step-down in error rates. Microsoft really is putting theory into practice.”
Dr. David Shaw, Chief Analyst, Global Quantum Intelligence
A long-standing collaboration with Quantinuum
Since 2019, Microsoft has been collaborating with Quantinuum to enable quantum developers to write and run their own quantum code on ion-trap qubit technology which includes high-fidelity, full connectivity and mid-circuit measurements. Multiple published benchmark tests recognize Quantinuum as having the best quantum volumes, making them well-positioned to enter Level 2.
“Today’s results mark a historic achievement and are a wonderful reflection of how this collaboration continues to push the boundaries for the quantum ecosystem. With Microsoft’s state-of-the-art error correction aligned with the world’s most powerful quantum computer and a fully integrated approach, we are so excited for the next evolution in quantum applications and can’t wait to see how our customers and partners will benefit from our solutions especially as we move towards quantum processors at scale.”
Ilyas Khan, Founder and Chief Product Officer, Quantinuum
Quantinuum’s hardware performs at physical two-qubit fidelity of 99.8%. This fidelity enables application of our qubit-virtualization system, with diagnostics and error correction, and makes today’s announcement possible. This quantum system, with co-innovation from Microsoft and Quantinuum, ushers us into Level 2 Resilient.
Pioneering quantum supercomputing, together
At Microsoft, our mission is to empower every individual and organization to achieve more. We’ve brought the world’s best NISQ hardware to the cloud with our Azure Quantum platform so our customers can embark on their quantum journey. This is why we’ve integrated artificial intelligence with quantum computing and cloud HPC in the private preview of Azure Quantum Elements. We used this platform to design and demonstrate an end-to-end workflow that integrates Copilot, Azure compute and a quantum algorithm running on Quantinuum processors to train an AI model for property prediction.
Today’s announcement continues this commitment by advancing quantum hardware to Level 2. Advanced capabilities based on these logical qubits will be available in private preview for Azure Quantum Elements in the coming months.
Lastly, we continue to invest heavily in progressing beyond Level 2, scaling to the level of quantum supercomputing. This is why we’ve been advocating for our topological approach, the feasibility of which our Azure Quantum team has demonstrated. At Level 3, we expect to be able to solve some of our most challenging problems, particularly in fields like chemistry and materials science, unlocking new applications that bring quantum at scale together with the best of classical supercomputing and AI — all connected in the Azure Quantum cloud.
We are excited to empower the collective genius and make these breakthroughs accessible to our customers. For more details on how we achieved today’s results, explore our technical blog, and register for the upcoming Quantum Innovator Series with Quantinuum.
The post Advancing science: Microsoft and Quantinuum demonstrate the most reliable logical qubits on record with an error rate 800x better than physical qubits appeared first on The Official Microsoft Blog.