Quantum Nuggets: Riverlane’s 2024 QEC Study, IBM’s V-score, Twisted Semiconductors
semiconductor

Quantum Nuggets: Riverlane’s 2024 QEC Study, IBM’s V-score, Twisted Semiconductors

Quantum error correction specialist Riverlane today released a fascinating report — The Quantum Error Correction Report 2024 — that’s worth scanning; IBM and 28 collaborators last week released V-score, a new metric for comparing quantum v. classical methods used to solve many-body problems; and three reports suggest twisted semiconductors hold promise for fault tolerant quantum computer.

Let’s start with Riverlane’s QEC report.

Goodbye NISQ, Hello QEC Era?

The underlying theme is the NISQ (noisy intermediate scale quantum) era is ending and the QEC era is starting.

Today’s quantum computers have high error rates – roughly around one error in every hundred operations: “Once we reduce this to one in a million, we will start unlocking applications that are intractable on today’s supercomputers. For quantum computers to be useful, error rates must be as low as one in a trillion. Qubit and/or quantum algorithm improvements alone will not be enough to reliably run algorithms with millions or billions of operations,” contends the report.

To accomplish that, QEC will be needed. Error mitigation techniques won’t be enough. QEC, of course, is Riverlane’s bailiwick. It is one of a handful of quantum companies focused exclusively on quantum error correction and it is building Deltaflow, a QEC stack to solve this problem for all quantum computers and qubit types. Riverlane touts “Deltaflow’s core is the world’s most powerful quantum error decoder. Deltaflow is powered by a new class of patented QEC semiconductors designed and built by Riverlane.”

So the caveat here is Riverlane has a vested interest in promoting QEC effort. That said, few would disagree with the QEC imperative; basically all of the quantum computing hardware developers and a small army of academics and government lab researchers are also attacking the problem. The Riverlane report, though not overly granular, covers a lot ground and includes a variety of industry voices.

As shown above, quantum error correction higher than what’s called the QEC Threshold is what’s required to deliver fault tolerant quantum computing — basically this is the rate of error correction needed to keep up with rate of error occurrence. The report has good and accessible discussion of QEC threshold, including its limitations.

In the past few years, QEC error corrections have made rather dramatic advances (see timeline figure below for only 2024) prompting the report’s writers to say we’re on the verge of entering the early QEC era, described loosely as a transitional period at the end of NISQ, when errors are controlled mostly by mitigation techniques, and blending into the era when QEC is the driver.

There are several ways to implement error correction, generally using redundant physical qubits to form logical qubits. The additional qubits are used to detect error without interfering with the ongoing quantum calculation or violating various quantum tenets (no cloning, for example). The report broadly examines some of the QEC methods (surface codes, color codes, qLDPC codes, Bacon-Shor code, Bosonic codes, and MBQC codes) and how they work. Also, a fair amount of classical compute power is needed to keep track of the error-correcting and currently there’s debate on whether that would best be handled by FPGAs or ASICs.

Here are the key report highlights as called out by Riverlane:

  • The QEC era is here: spurred by the recent publication of impressive experimental results, 2024 marks the worldwide alignment of governments, investors and quantum computing companies on the need for real-time quantum error correction and clear steps for getting there.
  • For quantum computers to scale, so must QEC: we now require a new breed of classical QEC technologies to reduce the cost-per-qubit and increase power efficiency while still continuing to improve fidelity at scale. These technologies must include more resource-efficient QEC codes and perform error correction rounds at fast (<1µs) and deterministic speeds while dealing with massive data volumes (scaling to 100TB/s).
  • Quantum hardware companies are embracing QEC: almost two-thirds (66%) of companies are now actively implementing QEC or have a heavy focus on QEC as a differentiating element of their operation. This typically involves setting up an internal division or research group dedicated to QEC.
  • New metrics, new era: traditional benchmarks are becoming obsolete. QuOps (reliable Quantum Operations) are a good measure of quantum computing performance, but future machines need new metrics and application-specific benchmarks.
  • The future breakthrough target: within two to three years, we will arrive at the MegaQuOp (million QuOps) inflexion point when quantum computers will surpass the reach of supercomputers.

One of the interesting aspects off the report is the scattering of quotes from prominent researchers and executives. It is interesting to the hear their perspectives. Here are four examples.

  • Jan Goetz, Co-CEO and Co-Founder at IQM Quantum Computers — “The only devices currently available are NISQ, so companies are getting quantum ready by demonstrating how their business problems can be tackled with a NISQ quantum computer. Of course, the conclusion is that, with the current size and current fidelity of those QPUs, it’s only a proof of concept. However, that’s the value of NISQ: it can be used to educate the market and show that real problems can be tackled if we had much more powerful quantum computers.”
  • Simone Severini, General Manager, Quantum Technologies, AWS — “Enterprise customers don’t usually ask for error-corrected devices yet. They’re more focused on seeing what the current hardware can do and preparing for the future. However, there is a growing consensus that error correction has a much more important role than initially thought in making quantum computers commercially relevant. Without error correction, it will be very difficult to build a commercially relevant quantum computer. This has been central to AWS’s quantum hardware effort from day one.”
  • Nicolas Delfosse, Principal Researcher at IonQ — “Right now, we are using error mitigation in all our machines, but it has an issue – the circuit size scaling is exponential, so the cost of error mitigation will quickly become too large to be useful. At the same time, full error correction is too expensive for wide use, so we need something in between. That’s why I’m excited about building partial fault-tolerant operation schemes that could work in the near term and boost the capabilities of current machines.”
  • Jérémie Guillaud, Chief of Theory at Alice & Bob — “Quantum error correction is making fast progress… we see companies switching roadmaps to include error correction. We are at the point where everybody is realizing it will be necessary. In terms of roadmaps, some actors initially focused on NISQ [Noisy Intermediate-Scale Quantum] systems and are now switching to error correction because NISQ hasn’t delivered the expected value.”

Riverlane’s quantum error correction (QEC) study, conducted with market researcher Resonance, attempts to survey the QEC landscape and its opportunities and challenges. The report draws on public data from 29 hardware companies, including interviews with 12 global industry experts from companies such as Amazon Web Services (AWS), Rigetti Computing, IQM, IonQ, and Alice & Bob. Quantum watchers will recognize many of those quoted.

Download the full report here: https://www.riverlane.com/quantum-error-correction-report-2024

IBM-led Team Creates Tool – V-score – for Ground State Problems

Solving the ground state for many-body systems is a notoriously difficult problem and one thought to be well-suited for quantum computers. IBM and researchers from 28 other institutions have developed a new metric — V-score — to quantify the accuracy of algorithms used to solve these many-body problems and to compare performance on quantum and classical systems.

The new work (Variational benchmarks for quantum many-body problems) was published in Science last week and IBM issued a blog about the project.

IBM researchers Antonio Mezzacapo and Javier Robledo-Moreno, wrote in the blog, “[We] have defined a quality metric called the V-score, useful for benchmarking our ability to approximate ground states of quantum systems. We used it for the most comprehensive set of many-body problems to date with state-of-the-art classical techniques, and hope it will help with defining quantum advantage for future quantum computing calculations.”

Quantum advantage has long been a hot topic, particularly as the lexicon of quantum “milestones” description — e.g. quantum supremacy, utility computing —  has grown.

Before jumping into the main topic, Mezzacapo and Robledo-Moreno offer their take on QA:

“You might hear the words ‘quantum advantage’ tossed around in various contexts. Its meaning is based on the three parameters that define any computation: accuracy, runtime, and cost. A quantum advantage is a demonstration of a solution for a problem for which a quantum computer can provide a demonstrable improvement over any classical method and classical resources in terms of accuracy, runtime or cost requirements. It’s not about being faster one time — it’s about quantum (or quantum in combination with classical) being the objectively better tool for solving the problem, versus classical computing alone.

“In the search for quantum advantage, the emphasis often falls on the quantum runtime to achieve a certain computation compared to the runtime of classical state-of-the-art techniques. However, runtime is only one metric by which quantum advantage is benchmarked. Achieving quantum advantage requires the addition of quantum subroutines to make an algorithm cheaper, faster, or more accurate.

“While runtime is an easy metric to measure, measuring cost and accuracy requires more effort. A systematic analysis of the resources used in the computation, despite requiring some non-trivial work, will inevitably come to a good measure of the total cost. This leaves us with the third metric, accuracy. Defining a metric for accuracy can be a challenging problem, since it depends on the computational task at hand and the user’s specific reasons for using a specific algorithm for solving the task.”

Turning to the problem at hand, the researchers note, “Today, algorithms designed to solve this problem mostly rely on what we call variational methods, which are algorithms guaranteed to output an energy for a target system which cannot be lower than the exact solution — or the deepest valley — up to statistical uncertainties. An ideal quality metric for the ground state problem would not only allow the user to benchmark different methods against the same problem, but also different target problems when tackled by the same method.”

What the team of researchers did was construct its accuracy metric from an estimation of the energy and its variance for any specific algorithm used to solve the ground state problem, with additional parameters of the system such as the size and the nature of its interactions. They called this metric “variational-score,” or “V-score,” and showed that it is an absolute metric for this benchmark.

They reported testing the benchmark against the “largest and most complete set of local Hamiltonian problems to date, which can be found online. We found that the V-score correlated very well with the hardness of these problems and the ability of different methods to address them.”

Mezzacapo and Robledo-Moreno contend, “For quantum computing practitioners and algorithm developers, there are some important implications: First, the V-score can be used to benchmark existing classical algorithms. This will enable us then to assess which ground state problems are the hardest for classical algorithms, and therefore best poised for quantum advantage. Second, the identification of hard problems flags systems for which modeling is potentially incomplete. In turn, this means that these systems hold the most potential for new discoveries. Third, the V-score can be used as a quality metric to assess quantum advantage of quantum computing algorithms for cases where classical verifiability is not available.”

They argue when new quantum algorithms are discovered, the V-score can be used as a tool to assess the quality of their output, and to identify quantum advantage.

Here’s the abstract to the paper:

“The continued development of computational approaches to many-body ground-state problems in physics and chemistry calls for a consistent way to assess its overall progress. In this work, we introduce a metric of variational accuracy, the V-score, obtained from the variational energy and its variance. We provide an extensive curated dataset of variational calculations of many-body quantum systems, identifying cases where state-of-the-art numerical approaches show limited accuracy and future algorithms or computational platforms, such as quantum computing, could provide improved accuracy. The V-score can be used as a metric to assess the progress of quantum variational methods toward a quantum advantage for ground-state problems, especially in regimes where classical verifiability is impossible.”

Link to blog, https://www.ibm.com/quantum/blog/v-score

Link to paper, https://www.science.org/doi/10.1126/science.adg9774

Chasing non-Abelian States in Twisted Semiconductors

In quantum computing, the race to develop and apply non-Abelian states for error-resistant quantum computers remains an ongoing quest. Last week, two research groups published works in which they used twisted semiconductor bilayer material to induce non-Abelian states. This adds evidence to a third effort published earlier this year.

Broadly, the big idea is that it is possible to encode information in an error-resistant way using hypothesized non-Abelian phases of matter. Many organizations including, for example, Microsoft and QSC at ORNL, are digging into ways to develop both the materials needed and the encoding techniques.

Physics.org has a short account of the recent work: “Scientists think that the performance of quantum computers could be improved by using hypothesized phases of matter known as non-Abelian states, which have the potential to encode information in an error-resistant way. But realizing a material that could host such states typically requires a powerful magnetic field, which would hinder device integration. Now three teams have predicted that non-Abelian states can form in certain semiconductor structures without a magnetic field. If this prediction is confirmed experimentally, it could lead to more reliable quantum computers that can execute a wider range of tasks,” writes Ryan Wilkinson in the physics.org account.

“The three teams considered a material in which two single layers of the semiconductor molybdenum ditelluride are stacked with a slight twist between them. Using theoretical modeling and advanced simulations, the groups investigated whether this material could harbor non-Abelian states in zero magnetic field. All three teams found that these states could emerge at a twist angle of about 2° if one of the material’s energy levels called the second moiré band were half-filled with electrons.

“The teams explore different aspects of this predicted phenomenon. Aidan Reddy at the Massachusetts Institute of Technology and his colleagues predict that non-Abelian states could also form in similar 2D structures involving other semiconductors. Gil Young Cho at Pohang University of Science and Technology, South Korea, and his colleagues argue that the emergence of non-Abelian states may be related to similarities between the second moiré band and more conventional energy levels called Landau levels. Lastly, Yang Zhang at the University of Tennessee, Knoxville, and his colleagues posted an e-print of a detailed model that explains how individual electrons behave in the twisted semiconductor bilayer.”

Here are abstracts to the three recent works:

Non-Abelian Fractionalization in Topological Minibands — “Motivated by the recent discovery of fractional quantum anomalous Hall states in moiré systems, we consider the possibility of realizing non-Abelian phases in topological minibands. We study a family of moiré systems, skyrmion Chern band models, which can be realized in two-dimensional semiconductor-magnet heterostructures and also capture the essence of twisted transition metal dichalcogenide homobilayers. We show using many-body exact diagonalization that, in spite of strong Berry curvature variations in momentum space, the non-Abelian Moore-Read state can be realized at half filling of the second miniband. These results demonstrate the feasibility of non-Abelian fractionalization in moiré systems without Landau levels and shed light on the desirable conditions for their realization. In particular, we highlight the prospect of realizing the Moore-Read state in twisted semiconductor bilayers.” (https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.133.166503)

Non-Abelian fractional quantum anomalous Hall states and first Landau level physics of the second moiré band of twisted bilayer MoTe2 — “Utilizing the realistic continuum description of twisted bilayer MoTe2 and many-body exact diagonalization calculation, we establish that the second moiré band of twisted bilayer MoTe2, at a small twist angle of approximately 2, serves as an optimal platform for achieving the long-sought non-Abelian fractional quantum anomalous Hall states without the need for external magnetic fields. Across a wide parameter range, our exact diagonalization calculations reveal that the half-filled second moiré band demonstrates the ground state degeneracy and spectral flows, which are consistent with the Pfaffian state in the first Landau level. We further elucidate that the emergence of the non-Abelian state is deeply connected to the remarkable similarity between the second moiré band and the first Landau level. Essentially, the band not only exhibits characteristics akin to the first Landau level, 1/2πBZd2tr()≈3 where () is the Fubini-Study metric of the band, but also that its projected Coulomb interaction closely mirrors the Haldane pseudopotentials of the first Landau level. Motivated by this observation, we introduce a metric of ‘first Landau level’-ness of a band, which quantitatively measures the alignment of the projected Coulomb interaction with the Haldane pseudopotentials in Landau levels. This metric is then compared with the global phase diagram of the half-filled second moiré band, revealing its utility in predicting the parameter region of the non-Abelian state. In addition, we uncover that the first and third moiré bands closely resemble the lowest and second Landau levels, revealing a remarkable sequential equivalence between the moiré bands and Landau levels. We finally discuss the potential implications on experiments.” (https://journals.aps.org/prb/abstract/10.1103/PhysRevB.110.L161109)

Multiple Chern bands in twisted MoTe2 and possible non-Abelian states — “We investigate the moiré band structures and possible even denominator fractional quantum Hall state in small angle twisted bilayer MoTe2, using combined large-scale local basis density functional theory calculation and continuum model exact diagonalization. Via large-scale first principles calculations at θ = 1.89 , we find a sequence of C = 1 moir´e Chern bands, in analogy to Landau levels. Constructing the continuum model with multiple Chern bands and uniform Berry curvature in the second moir´e band, we undertake band-projected exact diagonalization using unscreened Coulomb repulsion to pinpoint possible ν = −3/2 non-Abelian states across a wide range of twist angles below θ = 2.5 .” (https://arxiv.org/abs/2403.17003)

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *