Algorithm could unleash the power of quantum computers

A new algorithm that fast forwards simulations could bring greater use ability to current and near-term quantum computers, opening the way for applications to run past strict time limits that hamper many quantum calculations.

Photo
Fast-forwarding quantum calculations skips past the time limits imposed by decoherence, which plagues today's machines.
Source: DOE/Los Alamos National Laboratory

Quantum computers have a limited time to perform calculations before their useful quantum nature, which we call coherence, breaks down,” said Andrew Sornborger of the Computer, Computational, and Statistical Sciences division at Los Alamos National Laboratory. “With a new algorithm we have developed and tested, we will be able to fast forward quantum simulations to solve problems that were previously out of reach.”

Computers built of quantum components, known as qubits, can potentially solve extremely difficult problems that exceed the capabilities of even the most powerful modern supercomputers. Applications include faster analysis of large data sets, drug development, and unraveling the mysteries of superconductivity, to name a few of the possibilities that could lead to major technological and scientific breakthroughs in the near future.

Recent experiments have demonstrated the potential for quantum computers to solve problems in seconds that would take the best conventional computer millennia to complete. The challenge remains, however, to ensure a quantum computer can run meaningful simulations before quantum coherence breaks down.

“We use machine learning to create a quantum circuit that can approximate a large number of quantum simulation operations all at once,” said Sornborger. “The result is a quantum simulator that replaces a sequence of calculations with a single, rapid operation that can complete before quantum coherence breaks down.”

The Variational Fast Forwarding (VFF) algorithm that the Los Alamos researchers developed is a hybrid combining aspects of classical and quantum computing. Although well-established theorems exclude the potential of general fast forwarding with absolute fidelity for arbitrary quantum simulations, the researchers get around the problem by tolerating small calculation errors for intermediate times in order to provide useful, if slightly imperfect, predictions.

In principle, the approach allows scientists to quantum-mechanically simulate a system for as long as they like. Practically speaking, the errors that build up as simulation times increase limits potential calculations. Still, the algorithm allows simulations far beyond the time scales that quantum computers can achieve without the VFF algorithm.

One quirk of the process is that it takes twice as many qubits to fast forward a calculation than would make up the quantum computer being fast forwarded. In the newly published paper, for example, the research group confirmed their approach by implementing a VFF algorithm on a two qubit computer to fast forward the calculations that would be performed in a one qubit quantum simulation.

In future work, the Los Alamos researchers plan to explore the limits of the VFF algorithm by increasing the number of qubits they fast forward, and checking the extent to which they can fast forward systems.

The research was published npj Quantum Information.

Subscribe to our newsletter

Related articles

Machine learning comes of age in cystic fibrosis

Machine learning comes of age in cystic fibrosis

Researchers have developed AI technology that offers a glimpse of the future of precision medicine, and unprecedented predictive power to clinicians caring for individuals with the life-limiting condition.

Machine learning takes on synthetic biology

Machine learning takes on synthetic biology

Researchers have developed a new tool that adapts machine learning algorithms to the needs of synthetic biology to guide development systematically.

Device achieves unprecedented control of cell membrane voltage

Device achieves unprecedented control of cell membrane voltage

Scientists have developed a bioelectronic system driven by a machine learning algorithm that can shift the membrane voltage in living cells and maintain it at a set point for 10 hours.

AI detects osteoarthritis years before it develops

AI detects osteoarthritis years before it develops

Researchers have created a machine learning algorithm that can detect subtle signs of osteoarthritis on an MRI scan taken years before symptoms even begin.

Assessing mental illnesses through digital phenotyping

Assessing mental illnesses through digital phenotyping

Digital phenotyping and machine learning have emerged as promising tools for monitoring patients with psychosis spectrum illnesses.

AI signals a possible disease resurgence

AI signals a possible disease resurgence

Scientists have used machine learning to predict the reemergence of existing infectious diseases.

Machine learning system sorts out materials' databases

Machine learning system sorts out materials' databases

Scientists have used machin -learning to organize the chemical diversity found in the ever-growing databases for the popular metal-organic framework materials.

'Deep tech' can help coronavirus response – but there's resistance

'Deep tech' can help coronavirus response – but there's resistance

The development of new medical technologies based on cutting-edge discoveries has accelerated during the coronavirus pandemic.

Machine learning model may perfect 3D nanoprinting

Machine learning model may perfect 3D nanoprinting

Scientists and collaborators are using machine learning to address two key barriers to industrialization of two-photon lithography.

Popular articles