Researchers disentangle quantum machine learning
A group of international researchers have discovered an important barrier that prevents quantum machine learning from being trained – too much quantum entanglement.
Quantum machine learning studies the advantages of quantum computers for Artificial Intelligence (AI). The hope is that in the future quantum neural networks will be able to combine the strengths of quantum computation and traditional neural networks, however, recent theory research points to potential difficulties.
Machine learning requires the algorithms to learn from the data in a phase known as training. During the training, the algorithm progressively improves in the given task. However, a large class of quantum algorithms are mathematically proven to experience only a negligible improvement due to a phenomenon known as a barren plateau, first reported by a team from Google in 2018. Experiencing a barren plateau can stop the quantum algorithm from learning.
The theory research, published in PRX Quantum, further investigates the causes of barren plateaus with a new focus on the impact of too much entanglement. Entanglement of qubits – or quantum bits – is a quantum effect which allows for the exponential speedup of quantum computing power.
“While entanglement is necessary for quantum speedups, the research indicates the need for careful design of which qubits should be entangled and how much,” says research co-author Dr Maria Kieferova, Research Fellow at the ARC Centre for Quantum Computation and Communication Technology based at the University of Technology Sydney.
“This is in contradiction to the common understanding that more quantum entanglement provides faster speedups.’
“We have proven that excess entanglement between the output qubits, or visible units, and the rest of the quantum neural network hinders the learning process and that large amounts of entanglement can be catastrophic for the model,” says lead author Dr Carlos Ortiz Marrero, who is currently a Research Assistant Professor at North Carolina State University.
“This result teaches us which structures of quantum neural networks we need to avoid for successful algorithms.”
“Even though the research showed that a range of straightforward translations from classical machine learning models to the quantum realm isn’t beneficial, there is a way forward,” says Dr Ortiz Marrero.
“By limiting the depth and connectivity of the network, we might be able to avoid the regimes where quantum machine learning algorithms cannot be trained.”
This can be achieved by precisely and deliberately deploying entanglement in quantum machine learning models.
“While entanglement is a powerful tool to add to our models, it must be used like a scalpel and not a sledgehammer,” says co-author Dr Nathan Wiebe, University of Toronto.
Journal: PRX Quantum
Method of Research: Computational simulation/modeling
Subject of Research: Not applicable
Article Title: Entanglement-Induced Barren Plateaus
Article Publication Date: 25-Oct-2021
All latest news from the category: Information Technology
Here you can find a summary of innovations in the fields of information and data processing and up-to-date developments on IT equipment and hardware.
This area covers topics such as IT services, IT architectures, IT management and telecommunications.
Creating a reference map to explore the electronic device mimicking brain activity
Maps are essential for exploring trackless wilderness or vast expanses of ocean. The same is true for scientific studies that try to open up new fields and develop brand-new devices….
Arase satellite uncovers coupling between plasma waves and charged particles in Geospace
Scientists unravel a bit more of the mystery underlying how wave-particle interactions generate other plasma waves in Geospace. In a new study published in Physical Review Letters, researchers from Japan…