In a recent issue of Physical Review A, Argonne researchers reported a new method for alleviating the effects of "noise" in quantum information systems, a challenge scientists around the globe are working to meet in the race toward a new era of quantum technologies. The new method has implications for the future of quantum information science, including quantum computing and quantum sensing.
Many current quantum information applications, such as carrying out an algorithm on a quantum computer, suffer from "decoherence" -- a loss of information due to "noise," which is inherent to quantum hardware. Matthew Otten, a Maria Goeppert Mayer Fellow at Argonne, and Stephen Gray, group leader of Theory and Modeling at the Center for Nanoscale Materials, a U.S. Department of Energy Office of Science User Facility, have developed a new technique that recovers this lost information by repeating the quantum process or experiment many times, with slightly different noise characteristics, and then analyzing the results.
This is an example of a 'hypersurface' fit to many experiments with slightly different noise parameters, 1 and 2. Black points are measurements of an observable with different noise rates. The red 'X' is the noise-free result. Blue, orange and green surfaces are first, third and fourth order fits.
Credit: Argonne National Laboratory
After gathering results by running the process many times in sequence or parallel, the researchers construct a hypersurface where one axis represents the result of a measurement and the other two (or more) axes represent different noise parameters. This hypersurface yields an estimate of the noise-free observable and gives information about the effect of each noise rate.
"It's like taking a series of flawed photographs," said Otten. "Each photo has a flaw, but in a different place in the picture. When we compile all the clear pieces from the flawed photos together, we get one clear picture."
Applying this technique effectively reduces quantum noise without the need for additional quantum hardware.
"This is a versatile technique that can be done with separate quantum systems undergoing the same process at the same time," said Otten.
"One could create several small quantum devices and run them in parallel," said Gray. "Using our method, one would combine the results on the hypersurface and generate approximate noise-free observables. The results would help extend the usefulness of the quantum devices before decoherence sets in."
"We successfully performed a simple demonstration of our method on the Rigetti 8Q-Agave quantum computer," said Otten. "This class of methods will likely see much use in near-term quantum devices."
The researchers' work described above appears in Physical Review A and is entitled "Recovering noise-free quantum observables."
Otten and Gray have also developed a similar and somewhat less computationally complex process to achieve noise-reduction results based on correcting one qubit at a time to approximate the result for all qubits being simultaneously corrected. A qubit, or quantum bit, is the equivalent in quantum computing to the binary digit or bit used in classical computing.
"In this approach, we assume that the noise can be reduced on each qubit individually, which, while experimentally challenging, leads to a much simpler data processing problem and results in an estimate of the noise-free result," noted Otten.
This second method was recently published in Nature Partner Journals Quantum Information: "Accounting for errors in quantum algorithms via individual error reduction."
This research was performed at the Center for Nanoscale Materials, a U.S. Department of Energy User Facility at Argonne, and was supported by the U.S. Department of Energy, Office of Science. Bebop, a high-performance computing cluster operated by the Laboratory Computing Resource Center at Argonne, was used to perform simulations that helped hone the new method and demonstrate it in situations that are not currently available with quantum hardware.
Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation's first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America's scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy's Office of Science.
The U.S. Department of Energy's Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit the Office of Science website.
Diana Anderson | idw - Informationsdienst Wissenschaft
Invisible tags: Physicists at TU Dresden write, read and erase using light
04.02.2019 | Technische Universität Dresden
Making ultrafast lasers faster
31.01.2019 | Fraunhofer-Institut für Lasertechnik ILT
A team of physicists headed by Prof. Sebastian Reineke of TU Dresden developed a new method of storing information in fully transparent plastic foils. Their innovative idea was now published in the renowned online journal “Science Advances”.
Prof. Reineke and his LEXOS team work with simple plastic foils with a thickness of less than 50 µm, which is thinner than a human hair. In these transparent...
In the future, cars will exchange data via radio and warn each other about obstacles and accidents. There are currently various radio standards in existence to allow this. However, it is almost impossible to compare them, because the requisite hardware is not yet on the market. To address this lack, researchers at the Fraunhofer Institute for Telecommunications, Heinrich Hertz Institute, HHI have developed a software system that will enable users to analyze the future wireless technology. For manufacturers, this is an ideal solution for testing interesting radio applications at an early stage.
Slowly but surely, the automobile is developing into the autonomous vehicle, as new functions are added with each new generation. Proximity radars are by now...
Lasers with ultrashort pulses in the picosecond and femtosecond range are often referred to as ultrafast lasers. They are known for their ultra-precise ablation and cutting results. Unfortunately, processing with such lasers takes time. To address this issue, a new research project, funded by the European Commission, aims to make material processing with ultrafast lasers up to a hundred times faster.
Ultrashort pulsed (USP) or ultrafast lasers can do something very unique: They ablate almost any material without causing a thermal load of the adjacent...
A further increase in the performance of supercomputers is expected over the next few years. So-called exascale computers will be able to deliver more precise simulations. This leads to considerably more data. Fraunhofer SCAI develops efficient data analysis methods for this purpose, which provide the engineer with detailed insights into the complex technical contexts.
Simulations on supercomputers answer important industrial questions, such as how air flows behave in air conditioning systems, on rotor blades or for entire...
Breakthrough in graphene research: large, stable pieces of graphene produced with unique edge pattern
Graphene is a promising material for use in nanoelectronics. Its electronic properties depend greatly, however, on how the edges of the carbon layer are formed.
30.01.2019 | Event News
16.01.2019 | Event News
14.01.2019 | Event News
05.02.2019 | Life Sciences
05.02.2019 | Studies and Analyses
05.02.2019 | Life Sciences