Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

Quantum computers may be easier to build than predicted

03.03.2005


The new NIST architecture for quantum computing relies on several levels of error checking to ensure the accuracy of quantum bits (qubits). The image above illustrates how qubits are grouped in blocks to form the levels. To implement the architecture with three levels, a series of operations is performed on 36 qubits (bottom row)each one representing either a 1, a 0, or both at once. The operations on the nine sets of qubits produce two reliably accurate qubits (top row). The purple spheres represent qubits that are either used in error detection or in actual computations. The yellow spheres are qubits that are measured to detect or correct errors but are not used in final computations.


A full-scale quantum computer could produce reliable results even if its components performed no better than today’s best first-generation prototypes, according to a paper in the March 3 issue in the journal Nature* by a scientist at the Commerce Department’s National Institute of Standards and Technology (NIST).

In theory, such a quantum computer could be used to break commonly used encryption codes, to improve optimization of complex systems such as airline schedules, and to simulate other complex quantum systems.

A key issue for the reliability of future quantum computers--which would rely on the unusual properties of nature’s smallest particles to store and process data--is the fragility of quantum states. Today’s computers use millions of transistors that are switched on or off to reliably represent values of 1 or 0. Quantum computers would use atoms, for example, as quantum bits (qubits), whose magnetic and other properties would be manipulated to represent 1 or 0 or even both at the same time. These states are so delicate that qubit values would be unusually susceptible to errors caused by the slightest electronic "noise."



To get around this problem, NIST scientist Emanuel Knill suggests using a pyramid-style hierarchy of qubits made of smaller and simpler building blocks than envisioned previously, and teleportation of data at key intervals to continuously double-check the accuracy of qubit values. Teleportation was demonstrated last year by NIST physicists, who transferred key properties of one atom to another atom without using a physical link.

"There has been a tremendous gap between theory and experiment in quantum computing," Knill says. "It is as if we were designing today’s supercomputers in the era of vacuum tube computing, before the invention of transistors. This work reduces the gap, showing that building quantum computers may be easier than we thought. However, it will still take a lot of work to build a useful quantum computer."

Use of Knill’s architecture could lead to reliable computing even if individual logic operations made errors as often as 3 percent of the time--performance levels already achieved in NIST laboratories with qubits based on ions (charged atoms). The proposed architecture could tolerate several hundred times more errors than scientists had generally thought acceptable.

Knill’s findings are based on several months of calculations and simulations on large, conventional computer workstations. The new architecture, which has yet to be validated by mathematical proofs or tested in the laboratory, relies on a series of simple procedures for repeatedly checking the accuracy of blocks of qubits. This process creates a hierarchy of qubits at various levels of validation.

For instance, to achieve relatively low error probabilities in moderately long computations, 36 qubits would be processed in three levels to arrive at one corrected pair. Only the top-tier, or most accurate, qubits are actually used for computations. The more levels there are, the more reliable the computation will be.

Knill’s methods for detecting and correcting errors rely heavily on teleportation. Teleportation enables scientists to measure how errors have affected a qubit’s value while transferring the stored information to other qubits not yet perturbed by errors. The original qubit’s quantum properties would be teleported to another qubit as the original qubit is measured.

The new architecture allows trade-offs between error rates and computing resource demands. To tolerate 3 percent error rates in components, massive amounts of computing hardware and processing time would be needed, partly because of the "overhead" involved in correcting errors. Fewer resources would be needed if component error rates can be reduced further, Knill’s calculations show.

Laura Ost | EurekAlert!
Further information:
http://www.nist.gov

More articles from Physics and Astronomy:

nachricht Astronomers find unexpected, dust-obscured star formation in distant galaxy
24.03.2017 | University of Massachusetts at Amherst

nachricht Gravitational wave kicks monster black hole out of galactic core
24.03.2017 | NASA/Goddard Space Flight Center

All articles from Physics and Astronomy >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Giant Magnetic Fields in the Universe

Astronomers from Bonn and Tautenburg in Thuringia (Germany) used the 100-m radio telescope at Effelsberg to observe several galaxy clusters. At the edges of these large accumulations of dark matter, stellar systems (galaxies), hot gas, and charged particles, they found magnetic fields that are exceptionally ordered over distances of many million light years. This makes them the most extended magnetic fields in the universe known so far.

The results will be published on March 22 in the journal „Astronomy & Astrophysics“.

Galaxy clusters are the largest gravitationally bound structures in the universe. With a typical extent of about 10 million light years, i.e. 100 times the...

Im Focus: Tracing down linear ubiquitination

Researchers at the Goethe University Frankfurt, together with partners from the University of Tübingen in Germany and Queen Mary University as well as Francis Crick Institute from London (UK) have developed a novel technology to decipher the secret ubiquitin code.

Ubiquitin is a small protein that can be linked to other cellular proteins, thereby controlling and modulating their functions. The attachment occurs in many...

Im Focus: Perovskite edges can be tuned for optoelectronic performance

Layered 2D material improves efficiency for solar cells and LEDs

In the eternal search for next generation high-efficiency solar cells and LEDs, scientists at Los Alamos National Laboratory and their partners are creating...

Im Focus: Polymer-coated silicon nanosheets as alternative to graphene: A perfect team for nanoelectronics

Silicon nanosheets are thin, two-dimensional layers with exceptional optoelectronic properties very similar to those of graphene. Albeit, the nanosheets are less stable. Now researchers at the Technical University of Munich (TUM) have, for the first time ever, produced a composite material combining silicon nanosheets and a polymer that is both UV-resistant and easy to process. This brings the scientists a significant step closer to industrial applications like flexible displays and photosensors.

Silicon nanosheets are thin, two-dimensional layers with exceptional optoelectronic properties very similar to those of graphene. Albeit, the nanosheets are...

Im Focus: Researchers Imitate Molecular Crowding in Cells

Enzymes behave differently in a test tube compared with the molecular scrum of a living cell. Chemists from the University of Basel have now been able to simulate these confined natural conditions in artificial vesicles for the first time. As reported in the academic journal Small, the results are offering better insight into the development of nanoreactors and artificial organelles.

Enzymes behave differently in a test tube compared with the molecular scrum of a living cell. Chemists from the University of Basel have now been able to...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

International Land Use Symposium ILUS 2017: Call for Abstracts and Registration open

20.03.2017 | Event News

CONNECT 2017: International congress on connective tissue

14.03.2017 | Event News

ICTM Conference: Turbine Construction between Big Data and Additive Manufacturing

07.03.2017 | Event News

 
Latest News

Northern oceans pumped CO2 into the atmosphere

27.03.2017 | Earth Sciences

Fingerprint' technique spots frog populations at risk from pollution

27.03.2017 | Life Sciences

Big data approach to predict protein structure

27.03.2017 | Life Sciences

VideoLinks
B2B-VideoLinks
More VideoLinks >>>