Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

New frontier in error-correcting codes

02.10.2014

Coding scheme for interactive communication is the first to near optimality on three classical measures.

Error-correcting codes are one of the glories of the information age: They're what guarantee the flawless transmission of digital information over the airwaves or through copper wire, even in the presence of the corrupting influences that engineers call "noise."

But classical error-correcting codes work best with large chunks of data: The bigger the chunk, the higher the rate at which it can be transmitted error-free. In the Internet age, however, distributed computing is becoming more and more common, with devices repeatedly exchanging small chunks of data over long periods of time.

So for the last 20 years, researchers have been investigating interactive-coding schemes, which address the problem of long sequences of short exchanges. Like classical error-correcting codes, interactive codes are evaluated according to three criteria: How much noise can they tolerate? What's the maximum transmission rate they afford? And how time-consuming are the encoding and decoding processes?

At the IEEE Symposium on Foundations of Computer Science this month, MIT graduate students past and present will describe the first interactive coding scheme to approach the optimum on all three measures.

"Previous to this work, it was known how to get two out of three of these things to be optimal," says Mohsen Ghaffari, a graduate student in electrical engineering and computer science and one of the paper's two co-authors. "This paper achieves all three of them."

Vicious noise

Moreover, where Claude Shannon's groundbreaking 1948 analysis of error-correcting codes considered the case of random noise, in which every bit of transmitted data has the same chance of being corrupted, Ghaffari and his collaborator — Bernhard Haeupler, who did his graduate work at MIT and is now an assistant professor at Carnegie Mellon University — consider the more stringent case of "adversarial noise," in which an antagonist is trying to interfere with transmission in the most disruptive way possible.

"We don't know what type of random noise will be the one that actually captures reality," Ghaffari explains. "If we knew the best one, we would just use that. But generally, we don't know. So you try to generate a coding that is as general as possible." A coding scheme that could thwart an active adversary would also thwart any type of random noise.

Error-correcting codes — both classical and interactive — work by adding some extra information to the message to be transmitted. They might, for instance, tack on some bits that describe arithmetic relationships between the message bits. Both the message bits and the extra bits are liable to corruption, so decoding a message — extracting the true sequence of message bits from the sequence that arrives at the receiver — is usually a process of iterating back and forth between the message bits and the extra bits, trying to iron out discrepancies.

In interactive communication, the maximum tolerable error rate is one-fourth: If the adversary can corrupt more than a quarter of the bits sent, perfectly reliable communication is impossible. Some prior interactive-coding schemes, Ghaffari explains, could handle that error rate without requiring too many extra bits. But the decoding process was prohibitively complex.

Making a list

To keep the complexity down, Ghaffari and Haeupler adopted a technique called list decoding. Rather than iterating back and forth between message bits and extra bits until the single most probable interpretation emerges, their algorithm iterates just long enough to create a list of likely candidates. At the end of their mutual computation, each of the interacting devices may have a list with hundreds of entries.

But each device, while it has only imperfect knowledge of the messages sent by the other, has perfect knowledge of the messages it sent. So if, at the computation's end, the devices simply exchange lists, each has enough additional information to zero in on the optimal decoding.

The maximum tolerable error rate for an interactive-coding scheme — one-fourth — is a theoretical result. The minimum length of an encoded message and the minimum decoding complexity, on the other hand, are surmises based on observation.

But Ghaffari and Haeupler's decoding algorithm is nearly linear, meaning that its execution time is roughly proportional to the length of the messages exchanged.

But linear relationships are still defined by constants: y = x is a linear relationship, but so is y = 1,000,000,000x. A linear algorithm that takes an extra second of computation for each additional bit of data it considers isn't as good as a linear algorithm that takes an extra microsecond.

###

Written by Larry Hardesty, MIT News Office

Andrew Carleen | Eurek Alert!
Further information:
http://www.mit.edu

More articles from Communications Media:

nachricht New Technologies for A/V Analysis and Search
13.04.2017 | Fraunhofer-Institut für Digitale Medientechnologie IDMT

nachricht On patrol in social networks
25.01.2017 | Fraunhofer-Institut für Arbeitswirtschaft und Organisation IAO

All articles from Communications Media >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Fizzy soda water could be key to clean manufacture of flat wonder material: Graphene

Whether you call it effervescent, fizzy, or sparkling, carbonated water is making a comeback as a beverage. Aside from quenching thirst, researchers at the University of Illinois at Urbana-Champaign have discovered a new use for these "bubbly" concoctions that will have major impact on the manufacturer of the world's thinnest, flattest, and one most useful materials -- graphene.

As graphene's popularity grows as an advanced "wonder" material, the speed and quality at which it can be manufactured will be paramount. With that in mind,...

Im Focus: Exotic quantum states made from light: Physicists create optical “wells” for a super-photon

Physicists at the University of Bonn have managed to create optical hollows and more complex patterns into which the light of a Bose-Einstein condensate flows. The creation of such highly low-loss structures for light is a prerequisite for complex light circuits, such as for quantum information processing for a new generation of computers. The researchers are now presenting their results in the journal Nature Photonics.

Light particles (photons) occur as tiny, indivisible portions. Many thousands of these light portions can be merged to form a single super-photon if they are...

Im Focus: Circular RNA linked to brain function

For the first time, scientists have shown that circular RNA is linked to brain function. When a RNA molecule called Cdr1as was deleted from the genome of mice, the animals had problems filtering out unnecessary information – like patients suffering from neuropsychiatric disorders.

While hundreds of circular RNAs (circRNAs) are abundant in mammalian brains, one big question has remained unanswered: What are they actually good for? In the...

Im Focus: RAVAN CubeSat measures Earth's outgoing energy

An experimental small satellite has successfully collected and delivered data on a key measurement for predicting changes in Earth's climate.

The Radiometer Assessment using Vertically Aligned Nanotubes (RAVAN) CubeSat was launched into low-Earth orbit on Nov. 11, 2016, in order to test new...

Im Focus: Scientists shine new light on the “other high temperature superconductor”

A study led by scientists of the Max Planck Institute for the Structure and Dynamics of Matter (MPSD) at the Center for Free-Electron Laser Science in Hamburg presents evidence of the coexistence of superconductivity and “charge-density-waves” in compounds of the poorly-studied family of bismuthates. This observation opens up new perspectives for a deeper understanding of the phenomenon of high-temperature superconductivity, a topic which is at the core of condensed matter research since more than 30 years. The paper by Nicoletti et al has been published in the PNAS.

Since the beginning of the 20th century, superconductivity had been observed in some metals at temperatures only a few degrees above the absolute zero (minus...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

Call for Papers – ICNFT 2018, 5th International Conference on New Forming Technology

16.08.2017 | Event News

Sustainability is the business model of tomorrow

04.08.2017 | Event News

Clash of Realities 2017: Registration now open. International Conference at TH Köln

26.07.2017 | Event News

 
Latest News

Cholesterol-lowering drugs may fight infectious disease

22.08.2017 | Health and Medicine

Meter-sized single-crystal graphene growth becomes possible

22.08.2017 | Materials Sciences

Repairing damaged hearts with self-healing heart cells

22.08.2017 | Life Sciences

VideoLinks
B2B-VideoLinks
More VideoLinks >>>