04.04.2012

As computer scientists this year celebrate the 100th anniversary of the birth of the mathematical genius Alan Turing, who set out the basis for digital computing in the 1930s to anticipate the electronic age, they still quest after a machine as adaptable and intelligent as the human brain.

Now, computer scientist Hava Siegelmann of the University of Massachusetts Amherst, an expert in neural networks, has taken Turing’s work to its next logical step. She is translating her 1993 discovery of what she has dubbed “Super-Turing” computation into an adaptable computational system that learns and evolves, using input from the environment in a way much more like our brains do than classic Turing-type computers. She and her post-doctoral research colleague Jeremie Cabessa report on the advance in the current issue of Neural Computation.

“This model is inspired by the brain,” she says. “It is a mathematical formulation of the brain’s neural networks with their adaptive abilities.” The authors show that when the model is installed in an environment offering constant sensory stimuli like the real world, and when all stimulus-response pairs are considered over the machine’s lifetime, the Super Turing model yields an exponentially greater repertoire of behaviors than the classical computer or Turing model. They demonstrate that the Super-Turing model is superior for human-like tasks and learning.

“Each time a Super-Turing machine gets input it literally becomes a different machine,” Siegelmann says. “You don’t want this for your PC. They are fine and fast calculators and we need them to do that. But if you want a robot to accompany a blind person to the grocery store, you’d like one that can navigate in a dynamic environment. If you want a machine to interact successfully with a human partner, you’d like one that can adapt to idiosyncratic speech, recognize facial patterns and allow interactions between partners to evolve just like we do. That’s what this model can offer.”

... more about:

»Artificial pump »Intelligence »Siegelmann »Super-Turing »Turing »human brain »neural network

»Artificial pump »Intelligence »Siegelmann »Super-Turing »Turing »human brain »neural network

In 1948, Turing himself predicted another kind of computation that would mimic life itself, but he died without developing his concept of a machine that could use what he called “adaptive inference.” In 1993, Siegelmann, then at Rutgers, showed independently in her doctoral thesis that a very different kind of computation, vastly different from the “calculating computer” model and more like Turing’s prediction of life-like intelligence, was possible. She published her findings in Science and in a book shortly after.

“I was young enough to be curious, wanting to understand why the Turing model looked really strong,” she recalls. “I tried to prove the conjecture that neural networks are very weak and instead found that some of the early work was faulty. I was surprised to find out via mathematical analysis that the neural models had some capabilities that surpass the Turing model. So I re-read Turing and found that he believed there would be an adaptive model that was stronger based on continuous calculations.”

Each step in Siegelmann’s model starts with a new Turing machine that computes once and then adapts. The size of the set of natural numbers is represented by the notation aleph-zero, representing also the number of different infinite calculations possible by classical Turing machines in a real-world environment on continuously arriving inputs. By contrast, Siegelmann’s most recent analysis demonstrates that Super-Turing computation has 2 to the power aleph-zero, possible behaviors. “If the Turing machine had 300 behaviors, the Super-Turing would have 2300, more than the number of atoms in the observable universe,” she explains.

The new Super-Turing machine will not only be flexible and adaptable but economical. This means that when presented with a visual problem, for example, it will act more like our human brains and choose salient features in the environment on which to focus, rather than using its power to visually sample the entire scene as a camera does. This economy of effort, using only as much attention as needed, is another hallmark of high artificial intelligence, Siegelmann says.

“If a Turing machine is like a train on a fixed track, a Super-Turing machine is like an airplane. It can haul a heavy load, but also move in endless directions and vary its destination as needed. The Super-Turing framework allows a stimulus to actually change the computer at each computational step, behaving in a way much closer to that of the constantly adapting and evolving brain,” she adds.

Siegelmann and two colleagues recently were notified that they will receive a grant to make the first ever Super-Turing computer, based on Analog Recurrent Neural Networks. The device is expected to introduce a level of intelligence not seen before in artificial computation.

Hava Siegelmann413/577-4282

hava@cs.umass.edu

Hava Siegelmann | Newswise Science News

Further information:

http://www.umass.edu

**Further reports about:**
> Artificial pump
> Intelligence
> Siegelmann
> Super-Turing
> Turing
> human brain
> neural network

Accelerating quantum technologies with materials processing at the atomic scale

15.05.2019 | University of Oxford

A step towards probabilistic computing

15.05.2019 | University of Konstanz

A new assessment of NASA's record of global temperatures revealed that the agency's estimate of Earth's long-term temperature rise in recent decades is accurate to within less than a tenth of a degree Fahrenheit, providing confidence that past and future research is correctly capturing rising surface temperatures.

The most complete assessment ever of statistical uncertainty within the GISS Surface Temperature Analysis (GISTEMP) data product shows that the annual values...

Physicists at the University of Basel are able to show for the first time how a single electron looks in an artificial atom. A newly developed method enables them to show the probability of an electron being present in a space. This allows improved control of electron spins, which could serve as the smallest information unit in a future quantum computer. The experiments were published in Physical Review Letters and the related theory in Physical Review B.

The spin of an electron is a promising candidate for use as the smallest information unit (qubit) of a quantum computer. Controlling and switching this spin or...

Engineers at the University of Tokyo continually pioneer new ways to improve battery technology. Professor Atsuo Yamada and his team recently developed a...

With a quantum coprocessor in the cloud, physicists from Innsbruck, Austria, open the door to the simulation of previously unsolvable problems in chemistry, materials research or high-energy physics. The research groups led by Rainer Blatt and Peter Zoller report in the journal Nature how they simulated particle physics phenomena on 20 quantum bits and how the quantum simulator self-verified the result for the first time.

Many scientists are currently working on investigating how quantum advantage can be exploited on hardware already available today. Three years ago, physicists...

'Quantum technologies' utilise the unique phenomena of quantum superposition and entanglement to encode and process information, with potentially profound benefits to a wide range of information technologies from communications to sensing and computing.

However a major challenge in developing these technologies is that the quantum phenomena are very fragile, and only a handful of physical systems have been...

Anzeige

Anzeige

Event News

SEMANTiCS 2019 brings together industry leaders and data scientists in Karlsruhe

29.04.2019 | Event News

Revered mathematicians and computer scientists converge with 200 young researchers in Heidelberg!

17.04.2019 | Event News

First dust conference in the Central Asian part of the earth’s dust belt

15.04.2019 | Event News

Latest News

On Mars, sands shift to a different drum

24.05.2019 | Physics and Astronomy

Piedmont Atlanta first in Georgia to offer new minimally invasive treatment for emphysema

24.05.2019 | Medical Engineering

Chemical juggling with three particles

24.05.2019 | Life Sciences

VideoLinks

Science & Research

Science & Research

NASA | A Year in the Life of Earth's CO2

NASA Computer Model Provides a New Portrait of Carbon Dioxide

Black Holes Come to the Big Screen

The new movie "Interstellar" explores a longstanding fascination, but UA astrophysicists are using cutting-edge technology to go one better.

NASA's Swift Mission Observes Mega Flares from a Mini Star

NASA's Swift satellite detected the strongest, hottest, and longest-lasting sequence of stellar flares ever seen from a nearby red dwarf star.

NASA | Global Hawks Soar into Storms

NASA's airborne Hurricane and Severe Storm Sentinel or HS3 mission, will revisit the Atlantic Ocean for the third year in a row.

Baffin Island - Disappearing ice caps

Giff Miller, geologist and paleoclima-tologist, is walking the margins of melting glaciers on Baffin Island, Nunavut, Canada.

The Infrasound Network and how it works

The CTBTO uses infrasound stations to monitor the Earth mainly for atmospheric explosions.