Predicting decisions by taking a glimpse into the brain
Recent studies have indicated that the brain makes decisions on visual and auditory stimuli using accumulated sensory evidence, and that this process is orchestrated by a network of neurons from the front (the prefrontal area) and the back (the parietal area) of the brain.
Ksander de Winkel and his colleagues from the Department of Prof. Bülthoff at the Max Planck Institute for Biological Cybernetics investigated whether these findings also apply to decisions on self-motion stimuli (passive motion of one’s own body). The results showed that the scientists could predict how well a participant was able to tell the different motions apart, which is an indication that an accumulation of sensory evidence of self-motion was measured.
These findings provide support for the idea that the network of prefrontal and parietal neurons is ‘modality-independent’, meaning that the neurons in this network are dedicated to collect evidence and to make decisions using any type of sensory information and are not dependent on visual and vestibular (concerning the equilibrium) cues.
The scientists placed participants in a motion simulator and rotated them around an earth-vertical axis that was aligned with the spine. More specifically, participants experienced a large number of pairs of such rotations, for which one was always slightly more intense than the other.
The order of the smaller and larger rotations was randomized for each pair, and participants had to judge which rotation of each pair was more intense. While the participants performed the task, the blood flow in the prefrontal and parietal areas were measured using a novel technique: functional Near-Infrared Spectroscopy (fNIRS). The scientists then used these recordings to try whether it was possible to predict the participants’ judgments for every single pair of rotations.
Research on brain activity of participants or patients in motion is scarce, because the readings of common neuroimaging methods, such as electroencephalography (EEG) or functional magnetic resonance imaging (fMRI), are distorted by the body motion and electro-magnetic inference, such as electric noise in vehicles. This is not the case with fNIRS. Infrared light is emitted through the scalp into the brain tissue with the reflection to be measured.
Since the light intensity of infrared light is very low, this method is non-invasive and harmless. The blood flow and the oxygen level increase in the active brain regions (haemodynamic response), which is captured via this method. Through this, it is possible to draw conclusions about activities in these brain areas.
“This method is very promising”, de Winkel gladly announces. “Up to now, we had to rely on what the participants could tell us about their perception. Now we get to glance directly into the brain.” The results showed that they could predict how well a participant could tell the motions apart using the fNIRS recordings, and therefore indicated that the areas under investigation were indeed involved in decision making on self-motion. The more sensory evidence participants collect, the better they will be able to tell two motions apart.
“If we know how the brain makes decisions and what areas are involved, we can relate specific behavioral problems and physical traumata to these areas”, de Winkel explains. Moreover, considering the fact that conventional neuroimaging techniques are not suitable to use with moving participants, the results are encouraging for the use of fNIRS to perform neuroimaging in participants in moving vehicles and simulators. This might pave the way for a completely new line of research.
Authors: Dr. Ksander de Winkel, Alessandro Nesti, Hasan Ayaz, Heinrich H. Bülthoff
Scientist Dr. Ksander de Winkel
Phone: +49 7071 601- 643
Media Liaison Officer Beate Fülle
Head of Communications and Public Relations
Phone: +49 (0)7071 601-777
Max Planck Institute for Biological Cybernetics
The Max Planck Institute for Biological Cybernetics deals with the processing of signals and information in the brain. We know that our brain must constantly process an immense wealth of sensory impressions to coordinate our behavior and enable us to interact with our environment. It is, however, surprisingly little known how our brain actually manages to perceive, recognize and learn. The scientists at the Max Planck Institute for Biological Cybernetics are therefore looking into the question of which signals and processes are necessary in order to generate a consistent picture of our environment and the corresponding behavior from the various sensory information. Scientists from three departments and seven research groups work on fundamental questions of brain research using different approaches and methods.
Presse- und Öffentlichkeitsarbeit | Max-Planck-Institut für biologische Kybernetik
Zebrafish's near 360 degree UV-vision knocks stripes off Google Street View
22.06.2018 | University of Sussex
New cellular pathway helps explain how inflammation leads to artery disease
22.06.2018 | Cedars-Sinai Medical Center
In a recent publication in the renowned journal Optica, scientists of Leibniz-Institute of Photonic Technology (Leibniz IPHT) in Jena showed that they can accurately control the optical properties of liquid-core fiber lasers and therefore their spectral band width by temperature and pressure tuning.
Already last year, the researchers provided experimental proof of a new dynamic of hybrid solitons– temporally and spectrally stationary light waves resulting...
Scientists from the University of Freiburg and the University of Basel identified a master regulator for bone regeneration. Prasad Shastri, Professor of...
Moving into its fourth decade, AchemAsia is setting out for new horizons: The International Expo and Innovation Forum for Sustainable Chemical Production will take place from 21-23 May 2019 in Shanghai, China. With an updated event profile, the eleventh edition focusses on topics that are especially relevant for the Chinese process industry, putting a strong emphasis on sustainability and innovation.
Founded in 1989 as a spin-off of ACHEMA to cater to the needs of China’s then developing industry, AchemAsia has since grown into a platform where the latest...
The BMBF-funded OWICELLS project was successfully completed with a final presentation at the BMW plant in Munich. The presentation demonstrated a Li-Fi communication with a mobile robot, while the robot carried out usual production processes (welding, moving and testing parts) in a 5x5m² production cell. The robust, optical wireless transmission is based on spatial diversity; in other words, data is sent and received simultaneously by several LEDs and several photodiodes. The system can transmit data at more than 100 Mbit/s and five milliseconds latency.
Modern production technologies in the automobile industry must become more flexible in order to fulfil individual customer requirements.
An international team of scientists has discovered a new way to transfer image information through multimodal fibers with almost no distortion - even if the fiber is bent. The results of the study, to which scientist from the Leibniz-Institute of Photonic Technology Jena (Leibniz IPHT) contributed, were published on 6thJune in the highly-cited journal Physical Review Letters.
Endoscopes allow doctors to see into a patient’s body like through a keyhole. Typically, the images are transmitted via a bundle of several hundreds of optical...
13.06.2018 | Event News
08.06.2018 | Event News
05.06.2018 | Event News
22.06.2018 | Materials Sciences
22.06.2018 | Earth Sciences
22.06.2018 | Life Sciences