Without words – what if computers could intuitively understand us
The experimental setting is not entirely unlike the popular children’s party game Topfschlagen (“Hit the Pot!”): one child is blindfolded and has to find a hidden pot by “blindly feeling around”. The other children sit in a circle, calling out “hot” or “cold”, depending on how close the blindfolded child is to the pot.
Dr. Thorsten Zander, postdoc in the research group for Biopsychology and Neuroergonomics led by Prof. Klaus Dr. Gramann at TU Berlin, working in collaboration with Laurens Krol (TU Berlin) and Prof. Dr. Nils Birbaumer (University of Tübingen), conducted an experiment with his participants based on a similar principle: the participants were instructed to watch a computer screen showing a flashing cursor that randomly jumps through a grid of 16 nodes toward a pre-specified target in one of the corners.
Each participant wore a headset with a network of several electrodes. Their brainwaves were captured by a so-called Brain-Computer Interface (BCI) and sent to a special software application to be analyzed and evaluated. The participants were given one single task: watch the cursor as it randomly jumps around.
“The results were spectacular: over time, as the participants simply watched the screen, the cursor found the target more and more quickly. In the first round, it needed 27 jumps on average to reach the objective. But in subsequent rounds, it could do it in 13,” reported Thorsten Zander, a mathematician by training.
The computer “learns” from the participants without requiring them to participate intentionally – or without them necessarily even knowing. The participants’ brains naturally play the role of the children in the party game, unknowingly revealing information to the computer about which movements are “hot” (when the cursor moves toward the target) and which are “cold” (when the cursor moves away from the target).
“We were able to show for the very first time that a passive brain-computer interface is capable of detecting unconscious brain signals, analyzing them, and turning them into an actionable instruction for the computer,” explained Thorsten Zander. These results are now published in the prestigious journal “Proceedings of the National Academy of Sciences” (PNAS).
“The brain activity used by the computer to determine the accuracy of the motion of the cursor is emitted by the medial prefrontal cortex region of the brain. We know from the literature that precisely this area of the brain is where so-called ‘predictive coding’ takes place.” ‘Predictive coding’ describes the brain’s tendency to construct a specific model of its surroundings and continuously make predictions about what will happen next in order to be able to respond adequately. For example, this allows people to anticipate the trajectory of a falling cup within a split second, so that they can catch it before it hits the floor.
“When the participants look at the flashing cursor, which they do not know they can influence, and see that it moves in the ‘hot direction – toward the target, the brain's prediction is confirmed, resulting in a certain ‘peak’ in their brain activity. If the cursor jumps in the ‘cold’ direction, the brain’s prediction is rejected, which creates a different peak,” explained Thorsten Zander. Each of these types of peak is detected by the BCI and converted into motion commands for the cursor by a special algorithm. The algorithm deduces the direction of the target from the brain’s unconscious reactions.
Thorsten Zander’s vision is not just to move cursors reliably across the screen. He is working toward a new type of neuroadaptive technology: “We’ve had computers for around 60 years. In that time, the performance of these computers has grown exponentially, but the interaction between humans and computers remains limited by the bottleneck of communicating human intentions to the machine by pressing keys or moving the mouse.
In this paper, we demonstrated for the first time that, after a suitable calibration phase, passive brain-computer interfaces can do more than simply detect yes/no decisions from our brain activity. They can reconstruct a model of complex thought processes from it and extract various pieces of information, allowing the computer to independently deduce machine instructions without requiring conscious human input.” This opens completely new avenues for interaction: consider for example how computers are already capable of suggesting frequently visited websites. In Thorsten Zander’s vision, the computers of the future might be able to use the current peaks in our brain activity to determine whether we are more likely to go online shopping or look for work.
This also carries serious ethical implications. Because of this and other similar work, a large conference on neuroadaptive technologies will be held in Berlin in July 2017, focusing on the ethical aspects of this technology, among other things.
One interesting question remains unanswered: can participants suppress the unconscious information revealed to the computer by their brain activity via the BCI? “That’s exactly what our next project will attempt to find out. It has just now been approved by the DFG,” says Thorsten Zander.
For further information please contact:
Dr. Thorsten Zander
Biological Psychology and Neuroergonomics
Stefanie Terp | idw - Informationsdienst Wissenschaft
Fingerprints of quantum entanglement
16.02.2018 | University of Vienna
Simple in the Cloud: The digitalization of brownfield systems made easy
07.02.2018 | Deutsches Forschungszentrum für Künstliche Intelligenz GmbH, DFKI
Breakthrough provides a new concept of the design of molecular motors, sensors and electricity generators at nanoscale
Researchers from the Institute of Organic Chemistry and Biochemistry of the CAS (IOCB Prague), Institute of Physics of the CAS (IP CAS) and Palacký University...
For photographers and scientists, lenses are lifesavers. They reflect and refract light, making possible the imaging systems that drive discovery through the microscope and preserve history through cameras.
But today's glass-based lenses are bulky and resist miniaturization. Next-generation technologies, such as ultrathin cameras or tiny microscopes, require...
Scientists from the University of Zurich have succeeded for the first time in tracking individual stem cells and their neuronal progeny over months within the intact adult brain. This study sheds light on how new neurons are produced throughout life.
The generation of new nerve cells was once thought to taper off at the end of embryonic development. However, recent research has shown that the adult brain...
Theoretical physicists propose to use negative interference to control heat flow in quantum devices. Study published in Physical Review Letters
Quantum computer parts are sensitive and need to be cooled to very low temperatures. Their tiny size makes them particularly susceptible to a temperature...
Let’s say the armrest is broken in your vintage car. As things stand, you would need a lot of luck and persistence to find the right spare part. But in the world of Industrie 4.0 and production with batch sizes of one, you can simply scan the armrest and print it out. This is made possible by the first ever 3D scanner capable of working autonomously and in real time. The autonomous scanning system will be on display at the Hannover Messe Preview on February 6 and at the Hannover Messe proper from April 23 to 27, 2018 (Hall 6, Booth A30).
Part of the charm of vintage cars is that they stopped making them long ago, so it is special when you do see one out on the roads. If something breaks or...
15.02.2018 | Event News
13.02.2018 | Event News
12.02.2018 | Event News
16.02.2018 | Information Technology
16.02.2018 | Health and Medicine
16.02.2018 | Physics and Astronomy