This will not only help robots to better navigate in their environments, it will also enable robot self-perception for the first time. A single robotic arm has already been partially equipped with sensors and proves that the concept works.
Our skin is a communicative wonder: The nerves convey temperature, pressure, shear forces and vibrations – from the finest breath of air to touch to pain. At the same time, the skin is the organ by which we set ourselves apart from our environment and distinguish between environment and body. Scientists at TUM are now developing an artificial skin for robots with a similar purpose: It will provide important tactile information to the robot and thus supplement its perception formed by camera eyes, infrared scanners and gripping hands. As with human skin, the way the artificial skin is touched could, for example, lead to a spontaneous retreat (when the robot hits an object) or cause the machine to use its eyes for the first time to search for the source of contact.
Such behavior is especially important for robotic helpers of people traveling in constantly changing environments. According to robot vision, this is just a regular apartment in which things often change position and people and pets move around. “In contrast to the tactile information provided by the skin, the sense of sight is limited because objects can be hidden,” explains Philip Mittendorfer, a scientist who develops the artificial skin at the Institute of Cognitive Systems at TUM.
The centerpiece of the new robotic shell is a 5 square centimeter hexagonal plate or circuit board. Each small circuit board contains four infrared sensors that detect anything closer than 1 centimeter. “We thus simulate light touch,” explains Mittendorfer. “This corresponds to our sense of the fine hairs on our skin being gently stroked.” There are also six temperature sensors and an accelerometer. This allows the machine to accurately register the movement of individual limbs, for example, of its arms, and thus to learn what body parts it has just moved. “We try to pack many different sensory modalities into the smallest of spaces,” explains the engineer. “In addition, it is easy to expand the circuit boards to later include other sensors, for example, pressure.”
Plate for plate, the boards are placed together forming a honeycomb-like, planar structure to be worn by the robot. For the machine to have detection ability, the signals from the sensors must be processed by a central computer. This enables each sensory module to not only pass its own information, but to also serve as a data hub for different sensory elements. This happens automatically, ensuring that signals can go in alternative ways if a connection should fail.
Only a small piece of skin is currently complete. These 15 sensors, however, at least one on each segment of a long robot arm, already show that the principle works. Just a light pat or blow ensures that the arm reacts. “We will close the skin and generate a prototype which is completely enclosed with these sensors and can interact anew with its environment,” claims Mittendorfer’s supervisor, Prof. Gordon Cheng. Prof. Cheng expounds that this will be “a machine that notices when you tap it on the back… even in the dark.”
The pioneering aspects of the concept do not end with its sensory accomplishments. Beyond this, these machines will someday be able to incorporate our fundamental neurobiological capabilities and form a self-impression. The robot has moved a step closer to humanity.Contact
Fraunhofer ISE Supports Market Development of Solar Thermal Power Plants in the MENA Region
21.02.2018 | Fraunhofer-Institut für Solare Energiesysteme ISE
New tech for commercial Lithium-ion batteries finds they can be charged 5 times fast
20.02.2018 | University of Warwick
A newly developed laser technology has enabled physicists in the Laboratory for Attosecond Physics (jointly run by LMU Munich and the Max Planck Institute of Quantum Optics) to generate attosecond bursts of high-energy photons of unprecedented intensity. This has made it possible to observe the interaction of multiple photons in a single such pulse with electrons in the inner orbital shell of an atom.
In order to observe the ultrafast electron motion in the inner shells of atoms with short light pulses, the pulses must not only be ultrashort, but very...
A group of researchers led by Andrea Cavalleri at the Max Planck Institute for Structure and Dynamics of Matter (MPSD) in Hamburg has demonstrated a new method enabling precise measurements of the interatomic forces that hold crystalline solids together. The paper Probing the Interatomic Potential of Solids by Strong-Field Nonlinear Phononics, published online in Nature, explains how a terahertz-frequency laser pulse can drive very large deformations of the crystal.
By measuring the highly unusual atomic trajectories under extreme electromagnetic transients, the MPSD group could reconstruct how rigid the atomic bonds are...
Quantum computers may one day solve algorithmic problems which even the biggest supercomputers today can’t manage. But how do you test a quantum computer to...
For the first time, a team of researchers at the Max-Planck Institute (MPI) for Polymer Research in Mainz, Germany, has succeeded in making an integrated circuit (IC) from just a monolayer of a semiconducting polymer via a bottom-up, self-assembly approach.
In the self-assembly process, the semiconducting polymer arranges itself into an ordered monolayer in a transistor. The transistors are binary switches used...
Breakthrough provides a new concept of the design of molecular motors, sensors and electricity generators at nanoscale
Researchers from the Institute of Organic Chemistry and Biochemistry of the CAS (IOCB Prague), Institute of Physics of the CAS (IP CAS) and Palacký University...
15.02.2018 | Event News
13.02.2018 | Event News
12.02.2018 | Event News
23.02.2018 | Physics and Astronomy
23.02.2018 | Health and Medicine
23.02.2018 | Physics and Astronomy