However, scientists at Fraunhofer FIT has developed the next generation noncontact gesture and finger recognition system. The novel system detects hand and finger positions in real-time and translates these into appropriate interaction commands. Furthermore, the system does not require special gloves or markers and is capable of supporting multiple users.
The system detects the hands and fingers in real-time. Source: Fraunhofer FIT
With touch screens becoming increasingly popular, classic interaction techniques such as a mouse and keyboard are becoming less frequently used. One example of a breakthrough is the Apple iPhone which was released in summer 2007. Since then many other devices featuring touch screens and similar characteristics have been successfully launched – with more advanced devices even supporting multiple users simultaneously, e.g. the Microsoft Surface table becoming available. This is an entire surface which can be used for input. However, this form of interaction is specifically designed for two-dimensional surfaces.
Fraunhofer FIT has developed the next generation of multi-touch environment, one that requires no physical contact and is entirely gesture-based. This system detects multiple fingers and hands at the same time and allows the user to interact with objects on a display. The users move their hands and fingers in the air and the system automatically recognizes and interprets the gestures accordingly.
Cinemagoers will remember the science-fiction thriller Minority Report from 2002 which starred Tom Cruise. In this film Tom Cruise is in a 3-D software arena and is able to interact with numerous programs at unimaginable speed, however the system used special gloves and only three fingers from each hand.
The FIT prototype provides the next generation of gesture-based interaction far in advance of the Minority Report system. The FIT prototype tracks the user's hand in front of a 3-D camera. The 3-D camera uses the time of flight principle, in this approach each pixel is tracked and the length of time it takes light to be filmed travelling to and from the tracked object is determined. This allows for the calculation of the distance between the camera and the tracked object.
"A special image analysis algorithm was developed which filters out the positions of the hands and fingers. This is achieved in real-time through the use of intelligent filtering of the incoming data. The raw data can be viewed as a kind of 3-D mountain landscape, with the peak regions representing the hands or fingers." said Georg Hackenberg, who developed the system as part of his Master's thesis. In addition plausibility criteria are used, these are based around: the size of a hand, finger length and the potential coordinates.
A user study was conducted and found that the system both easy to use and fun. However, work remains to be done on removing elements which confuses the system, for example reflections caused by wristwatches and palms which are positioned orthogonal to the camera.
"With Microsoft announcing Project Natal, it is likely that similar techniques will very soon become standard across the gaming industry. This technology also opens up the potential for new solutions in the range of other application domains, such as the exploration of complex simulation data and for new forms of learning," predicts Prof. Dr. Wolfgang Broll of the Fraunhofer Institute for Applied Information Technology FIT.
Green Light for Galaxy Europe
15.03.2018 | Albert-Ludwigs-Universität Freiburg im Breisgau
Tokyo Tech's six-legged robots get closer to nature
12.03.2018 | Tokyo Institute of Technology
Animal photoreceptors capture light with photopigments. Researchers from the University of Göttingen have now discovered that these photopigments fulfill an...
On 15 March, the AWI research aeroplane Polar 5 will depart for Greenland. Concentrating on the furthest northeast region of the island, an international team...
The world’s second-largest ice shelf was the destination for a Polarstern expedition that ended in Punta Arenas, Chile on 14th March 2018. Oceanographers from...
At the 2018 ILA Berlin Air Show from April 25–29, the Fraunhofer Institute for Laser Technology ILT is showcasing extreme high-speed Laser Material Deposition (EHLA): A video documents how for metal components that are highly loaded, EHLA has already proved itself as an alternative to hard chrome plating, which is now allowed only under special conditions.
When the EU restricted the use of hexavalent chromium compounds to special applications requiring authorization, the move prompted a rethink in the surface...
At the ILA Berlin, hall 4, booth 202, Fraunhofer FHR will present two radar sensors for navigation support of drones. The sensors are valuable components in the implementation of autonomous flying drones: they function as obstacle detectors to prevent collisions. Radar sensors also operate reliably in restricted visibility, e.g. in foggy or dusty conditions. Due to their ability to measure distances with high precision, the radar sensors can also be used as altimeters when other sources of information such as barometers or GPS are not available or cannot operate optimally.
Drones play an increasingly important role in the area of logistics and services. Well-known logistic companies place great hope in these compact, aerial...
16.03.2018 | Event News
13.03.2018 | Event News
08.03.2018 | Event News
16.03.2018 | Earth Sciences
16.03.2018 | Physics and Astronomy
16.03.2018 | Life Sciences