Researchers at Ben-Gurion University of the Negev (BGU) in Israel have developed a new hand gesture recognition system, tested at a Washington, D.C. hospital, that enables doctors to manipulate digital images during medical procedures by motioning instead of touching a screen, keyboard or mouse which compromises sterility and could spread infection, according to a just released article.
The June article," A Gesture-based Tool for Sterile Browsing of Radiology Images" in the Journal of the American Medical Informatics Association (2008;15:321-323, DOI 10.1197/jamia.M24), reports on what the authors believe is the first time a hand gesture recognition system was successfully implemented in an actual "in vivo" neurosurgical brain biopsy. It was tested at the Washington Hospital Center in Washington, D.C.
According to lead researcher Juan P. Wachs, a recent Ph.D. recipient from the Department of Industrial Engineering and Management at BGU, "A sterile human-machine interface is of supreme importance because it is the means by which the surgeon controls medical information, avoiding patient contamination, the operating room (OR) and the other surgeons." This could replace touch screens now used in many hospital operating rooms which must be sealed to prevent accumulation or spreading of contaminants and requires smooth surfaces that must be thoroughly cleaned after each procedure – but sometimes aren't. With infection rates at U.S. hospitals now at unacceptably high rates, our system offers a possible alternative."
Helman Stern, a principal investigator on the project and a professor in the Department of Industrial Engineering and Management, explains how Gestix functions in two stages: "[There is] an initial calibration stage where the machine recognizes the surgeons' hand gestures, and a second stage where surgeons must learn and implement eight navigation gestures, rapidly moving the hand away from a "neutral area" and back again. Gestix users even have the option of zooming in and out by moving the hand clockwise or counterclockwise."
To avoid sending unintended signals, users may enter a "sleep" mode by dropping the hand. The gestures for sterile gesture interface are captured by a Canon VC-C4 camera, positioned above a large flat screen monitor, using an Intel Pentium and a Matrox Standard II video-capturing device.
The project lasted for two years; in the first year Juan Wachs spent a year working at IMI (Washington D.C.) as an informatics fellow on the development of the system. During the second year, there was a contract which ended between BGU and WHC (Washington Hospital Center) where Wachs continued working at BGU with Professors Helman Stern and Yael Edan, the project's principle investigators.
At BGU, several M.Sc theses, supervised by Prof. Helman Stern and Yael Edan, have used hand gesture recognition as part of an interface to evaluate different aspects of interface design on performance in a variety of tele-robotic and tele-operated systems. Ongoing research is aiming at expanding this work to include additional control modes (e.g., voice) so as to create a multimodal telerobotic control system.
In addition, Dr. Tal Oron and her students are currently using the gesture system to evaluate human performance measures. Further research, based on video motion capture, is being conducted by Prof. Helman Stern and Dr. Tal Oren of the Dept. of Industrial Engineering and Management and Dr. Amir Shapiro of the Dept. of Mechanical Engineering. This system, combined with a tactile body display, is intended to help the vision impaired sense their surroundings.
Andrew Lavin | EurekAlert!
Terahertz spectroscopy goes nano
20.10.2017 | Brown University
New software speeds origami structure designs
12.10.2017 | Georgia Institute of Technology
Salmonellae are dangerous pathogens that enter the body via contaminated food and can cause severe infections. But these bacteria are also known to target...
University of Maryland researchers contribute to historic detection of gravitational waves and light created by event
On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...
Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.
Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....
Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).
When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...
Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.
How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...
23.10.2017 | Event News
17.10.2017 | Event News
10.10.2017 | Event News
24.10.2017 | Life Sciences
23.10.2017 | Life Sciences
23.10.2017 | Physics and Astronomy