Both the hand-gesture recognition and robotic nurse innovations might help to reduce the length of surgeries and the potential for infection, said Juan Pablo Wachs, an assistant professor of industrial engineering at Purdue University.
The "vision-based hand gesture recognition" technology could have other applications, including the coordination of emergency response activities during disasters.
"It's a concept Tom Cruise demonstrated vividly in the film 'Minority Report,'" Wachs said.
Surgeons routinely need to review medical images and records during surgery, but stepping away from the operating table and touching a keyboard and mouse can delay the surgery and increase the risk of spreading infection-causing bacteria.
The new approach is a system that uses a camera and specialized algorithms to recognize hand gestures as commands to instruct a computer or robot.
At the same time, a robotic scrub nurse represents a potential new tool that might improve operating-room efficiency, Wachs said.
Findings from the research will be detailed in a paper appearing in the February issue of Communications of the ACM, the flagship publication of the Association for Computing Machinery. The paper, featured on the journal's cover, was written by researchers at Purdue, the Naval Postgraduate School in Monterey, Calif., and Ben-Gurion University of the Negev, Israel.
Research into hand-gesture recognition began several years ago in work led by the Washington Hospital Center and Ben-Gurion University, where Wachs was a research fellow and doctoral student, respectively.
He is now working to extend the system's capabilities in research with Purdue's School of Veterinary Medicine and the Department of Speech, Language, and Hearing Sciences.
"One challenge will be to develop the proper shapes of hand poses and the proper hand trajectory movements to reflect and express certain medical functions," Wachs said. "You want to use intuitive and natural gestures for the surgeon, to express medical image navigation activities, but you also need to consider cultural and physical differences between surgeons. They may have different preferences regarding what gestures they may want to use."
Other challenges include providing computers with the ability to understand the context in which gestures are made and to discriminate between intended gestures versus unintended gestures.
"Say the surgeon starts talking to another person in the operating room and makes conversational gestures," Wachs said. "You don't want the robot handing the surgeon a hemostat."
A scrub nurse assists the surgeon and hands the proper surgical instruments to the doctor when needed.
"While it will be very difficult using a robot to achieve the same level of performance as an experienced nurse who has been working with the same surgeon for years, often scrub nurses have had very limited experience with a particular surgeon, maximizing the chances for misunderstandings, delays and sometimes mistakes in the operating room," Wachs said. "In that case, a robotic scrub nurse could be better."
The Purdue researcher has developed a prototype robotic scrub nurse, in work with faculty in the university's School of Veterinary Medicine.
Researchers at other institutions developing robotic scrub nurses have focused on voice recognition. However, little work has been done in the area of gesture recognition, Wachs said.
"Another big difference between our focus and the others is that we are also working on prediction, to anticipate what images the surgeon will need to see next and what instruments will be needed," he said.
Wachs is developing advanced algorithms that isolate the hands and apply "anthropometry," or predicting the position of the hands based on knowledge of where the surgeon's head is. The tracking is achieved through a camera mounted over the screen used for visualization of images.
"Another contribution is that by tracking a surgical instrument inside the patient's body, we can predict the most likely area that the surgeon may want to inspect using the electronic image medical record, and therefore saving browsing time between the images," Wachs said. "This is done using a different sensor mounted over the surgical lights."
The hand-gesture recognition system uses a new type of camera developed by Microsoft, called Kinect, which senses three-dimensional space. The camera is found in new consumer electronics games that can track a person's hands without the use of a wand.
"You just step into the operating room, and automatically your body is mapped in 3-D," he said.
Accuracy and gesture-recognition speed depend on advanced software algorithms.
"Even if you have the best camera, you have to know how to program the camera, how to use the images," Wachs said. "Otherwise, the system will work very slowly."
The research paper defines a set of requirements, including recommendations that the system should:
* Use a small vocabulary of simple, easily recognizable gestures.
* Not require the user to wear special virtual reality gloves or certain types of clothing.
* Be as low-cost as possible.
* Be responsive and able to keep up with the speed of a surgeon's hand gestures.
* Let the user know whether it understands the hand gestures by providing feedback, perhaps just a simple "OK."
* Use gestures that are easy for surgeons to learn, remember and carry out with little physical exertion.
* Be highly accurate in recognizing hand gestures.
* Use intuitive gestures, such as two fingers held apart to mimic a pair of scissors.
* Be able to disregard unintended gestures by the surgeon, perhaps made in conversation with colleagues in the operating room.
* Be able to quickly configure itself to work properly in different operating rooms, under various lighting conditions and other criteria.
"Eventually we also want to integrate voice recognition, but the biggest challenges are in gesture recognition," Wachs said. "Much is already known about voice recognition."
The work is funded by the U.S. Agency for Healthcare Research and Quality. The article is accessible online at http://cacm.acm.org/magazines/2011/2/104397-vision-based-hand-gesture-applications/fulltext
Writer: Emil Venere, 765-494-4709, firstname.lastname@example.orgSources: Juan Pablo Wachs, 765 496-7380, email@example.com
Emil Venere | EurekAlert!
3D images of cancer cells in the body: Medical physicists from Halle present new method
16.05.2018 | Martin-Luther-Universität Halle-Wittenberg
Better equipped in the fight against lung cancer
16.05.2018 | Friedrich-Alexander-Universität Erlangen-Nürnberg
At the LASYS 2018, from June 5th to 7th, the Laser Zentrum Hannover e.V. (LZH) will be showcasing processes for the laser material processing of tomorrow in hall 4 at stand 4E75. With blown bomb shells the LZH will present first results of a research project on civil security.
At this year's LASYS, the LZH will exhibit light-based processes such as cutting, welding, ablation and structuring as well as additive manufacturing for...
There are videos on the internet that can make one marvel at technology. For example, a smartphone is casually bent around the arm or a thin-film display is rolled in all directions and with almost every diameter. From the user's point of view, this looks fantastic. From a professional point of view, however, the question arises: Is that already possible?
At Display Week 2018, scientists from the Fraunhofer Institute for Applied Polymer Research IAP will be demonstrating today’s technological possibilities and...
So-called quantum many-body scars allow quantum systems to stay out of equilibrium much longer, explaining experiment | Study published in Nature Physics
Recently, researchers from Harvard and MIT succeeded in trapping a record 53 atoms and individually controlling their quantum state, realizing what is called a...
The historic first detection of gravitational waves from colliding black holes far outside our galaxy opened a new window to understanding the universe. A...
A team led by Austrian experimental physicist Rainer Blatt has succeeded in characterizing the quantum entanglement of two spatially separated atoms by observing their light emission. This fundamental demonstration could lead to the development of highly sensitive optical gradiometers for the precise measurement of the gravitational field or the earth's magnetic field.
The age of quantum technology has long been heralded. Decades of research into the quantum world have led to the development of methods that make it possible...
02.05.2018 | Event News
13.04.2018 | Event News
12.04.2018 | Event News
22.05.2018 | Life Sciences
22.05.2018 | Earth Sciences
22.05.2018 | Trade Fair News