With almost all of the U.S. population armed with cellphones – and close to 80 percent carrying a smartphone – mobile phones have become second-nature for most people.
What's coming next, say University of Washington researchers, is the ability to interact with our devices not just with touchscreens, but through gestures in the space around the phone. Some smartphones are starting to incorporate 3-D gesture sensing based on cameras, for example, but cameras consume significant battery power and require a clear view of the user's hands.
UW engineers have developed a new form of low-power wireless sensing technology that could soon contribute to this growing field by letting users "train" their smartphones to recognize and respond to specific hand gestures near the phone.
The technology – developed in the labs of Matt Reynolds and Shwetak Patel, UW associate professors of electrical engineering and of computer science and engineering – uses the phone's wireless transmissions to sense nearby gestures, so it works when a device is out of sight in a pocket or bag and could easily be built into future smartphones and tablets.
"Today's smartphones have many different sensors built in, ranging from cameras to accelerometers and gyroscopes that can track the motion of the phone itself," Reynolds said. "We have developed a new type of sensor that uses the reflection of the phone's own wireless transmissions to sense nearby gestures, enabling users to interact with their phones even when they are not holding the phone, looking at the display or touching the screen."
Team members will present their project, called SideSwipe, and a related paper Oct. 8 at the Association for Computing Machinery's Symposium on User Interface Software and Technology in Honolulu.
When a person makes a call or an app exchanges data with the Internet, a phone transmits radio signals on a 2G, 3G or 4G cellular network to communicate with a cellular base station. When a user's hand moves through space near the phone, the user's body reflects some of the transmitted signal back toward the phone.
The new system uses multiple small antennas to capture the changes in the reflected signal and classify the changes to detect the type of gesture performed. In this way, tapping, hovering and sliding gestures could correspond to various commands for the phone, such as silencing a ring, changing which song is playing or muting the speakerphone. Because the phone's wireless transmissions pass easily through the fabric of clothing or a handbag, the system works even when the phone is stowed away.
"This approach allows us to make the entire space around the phone an interaction space, going beyond a typical touchscreen interface," Patel said. "You can interact with the phone without even seeing the display by using gestures in the 3-D space around the phone."
A group of 10 study participants tested the technology by performing 14 different hand gestures – including hovering, sliding and tapping – in various positions around a smartphone. Each time, the phone was calibrated by learning a user's hand movements, then trained itself to respond. The team found the smartphone recognized gestures with about 87 percent accuracy.
There are other gesture-based technologies, such as "AllSee" and "WiSee" recently developed at the UW, but researchers say there are important advantages to the new approach.
"SideSwipe's directional antenna approach makes interaction with the phone completely self-contained, because you're not depending on anything in the environment other than the phone's own transmissions," Reynolds said. "Because the SideSwipe sensor is based only on low-power receivers and relatively simple signal processing compared with video from a camera, we expect SideSwipe would have a minimal impact on battery life."
The team has filed patents on the technology and will continue developing SideSwipe, integrating the hardware and making a "plug and play" device that could be built into smartphones, said Chen Zhao, project lead and a UW doctoral student in electrical engineering.
Other co-authors are Ke-Yu Chen, a UW doctoral student in electrical engineering, and Md Tanvir Islam Aumi, a doctoral student in computer science and engineering.
This research was funded by the UW.
For more information, contact Reynolds at email@example.com or 206-616-5046.
Project website: http://www.keyuc.com/research/SideSwipe/
Research paper: http://www.keyuc.com/research/SideSwipe/SideSwipe_UIST2014.pdf
Project video: http://youtu.be/KN3GWZ8pt4w
Posted with video, images: http://www.washington.edu/news/2014/09/19/reflected-smartphone-transmissions-enable-gesture-control/
Michelle Ma | Eurek Alert!
NASA CubeSat to test miniaturized weather satellite technology
10.11.2017 | NASA/Goddard Space Flight Center
New approach uses light instead of robots to assemble electronic components
08.11.2017 | The Optical Society
The formation of stars in distant galaxies is still largely unexplored. For the first time, astron-omers at the University of Geneva have now been able to closely observe a star system six billion light-years away. In doing so, they are confirming earlier simulations made by the University of Zurich. One special effect is made possible by the multiple reflections of images that run through the cosmos like a snake.
Today, astronomers have a pretty accurate idea of how stars were formed in the recent cosmic past. But do these laws also apply to older galaxies? For around a...
Just because someone is smart and well-motivated doesn't mean he or she can learn the visual skills needed to excel at tasks like matching fingerprints, interpreting medical X-rays, keeping track of aircraft on radar displays or forensic face matching.
That is the implication of a new study which shows for the first time that there is a broad range of differences in people's visual ability and that these...
Computer Tomography (CT) is a standard procedure in hospitals, but so far, the technology has not been suitable for imaging extremely small objects. In PNAS, a team from the Technical University of Munich (TUM) describes a Nano-CT device that creates three-dimensional x-ray images at resolutions up to 100 nanometers. The first test application: Together with colleagues from the University of Kassel and Helmholtz-Zentrum Geesthacht the researchers analyzed the locomotory system of a velvet worm.
During a CT analysis, the object under investigation is x-rayed and a detector measures the respective amount of radiation absorbed from various angles....
The quantum world is fragile; error correction codes are needed to protect the information stored in a quantum object from the deteriorating effects of noise. Quantum physicists in Innsbruck have developed a protocol to pass quantum information between differently encoded building blocks of a future quantum computer, such as processors and memories. Scientists may use this protocol in the future to build a data bus for quantum computers. The researchers have published their work in the journal Nature Communications.
Future quantum computers will be able to solve problems where conventional computers fail today. We are still far away from any large-scale implementation,...
Pillared graphene would transfer heat better if the theoretical material had a few asymmetric junctions that caused wrinkles, according to Rice University...
15.11.2017 | Event News
15.11.2017 | Event News
30.10.2017 | Event News
17.11.2017 | Physics and Astronomy
17.11.2017 | Health and Medicine
17.11.2017 | Studies and Analyses