Future AR/VR controllers could be the palm of your hand
Carnegie Mellon University’s EgoTouch creates simple interfaces for virtual and augmented reality.
The new generation of augmented and virtual reality controllers may not just fit in the palm of your hand. They could be the palm of your hand.
A recent paper by researchers in Carnegie Mellon University’s Human-Computer Interaction Institute introduces EgoTouch, a tool that uses AI to control AR/VR interfaces by touching your skin with a finger.
The team wanted to ultimately design a control that would provide tactile feedback using only the sensors that come with a standard AR/VR headset. OmniTouch, a previous method developed by Chris Harrison, an associate professor in the HCII and director of the Future Interfaces Group, got close. But that method required a special, clunky, depth-sensing camera. Vimal Mollyn, a Ph.D. student advised by Harrison, had the idea to use a machine learning algorithm to train normal cameras to recognize touching.
“Try taking your finger and see what happens when you touch your skin with it. You’ll notice that there are these shadows and local skin deformations that only occur when you’re touching the skin,” Mollyn said. “If we can see these, then we can train a machine learning model to do the same, and that’s essentially what we did.”
Mollyn collected the data for EgoTouch by using a custom touch sensor that ran along the underside of the index finger and the palm. The sensor collected data on different types of touch at different forces while staying invisible to the camera. The model then learned to correlate the visual features of shadows and skin deformities to touch and force without human annotation. The team broadened its training data collection to include 15 users with different skin tones and hair densities and gathered hours of data across many situations, activities and lighting conditions.
EgoTouch can detect touch with more than 96% accuracy and has a false positive rate of around 5%. It recognizes pressing down, lifting up and dragging. The model can also classify whether a touch was light or hard with 98% accuracy.
“That can be really useful for having a right-click functionality on the skin,” Mollyn said.
Detecting variations in touch could enable developers to mimic touchscreen gestures on our skin. For example, your smartphone can recognize when you scroll up or down a page, zoom in, swipe right, or press and hold on an icon. To translate this to a skin-based interface, the camera needs to recognize the subtle differences between the type of touch and the force of touch.
Accuracies were about the same across diverse skin tones and hair densities, and at different areas on the hand and forearm like the front of arm, back of arm, palm and back of hand. The system did not perform well on bony areas like the knuckles.
“It’s probably because there wasn’t as much skin deformation in those areas,” Mollyn said. “As a user interface designer, what you can do is avoid placing elements on those regions.”
Mollyn is exploring ways to use night vision cameras and nighttime illumination to enable the EgoTouch system to work in the dark. He’s also collaborating with researchers to extend this touch-detection method to surfaces other than the skin.
“For the first time, we have a system that just uses a camera that is already in all the headsets. Our models are calibration free, and they work right out of the box,” said Mollyn. “Now we can build off prior work on on-skin interfaces and actually make them real.”
Article Title: EgoTouch: On-Body Touch Input Using AR/VR Headset Cameras
Article Publication Date: 11-Oct-2024
Media Contact
Aaron Aupperlee
Carnegie Mellon University
aaupperlee@cmu.edu
Office: 412-268-9068
Original Source
All latest news from the category: Information Technology
Here you can find a summary of innovations in the fields of information and data processing and up-to-date developments on IT equipment and hardware.
This area covers topics such as IT services, IT architectures, IT management and telecommunications.
Newest articles
Planets form through domino effect
New radio astronomy observations of a planetary system in the process of forming show that once the first planets form close to the central star, these planets can help shepherd…
M87’s powerful jet unleashes rare gamma-ray outburst
Also known as Virgo A or NGC 4486, M87 is the brightest object in the Virgo cluster of galaxies, the largest gravitationally bound type of structure in the universe. It…
Accelerating 5G & 6G Applications
Fraunhofer HHI and Partners Launch First Open-Source 5G FR2 MIMO Demonstrator. Fraunhofer Heinrich-Hertz-Institut (HHI) and its partners, Allbesmart, National Instruments (NI), and TMYTEK, have unveiled the world’s first open-source 5G…