Throwing a perfect strike in virtual bowling doesn't require your gaming system to precisely track the position and orientation of your swinging arm. But if you're operating a robotic forklift around a factory, manipulating a mechanical arm on an assembly line or guiding a remote-controlled laser scalpel inside a patient, the ability to pinpoint exactly where it is in three-dimensional (3-D) space is critical.
To make that measurement more reliable, a public-private team led by the National Institute of Standards and Technology (NIST) has created a new standard test method to evaluate how well an optical tracking system can define an object's position and orientation--known as its "pose"--with six degrees of freedom: up/down, right/left, forward/backward, pitch, yaw and roll.
Optical tracking systems work on a principle similar to the stereoscopic vision of a human. A person's two eyes work together to simultaneously take in their surroundings and tell the brain exactly where all of the people and objects within that space are located.
In an optical tracking system, the "eyes" consist of two or more cameras that record the room and are partnered with beam emitters that bounce a signal--infrared, laser or LIDAR (Light Detection and Ranging)--off objects in the area. With both data sources feeding into a computer, the room and its contents can be virtually recreated.
Determining the pose of an object is relatively easy if it doesn't move, and previous performance tests for optical tracking systems relied solely on static measurements. However, for systems such as those used to pilot automated guided vehicle (AGV) forklifts--the robotic beasts of burden found in many factories and warehouses--that isn't good enough. Their "vision" must be 20/20 for both stationary and moving objects to ensure they work efficiently and safely.
To address this need, a recently approved ASTM International standard (ASTM E3064-16) now provides a standard test method for evaluating the performance of optical tracking systems that measure pose in six degrees of freedom for static--and for the first time, dynamic--objects.
NIST engineers helped develop both the tools and procedure used in the new standard. "The tools are two barbell-like artifacts for the optical tracking systems to locate during the test," said NIST electronics engineer Roger Bostelman. "Both artifacts have a 300-millimeter bar at the center, but one has six reflective markers attached to each end while the other has two 3-D shapes called cuboctahedrons [a solid with 8 triangular faces and 6 square faces]." Optical tracking systems can measure the full poses of both targets.
According to Bostelman's colleague, NIST computer scientist Tsai Hong, the test is conducted by having the evaluator walk two defined paths--one up and down the test area and the other from left and right--with each artifact. Moving an artifact along the course orients it for the X-, Y- and Z-axis measurements, while turning it three ways relative to the path provides the pitch, yaw and roll aspects.
"Our test bed at NIST's Gaithersburg, Maryland, headquarters has 12 cameras with infrared emitters stationed around the room, so we can track the artifact throughout the run and determine its pose at multiple points," Hong said. "And since we know that the reflective markers or the irregular shapes on the artifacts are fixed at 300 millimeters apart, we can calculate and compare with extreme precision the measured distance between those poses."
Bostelman said that the new standard can evaluate the ability of an optical tracking system to locate things in 3-D space with unprecedented accuracy. "We found that the margin of error is 0.02 millimeters for assessing static performance and 0.2 millimeters for dynamic performance," he said.
Along with robotics, optical tracking systems are at the heart of a variety of applications including virtual reality in flight/medical/industrial training, the motion capture process in film production and image-guided surgical tools.
"The new standard provides a common set of metrics and a reliable, easily implemented procedure that assesses how well optical trackers work in any situation," Hong said.
The E3064-16 standard test method was developed by the ASTM Subcommittee E57.02 on Test Methods, a group with representatives from various stakeholders, including manufacturers of optical tracking systems, research laboratories and industrial companies.
The E3064-16 document detailing construction of the artifacts, setup of the test course, formulas for deriving pose measurement error and the procedure for conducting the evaluation may be found on the ASTM website, http://www.
Michael E. Newman | EurekAlert!
Controlling robots with brainwaves and hand gestures
20.06.2018 | Massachusetts Institute of Technology, CSAIL
Innovative autonomous system for identifying schools of fish
20.06.2018 | IMDEA Networks Institute
In a recent publication in the renowned journal Optica, scientists of Leibniz-Institute of Photonic Technology (Leibniz IPHT) in Jena showed that they can accurately control the optical properties of liquid-core fiber lasers and therefore their spectral band width by temperature and pressure tuning.
Already last year, the researchers provided experimental proof of a new dynamic of hybrid solitons– temporally and spectrally stationary light waves resulting...
Scientists from the University of Freiburg and the University of Basel identified a master regulator for bone regeneration. Prasad Shastri, Professor of...
Moving into its fourth decade, AchemAsia is setting out for new horizons: The International Expo and Innovation Forum for Sustainable Chemical Production will take place from 21-23 May 2019 in Shanghai, China. With an updated event profile, the eleventh edition focusses on topics that are especially relevant for the Chinese process industry, putting a strong emphasis on sustainability and innovation.
Founded in 1989 as a spin-off of ACHEMA to cater to the needs of China’s then developing industry, AchemAsia has since grown into a platform where the latest...
The BMBF-funded OWICELLS project was successfully completed with a final presentation at the BMW plant in Munich. The presentation demonstrated a Li-Fi communication with a mobile robot, while the robot carried out usual production processes (welding, moving and testing parts) in a 5x5m² production cell. The robust, optical wireless transmission is based on spatial diversity; in other words, data is sent and received simultaneously by several LEDs and several photodiodes. The system can transmit data at more than 100 Mbit/s and five milliseconds latency.
Modern production technologies in the automobile industry must become more flexible in order to fulfil individual customer requirements.
An international team of scientists has discovered a new way to transfer image information through multimodal fibers with almost no distortion - even if the fiber is bent. The results of the study, to which scientist from the Leibniz-Institute of Photonic Technology Jena (Leibniz IPHT) contributed, were published on 6thJune in the highly-cited journal Physical Review Letters.
Endoscopes allow doctors to see into a patient’s body like through a keyhole. Typically, the images are transmitted via a bundle of several hundreds of optical...
13.06.2018 | Event News
08.06.2018 | Event News
05.06.2018 | Event News
22.06.2018 | Materials Sciences
22.06.2018 | Earth Sciences
22.06.2018 | Life Sciences