Prototype systems evaluated by NIST performed surprisingly well for a developing technology: half of the prototypes were accurate at least 80 percent of the time and one had a near perfect score. Automating the manual portion of the work frees up time for trained examiners to spend time on very difficult images that the software has little hope of processing.
As any TV crime series fan knows, latent prints are left behind any time someone touches something. While ubiquitous, “latents” often include only part of the finger—maybe just a few ridges—and sometimes are left on textured materials, adding even more challenges.
To identify the owner, a fingerprint examiner must first carefully mark the distinguishing features of the full or partial print, beginning with the positions where ridges end or branch. Then the latent is entered into a counter-terrorist or law enforcement identification system such as the Federal Bureau of Investigation’s Integrated Automated Fingerprint Identification System (IAFIS). The FBI’s system compares latents against the 55 million sets of ten-print cards taken at arrest.
The IAFIS system was a significant advance. Now the manual, mark-up portion of latent fingerprint identification is being automated with an emerging technology called Automatic Feature Extraction and Matching (AFEM). NIST biometric researchers assessed prototypes that eight vendors are developing.
In the evaluation, researchers used a data set of 835 latent prints and 100,000 fingerprints that have been used in real case examinations.
The AFEM software extracted the distinguishing features of the latent prints, then compared them against 100,000 fingerprints. For each print the software provided a list of 50 candidates that the fingerprint specialists compared by hand. Most identities were found within the top 10.
In order of performance, the most accurate prototypes were furnished by NEC Corp., Cogent Inc., SPEX Forensics, Inc., Motorola, Inc. and L1 Identity Solutions. Results ranged from nearly 100 percent for the most accurate product to around 80 percent for the last three listed.
The evaluations also showed a strong correlation between the number of distinguishing features in a latent print and its ability to match for all prototypes and that the quality of the image data strongly influences accuracy.
“While the testing has demonstrated accuracy beyond pre-test expectations, the potential of the technology remains undefined and further testing is required,” said computer scientist Patrick Grother. “In the future we will look at lower quality latent images to understand the technology’s limitations and we will support development of a standardized feature set that extends the one currently used by examiners for searches.”
The research was funded by the Department of Homeland Security’s Science and Technology Directorate and the FBI’s Criminal Justice Information Services Division. The report, An Evaluation of Automated Latent Fingerprint Identification Technologies, is available at http://fingerprint.nist.gov/latent/NISTIR_7577_ELFT_PhaseII.pdf
Evelyn Brown | EurekAlert!
NASA CubeSat to test miniaturized weather satellite technology
10.11.2017 | NASA/Goddard Space Flight Center
New approach uses light instead of robots to assemble electronic components
08.11.2017 | The Optical Society
The formation of stars in distant galaxies is still largely unexplored. For the first time, astron-omers at the University of Geneva have now been able to closely observe a star system six billion light-years away. In doing so, they are confirming earlier simulations made by the University of Zurich. One special effect is made possible by the multiple reflections of images that run through the cosmos like a snake.
Today, astronomers have a pretty accurate idea of how stars were formed in the recent cosmic past. But do these laws also apply to older galaxies? For around a...
Just because someone is smart and well-motivated doesn't mean he or she can learn the visual skills needed to excel at tasks like matching fingerprints, interpreting medical X-rays, keeping track of aircraft on radar displays or forensic face matching.
That is the implication of a new study which shows for the first time that there is a broad range of differences in people's visual ability and that these...
Computer Tomography (CT) is a standard procedure in hospitals, but so far, the technology has not been suitable for imaging extremely small objects. In PNAS, a team from the Technical University of Munich (TUM) describes a Nano-CT device that creates three-dimensional x-ray images at resolutions up to 100 nanometers. The first test application: Together with colleagues from the University of Kassel and Helmholtz-Zentrum Geesthacht the researchers analyzed the locomotory system of a velvet worm.
During a CT analysis, the object under investigation is x-rayed and a detector measures the respective amount of radiation absorbed from various angles....
The quantum world is fragile; error correction codes are needed to protect the information stored in a quantum object from the deteriorating effects of noise. Quantum physicists in Innsbruck have developed a protocol to pass quantum information between differently encoded building blocks of a future quantum computer, such as processors and memories. Scientists may use this protocol in the future to build a data bus for quantum computers. The researchers have published their work in the journal Nature Communications.
Future quantum computers will be able to solve problems where conventional computers fail today. We are still far away from any large-scale implementation,...
Pillared graphene would transfer heat better if the theoretical material had a few asymmetric junctions that caused wrinkles, according to Rice University...
15.11.2017 | Event News
15.11.2017 | Event News
30.10.2017 | Event News
20.11.2017 | Life Sciences
20.11.2017 | Materials Sciences
20.11.2017 | Life Sciences