A study by Illinois researchers demonstrates that intensive screening of all passengers actually makes the system less secure by overtaxing security resources.
University of Illinois computer science and mathematics professor Sheldon H. Jacobson, in collaboration with Adrian J. Lee at the Central Illinois Technology and Education Research Institute, explored the benefit of matching passenger risk with security assets. The pair detailed their work in the journal Transportation Science.
“A natural tendency, when limited information is available about from where the next threat will come, is to overestimate the overall risk in the system,” Jacobson said. “This actually makes the system less secure by over-allocating security resources to those in the system that are low on the risk scale relative to others in the system.”
When overestimating the population risk, a larger proportion of high-risk passengers are designated for too little screening while a larger proportion of low-risk passengers are subjected to too much screening. With security resources devoted to the many low-risk passengers, those resources are less able to identify or address high-risk passengers. Nevertheless, current policies favor broad screening.
“One hundred percent checked baggage screening and full-body scanning of all passengers is the antithesis of a risk-based system,” Jacobson said. “It treats all passengers and their baggage as high-risk threats. The cost of such a system is prohibitive, and it makes the air system more vulnerable to successful attacks by sub-optimally allocating security assets.”
In an effort to address this problem, the Transportation Security Administration (TSA) introduced a pre-screening program in 2011, available to select passengers on a trial basis. Jacobson’s previous work has indicated that resources could be more effectively invested if the lowest-risk segments of the population – frequent travelers, for instance – could pass through security with less scrutiny since they are “known” to the system.
A challenge with implementing such a system is accurately assessing the risk of each passenger and using such information appropriately. In the new study, Jacobson and Lee developed three algorithms dealing with risk uncertainty in the passenger population. Then, they ran simulations to demonstrate how their algorithms, applied to a risk-based screening method, could estimate risk in the overall passenger population – instead of focusing on each individual passenger – and how errors in this estimation procedure can be mitigated to reduce the risk to the overall system.
They found that risk-based screening, such as the TSA’s new Pre-Check program, increases the overall expected security. Rating a passenger’s risk relative to the entire flying population allows more resources to be devoted to passengers with a high risk relative to the passenger population.
The paper also discusses scenarios of how terrorists may attempt to thwart the security system – for example, blending in with a high-risk crowd so as not to stand out – and provides insights into how risk-based systems can be designed to mitigate the impact of such activities.
“The TSA’s move toward a risk-based system is designed to more accurately match security assets with threats to the air system,” Jacobson said. “The ideal situation is to create a system that screens passengers commensurate with their risk. Since we know that very few people are a threat to the system, relative risk rather than absolute risk provides valuable information.”The National Science Foundation and the U.S. Air Force Office of Scientific Research supported this work.
Liz Ahlberg | University of Illinois
Statistical method developed at TU Dresden allows the detection of higher order dependencies
07.02.2020 | Technische Universität Dresden
Novel study underscores microbial individuality
13.12.2019 | Bigelow Laboratory for Ocean Sciences
The operational speed of semiconductors in various electronic and optoelectronic devices is limited to several gigahertz (a billion oscillations per second). This constrains the upper limit of the operational speed of computing. Now researchers from the Max Planck Institute for the Structure and Dynamics of Matter in Hamburg, Germany, and the Indian Institute of Technology in Bombay have explained how these processes can be sped up through the use of light waves and defected solid materials.
Light waves perform several hundred trillion oscillations per second. Hence, it is natural to envision employing light oscillations to drive the electronic...
Most natural and artificial surfaces are rough: metals and even glasses that appear smooth to the naked eye can look like jagged mountain ranges under the microscope. There is currently no uniform theory about the origin of this roughness despite it being observed on all scales, from the atomic to the tectonic. Scientists suspect that the rough surface is formed by irreversible plastic deformation that occurs in many processes of mechanical machining of components such as milling.
Prof. Dr. Lars Pastewka from the Simulation group at the Department of Microsystems Engineering at the University of Freiburg and his team have simulated such...
Investigation of the temperature dependence of the skyrmion Hall effect reveals further insights into possible new data storage devices
The joint research project of Johannes Gutenberg University Mainz (JGU) and the Massachusetts Institute of Technology (MIT) that had previously demonstrated...
Researchers at Chalmers University of Technology, Sweden, recently completed a 5-year research project looking at how to make fibre optic communications systems more energy efficient. Among their proposals are smart, error-correcting data chip circuits, which they refined to be 10 times less energy consumptive. The project has yielded several scientific articles, in publications including Nature Communications.
Streaming films and music, scrolling through social media, and using cloud-based storage services are everyday activities now.
After helping develop a new approach for organic synthesis -- carbon-hydrogen functionalization -- scientists at Emory University are now showing how this approach may apply to drug discovery. Nature Catalysis published their most recent work -- a streamlined process for making a three-dimensional scaffold of keen interest to the pharmaceutical industry.
"Our tools open up whole new chemical space for potential drug targets," says Huw Davies, Emory professor of organic chemistry and senior author of the paper.
12.02.2020 | Event News
16.01.2020 | Event News
15.01.2020 | Event News
21.02.2020 | Medical Engineering
21.02.2020 | Health and Medicine
21.02.2020 | Physics and Astronomy