Intelligent video cameras, large video screens, and geo-referencing software are among the technologies that will soon be available to law enforcement and security agencies.
In the recent Proceedings of the 2008 IEEE Conference on Advanced Video and Signal Based Surveillance, James W. Davis and doctoral student Karthik Sankaranarayanan report that they've completed the first three phases of the project: they have one software algorithm that creates a wide-angle video panorama of a street scene, another that maps the panorama onto a high-resolution aerial image of the scene, and a method for actively tracking a selected target.
The ultimate goal is a networked system of “smart” video cameras that will let surveillance officers observe a wide area quickly and efficiently. Computers will carry much of the workload.
"In my lab, we've always tried to develop technologies that would improve officers' situational awareness, and now we want to give that same kind of awareness to computers," said Davis, an associate professor of computer science and engineering at Ohio State University.
The research isn't meant to gather specific information about individuals, he explained.
"In our research, we care what you do, not who you are. We aim to analyze and model the behavior patterns of people and vehicles moving through the scene, rather than attempting to determine the identity of people. We are trying to automatically learn what typical activity patterns exist in the monitored area, and then have the system look for atypical patterns that may signal a person of interest -- perhaps someone engaging in nefarious behavior or a person in need of help."
The first piece of software expands the small field of view that traditional pan-tilt-zoom security cameras offer.
When surveillance operators look through one of these video cameras, they get only a tiny image -- what some refer to as a "soda straw" view of the world. As they move the camera around, they can easily lose a sense of where they are looking within a larger context.
The Ohio State software takes a series of snapshots from every direction within a camera's field of view, and combines them into a seamless panorama.
Commercially available software can turn overlapping photographs into a flat panorama, Davis explained. But this new software creates a 360-degree high-resolution view of a camera's whole viewspace, as if someone were looking at the entire scene at once. The view resembles that of a large fish-eye lens.
The fish-eye view isn't a live video image; it takes a few minutes to produce. But once it's displayed on a computer screen, operators can click a mouse anywhere within it, and the camera will pan and tilt to that location for a live shot.
Or, they could draw a line on the screen, and the camera will orient along that particular route -- down a certain street, for instance. Davis and his team are also looking to add touch-screen capability to the system.
A second piece of software maps locations within the fish-eye view onto an aerial map of the scene, such as a detailed Google map. A computer can use this information to calculate where the viewspaces of all the security cameras in an area overlap. Then it can determine the geo-referenced coordinates -- latitude and longitude -- of each ground pixel in the panorama image.
In the third software component, the combination map/panorama is used for tracking. As a person walks across a scene, the computer can calculate exactly where the person is on the panorama and aerial map. That information can then be used to instruct a camera to follow him or her automatically using the camera’s pan-and-tilt control. With this system, it will be possible for the computer to “hand-off” the tracking task between cameras as the person moves in and out of view of different cameras.
"That's the advantage of linking all the cameras together in one system -- you could follow a person's trajectory seamlessly," Davis said.
His team is now working on the next step in the research: determining who should be followed.
The system won't rely on traditional profiling methods, he said. A person's race or sex or general appearance won't matter. What will matter is where the person goes, and what they do.
"If you're doing something strange, we want to be able to detect that, and figure out what's going on," he said.
To first determine what constitutes normal behavior, they plan to follow the paths of many people who walk through a particular scene over a long period of time. A line tracing each person's trajectory will be saved to a database.
"You can imagine that over a few months, you're going to start to pick up where people tend to go at certain times of day -- trends," he said.
People who stop in an unusual spot or leave behind an object like a package or book bag might be considered suspicious by law enforcement.
But Davis has always wanted to see if this technology could find lost or confused people. He suspects that it can, since he can easily pick out lost people himself, while he watches video footage from the experimental camera system that surrounds his building at Ohio State.
It never fails -- during the first week of fall quarter, as most students hurry directly to class, some will circle the space between buildings. They'll stop, maybe look around, and turn back and forth a lot.
"Humans can pick out a lost person really well," he said. "I believe you could build an algorithm that would also be able to do it."
He's now looking into the possibility of deploying a large test system around the state of Ohio using their research. Here law enforcement could link video cameras around the major cities, map video panoramas to publicly available aerial maps (such as those maintained by the Ohio Geographically Referenced Information Program), and use their software to provide a higher level of “location awareness” for surveillance.
Three Ohio State students are currently working on this project. Doctoral student Karthik Sankaranarayanan is funded by the National Science Foundation. And two undergraduate students -- Matthew Nedrich and Karl Salva -- are funded by the Air Force Research Laboratory.
Contact: James W. Davis, (614) 292-1553; Davis.firstname.lastname@example.org
Pam Frost Gorder | Newswise Science News
Green Light for Galaxy Europe
15.03.2018 | Albert-Ludwigs-Universität Freiburg im Breisgau
Tokyo Tech's six-legged robots get closer to nature
12.03.2018 | Tokyo Institute of Technology
Animal photoreceptors capture light with photopigments. Researchers from the University of Göttingen have now discovered that these photopigments fulfill an...
On 15 March, the AWI research aeroplane Polar 5 will depart for Greenland. Concentrating on the furthest northeast region of the island, an international team...
The world’s second-largest ice shelf was the destination for a Polarstern expedition that ended in Punta Arenas, Chile on 14th March 2018. Oceanographers from...
At the 2018 ILA Berlin Air Show from April 25–29, the Fraunhofer Institute for Laser Technology ILT is showcasing extreme high-speed Laser Material Deposition (EHLA): A video documents how for metal components that are highly loaded, EHLA has already proved itself as an alternative to hard chrome plating, which is now allowed only under special conditions.
When the EU restricted the use of hexavalent chromium compounds to special applications requiring authorization, the move prompted a rethink in the surface...
At the ILA Berlin, hall 4, booth 202, Fraunhofer FHR will present two radar sensors for navigation support of drones. The sensors are valuable components in the implementation of autonomous flying drones: they function as obstacle detectors to prevent collisions. Radar sensors also operate reliably in restricted visibility, e.g. in foggy or dusty conditions. Due to their ability to measure distances with high precision, the radar sensors can also be used as altimeters when other sources of information such as barometers or GPS are not available or cannot operate optimally.
Drones play an increasingly important role in the area of logistics and services. Well-known logistic companies place great hope in these compact, aerial...
16.03.2018 | Event News
13.03.2018 | Event News
08.03.2018 | Event News
16.03.2018 | Earth Sciences
16.03.2018 | Physics and Astronomy
16.03.2018 | Life Sciences