In a paper published online Sept. 7, 2010, in the journal Nature Communications, Thomas Serre and a team of colleagues at the McGovern Institute for Brain Research at MIT and the California Institute of Technology describe a new computer system that is as accurate as people in identifying mouse behaviors in videos.
What’s more, the team is making the fully customizable open-source software available for free. Given standard camcorder footage of a mouse, the software will automatically identify a mouse’s behavior frame by frame.
“We measured the agreement between any two human observers and it was more than 70 percent,” said Serre, who joined the faculty of Brown University in January 2010 after conducting his doctoral and postdoctoral studies, including the work described in the paper, in Tomaso Poggio’s lab at MIT. “The system agreed with humans at the same level. There was no significant difference between the annotations provided by our system and any two human observers.”
The value of the software is not only that it could relieve graduate students and lab technicians from some boredom. It takes about 25 person-hours to fully annotate an hour of mouse movies. In a small experiment with 10 mice who are each observed for 5 hours, that’s 1,250 person-hours of work. Because it is computerized, the system might also provide less subjective annotations than a human team would and could therefore be less susceptible to bias.
“This is a small step towards developing automatic tools for quantifying phenotyping of behavior,” said Poggio. “In the quest to understand the causes of mental diseases, labs at McGovern and elsewhere can rely on precise, quantitative and affordable tools to analyze the genes that contribute to disease. The bottleneck is the lack of corresponding techniques for quantifying the behavioral effects of mental diseases in animal models and in humans. The combination of large-scale genotyping and phenotyping will allow powerful data analysis techniques to help uncover the complex relationship between multiple genes and complex behaviors.”
There are a few commercial programs on the market, some of which cost thousands of dollars. They mostly base their behavioral coding on sensors, rather than video, and therefore have agreement rates with human observers of around 60 percent, substantially lower than the rates between people or between people and the system reported in the paper.
Although feats of artificial perception that compare to real perception are notable, it should not be a surprise that the system matches human levels of observation. It is, after all, based on a computer model of how the human brain interprets what it sees.
“It’s mimicking what the visual system does when you process motion,” Serre says.
In addition, the system learns from experience. To train it to detect grooming behavior, for example, the researchers fed the system lots of videos of mice grooming themselves and certified what the behavior was so the system would know. From there the software was able to identify new scenes of grooming without any coaching. In the paper, the team shows that the software is capable of performing the chore even in different strains of mice in a variety of lighting and other conditions. Serre says the software is likely to be easy to train to work with other lab animals.
“Neuroscience is beginning to give us useful blueprints for a more powerful artificial vision technology,” Poggio explains. “It is encouraging that studies of the brain can lead to a system like ours that can help the scientific community better understand mental diseases.”
The paper’s other authors are Hueihan Jhuang, Estibaliz Garrote, Jim Mutch and Tomaso Poggio at MIT, and Xinlin Yu, Vinita Khilnani, and Andrew D. Steele at Caltech.
Funding: McGovern Institute Neurotechnology (MINT) Program at the McGovern Institute for Brain Research at MIT, Broad Fellows Program in Brain Circuitry at Caltech, and the Taiwan National Science Council.
Julie Pryor | Newswise Science News
Rutgers-led innovation could spur faster, cheaper, nano-based manufacturing
14.02.2018 | Rutgers University
New study from the University of Halle: How climate change alters plant growth
12.01.2018 | Martin-Luther-Universität Halle-Wittenberg
For the first time, a team of researchers at the Max-Planck Institute (MPI) for Polymer Research in Mainz, Germany, has succeeded in making an integrated circuit (IC) from just a monolayer of a semiconducting polymer via a bottom-up, self-assembly approach.
In the self-assembly process, the semiconducting polymer arranges itself into an ordered monolayer in a transistor. The transistors are binary switches used...
Breakthrough provides a new concept of the design of molecular motors, sensors and electricity generators at nanoscale
Researchers from the Institute of Organic Chemistry and Biochemistry of the CAS (IOCB Prague), Institute of Physics of the CAS (IP CAS) and Palacký University...
For photographers and scientists, lenses are lifesavers. They reflect and refract light, making possible the imaging systems that drive discovery through the microscope and preserve history through cameras.
But today's glass-based lenses are bulky and resist miniaturization. Next-generation technologies, such as ultrathin cameras or tiny microscopes, require...
Scientists from the University of Zurich have succeeded for the first time in tracking individual stem cells and their neuronal progeny over months within the intact adult brain. This study sheds light on how new neurons are produced throughout life.
The generation of new nerve cells was once thought to taper off at the end of embryonic development. However, recent research has shown that the adult brain...
Theoretical physicists propose to use negative interference to control heat flow in quantum devices. Study published in Physical Review Letters
Quantum computer parts are sensitive and need to be cooled to very low temperatures. Their tiny size makes them particularly susceptible to a temperature...
15.02.2018 | Event News
13.02.2018 | Event News
12.02.2018 | Event News
19.02.2018 | Materials Sciences
19.02.2018 | Materials Sciences
19.02.2018 | Life Sciences