Big Data is everywhere, and we are constantly told that it holds the answers to almost any problem we want to solve. Companies collect information on how we shop, doctors and insurance companies gather our medical test results, and governments compile logs of our phone calls and emails. In each instance, the hope is that critical insights are hidden deep within massive amounts of information, just waiting to be discovered.
Two researchers at Cold Spring Harbor Laboratory challenge the most recent advances in this Big Data analysis, using a classic mathematical concept to tackle the outstanding problems in this field. Mutual information is able to uncover patterns in large lists of numbers, revealing entirely new, unexpected patterns.
But simply having lots of data is not the same as understanding it. Increasingly, new mathematical tools are needed to extract meaning from enormous data sets. In work published online today, two researchers at Cold Spring Harbor Laboratory (CSHL) now challenge the most recent advances in this field, using a classic mathematical concept to tackle the outstanding problems in Big Data analysis.
What does it mean to analyze Big Data? A major goal is to find patterns between seemingly unrelated quantities, such as income and cancer rates. Many of the most common statistical tools are only able to detect patterns if the researcher has some expectation about the relationship between the quantities. Part of the lure of Big Data is that it may reveal entirely new, unexpected patterns. Therefore, scientists and researchers have worked to develop statistical methods that will uncover these novel relationships.
In 2011, a distinguished group of researchers from Harvard University published a highly influential paper in the journal Science that advanced just such a tool. But in a paper published today in Proceedings of the National Academy of Sciences, CSHL Quantitative Biology Fellow Justin Kinney and CSHL Assistant Professor Gurinder "Mickey" Atwal demonstrate that this new tool is critically flawed. "Their statistical tool does not have the mathematical properties that were claimed," says Kinney.
Kinney and Atwal show that the correct tool was hiding in plain sight all along. The solution, they say, is a well known mathematical measure called "mutual information," first described in 1948. It was initially used to quantify the amount of information that could be transmitted electronically through a telephone cable; the concept now underlies the design of the world's telecommunications infrastructure. "What we've found in our work is that this same concept can also be used to find patterns in data," Kinney explains.
Applied to Big Data, mutual information is able to reveal patterns in large lists of numbers. For instance, it can be used to analyze patterns in data sets on the numerous bacterial species that help us digest food. "This particular tool is perfect for finding patterns in studies of the human microbiome, among many other things," Kinney says.
Importantly, mutual information provides a way of identifying all types of patterns within the data without reliance upon any prior assumptions. "Our work shows that mutual information very naturally solves this critical problem in statistics," Kinney says. "This beautiful mathematical concept has the potential to greatly benefit modern data analysis, in biology and in biology and many other important fields.
The research described here was supported by the Simons Center for Quantitative Biology at Cold Spring Harbor Laboratory.
"Equitability, mutual information, and the maximal information coefficient" appears online in PNAS on February 17, 2014. The authors are: Justin Block Kinney and Gurinder Singh Atwal. The paper can be obtained online at: http://www.pnas.org/content/early/2014/02/14/1309933111.abstractAbout Cold Spring Harbor Laboratory
Jaclyn Jansen | EurekAlert!
Trend-setting research project 5GNOW on the future of mobile communications rated “excellent”
03.08.2015 | Fraunhofer-Institut für Nachrichtentechnik Heinrich-Hertz-Institut
Superfast fluorescence sets new speed record
27.07.2015 | Duke University
Glacier decline in the first decade of the 21st century has reached a historical record, since the onset of direct observations. Glacier melt is a global phenomenon and will continue even without further climate change. This is shown in the latest study by the World Glacier Monitoring Service under the lead of the University of Zurich, Switzerland.
The World Glacier Monitoring Service, domiciled at the University of Zurich, has compiled worldwide data on glacier changes for more than 120 years. Together...
Using ultracold atoms trapped in light crystals, scientists from the MPQ, LMU, and the Weizmann Institute observe a novel state of matter that never thermalizes.
What happens if one mixes cold and hot water? After some initial dynamics, one is left with lukewarm water—the system has thermalized to a new thermal...
Physicists from Regensburg and Marburg, Germany have succeeded in taking a slow-motion movie of speeding electrons in a solid driven by a strong light wave. In the process, they have unraveled a novel quantum phenomenon, which will be reported in the forthcoming edition of Nature.
The advent of ever faster electronics featuring clock rates up to the multiple-gigahertz range has revolutionized our day-to-day life. Researchers and...
Researchers have developed an ultrafast light-emitting device that can flip on and off 90 billion times a second and could form the basis of optical computing.
Joint BioEnergy Institute study identifies bacterial protein that is key to protecting rice against bacterial blight
A bacterial signal that when recognized by rice plants enables the plants to resist a devastating blight disease has been identified by a multi-national team...
23.07.2015 | Event News
10.07.2015 | Event News
25.06.2015 | Event News
03.08.2015 | Materials Sciences
03.08.2015 | Life Sciences
03.08.2015 | Life Sciences