Using a $2.1 million grant from the National Science Foundation, a group led by computer scientist and astrophysicist Alexander Szalay of Johns Hopkins' Institute for Data Intensive Engineering and Science is designing and developing such a tool, dubbed the Data-Scope.
Once built, the Data-Scope, which is actually a cluster of sophisticated computers capable of handling colossal sets of information, will enable the kind of data analysis tasks that simply are not otherwise possible today, said Szalay, the Alumni Centennial Professor in the Krieger School's Henry A. Rowland Department of Physics and Astronomy.
"Computer science has drastically changed the way we do science and the science that we do, and the Data-Scope is a crucial step in this process," Szalay said. "At this moment, the huge data sets are here, but we lack an integrated software and hardware infrastructure to analyze them. Data-Scope will bridge that gap."
Co-investigators on the Data-Scope project, all from Johns Hopkins, are Kenneth Church, chief scientist for the Human Language Technology Center of Excellence, a Department of Defense-funded center dedicated to advancing technology for the analysis of speech, text and document data; Andreas Terzis, associate professor in the Department of Computer Science at the Whiting School of Engineering; Sarah Wheelan, assistant professor of oncology bioinformatics in the School of Medicine; and Scott Zeger, professor of biostatistics in the Bloomberg School of Public Health and the university's vice provost for research.
Data-Scope will be able to handle 5 petabytes of data. That's the equivalent of 100 million four-drawer file cabinets filled with text. (Fifty petabytes would equal the entire written work of humankind, from the beginning of history until now, in all languages.)
The new apparatus will allow Szalay and a host of other Johns Hopkins researchers (not to mention those at other institutions, including universities and national laboratories such as Los Alamos in New Mexico and Oak Ridge in Tennessee) to conduct research directly in the database, which is where Szalay contends that more and more science is being done.
"The Data-Scope will allow us to mine out relationships among data that already exist, but that we can't yet handle, and to sift discoveries from what seems like an overwhelming flow of information," he said. "New discoveries will definitely emerge this way. There are relationships and patterns that we just cannot fathom buried in that onslaught of data. Data-Scope will tease these out."
According to Szalay, there are at least 20 research groups within Johns Hopkins that are grappling with data problems totaling 3 petabytes. (Three petabytes is equal to about 20 billion photos on Facebook.) Without Data-Scope, "they would have to wait years in order to analyze that amount of data," Szalay said.
The two-year NSF grant, to be supplemented with almost $1 million from Johns Hopkins, will underwrite the design and building of the new instrument and its first year of operation, expected to begin in May 2011. Szalay said that the range of material that the Data-Scope will handle will be "breathtakingly large, from genomics to ocean circulation, turbulence, astrophysics, environmental science, public health and beyond."
"There really is nothing like this at any university right now," Szalay said. "Such systems usually take many years to build up, but we are doing it much more quickly. It's similar to what Google is doing-of course on a thousand-times-larger scale than we are. This instrument will be the best in the academic world, bar none."
Zeger said he is excited about the research possibilities and collaborations that the new instrument will make possible.
"The NSF funding of a high-performance computing system, specially designed by Dr. Szalay and his team to solve large computational problems, will contribute to Johns Hopkins' remaining in the forefront of many areas, including biomedicine, where I work," he said. "The new genomic data are voluminous. Their analysis requires machines faster than are currently available. Dr. Szalay's machine will enable our biomedical and computational scientists to work together to solve problems that would have been beyond them otherwise."
Jonathan Bagger, vice provost for graduate and postdoctoral programs and special projects, said he believes that the Data-Scope positions Johns Hopkins to play a crucial role in the next revolution in science: data analysis.
"The Data-Scope is specially designed to bring large amounts of data literally under the microscope," he said. "By manipulating data in new ways, Johns Hopkins researchers will be able to advance their science in ways never before possible. I am excited that Johns Hopkins is in the forefront of this new field of inquiry: developing the calculus of the 21st century."
The instrument will be part of a new energy-efficient computing center that is being constructed in the basement of the Bloomberg Center for Physics and Astronomy on the Homewood campus. The house-sized room once served as a mission control center for the Far Ultraviolet Spectroscopic Explorer, a NASA satellite. This computing center is being built using a $1.3 million federal stimulus grant from the National Science Foundation.
Lisa De Nike | Newswise Science News
Deep Learning predicts hematopoietic stem cell development
21.02.2017 | Helmholtz Zentrum München - Deutsches Forschungszentrum für Gesundheit und Umwelt
Sensors embedded in sports equipment could provide real-time analytics to your smartphone
16.02.2017 | University of Illinois College of Engineering
In the field of nanoscience, an international team of physicists with participants from Konstanz has achieved a breakthrough in understanding heat transport
Cells need to repair damaged DNA in our genes to prevent the development of cancer and other diseases. Our cells therefore activate and send “repair-proteins”...
The Fraunhofer IWS Dresden and Technische Universität Dresden inaugurated their jointly operated Center for Additive Manufacturing Dresden (AMCD) with a festive ceremony on February 7, 2017. Scientists from various disciplines perform research on materials, additive manufacturing processes and innovative technologies, which build up components in a layer by layer process. This technology opens up new horizons for component design and combinations of functions. For example during fabrication, electrical conductors and sensors are already able to be additively manufactured into components. They provide information about stress conditions of a product during operation.
The 3D-printing technology, or additive manufacturing as it is often called, has long made the step out of scientific research laboratories into industrial...
Nature does amazing things with limited design materials. Grass, for example, can support its own weight, resist strong wind loads, and recover after being...
Nanometer-scale magnetic perforated grids could create new possibilities for computing. Together with international colleagues, scientists from the Helmholtz Zentrum Dresden-Rossendorf (HZDR) have shown how a cobalt grid can be reliably programmed at room temperature. In addition they discovered that for every hole ("antidot") three magnetic states can be configured. The results have been published in the journal "Scientific Reports".
Physicist Dr. Rantej Bali from the HZDR, together with scientists from Singapore and Australia, designed a special grid structure in a thin layer of cobalt in...
13.02.2017 | Event News
10.02.2017 | Event News
09.02.2017 | Event News
23.02.2017 | Physics and Astronomy
23.02.2017 | Earth Sciences
23.02.2017 | Life Sciences