Using a $2.1 million grant from the National Science Foundation, a group led by computer scientist and astrophysicist Alexander Szalay of Johns Hopkins' Institute for Data Intensive Engineering and Science is designing and developing such a tool, dubbed the Data-Scope.
Once built, the Data-Scope, which is actually a cluster of sophisticated computers capable of handling colossal sets of information, will enable the kind of data analysis tasks that simply are not otherwise possible today, said Szalay, the Alumni Centennial Professor in the Krieger School's Henry A. Rowland Department of Physics and Astronomy.
"Computer science has drastically changed the way we do science and the science that we do, and the Data-Scope is a crucial step in this process," Szalay said. "At this moment, the huge data sets are here, but we lack an integrated software and hardware infrastructure to analyze them. Data-Scope will bridge that gap."
Co-investigators on the Data-Scope project, all from Johns Hopkins, are Kenneth Church, chief scientist for the Human Language Technology Center of Excellence, a Department of Defense-funded center dedicated to advancing technology for the analysis of speech, text and document data; Andreas Terzis, associate professor in the Department of Computer Science at the Whiting School of Engineering; Sarah Wheelan, assistant professor of oncology bioinformatics in the School of Medicine; and Scott Zeger, professor of biostatistics in the Bloomberg School of Public Health and the university's vice provost for research.
Data-Scope will be able to handle 5 petabytes of data. That's the equivalent of 100 million four-drawer file cabinets filled with text. (Fifty petabytes would equal the entire written work of humankind, from the beginning of history until now, in all languages.)
The new apparatus will allow Szalay and a host of other Johns Hopkins researchers (not to mention those at other institutions, including universities and national laboratories such as Los Alamos in New Mexico and Oak Ridge in Tennessee) to conduct research directly in the database, which is where Szalay contends that more and more science is being done.
"The Data-Scope will allow us to mine out relationships among data that already exist, but that we can't yet handle, and to sift discoveries from what seems like an overwhelming flow of information," he said. "New discoveries will definitely emerge this way. There are relationships and patterns that we just cannot fathom buried in that onslaught of data. Data-Scope will tease these out."
According to Szalay, there are at least 20 research groups within Johns Hopkins that are grappling with data problems totaling 3 petabytes. (Three petabytes is equal to about 20 billion photos on Facebook.) Without Data-Scope, "they would have to wait years in order to analyze that amount of data," Szalay said.
The two-year NSF grant, to be supplemented with almost $1 million from Johns Hopkins, will underwrite the design and building of the new instrument and its first year of operation, expected to begin in May 2011. Szalay said that the range of material that the Data-Scope will handle will be "breathtakingly large, from genomics to ocean circulation, turbulence, astrophysics, environmental science, public health and beyond."
"There really is nothing like this at any university right now," Szalay said. "Such systems usually take many years to build up, but we are doing it much more quickly. It's similar to what Google is doing-of course on a thousand-times-larger scale than we are. This instrument will be the best in the academic world, bar none."
Zeger said he is excited about the research possibilities and collaborations that the new instrument will make possible.
"The NSF funding of a high-performance computing system, specially designed by Dr. Szalay and his team to solve large computational problems, will contribute to Johns Hopkins' remaining in the forefront of many areas, including biomedicine, where I work," he said. "The new genomic data are voluminous. Their analysis requires machines faster than are currently available. Dr. Szalay's machine will enable our biomedical and computational scientists to work together to solve problems that would have been beyond them otherwise."
Jonathan Bagger, vice provost for graduate and postdoctoral programs and special projects, said he believes that the Data-Scope positions Johns Hopkins to play a crucial role in the next revolution in science: data analysis.
"The Data-Scope is specially designed to bring large amounts of data literally under the microscope," he said. "By manipulating data in new ways, Johns Hopkins researchers will be able to advance their science in ways never before possible. I am excited that Johns Hopkins is in the forefront of this new field of inquiry: developing the calculus of the 21st century."
The instrument will be part of a new energy-efficient computing center that is being constructed in the basement of the Bloomberg Center for Physics and Astronomy on the Homewood campus. The house-sized room once served as a mission control center for the Far Ultraviolet Spectroscopic Explorer, a NASA satellite. This computing center is being built using a $1.3 million federal stimulus grant from the National Science Foundation.
Lisa De Nike | Newswise Science News
New software speeds origami structure designs
12.10.2017 | Georgia Institute of Technology
Seeing the next dimension of computer chips
11.10.2017 | Osaka University
University of Maryland researchers contribute to historic detection of gravitational waves and light created by event
On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...
Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.
Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....
Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).
When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...
Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.
How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...
Scientists from the Max Planck Institute of Quantum Optics, using high precision laser spectroscopy of atomic hydrogen, confirm the surprisingly small value of the proton radius determined from muonic hydrogen.
It was one of the breakthroughs of the year 2010: Laser spectroscopy of muonic hydrogen resulted in a value for the proton charge radius that was significantly...
17.10.2017 | Event News
10.10.2017 | Event News
10.10.2017 | Event News
18.10.2017 | Materials Sciences
18.10.2017 | Physics and Astronomy
18.10.2017 | Physics and Astronomy