But there’s a problem: The latest DNA sequencing instruments are burying researchers in trillions of bytes of data and overwhelming existing tools in biological computing. It doesn’t help that there’s a variety of sequencing instruments feeding a diverse set of applications.
Iowa State University’s Srinivas Aluru is leading a research team that’s developing a set of solutions using high performance computing. The researchers want to develop core techniques, parallel algorithms and software libraries to help researchers adapt parallel computing techniques to high-throughput DNA sequencing, the next generation of sequencing technologies.
Those technologies are now ubiquitous, “enabling single investigators with limited budgets to carry out what could only be accomplished by an international network of major sequencing centers just a decade ago,” said Aluru, the Ross Martin Mehl and Marylyne Munas Mehl Professor of Computer Engineering at Iowa State.
“Seven years ago we were able to sequence DNA one fragment at a time,” he said. “Now researchers can read up to 6 billion DNA sequences in one experiment.
“How do we address these big data issues?”
A three-year, $2 million grant from the BIGDATA program of the National Science Foundation and the National Institutes of Health will support the search for a solution by Aluru and researchers from Iowa State, Stanford University, Virginia Tech and the University of Michigan. In addition to Aluru, the project’s leaders at Iowa State are Patrick Schnable, Iowa State’s Baker Professor of Agronomy and director of the centers for Plant Genomics and Carbon Capturing Crops, and Jaroslaw Zola, a former research assistant professor in electrical and computer engineering who recently moved to Rutgers University.
The majority of the grant – $1.3 million – will support research at Iowa State. And Aluru is quick to say that none of the grant will support hardware development.
Researchers will start by identifying a large set of building blocks frequently used in genomic studies. They’ll develop the parallel algorithms and high performance implementations needed to do the necessary data analysis. And they’ll wrap all of those technologies in software libraries researchers can access for help. On top of all that, they’ll design a domain specific language that automatically generates computing codes for researchers.
Aluru said that should be much more effective than asking high performance computing specialists to develop parallel approaches to each and every application.
“The goal is to empower the broader community to benefit from clever parallel algorithms, highly tuned implementations and specialized high performance computing hardware, without requiring expertise in any of these,” says a summary of the research project.
Aluru said the resulting software libraries will be fully open-sourced. Researchers will be free to use the libraries while developing, editing and modifying them as needed.
“We’re hoping this approach can be the most cost-effective and fastest way to gain adoption in the research community,” Aluru said. “We want to get everybody up to speed using high performance computing.”Srinivas Aluru, Electrical and Computer Engineering,
Mike Krapfl | Newswise Science News
CiViQ brings quantum technologies to the telecommunications arena
21.11.2018 | Fraunhofer-Institut für Nachrichtentechnik, Heinrich-Hertz-Institut, HHI
Earthquake researchers finalists for supercomputing prize
19.11.2018 | University of Tokyo
Innsbruck quantum physicists have constructed a diode for magnetic fields and then tested it in the laboratory. The device, developed by the research groups led by the theorist Oriol Romero-Isart and the experimental physicist Gerhard Kirchmair, could open up a number of new applications.
Electric diodes are essential electronic components that conduct electricity in one direction but prevent conduction in the opposite one. They are found at the...
Max Planck researchers revel the nano-structure of molecular trains and the reason for smooth transport in cellular antennas.
Moving around, sensing the extracellular environment, and signaling to other cells are important for a cell to function properly. Responsible for those tasks...
Researchers at the University of New Hampshire have captured a difficult-to-view singular event involving "magnetic reconnection"--the process by which sparse particles and energy around Earth collide producing a quick but mighty explosion--in the Earth's magnetotail, the magnetic environment that trails behind the planet.
Magnetic reconnection has remained a bit of a mystery to scientists. They know it exists and have documented the effects that the energy explosions can...
Biochips have been developed at TU Wien (Vienna), on which tissue can be produced and examined. This allows supplying the tissue with different substances in a very controlled way.
Cultivating human cells in the Petri dish is not a big challenge today. Producing artificial tissue, however, permeated by fine blood vessels, is a much more...
Faster and secure data communication: This is the goal of a new joint project involving physicists from the University of Würzburg. The German Federal Ministry of Education and Research funds the project with 14.8 million euro.
In our digital world data security and secure communication are becoming more and more important. Quantum communication is a promising approach to achieve...
19.11.2018 | Event News
09.11.2018 | Event News
06.11.2018 | Event News
21.11.2018 | Life Sciences
21.11.2018 | Power and Electrical Engineering
21.11.2018 | Life Sciences