Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

Whole genome analysis, stat

20.02.2014
Supercomputer dramatically accelerates rapid genome analysis

Although the time and cost of sequencing an entire human genome has plummeted, analyzing the resulting three billion base pairs of genetic information from a single genome can take many months.


Beagle, a Cray XE6 supercomputer at Argonne National Laboratory, supports computation, simulation and data analysis for the biomedical research community.

Credit: Argonne National Laboratory

In the journal Bioinformatics, however, a University of Chicago-based team—working with Beagle, one of the world's fastest supercomputers devoted to life sciences—reports that genome analysis can be radically accelerated. This computer, based at Argonne National Laboratory, is able to analyze 240 full genomes in about two days.

"This is a resource that can change patient management and, over time, add depth to our understanding of the genetic causes of risk and disease," said study author Elizabeth McNally, MD, PhD, the A. J. Carlson Professor of Medicine and Human Genetics and director of the Cardiovascular Genetics clinic at the University of Chicago Medicine.

"The supercomputer can process many genomes simultaneously rather than one at a time," said first author Megan Puckelwartz, a graduate student in McNally's laboratory. "It converts whole genome sequencing, which has primarily been used as a research tool, into something that is immediately valuable for patient care."

Because the genome is so vast, those involved in clinical genetics have turned to exome sequencing, which focuses on the two percent or less of the genome that codes for proteins. This approach is often useful. An estimated 85 percent of disease-causing mutations are located in coding regions. But the rest, about 15 percent of clinically significant mutations, come from non-coding regions, once referred to as "junk DNA" but now known to serve important functions. If not for the tremendous data-processing challenges of analysis, whole genome sequencing would be the method of choice.

To test the system, McNally's team used raw sequencing data from 61 human genomes and analyzed that data on Beagle. They used publicly available software packages and one quarter of the computer's total capacity. They found that shifting to the supercomputer environment improved accuracy and dramatically accelerated speed.

"Improving analysis through both speed and accuracy reduces the price per genome," McNally said. "With this approach, the price for analyzing an entire genome is less than the cost of the looking at just a fraction of genome. New technology promises to bring the costs of sequencing down to around $1,000 per genome. Our goal is get the cost of analysis down into that range."

"This work vividly demonstrates the benefits of dedicating a powerful supercomputer resource to biomedical research," said co-author Ian Foster, director of the Computation Institute and Arthur Holly Compton Distinguished Service Professor of Computer Science. "The methods developed here will be instrumental in relieving the data analysis bottleneck that researchers face as genetic sequencing grows cheaper and faster."

The finding has immediate medical applications. McNally's Cardiovascular Genetics clinic, for example, relies on rigorous interrogation of the genes from an initial patient as well as multiple family members to understand, treat and prevent disease. More than 50 genes can contribute to cardiomyopathy. Other genes can trigger heart failure, rhythm disorders or vascular problems.

"We start genetic testing with the patient," she said, "but when we find a significant mutation we have to think about testing the whole family to identify individuals at risk."

The range of testable mutations has radically expanded. "In the early days we would test one to three genes," she said. "In 2007, we did our first five-gene panel. Now we order 50 to 70 genes at a time, which usually gets us an answer. At that point, it can be more useful and less expensive to sequence the whole genome."

The information from these genomes combined with careful attention to patient and family histories "adds to our knowledge about these inherited disorders," McNally said. "It can refine the classification of these disorders," she said. "By paying close attention to family members with genes that place then at increased risk, but who do not yet show signs of disease, we can investigate early phases of a disorder. In this setting, each patient is a big-data problem."

Beagle, a Cray XE6 supercomputer housed in the Theory and Computing Sciences (TCS) building at Argonne National Laboratory, supports computation, simulation and data analysis for the biomedical research community. It is available for use by University of Chicago researchers, their collaborators and "other meritorious investigators." It was named after the HMS Beagle, the ship that carried Charles Darwin on his famous scientific voyage in 1831.

The National Institutes of Health and the Doris Duke Charitable Foundation funded this study. Additional authors include Lorenzo Pesce, Viswateja Nelakuditi, Lisa Dellefave-Castillo and Jessica Golbus of the University of Chicago; Sharlene Day of the University of Michigan; Thomas Coppola of the University of Pennsylvania; and Gerald Dorn of Washington University.

John Easton | EurekAlert!
Further information:
http://www.uchospitals.edu

More articles from Life Sciences:

nachricht Single-stranded DNA and RNA origami go live
15.12.2017 | Wyss Institute for Biologically Inspired Engineering at Harvard

nachricht New antbird species discovered in Peru by LSU ornithologists
15.12.2017 | Louisiana State University

All articles from Life Sciences >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: First-of-its-kind chemical oscillator offers new level of molecular control

DNA molecules that follow specific instructions could offer more precise molecular control of synthetic chemical systems, a discovery that opens the door for engineers to create molecular machines with new and complex behaviors.

Researchers have created chemical amplifiers and a chemical oscillator using a systematic method that has the potential to embed sophisticated circuit...

Im Focus: Long-lived storage of a photonic qubit for worldwide teleportation

MPQ scientists achieve long storage times for photonic quantum bits which break the lower bound for direct teleportation in a global quantum network.

Concerning the development of quantum memories for the realization of global quantum networks, scientists of the Quantum Dynamics Division led by Professor...

Im Focus: Electromagnetic water cloak eliminates drag and wake

Detailed calculations show water cloaks are feasible with today's technology

Researchers have developed a water cloaking concept based on electromagnetic forces that could eliminate an object's wake, greatly reducing its drag while...

Im Focus: Scientists channel graphene to understand filtration and ion transport into cells

Tiny pores at a cell's entryway act as miniature bouncers, letting in some electrically charged atoms--ions--but blocking others. Operating as exquisitely sensitive filters, these "ion channels" play a critical role in biological functions such as muscle contraction and the firing of brain cells.

To rapidly transport the right ions through the cell membrane, the tiny channels rely on a complex interplay between the ions and surrounding molecules,...

Im Focus: Towards data storage at the single molecule level

The miniaturization of the current technology of storage media is hindered by fundamental limits of quantum mechanics. A new approach consists in using so-called spin-crossover molecules as the smallest possible storage unit. Similar to normal hard drives, these special molecules can save information via their magnetic state. A research team from Kiel University has now managed to successfully place a new class of spin-crossover molecules onto a surface and to improve the molecule’s storage capacity. The storage density of conventional hard drives could therefore theoretically be increased by more than one hundred fold. The study has been published in the scientific journal Nano Letters.

Over the past few years, the building blocks of storage media have gotten ever smaller. But further miniaturization of the current technology is hindered by...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

See, understand and experience the work of the future

11.12.2017 | Event News

Innovative strategies to tackle parasitic worms

08.12.2017 | Event News

AKL’18: The opportunities and challenges of digitalization in the laser industry

07.12.2017 | Event News

 
Latest News

Engineers program tiny robots to move, think like insects

15.12.2017 | Power and Electrical Engineering

One in 5 materials chemistry papers may be wrong, study suggests

15.12.2017 | Materials Sciences

New antbird species discovered in Peru by LSU ornithologists

15.12.2017 | Life Sciences

VideoLinks
B2B-VideoLinks
More VideoLinks >>>