Microarrays are a tool used to study gene expression. Researchers can study thousands of genes at a time, all on a single glass slide. In oncology, scientists have used microarrays to study unique gene expression patterns of specific tumor types, to discover new drug targets, and to categorize unique characteristics of a particular tumor to help doctors tailor treatments to an individual patient. However, such studies produce volumes of data that is easily misinterpreted. It has been difficult to replicate such studies, which is considered the best way to validate scientific findings.
To study the statistical methods used in cancer-focused microarray studies, Alain Dupuy, M.D., and Richard M. Simon, D.Sc., of the National Cancer Institute in Bethesda, Md., reviewed 90 studies published through the end of 2004 that related microarray expression profiling to clinical outcome. The most common cancers in those studies were hematologic malignancies (24 studies), lung cancer (12 studies), and breast cancer (12 studies). The studies fell into three general categories: an outcome-related gene finding, such as searching for specific genes that are expressed differently in people who have a good versus bad prognosis; a class discovery, where researchers cluster together tumors with similar gene expression profiles; and supervised prediction, in which the gene expression profiles are used to generate an algorithm or set of rules that will predict clinical outcomes for patients based on their individual gene expression profiles.
The authors closely scrutinized the statistical methods and reporting in 42 studies published in 2004. Half of these studies (21) contained at least one basic flaw. In the 23 studies with an outcome-related gene finding, nine of them had inadequate, unclear, or unstated methods to take into account false-positive findings. In 13 of the 28 studies focused on class discovery, there were spurious claims of meaningful classifications of outcomes, in which the authors did not perform adequate analyses to reach their conclusions. Among the 28 studies reporting supervised prediction, Dupuy and Simon found that 12 of those studies used biased estimates of the accuracy of their predictions.
"…Microarray studies are a fast-growing area for both basic and clinical research with an exponentially growing number of publications," the authors write. "As demonstrated by our results, common mistakes and misunderstandings are pervasive in studies published in good-quality, peer-reviewed journals." To avoid such errors, Dupuy and Simon provide guidelines in the form of a list of "Do's and Don'ts" for researchers. "We believe that following these guidelines should substantially improve the quality of analysis and reporting of microarray investigations," the authors write.
Andrea Widener | EurekAlert!
Win-win strategies for climate and food security
02.10.2017 | International Institute for Applied Systems Analysis (IIASA)
The personality factor: How to foster the sharing of research data
06.09.2017 | ZBW – Leibniz-Informationszentrum Wirtschaft
University of Maryland researchers contribute to historic detection of gravitational waves and light created by event
On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...
Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.
Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....
Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).
When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...
Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.
How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...
Scientists from the Max Planck Institute of Quantum Optics, using high precision laser spectroscopy of atomic hydrogen, confirm the surprisingly small value of the proton radius determined from muonic hydrogen.
It was one of the breakthroughs of the year 2010: Laser spectroscopy of muonic hydrogen resulted in a value for the proton charge radius that was significantly...
17.10.2017 | Event News
10.10.2017 | Event News
10.10.2017 | Event News
20.10.2017 | Interdisciplinary Research
20.10.2017 | Materials Sciences
20.10.2017 | Earth Sciences