This finding from a current project by the Austrian Science Fund FWF was presented at the 5th International Conference on Multiple Comparison Procedures (MCP2007) in Vienna, which recently drew to a close. The conference, held at the Medical University, focused on the increasingly important issue of boosting the efficiency of medical studies in statistical terms.
30,000 – this is the number of genes that can be analysed simultaneously with state-of-the-art instruments. Such analyses provide a means of identifying whether individual genes have a decisive impact in the course of a disease or therapy. However, the more genes that are examined in a study, the greater the probability of incorrectly identifying a gene as a factor when, in reality, it has no influence.
MORE CONCENTRATION – LESS ERRORS
Dr. Sonja Zehetmayer from the Department of Medical Statistics, Medical University of Vienna, says: "The problem of identifying factors incorrectly could be countered by a very high number of repetition. However, repetition normally needs to be kept to a minimum, owing to high costs. A more innovative approach to solve this problem is offered by multi-stage methods. These involve preselecting genes following the first examination stage. In subsequent stages, only these selected genes are subject to further analysis. Concentrating on fewer genes thereby cuts error probability."
The question of exactly how many stages are needed to deliver an optimal cost-benefit ratio has until now remained unresolved. The answer has now been calculated, published, and discussed by Dr. Zehetmayer and her colleagues at the MCP2007 held in Vienna from 8 to 11 July. In actual fact, the solution turned out to be unexpectedly straightforward – three stages deliver the optimal ratio between the accuracy of the results obtained and the costs necessary for this. Although a fourth stage would offer greater accuracy, the resources this would require are out of all proportion to the additional accuracy gained.
Dr. Zehetmayer also found surprising results when she compared two different test designs with each other: "Multi-stage series of tests can be analysed either by integrating the results of all levels or by analysing the results of only the last stage. While the choice of test design for four-stage methods has a marked effect on its statistical characteristics, this effect is mitigated in the case of a three-stage method."
Dr. Zehetmayer’s colleague Alexandra Goll presented a further aspect contributing to the optimum configuration of test methods at the MCP2007. She showed that the individual stages of multi-stage test methods can be configured very differently without having a major detrimental effect on the accuracy of the end result. This means that initial stages can clearly be more cost-effective if more accurate and more expensive methods are used for the following stages (involving fewer genes).
There is good reason why the latest trends in the statistical analysis of clinical data are being initiated and analysed at the Department of Medical Studies at the Medical University of Vienna. Prof. Peter Bauer published a paper there in 1989 refuting a basic principle of biostatistics that the test design in an ongoing study must not be changed until the end. This evidence is and remains the basis for multi-stage adaptive analytical methods that have recently attracted worldwide research interest due to cost pressures in healthcare and are supported by the FWF in Austria.Image and text will be available online from Monday, 16th July 2007, 09.00 a.m. CET onwards:
Till C. Jelitto | alfa
A Map of the Cell’s Power Station
18.08.2017 | Albert-Ludwigs-Universität Freiburg im Breisgau
On the way to developing a new active ingredient against chronic infections
18.08.2017 | Deutsches Zentrum für Infektionsforschung
Whether you call it effervescent, fizzy, or sparkling, carbonated water is making a comeback as a beverage. Aside from quenching thirst, researchers at the University of Illinois at Urbana-Champaign have discovered a new use for these "bubbly" concoctions that will have major impact on the manufacturer of the world's thinnest, flattest, and one most useful materials -- graphene.
As graphene's popularity grows as an advanced "wonder" material, the speed and quality at which it can be manufactured will be paramount. With that in mind,...
Physicists at the University of Bonn have managed to create optical hollows and more complex patterns into which the light of a Bose-Einstein condensate flows. The creation of such highly low-loss structures for light is a prerequisite for complex light circuits, such as for quantum information processing for a new generation of computers. The researchers are now presenting their results in the journal Nature Photonics.
Light particles (photons) occur as tiny, indivisible portions. Many thousands of these light portions can be merged to form a single super-photon if they are...
For the first time, scientists have shown that circular RNA is linked to brain function. When a RNA molecule called Cdr1as was deleted from the genome of mice, the animals had problems filtering out unnecessary information – like patients suffering from neuropsychiatric disorders.
While hundreds of circular RNAs (circRNAs) are abundant in mammalian brains, one big question has remained unanswered: What are they actually good for? In the...
An experimental small satellite has successfully collected and delivered data on a key measurement for predicting changes in Earth's climate.
The Radiometer Assessment using Vertically Aligned Nanotubes (RAVAN) CubeSat was launched into low-Earth orbit on Nov. 11, 2016, in order to test new...
A study led by scientists of the Max Planck Institute for the Structure and Dynamics of Matter (MPSD) at the Center for Free-Electron Laser Science in Hamburg presents evidence of the coexistence of superconductivity and “charge-density-waves” in compounds of the poorly-studied family of bismuthates. This observation opens up new perspectives for a deeper understanding of the phenomenon of high-temperature superconductivity, a topic which is at the core of condensed matter research since more than 30 years. The paper by Nicoletti et al has been published in the PNAS.
Since the beginning of the 20th century, superconductivity had been observed in some metals at temperatures only a few degrees above the absolute zero (minus...
16.08.2017 | Event News
04.08.2017 | Event News
26.07.2017 | Event News
18.08.2017 | Life Sciences
18.08.2017 | Physics and Astronomy
18.08.2017 | Materials Sciences