This finding from a current project by the Austrian Science Fund FWF was presented at the 5th International Conference on Multiple Comparison Procedures (MCP2007) in Vienna, which recently drew to a close. The conference, held at the Medical University, focused on the increasingly important issue of boosting the efficiency of medical studies in statistical terms.
30,000 – this is the number of genes that can be analysed simultaneously with state-of-the-art instruments. Such analyses provide a means of identifying whether individual genes have a decisive impact in the course of a disease or therapy. However, the more genes that are examined in a study, the greater the probability of incorrectly identifying a gene as a factor when, in reality, it has no influence.
MORE CONCENTRATION – LESS ERRORS
Dr. Sonja Zehetmayer from the Department of Medical Statistics, Medical University of Vienna, says: "The problem of identifying factors incorrectly could be countered by a very high number of repetition. However, repetition normally needs to be kept to a minimum, owing to high costs. A more innovative approach to solve this problem is offered by multi-stage methods. These involve preselecting genes following the first examination stage. In subsequent stages, only these selected genes are subject to further analysis. Concentrating on fewer genes thereby cuts error probability."
The question of exactly how many stages are needed to deliver an optimal cost-benefit ratio has until now remained unresolved. The answer has now been calculated, published, and discussed by Dr. Zehetmayer and her colleagues at the MCP2007 held in Vienna from 8 to 11 July. In actual fact, the solution turned out to be unexpectedly straightforward – three stages deliver the optimal ratio between the accuracy of the results obtained and the costs necessary for this. Although a fourth stage would offer greater accuracy, the resources this would require are out of all proportion to the additional accuracy gained.
Dr. Zehetmayer also found surprising results when she compared two different test designs with each other: "Multi-stage series of tests can be analysed either by integrating the results of all levels or by analysing the results of only the last stage. While the choice of test design for four-stage methods has a marked effect on its statistical characteristics, this effect is mitigated in the case of a three-stage method."
Dr. Zehetmayer’s colleague Alexandra Goll presented a further aspect contributing to the optimum configuration of test methods at the MCP2007. She showed that the individual stages of multi-stage test methods can be configured very differently without having a major detrimental effect on the accuracy of the end result. This means that initial stages can clearly be more cost-effective if more accurate and more expensive methods are used for the following stages (involving fewer genes).
There is good reason why the latest trends in the statistical analysis of clinical data are being initiated and analysed at the Department of Medical Studies at the Medical University of Vienna. Prof. Peter Bauer published a paper there in 1989 refuting a basic principle of biostatistics that the test design in an ongoing study must not be changed until the end. This evidence is and remains the basis for multi-stage adaptive analytical methods that have recently attracted worldwide research interest due to cost pressures in healthcare and are supported by the FWF in Austria.Image and text will be available online from Monday, 16th July 2007, 09.00 a.m. CET onwards:
Till C. Jelitto | alfa
‘Farming’ bacteria to boost growth in the oceans
24.10.2016 | Max-Planck-Institut für marine Mikrobiologie
Calcium Induces Chronic Lung Infections
24.10.2016 | Universität Basel
Terahertz excitation of selected crystal vibrations leads to an effective magnetic field that drives coherent spin motion
Controlling functional properties by light is one of the grand goals in modern condensed matter physics and materials science. A new study now demonstrates how...
Researchers from the Institute for Quantum Computing (IQC) at the University of Waterloo led the development of a new extensible wiring technique capable of controlling superconducting quantum bits, representing a significant step towards to the realization of a scalable quantum computer.
"The quantum socket is a wiring method that uses three-dimensional wires based on spring-loaded pins to address individual qubits," said Jeremy Béjanin, a PhD...
In a paper in Scientific Reports, a research team at Worcester Polytechnic Institute describes a novel light-activated phenomenon that could become the basis for applications as diverse as microscopic robotic grippers and more efficient solar cells.
A research team at Worcester Polytechnic Institute (WPI) has developed a revolutionary, light-activated semiconductor nanocomposite material that can be used...
By forcefully embedding two silicon atoms in a diamond matrix, Sandia researchers have demonstrated for the first time on a single chip all the components needed to create a quantum bridge to link quantum computers together.
"People have already built small quantum computers," says Sandia researcher Ryan Camacho. "Maybe the first useful one won't be a single giant quantum computer...
COMPAMED has become the leading international marketplace for suppliers of medical manufacturing. The trade fair, which takes place every November and is co-located to MEDICA in Dusseldorf, has been steadily growing over the past years and shows that medical technology remains a rapidly growing market.
In 2016, the joint pavilion by the IVAM Microtechnology Network, the Product Market “High-tech for Medical Devices”, will be located in Hall 8a again and will...
14.10.2016 | Event News
14.10.2016 | Event News
12.10.2016 | Event News
24.10.2016 | Earth Sciences
24.10.2016 | Life Sciences
24.10.2016 | Physics and Astronomy