The study found that acceptable levels of at least one carcinogen may be 500 to 1,500 times higher than is currently believed. It also indicates that for many purposes trout may be a superior animal model than laboratory rats, and that traditional methods of assessing the risk of carcinogens need to be re-evaluated.
The health impact of carcinogens is not always "linear," the OSU researchers reported in this study. This means experiments that are done using high concentrations of a carcinogen – a common practice made necessary by cost and logistics – may not accurately predict the actual risks of the compound when exposure in the real world is at lower levels over long time periods.
In practice, this suggests that some chemicals or toxins are safe at levels far higher than is currently believed, and that some previous research may have significantly erred on the side of conservatism. Such studies have been "severely limited by inadequate experimental data at environmentally relevant exposures," the researchers wrote in their report.
"The whole foundation of modern toxicology is that the dose makes the poison," said George Bailey, an OSU distinguished professor emeritus of molecular and environmental toxicology. "You can die from eating a few tablespoons of ordinary table salt at one time, but that doesn't mean that table salt is a poison at the doses that humans normally consume.
"With compounds that we know can cause cancer, the real question is how much is too much," Bailey said. "What we have found is that traditional approaches to making that evaluation, which are almost always based on studies done at very high doses with laboratory rodents, may not always give us answers that are reasonably accurate."
Researchers are usually trying to determine what can cause cancer at levels considered unacceptable, such as one more case of cancer per million people. But the age-old problem they have faced is that cost and laboratory logistics make it virtually impossible to test millions of rats at a time to see how many more cases of cancer appear when a compound is administered at a very low dose. So, scientists traditionally have raised the dose, tested it on a relatively small but affordable number of rats to see the results, and then extrapolated the results to determine one-in-a-million dosages that are considered unsafe.
"There have always been questions and criticism over use of this methodology, and that's one reason we've had to be conservative, to err on the side of caution," said Gayle Orner, an assistant professor in the Linus Pauling Institute at OSU. "When using rodents, it simply was not possible to study larger numbers of animals, the cost was too prohibitive."
What has changed, the OSU researchers said, is the realization that rainbow trout may for many purposes be as or more accurate in determining what compounds, at what levels, can pose a risk of human cancer. OSU has pioneered the use of trout for studies of this type for 40 years, and it may now be time to greatly expand the use of that research, the scientists said.
"We can do experiments with trout in large numbers at very low cost, about 5 percent of what a rodent study would cost," Bailey said. "For most studies of carcinogens, exposing 2,000 rodents would be a huge project. For us, working with 2,000 trout is a pilot study."
The OSU scientists recently completed the largest study ever done with animals in toxicology, exposing 40,800 trout to what's considered an "ultra-low" dose of dibenzo-a,l-pyrene, a chemical that can cause liver cancer and is part of a broad field of toxic compounds called polycyclic aromatic hydrocarbons, or PAHs.
The study determined that a tolerable threshold for human exposure to this toxic chemical would be 500 to 1,500 times higher than is outlined by the Environmental Protection Agency. And in other work, which is still in preliminary stages, studies seem to be showing that previous studies about aflatoxins, another common carcinogen, are reasonably accurate.
"The EPA levels of exposure for both the PAH compound and aflatoxins were determined using essentially the same methodology based on rodent studies," Bailey said. "But our research suggests that the findings for aflatoxins are pretty accurate, while for the PAH compound we're off by a factor of about 1,000."
In addition, the OSU study determined that use of "biomarkers" such as DNA adducts to determine carcinogenic potential can also be flawed. That was a "huge surprise" and very significant, Bailey said, since much of the carcinogen research around the world is based on this.
Together, the findings suggest that past methodology to assess the danger of some of the most common carcinogens in the world may be questionable. They may, or may not, be accurate.
"In the past, our regulatory agencies have done the best they can with the data they have available," Bailey said. "The key is that now we have animal models that can far more accurately determine the real cancer risk some compounds pose, and in biochemical detail that's more valuable than the kill-them-and-count-them approaches of the past."
"It may be time for government agencies and medical researchers to reconsider the way we approach carcinogen research," he said.
The findings of this study were just published in Chemical Research in Toxicology, a professional journal, in work funded by the National Institute of Environmental Health Sciences. Researchers from the University of North Carolina also collaborated on the work.
George Bailey | EurekAlert!
A new molecular player involved in T cell activation
07.12.2018 | Tokyo Institute of Technology
News About a Plant Hormone
07.12.2018 | Julius-Maximilians-Universität Würzburg
What if a sensor sensing a thing could be part of the thing itself? Rice University engineers believe they have a two-dimensional solution to do just that.
Rice engineers led by materials scientists Pulickel Ajayan and Jun Lou have developed a method to make atom-flat sensors that seamlessly integrate with devices...
Scientists at the University of Stuttgart and the Karlsruhe Institute of Technology (KIT) succeed in important further development on the way to quantum Computers.
Quantum computers one day should be able to solve certain computing problems much faster than a classical computer. One of the most promising approaches is...
New Project SNAPSTER: Novel luminescent materials by encapsulating phosphorescent metal clusters with organic liquid crystals
Nowadays energy conversion in lighting and optoelectronic devices requires the use of rare earth oxides.
Scientists have discovered the first synthetic material that becomes thicker - at the molecular level - as it is stretched.
Researchers led by Dr Devesh Mistry from the University of Leeds discovered a new non-porous material that has unique and inherent "auxetic" stretching...
Scientists from the Theory Department of the Max Planck Institute for the Structure and Dynamics of Matter (MPSD) at the Center for Free-Electron Laser Science (CFEL) in Hamburg have shown through theoretical calculations and computer simulations that the force between electrons and lattice distortions in an atomically thin two-dimensional superconductor can be controlled with virtual photons. This could aid the development of new superconductors for energy-saving devices and many other technical applications.
The vacuum is not empty. It may sound like magic to laypeople but it has occupied physicists since the birth of quantum mechanics.
10.12.2018 | Event News
06.12.2018 | Event News
03.12.2018 | Event News
10.12.2018 | Materials Sciences
10.12.2018 | Event News
07.12.2018 | Life Sciences