Hurricane Katrina was the cause of more than 200 onshore releases of petroleum and other hazardous materials, a new study funded by the National Science Foundation has found.
According to comprehensive research using government incident databases, about 8 million gallons of petroleum releases were reported as a result of Katrina hitting the U.S. Gulf coast in 2005, nearly 75 percent of the total volume of the 1989 Exxon Valdez oil spill in Alaska. The releases were largely due to storage tank failure and the shut down and restart of production processes. Storm surge floods were the primary cause, but some incidents occurred as a result of hurricane and tropical storm strength winds where no surge was present, according to the authors.
The study “Petroleum and Hazardous Material Releases from Industrial Facilities Associated with Hurricane Katrina” appears in the April issue of the journal Risk Analysis published by the Society for Risk Analysis.
The authors include consultant Nicholas Santella, Laura Steinberg of Syracuse University, and Hatice Sengul of the Turkish Scientific and Technological Research Council. Ten onshore releases of petroleum products were greater than 10,000 gallons each, primarily made up of crude oil that leaked from storage tanks. Fewer and smaller releases were reported from chemical and manufacturing industries handling hazardous materials. Of the releases from onshore facilities and storage tanks, 76 percent were petroleum, 18 percent were chemicals and six percent were natural gas. Many refineries and other facilities shut down in anticipation of large storms to minimize damage and prevent process upsets and are required to do so for safety purposes. However, shutdowns and restarts have the disadvantage of leading to potentially large emissions of volatile organic compounds, particulate matter, and other chemicals.
“More attention should be given to planning for shutdowns, including coordination with government entities responsible for evacuation, and to plant startup after an emergency shutdown in order to minimize burning off excess gas by flaring and other releases,” according to the authors. For example, storage tanks can be filled with water and other steps can be taken to mitigate damage during severe storms and floods.
“Where large releases do occur, in-depth analysis by each plant of mechanism of failure and contributing factors should be required,” the authors add. Significant factors slowing response to the Katrina damage included indirect disruptions, such as displacement of workers, loss of electricity and communication systems, and difficulty acquiring supplies and contractors for operations and reconstruction. Of industrial facilities responding to a survey in the study, 55 percent experienced indirect disruptions, far more than had environmental releases of hazardous materials, indicating improved risk-based facility design and improved prevention and response planning may be warranted.
“Chemical accident prevention and emergency response regulations in the US and elsewhere generally do not address the threat of natural hazards directly. While many companies are proactive in taking steps to mitigate natural hazard risk, others may make only the minimum effort require by statute,” the authors conclude. The study is the first to comprehensively analyze the incidence and causes of releases from all types of onshore industrial facilities as a result of Hurricane Katrina. The analysis relies on the key incident reporting databases of the National Response Center (NRC) Incident Reporting Information System (IRIS) administered by the U.S. Coast Guard. In addition, interviews and data were obtained from federal and Gulf state environmental agencies, energy and chemical associations, public accounts of particular incidents, and a small industry survey.
Risk Analysis: An International Journal is published by the nonprofit Society for Risk Analysis (SRA). SRA is a multidisciplinary, interdisciplinary, scholarly, international society that provides an open forum for all those who are interested in risk analysis. Risk analysis is broadly defined to include risk assessment, risk characterization, risk communication, risk management, and policy relating to risk, in the context of risks of concern to individuals, to public and private sector organizations, and to society at a local, regional, national, or global level.
Note to editors: The complete study is available upon request from Lisa Pellegrin/Steve Gibb or here: http://www3.interscience.wiley.com/cgi-bin/fulltext/123322882/HTMLSTART
Steve Gibb | Newswise Science News
Dispersal of Fish Eggs by Water Birds – Just a Myth?
19.02.2018 | Universität Basel
Removing fossil fuel subsidies will not reduce CO2 emissions as much as hoped
08.02.2018 | International Institute for Applied Systems Analysis (IIASA)
For the first time, a team of researchers at the Max-Planck Institute (MPI) for Polymer Research in Mainz, Germany, has succeeded in making an integrated circuit (IC) from just a monolayer of a semiconducting polymer via a bottom-up, self-assembly approach.
In the self-assembly process, the semiconducting polymer arranges itself into an ordered monolayer in a transistor. The transistors are binary switches used...
Breakthrough provides a new concept of the design of molecular motors, sensors and electricity generators at nanoscale
Researchers from the Institute of Organic Chemistry and Biochemistry of the CAS (IOCB Prague), Institute of Physics of the CAS (IP CAS) and Palacký University...
For photographers and scientists, lenses are lifesavers. They reflect and refract light, making possible the imaging systems that drive discovery through the microscope and preserve history through cameras.
But today's glass-based lenses are bulky and resist miniaturization. Next-generation technologies, such as ultrathin cameras or tiny microscopes, require...
Scientists from the University of Zurich have succeeded for the first time in tracking individual stem cells and their neuronal progeny over months within the intact adult brain. This study sheds light on how new neurons are produced throughout life.
The generation of new nerve cells was once thought to taper off at the end of embryonic development. However, recent research has shown that the adult brain...
Theoretical physicists propose to use negative interference to control heat flow in quantum devices. Study published in Physical Review Letters
Quantum computer parts are sensitive and need to be cooled to very low temperatures. Their tiny size makes them particularly susceptible to a temperature...
15.02.2018 | Event News
13.02.2018 | Event News
12.02.2018 | Event News
19.02.2018 | Materials Sciences
19.02.2018 | Materials Sciences
19.02.2018 | Life Sciences