In this week's special issue of the Proceedings of the National Academy of Sciences (online), Indiana University political scientist Elinor Ostrom and colleagues argue that while many basic conservation strategies are sound, their use is often flawed. The strategies are applied too generally, they say, as an inflexible, regulatory "blueprint" that foolishly ignores local customs, economics and politics.
"We now ridicule the doctors who long ago used to tell us, 'Take two aspirin and call me in the morning' as a treatment for every single illness," said Ostrom, a member of the National Academy of Sciences. "Resource management is just as complex as the human body. It needs to be approached differently in different situations."
In her own contribution, Ostrom proposes a flexible "framework" for determining what factors will influence resource management, whether that resource is forest, fish... even air. Ostrom edited the special issue with Arizona State University's Macro Janssen and John Anderies.
"What we are learning is that you shouldn't ignore what's going on at the local level," Ostrom said. "It may even be beneficial to work with local people, including the resource exploiters, to create effective regulation."
Modern conservation theory relies on well established mathematical models that predict what will happen to a species or habitat over time. One thing these models can't account for are the unpredictable behavior of human beings whose lives influence and are influenced by conservation efforts.
The framework is divided into tiers that allow conservationists and policymakers to delineate those factors most likely to affect the protection or management of a given resource.
The first tier imposes four broad variables: the resource system, the resource units, the governance system and the resource users. The second tier examines each of these variables in greater detail, such as the government and non-government entities that may already be regulating the resource, the innate productivity of a resource system, the size and placement of the system, the system's economic value and what sorts of people use the resource -- from indigenous people to heads of state. The third tier digs even deeper into each of the basic variables.
"I admit it's ambitious," Ostrom said. "It lays out a research program for the next 15-20 years."
Applying Ostrom's framework, policymakers are encouraged first to examine the behaviors of resource users, then establish incentives for resource users to aid a conservation strategy or, at least, not interfere with it.
Ostrom's framework could also serve to normalize the effects of political upheavals that occur regularly at both national and state/provincial levels. It also accommodates non-political changes that may come with economic development and environmental change. In short, the framework's flexibility would allow the resource managers to modify a plan without scrapping the plan entirely.
Nicole Todd | EurekAlert!
Preservation of floodplains is flood protection
27.09.2017 | Technische Universität München
Conservationists are sounding the alarm: parrots much more threatened than assumed
15.09.2017 | Justus-Liebig-Universität Gießen
University of Maryland researchers contribute to historic detection of gravitational waves and light created by event
On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...
Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.
Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....
Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).
When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...
Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.
How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...
Scientists from the Max Planck Institute of Quantum Optics, using high precision laser spectroscopy of atomic hydrogen, confirm the surprisingly small value of the proton radius determined from muonic hydrogen.
It was one of the breakthroughs of the year 2010: Laser spectroscopy of muonic hydrogen resulted in a value for the proton charge radius that was significantly...
17.10.2017 | Event News
10.10.2017 | Event News
10.10.2017 | Event News
18.10.2017 | Materials Sciences
18.10.2017 | Physics and Astronomy
18.10.2017 | Physics and Astronomy