A new article publishing in the latest issue of Review of Policy Research examines the evolution and devolution of speed limit laws and their effects on fatality rates. The author did not find a significant increase in fatalities per miles driven after speed limit laws ceased to be national and states could, and some did, increase their highway limit to more than fifty-five miles per hour. "Automobile safety features and enforcement emerge as important factors in increasing highway safety; speed limits are far less important," author Robert O. Yowell explains.
Although speed limits bring to mind the notion of public safety, they were formed in the 1970s to combat a gasoline shortage. In the 1980s the focus shifted to public safety while some speed limit regulation devolved back to states; the maximum speed on rural interstates could be raised to 65 mph. In 1995 and involving some controversy, Congress returned all speed limit authority back to the states. Analysis of the highway deaths per mile driven after the 1974 nationalization of the maximum highway speed indicates an initial greater decline in deaths than had been the trend, but the long-term decreasing trend reemerged following the shock. Dr. Yowells research finds others reasons besides speed for the long-term trend of increased highway safety. (From 1968 to 1991, the fatality rate per 100 million declined by 63.2%.) Technical progress in car manufacturing, increased use of seat belts by drivers and passengers, an increase in the minimum legal drinking age, and the general maintenance of roads all affect this rate.
Jill Yablonski | EurekAlert!
Experiments show that a few self-driving cars can dramatically improve traffic flow
10.05.2017 | University of Illinois College of Engineering
Tool helps cities to plan electric bus routes, and calculate the benefits
09.01.2017 | International Institute for Applied Systems Analysis (IIASA)
An international team of physicists has monitored the scattering behaviour of electrons in a non-conducting material in real-time. Their insights could be beneficial for radiotherapy.
We can refer to electrons in non-conducting materials as ‘sluggish’. Typically, they remain fixed in a location, deep inside an atomic composite. It is hence...
Two-dimensional magnetic structures are regarded as a promising material for new types of data storage, since the magnetic properties of individual molecular building blocks can be investigated and modified. For the first time, researchers have now produced a wafer-thin ferrimagnet, in which molecules with different magnetic centers arrange themselves on a gold surface to form a checkerboard pattern. Scientists at the Swiss Nanoscience Institute at the University of Basel and the Paul Scherrer Institute published their findings in the journal Nature Communications.
Ferrimagnets are composed of two centers which are magnetized at different strengths and point in opposing directions. Two-dimensional, quasi-flat ferrimagnets...
An Australian-Chinese research team has created the world's thinnest hologram, paving the way towards the integration of 3D holography into everyday...
In the race to produce a quantum computer, a number of projects are seeking a way to create quantum bits -- or qubits -- that are stable, meaning they are not much affected by changes in their environment. This normally needs highly nonlinear non-dissipative elements capable of functioning at very low temperatures.
In pursuit of this goal, researchers at EPFL's Laboratory of Photonics and Quantum Measurements LPQM (STI/SB), have investigated a nonlinear graphene-based...
Dental plaque and the viscous brown slime in drainpipes are two familiar examples of bacterial biofilms. Removing such bacterial depositions from surfaces is...
23.05.2017 | Event News
22.05.2017 | Event News
17.05.2017 | Event News
23.05.2017 | Physics and Astronomy
23.05.2017 | Life Sciences
23.05.2017 | Medical Engineering