Members of Congress were, of course, concerned. Then, a similar paper came out in the journal Nature the next month that presented a model of how a cascade of failing interconnected networks led to a blackout that covered Italy in 2003.
These two papers are part of a growing reliance on a particular kind of mathematical model -- a so-called topological model -- for understanding complex systems, including the power grid.
And this has University of Vermont power-system expert Paul Hines concerned.
"Some modelers have gotten so fascinated with these abstract networks that they've ignored the physics of how things actually work -- like electricity infrastructure," Hines says, "and this can lead you grossly astray."
For example, the Safety Science paper came to the "highly counter-intuitive conclusion," Hines says, that the smallest, lowest-flow parts of the electrical system -- say a minor substation in a neighborhood -- were likely to be the most effective spots for a targeted attack to bring down the U.S. grid.
"That's a bunch of hooey," says Seth Blumsack, Hines's colleague at Penn State.
Hines and Blumsack's recent study, published in the journal Chaos on Sept. 28, found just the opposite. Drawing on real-world data from the Eastern U.S. power grid and accounting for the two most important laws of physics governing the flow of electricity, they show that "the most vulnerable locations are the ones that have most flow through them," Hines says. Think highly connected transformers and major power-generating stations. Score one point for common sense.
"If the government takes these topological models seriously," Hines says, "and changes their investment strategy to put walls around the substations that have the least amount of flow -- it would be a massive waste of resources."At the speed of light
In August of 2003 a blackout started in Ohio and then spread to New York City. Cleveland went down and soon Toronto was affected. The blackout was able to jump over long distances.
"The way topological cascades typically occur -- is they're more like real dominoes," says Hines, an assistant professor in UVM's College of Engineering and Mathematical Sciences. "When you push a domino the only thing that can fall is the one next to it. Whereas in a power grid you might push one domino and the next one to fall might be a hundred miles away."
That's because, "when a transmission line fails -- instantly, at nearly the speed of light, everything changes. Everything that is connected will change just a little bit," Hines says, "But in ways that are hard to predict." This strangeness is compounded by the fact that the U.S. electric grid is more an intractable patchwork of history than a rational design.
Which is why he and Blumsack decided to "run a horse race," he says, between topological models and a physics-based one -- applied to the actual arrangement of the North American Eastern Interconnect, the largest portion of the U.S. electric grid.
Using real-world data from a 2005 North American Electric Reliability Corporation test case, they compared how vulnerable parts of the grid appeared in the differing models. The topological measures -- so-called "characteristic path lengths" and "connectivity loss" between nodes -- came up with dramatically different and less accurate results than a model that calculated blackout size driven by the two rules that most influence actual electric transmissions -- Ohm's and Kirchhoff's laws.
In other words, the physics horse won. Or, as their paper concludes, "evaluating vulnerability in power networks using purely topological metrics can be misleading," and "results from physics-based models are more realistic and generally more useful for infrastructure risk assessment." Score one for gritty reality.The value of unpredictability
"Our system is quite robust to small things failing -- which is very good," he says, "Even hurricanes have trouble taking out power systems. Hurricanes do cause power system failures, but they don't often take out the whole system."
Blumsack agrees. "Our paper confirms that it would be possible for somebody who wanted to do something disruptive to the power grid to do so," he says. "A lot of the infrastructure is out in the open," which does create vulnerability to planned attack. "But if you wanted to black out half of the U.S., it will be much more difficult than some of these earlier models imply," he says.
"If you were a bad guy, there is no obvious thing to do to take out the power system," Hines says. "What we learned from doing the simulations is that if you take out the biggest substation, with the most flow, you get the biggest failure on average. But there were also a number of cases where, even if you took out the biggest one, you don't get much of a blackout."
"It takes an incredible amount of information," he says, "to really figure out how to make the grid fail."
Joshua Brown | EurekAlert!
Win-win strategies for climate and food security
02.10.2017 | International Institute for Applied Systems Analysis (IIASA)
The personality factor: How to foster the sharing of research data
06.09.2017 | ZBW – Leibniz-Informationszentrum Wirtschaft
University of Maryland researchers contribute to historic detection of gravitational waves and light created by event
On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...
Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.
Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....
Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).
When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...
Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.
How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...
Scientists from the Max Planck Institute of Quantum Optics, using high precision laser spectroscopy of atomic hydrogen, confirm the surprisingly small value of the proton radius determined from muonic hydrogen.
It was one of the breakthroughs of the year 2010: Laser spectroscopy of muonic hydrogen resulted in a value for the proton charge radius that was significantly...
23.10.2017 | Event News
17.10.2017 | Event News
10.10.2017 | Event News
23.10.2017 | Press release
23.10.2017 | Physics and Astronomy
23.10.2017 | Earth Sciences