A method of assessing the stability of large-scale power grids in real time could bring the world closer to its goal of producing and utilizing a smart grid. The algorithmic approach, developed by UC Santa Barbara professor Igor Mezic along with Yoshihiko Susuki from Kyoto University, can predict future massive instabilities in the power grid and make power outages a thing of the past.
"If we can get these instabilities under control, then people won't have to worry about losing power," said Mezic, who teaches in UCSB's Department of Mechanical Engineering, "And we can put in more fluctuating sources, like solar and wind."
While development of more energy efficient machines and devices and the emergence of alternative forms of energy give us reason to be optimistic for a greener future, the promise of sustainable, reliable energy is only as good as the infrastructure that delivers it. Conventional power grids, the system that still distributes most of our electricity today, were built for the demands of almost a century ago. As the demand for energy steadily rises, not only will the supply become inadequate under today's technology, its distribution will become inefficient and wasteful.
"Each individual component does not know what the collective state of affairs is," said Mezic. Current methods rely on a steady, abundant supply, producing enough energy to flow through the grid at all times, regardless of demand, he explained. However, should part of a grid already operating at capacity fail — say in times of disaster, attack or malfunction — widespread blackouts all over the system can occur.
"Everybody shuts down," Mezic said. The big surges of power left unregulated by the malfunctioning component can either overload and burn out other parts of the grid, or cause them to shut down to avoid damage, he explained. The result is a massive power outage and subsequent economic and physical damage. The Northeast Blackout of 2003 was one such event, affecting several U.S. states and part of Canada, crippling transportation, communication and industry.
One alternative to solve the situation could be to build more power plants to produce the steady supply to feed the grid and have the capacity to handle unpredictable failures, fluctuations and shutdowns. It's a solution that's costly both for the environment and for the checkbook.
However, the method developed by Mezic and partners promises to prevent the cascade of blackouts and their subsequent effects by monitoring the entire grid for early signs of failure, in real time. Called the Koopman Mode Analysis (KMA), it is a dynamical approach based on a concept related to chaos theory, and is capable of monitoring seemingly innocuous fluctuations in measured physical power flow. Using data from existing monitoring methods, like Supervisory Control And Data Acquisition (SCADA) and Phasor Measurement Units (PMUs) KMA can track power fluctuations against the greater landscape of the grid and predict emerging events. The result is the ability to prevent and control large-scale blackouts and the damage they can cause.
Additionally, this approach can also lead to wider development of, demand for and use of renewable sources of energy, said Mezic. Because energy from systems like wind, water and sun are weather-dependent, they tend to fluctuate naturally, and this ability to respond to fluctuations can dispel what reservations utilities may have about relying on them to a greater degree.
Mezic's research is published in the Institute of Electrical and Electronics Engineers journal Transactions of Power Systems. Other collaborators in Koopman Mode Analysis research include researchers from Princeton University, Tsinghua University in China and the Royal Institute of Technology in Sweden.
Sonia Fernandez | EurekAlert!
Did you know that the wrapping of Easter eggs benefits from specialty light sources?
13.04.2017 | Heraeus Noblelight GmbH
To e-, or not to e-, the question for the exotic 'Si-III' phase of silicon
05.04.2017 | Carnegie Institution for Science
More and more automobile companies are focusing on body parts made of carbon fiber reinforced plastics (CFRP). However, manufacturing and repair costs must be further reduced in order to make CFRP more economical in use. Together with the Volkswagen AG and five other partners in the project HolQueSt 3D, the Laser Zentrum Hannover e.V. (LZH) has developed laser processes for the automatic trimming, drilling and repair of three-dimensional components.
Automated manufacturing processes are the basis for ultimately establishing the series production of CFRP components. In the project HolQueSt 3D, the LZH has...
Reflecting the structure of composites found in nature and the ancient world, researchers at the University of Illinois at Urbana-Champaign have synthesized thin carbon nanotube (CNT) textiles that exhibit both high electrical conductivity and a level of toughness that is about fifty times higher than copper films, currently used in electronics.
"The structural robustness of thin metal films has significant importance for the reliable operation of smart skin and flexible electronics including...
The nearby, giant radio galaxy M87 hosts a supermassive black hole (BH) and is well-known for its bright jet dominating the spectrum over ten orders of magnitude in frequency. Due to its proximity, jet prominence, and the large black hole mass, M87 is the best laboratory for investigating the formation, acceleration, and collimation of relativistic jets. A research team led by Silke Britzen from the Max Planck Institute for Radio Astronomy in Bonn, Germany, has found strong indication for turbulent processes connecting the accretion disk and the jet of that galaxy providing insights into the longstanding problem of the origin of astrophysical jets.
Supermassive black holes form some of the most enigmatic phenomena in astrophysics. Their enormous energy output is supposed to be generated by the...
The probability to find a certain number of photons inside a laser pulse usually corresponds to a classical distribution of independent events, the so-called...
Microprocessors based on atomically thin materials hold the promise of the evolution of traditional processors as well as new applications in the field of flexible electronics. Now, a TU Wien research team led by Thomas Müller has made a breakthrough in this field as part of an ongoing research project.
Two-dimensional materials, or 2D materials for short, are extremely versatile, although – or often more precisely because – they are made up of just one or a...
20.04.2017 | Event News
18.04.2017 | Event News
03.04.2017 | Event News
25.04.2017 | Physics and Astronomy
25.04.2017 | Materials Sciences
25.04.2017 | Life Sciences