The study finds that the tests used by regulators do not detect when VaRs inaccurately account for significant swings in the market, which is significant because VaRs are key risk-assessment tools financial institutions use to determine the amount of capital they need to keep on hand to cover potential losses.
"Failing to modify the VaR to reflect market fluctuations is important," study co-author Dr. Denis Pelletier says, "because it could lead to a bank exhausting its on-hand cash reserves." Pelletier, an assistant professor of economics at NC State, says "Problems can come up if banks miscalculate their VaR and have insufficient funds on hand to cover their losses."
VaRs are a way to measure the risk exposure of a company's portfolio. Economists can determine the range of potential future losses and provide a statistical probability for those losses. For example, there may be a 10 percent chance that a company could lose $1 million. The VaR is generally defined as the point at which a portfolio stands only a one percent chance of taking additional losses.
In other words, the VaR is not quite the worst-case scenario – but it is close. The smaller a company's VaR, the less risk a portfolio is exposed to. If a company's portfolio is valued at $1 billion, for example, a VaR of $15 million is significantly less risky than a VaR of $25 million.
The NC State study indicates that regulators could use additional tests to detect when the models used by banks are failing to accurately assess the statistical probability of losses in financial markets. The good news, Pelletier says, is that the models banks use tend to be overly conservative – meaning they rarely lose more than their VaR. But the bad news is that bank models do not adjust the VaR quickly when the market is in turmoil – meaning that when the banks are wrong and "violate" or lose more than their VaR – they tend to be wrong multiple times in a short period of time.
This could have serious consequences, Pelletier explains. "For example, if a bank has a VaR of $100 million it would keep at least $300 million in reserve, because banks are typically required to keep three to five times the VaR on hand in cash as a capital reserve. So it could afford a bad day – say, $150 million in losses. However, it couldn't afford several really bad days in a row without having to sell illiquid assets, putting the bank further in distress."
Banks are required to calculate their VaR on a daily basis by various regulatory authorities, such as the Federal Deposit Insurance Corporation. Pelletier says the new study indicates that regulatory authorities need to do more to ensure that banks are using dynamic models – and don't face multiple VaR violations in a row.
The study, "Evaluating Value-at-Risk Models with Desk-Level Data," was co-authored by Pelletier, Jeremy Berkowitz of the University of Houston and Peter Christoffersen of McGill University. The study will be published in a forthcoming special issue of Management Science on interfaces of operations and finance.
Matt Shipman | EurekAlert!
Mathematical confirmation: Rewiring financial networks reduces systemic risk
22.06.2017 | International Institute for Applied Systems Analysis (IIASA)
Frugal Innovations: when less is more
19.04.2017 | Fraunhofer-Institut für Arbeitswirtschaft und Organisation IAO
Whether you call it effervescent, fizzy, or sparkling, carbonated water is making a comeback as a beverage. Aside from quenching thirst, researchers at the University of Illinois at Urbana-Champaign have discovered a new use for these "bubbly" concoctions that will have major impact on the manufacturer of the world's thinnest, flattest, and one most useful materials -- graphene.
As graphene's popularity grows as an advanced "wonder" material, the speed and quality at which it can be manufactured will be paramount. With that in mind,...
Physicists at the University of Bonn have managed to create optical hollows and more complex patterns into which the light of a Bose-Einstein condensate flows. The creation of such highly low-loss structures for light is a prerequisite for complex light circuits, such as for quantum information processing for a new generation of computers. The researchers are now presenting their results in the journal Nature Photonics.
Light particles (photons) occur as tiny, indivisible portions. Many thousands of these light portions can be merged to form a single super-photon if they are...
For the first time, scientists have shown that circular RNA is linked to brain function. When a RNA molecule called Cdr1as was deleted from the genome of mice, the animals had problems filtering out unnecessary information – like patients suffering from neuropsychiatric disorders.
While hundreds of circular RNAs (circRNAs) are abundant in mammalian brains, one big question has remained unanswered: What are they actually good for? In the...
An experimental small satellite has successfully collected and delivered data on a key measurement for predicting changes in Earth's climate.
The Radiometer Assessment using Vertically Aligned Nanotubes (RAVAN) CubeSat was launched into low-Earth orbit on Nov. 11, 2016, in order to test new...
A study led by scientists of the Max Planck Institute for the Structure and Dynamics of Matter (MPSD) at the Center for Free-Electron Laser Science in Hamburg presents evidence of the coexistence of superconductivity and “charge-density-waves” in compounds of the poorly-studied family of bismuthates. This observation opens up new perspectives for a deeper understanding of the phenomenon of high-temperature superconductivity, a topic which is at the core of condensed matter research since more than 30 years. The paper by Nicoletti et al has been published in the PNAS.
Since the beginning of the 20th century, superconductivity had been observed in some metals at temperatures only a few degrees above the absolute zero (minus...
16.08.2017 | Event News
04.08.2017 | Event News
26.07.2017 | Event News
21.08.2017 | Materials Sciences
21.08.2017 | Health and Medicine
21.08.2017 | Materials Sciences