In this paper we study copula-based models for aggregation of operational risk capital across business lines in a bank. A commonly used method of summation of the value-at-risk (VaR) measures, that relies on a hypothesis of full correlation of losses, becomes inappropriate in the presence of dependence between business lines and may lead to over-estimation of the capital charge. The problem can be further aggravated by the persistence of heavy tails in operational loss data; in some cases, the subadditivity property of value-at-risk may fail and the capital charge becomes underestimated. We use a-stable heavy-tailed distributions to model the loss data and then apply the copula approach in which the marginal distributions are consolidated in the symmetric and skewed Student t-copula framework. In our empirical study, we compare VaR and conditional VaR estimates with those obtained under the full correlation assumption. Our results demonstrate significant reduction in capital when a t-copula is employed. However, the capital reduction is significantly smaller than in cases where a moderately heavy-tailed or thin-tailed distribution is calibrated to loss data. We also show that for confidence levels below 94% VaR exhibits the super-additivity property.
[Authors: Rosella Giacometti, Svetlozar Rachev, Anna Chernobai, Marida Bertocchi]
Giacometti0 10497 Downloads05.06.2008
Datei downloaden Operational losses are generally observed at specific points in time and vary from moderate to possibly very large amounts. Both variables – the time of the event and the amplitude of the associated loss – are random variables whose distributions must be estimated. The concept of compound Poisson process provides an accurate analytical framework to address the modelling problem. In this paper, we analyse the class of parametric distributions which better fit the observed empirical loss data classified by business lines. Particular attention is devoted to the fitting of the tail distribution of the losses.
[Authors: Rosella Giacometti, Svetlozar Rachev, Anna Chernobai, Marida Bertocchi, Giorgio Consigli]
Giacometti 5720 Downloads28.05.2008
Datei downloaden Die Auseinandersetzung mit Risiken stellt seit jeher eine zentrale Aufgabe eines jeden Versicherungsunternehmens (VU) dar. Datensammlung, Risikoanalyse, Tarifierung und Reservierung sind wesentliche Bestandteile des Geschäfts von Versicherern und Rückversicherern. Dabei stand bisher meist die Analyse der Risiken der Versicherungsnehmer bzw. der resultierenden aktuariellen Risiken im Mittelpunkt. Noch relativ jung ist die systematische Betrachtung der Risiken, die das VU selbst betreffen- die Schwankung der gehaltenen Assets oder der Ausfall von Vertragspartnern. Die konsequente Fortführung dieser Entwicklung hin zu einer ganzheitlichen Analyse ist die Betrachtung der Risiken für den Betrieb des Unternehmens selbst – die operationellen Risiken (OpRisk). Der Artikel von Herrn Niels Kunzelmann und Herr Markus Quick (Dr. Peter & Company AG) beschäftigt sich mit den Möglichkeiten der Identifizierung, Messung und Steuerung dieser Risiken in VU.
MQuick 18150 Downloads27.05.2008
Datei downloaden This paper investigates the generalized parametric measurement methods of aggregate operational risk in compliance with the regulatory capital standards for operational risk in the New Basel Capital Accord (“Basel II”). Operational risk is commonly defined as the risk of loss resulting from inadequate or failed internal processes and information systems, from misconduct by people or from unforeseen external events. Our analysis informs an integrated assessment of the quantification of operational risk exposure and the consistency of current capital rules on operational risk. Given the heavy-tailed nature of operational risk losses, we employ extreme value theory (EVT) and the g-and-h distribution within a “full data” approach to derive point estimates of a unexpected operational risk at the 99.9th percentile in line with the Advanced Measurement Approaches (AMA). Although such internal risk estimates substantiate a close analytical representation of operational risk exposure, the accuracy and order of magnitude of point estimates vary greatly by percentile level, estimation method, and threshold selection. Since the scarcity of historical loss data defies back-testing at high percentile levels and requires the selection of extremes beyond a threshold level around the desired level of statistical confidence, the quantitative criteria of AMA standards appear overly stringent. A marginally lower regulatory percentile of 99.7% would entail an outsized reduction of the optimal loss threshold and unexpected loss at disproportionately smaller estimation uncertainty.
[Author: Andreas A. Jobst / Journal of Operational Risk, Vol. 2, No. 2, 2007]
Jobst 9214 Downloads13.05.2008
Datei downloaden In providing support for disaster-prone areas such as the Caribbean, the development community has begun to progress from disaster reconstruction assistance to funding for investment in mitigation as an explicit tool for sustainable development. Now it must enter a new phase: applying risk transfer mechanisms to address the financial risk of exposure to catastrophic events that require funding beyond what can be controlled solely through mitigation and physical measures.
Residual stochastic risks from catastrophic natural events can be addressed through insurance pooling and risk transfer mechanisms that provide the basis for financial protection and instill strong incentives for reducing vulnerability.
To reduce the economic stress after disasters, Pollner shows, World Bank instruments could be used to support initiatives to help correct market imperfections in catastrophe insurance. He takes a step-by-step approach to showing how both risk pooling structures and alternative catastrophe coverage mechanisms (long-maturity risk financing facilities, weather-indexed contracts, and capital market instruments) can achieve better risk protection and financing terms - enough to allow the expansion of insurance coverage of public assets and private property.
Pollner examines the insurable assets (private and public) in eight countries in the easternmost part of the Caribbean and, by quantifying the portion of the premium and risk used to fund catastrophe losses, shows that through pooling and the use of credit-type instruments for catastrophe coverage, governments and uninsured property owners or enterprises (with insurable assets) could expect to improve their terms of coverage. Neither local insurers nor reinsurers would suffer in profitability.
[World Bank Policy Research Working Paper No. 2560, 2001]
Pollner 7319 Downloads08.05.2008
Datei downloaden In the past decade, the legal system has done a remarkable job in absorbing the shockwaves of digital technology. As a result, the use of information and communication technologies in corporate settings in general and E-Business solutions in particular have become business as usual not only for dot-com managers, but increasingly also for inhouse lawyers and outside counsel.
The authors of this article, however, argue that the widespread use of digital communication technology on the part of business organizations leads at least in part (and most likely also latently) to new types of challenges when it comes to the management of risks at the intersection of law, technology, and the marketplace. In order to effectively manage these challenges and associated risks in diverse areas such as security, privacy, consumer protection, IP, and content governance, the authors call for an integrated and comprehensive compliance concept in response to the structural and substantive peculiarities of the digital environment in which corporations - both in and outside the dot-com industry - operate today.
The article starts with a brief overview of what we might describe as a shift from traditional compliance to e-Compliance. It then maps the central themes of E-Compliance and the characteristics of a comprehensive E-Compliance strategy. After discussing the key challenges of E-Compliance, the article outlines practical guidelines for the management of E-Compliance activities and ends with recommendations.
[Authors: Urs Gasser, Harvard University - Berkman Center for Internet & Society; University of St. Gallen / Daniel M. Haeusermann, University of St. Gallen - Research Center for Information Law (FIR-HSG)]
Gasser 8756 Downloads08.05.2008
Datei downloaden We examine the quantification of operational risk for banks. We adopt a financialeconomics approach and interpret operational risk management as a means of optimizing the profitability of an institution along its value chain. We start by defining operational risk and then propose a framework to model risk mitigation through the bank’s value chain over time. Using analytical and numerical methods, we obtain answers concerning capital allocation, network stability, risk figures, and diversification issues. Interpreting the results shows that the usual intuition gained from market and credit risk does not apply to the quantification of operational risk.
[Authors: Markus Leippold; Paolo Vanini]
Leippold0 9600 Downloads14.04.2008
Datei downloaden Die Modernisierung der Outsourcing-Regelungen bekräftigt die risikoorientierte Planung und Steuerung von Auslagerungsprojekten. Hieraus lässt sich die Notwendigkeit der Anwendung eines ganzheitlichen Risikomanagementansatzes ableiten, welcher die mit der Auslagerung in Verbindung stehenden Risiken in der Organisation adressieren muss. Dies kann durch die adäquate Einbindung des Outsourcing-Managements in das unternehmensweite Risikomanagement gelingen. Der Beitrag stellt die vielfältigen Gestaltungsmöglichkeiten für einen ganzheitlichen wert- und risikoorientierten Managementansatz (Outsourcing Governance) vor. Die erfolgreiche Umsetzung der Outsourcing Governance umfasst dabei u. a. die Identifikation und Analyse von Wertschöpfungspotenzialen innerhalb des Unternehmens, das effektive Management der zentralen Aufgaben in allen Phasen des Auslagerungsprojekts sowie die Verankerung der wichtigsten Outsourcing-Steuerungsinstrumente im Unternehmen.
[Quelle: RISIKO MANAGER, Ausgabe 4/2008, S. 1-17 / Autoren: André Baumgart, Thomas Falk, Nicolas Fandrey, Helge Lautenbach]
Baumgart 8426 Downloads10.04.2008
Datei downloaden Operationelle Risken wurden in der Finanzwelt lange Zeit systematisch unterschätzt. Nicht zuletzt aufgrund der Diskussion um Basel II rückte diese Risiko-Kategorie erst in den letzten Jahren in den Blickpunkt vieler Banken. Diese sträfliche Vernachlässigung mag – zumindest teilweise – dadurch begründet gewesen sein, dass zahlreiche Finanzinstitute ihre Verluste aus Operationellen Risiken nicht als solche erfasst bzw. erkannt haben, sondern die entstandenen Schäden schlichtweg anderen Risiko-Klassen zuordneten. Wie der Untergang des britischen Bankhauses Barings zeigt, sind Operationelle Risiken aber häufig die eigentliche Ursache vieler Verluste, die sich dann letztlich an anderer Stelle manifestieren.
[Quelle: RISKNEWS 01/2004]
Erben 7731 Downloads04.03.2008
Datei downloaden Die DSGV-Datenpools für Schadensfälle und Szenarien aus operationellen Risiken (OR) speisen sich aus den Daten, die die Sparkassen über die OR-Instrumente Schadensfalldatenbank, Risikolandkarte und Risikoinventur sammeln. Gleichzeitig ergänzen sie das Instrumentarium, denn: Kluge Menschen oder Sparkassen lernen aus ihren Fehlern, aber weise ist, wer aus den Fehlern anderer lernt. (Betriebswirtschaftliche Blätter 55, Februar 2007, S. 89)
JohannesVoit 10963 Downloads29.02.2008
Datei downloaden