Prudent valuation of financial instruments, based on independently verified prices, became more and more importance during past years. Implications from severe mismarking and even fraudulent actions in terms of write-downs, decreased shareholder value and loss of trust in the financial industry force – together with the demanding requirements by banking authorities – financial institutions, to improve their functional and technical setup to ensure a correct valuation of trading positions on daily basis.
A consistent price-set, verified by state-of-the art test methods, which can be flexible adopted to changes on the markets market, daily generated by highly automated IT-processes, based on a clearly defined governance model and presented on a central distribution layer for shared usage of all market data customers within the bank is a future success factor and subject of the following article.
The Challenge
The Independent Price Verification (IPV)is defined by the Basel II Prudent Valuation Guidance as „the process by which market prices or model inputs are regularly verified for accuracy … (and which) should be performed by a unit independent of the dealing room, at least monthly (or, depending on the nature of the market/trading activity, more frequently)" (Basel 2.5, July 2009,§718(civ)).
An appropriate framework to fulfill this regulatory requirement is the so-called Market Data End-of-Day (MD-EoD), normally established in the Market Risk Department of the institution, which guarantees the required independence.
The MD-EoD aims at sourcing and validating independent observable prices and verifiable market input parameters used by the valuation engines in the daily Fair Value Measurement of financial instruments. The MD-EoD produces the official EoD-price-set of a financial institution, which is a crucial input for various overnight processes, and in particular the Front Office EoD, which computes the daily P&L.
When dealing with the valuation of financial instruments, financial institutions are confronted with the following challenges:
- An increased volatility of the financial markets: Market conditions can change abruptly and risk factors that were deemed negligible gained in importance in the last years.
- Inconsistency in pricing: Multiple (internal & external) channels deliver and produce market data and this often leads to inconsistencies in the valuations of financial instruments. This is especially an issue with the growing importance of model calibration and back testing.
- Increased pressure from regulators: Regulators steadily demand for higher quality of market data as well as of the control and pricing processes, but also for an extended coverage of the instrument universe as well as for higher frequency of controls.
In this article, we demonstrate that an agile IPV platform can go well beyond fulfilling regulatory requirements, but also provide a powerful instruments to tackle the aforementioned challenges.
Success Factors
When designing an in-house developed IPV-platform, based on a database application, which can cope with a high volume of intraday ticks and EoD-time-series (available on the market), the following features must considered as success factors.
Central market data repository: A consistent set of quality-insured ("IPVed") prices is daily centrally generated and presented to all market data customers of the financial institution.
Governance model: The IPV is assigned to centers of competence, which act on clearly defined validation rules and quality standards as well as on appropriate escalation procedures.
Standardized STP (straight-through-processing): in order to keep software lean and in the long run maintain- and testable, all types of market parameters or model inputs (liquid/semi-liquid (Illiquid) or – using different categories – plain (spots and pillars) and structured (curves and surfaces) should run through a standardized IPV-process.
The necessary flexibility is granted by a configuration framework: the users can define, implement and adapt methods and functions and combine these into rulesets, that can be seen as a sequence of tests to validate a price. This ruleset framework can be used to validate different objects like single instruments, curves or surfaces, but also be used to configure various process types – not only the IPV process, but also support functions like liquidity or contributor scoring.
Adaptability is supported by a further core component, the workflow builder: the user can specify (groups of) "tasks", which are performed by the different batch processes in the sequence provided by the user. A category of tasks are in particular computations, that format the input parameters as needed to call a function in the central financial math library and define how the return value should be used. The workflow ensures that dependencies between asset classes or instruments are always treated consistently. Indeed, prices of a lot of instruments are computed from other instruments, and modifications on the input parameters must be replicated along the whole dependency graph.
Data management: in order to supply all potential market data customers within the financial institution with a complete and consistent set of validated prices, the market data perimeter must be kept in sync with the instrument universe in end-user systems. Therefore, a daily import of instruments, curves and surfaces, with an automatic reconciliation and update of the market parameter universe is essential. Furthermore quality-assured repositories must be connected that deliver exposure data and valuation relevant static instrument data. A user interface on the IPV platform for instrument, curve and surface management allows the users consistency checks and to customize the structures of market parameters, that must be validated as a whole like blocks of instruments, curves and surfaces.
Equipped in such a way the IPV is a fully automatic process, providing a price for each market parameter enriched by a status, either PASS or FAILED, the latter for the suspect ones. FAILED means that the price didn't pass the ambitious tests and should be re-checked by the user and remediated if needed.
A user interface for remediation allows for liquid and semi-liquid instruments the correction of FAILED just before distribution – important for those FAILED, which may cause a large P&L impact. Illiquid financial instruments however can consume a great deal of resources and be very challenging and cannot be modified on a daily basis before distribution. Instead, a valuation based on the best possible source (mostly the front office) is checked by the automatic IPV, but distributed in any case. An analysis of the results of the automatic IPV is done next day on a given frequency (weekly, monthly, etc…).
Precondition for the fully automatic IPV is an IT-infrastructure of high stability and performance. To ensure consistency of the price-set an advanced emergency handling is needed, which contains a sophisticated fallback procedure considering all dependencies between sub-set of prices.
The price-set is published on a central distribution layer, where the customers can subscribe the subsets they need and which is ideally connected to a transformation service, which converts the prices into any type of representation required by end-users.
IPV Workflow
Following the above mentioned STP is outlined for the asset classes FX (Foreign Exchange) and IR (Interest Rates) as an example. It's applicable to EQ/COM (Equity, Commodity) and CRD (Credit & Bonds) analogously, whereby dependencies must be considered (e.g. validation of Credit products may need already validated IR curves).
Target of the daily MD-EoD is to provide a complete and consistent price-set, which represents the status of the market on the observed day.
Usually, the MD-EoD process is scheduled between 5 (defined as end of trading day) and 7 pm, before front office systems starts their P&L computation. The time window is tight, if we consider the high number of IPV objects (up to 100,000), the time-consuming (re)-calculations and the necessity to provide a time slot for user interventions to correct detected outliers, at least those, which may cause material P&L jumps.
Straight-through-processing – Step by step:
- After the 5 pm snapshot of market quotes the EoD preparation starts, which normalizes input data to make them comparable.
- The automatic IPV is performed with the aim of having a price for each market parameter and providing them with a status, either PASS or FAILED.
FX Spots are validated as the very first, because they are often needed as input for the IPV of other instruments and asset classes.
The IPV of the rates instruments must follow a specific workflow (illustrated by figure 1):- Firstly all instruments are checked, for which quotes are observable on the markets. The validation applies the configured rulesets of the IPV framework, with thresholds calibrated from historical time-series and ideally weighted by a dynamic component – either an appropriate index or simply a factor that can be adjusted intraday upfront the EoD-process in case of strong market movements. Outliers are replaced by the default-value (e.g. previous day price) and flagged as FAILED for further analysis by the user.
- Following - using the output of the previous IPV step – the so-called "derived" market parameters are computed, as configured by the workflow builder, receiving their status from their input parameters (respectively from the one FAILED input value).
- Thereafter curves are checked as a structure, whereby detected misshaped curves or curve sectors are marked as FAILED.
- This is the phase for user remediation: the results of the IPV process are presented to users in so-called "problem lists", one for single instruments and one for curves. The user can correct FAILED and smooth curve shapes by replacing spikes by another source or interpolation. Each modification requires a re-calculation of the "derived" market parameters. The automation through the ruleset frameworks tends to keep this list minimal and user remediation exceptional. This short phase of user action is ideally supported by a ranking of the IPV failures according to their relevance (exposure).
- Now the completion of the EoD-set by the volatility surfaces is possible. As soon as the IR curves are validated, external reference surfaces can be computed and used to check the internal ones. It's recommended to use internal surfaces as EoD-set, conduct the automatic IPV, but perform the analysis of FAILED surfaces on the next day, because analysis and possible remediation are very time-consuming.
- The whole universe of prices is eventually written into the central market data repository and provided on the central distribution layer, where the customers can subscribe the subsets they need.
The IPV-workflow is illustrated in the figure 1.
Figure 1: IPV Workflow for Rates Instruments
Support functions
The core IPV tool can be enriched by support features, which increase quality of the automatic IPV and deliver additional benefit.
Liquidity Scoring: this function classifies instruments as liquid, semi- or illiquid and supports the IFRS 13 FVHL (level 1, 2 or 3) assignment. Furthermore the scores can be used by the automatic process to apply liquidity dependent thresholds and IPV-rules.
Contributor Scoring: normally the so called preferred contributor determines the source-price. For methods like "one vs. reference" the second best is needed or the median or average calculated from a group of "good contributors". The results from a daily scoring upfront EoD can automatically be used by the IPV.
Process Quality Scoring: The automatic IPV must deliver reliable results. Therefore the applied IPV rulesets must be steadily assessed and improved. Rules and rule-parameters, which deliver FAILED or stale results to an untypical extend, must be identified. A report-tool, which can filter for FAILED- reasons (set by the system, using standardized and convincing text-strings), supports the users in analyzing systematically need and options for improvements.
Calibration & Simulation: Such a module allows for running an IPV for a specific instrument and EoD-date in the past and supports herewith two use-cases. (1) New or modified IPV-rulesets can be calibrated and tested before activating. (2) Each price of the past can be replicated, which is helpful to resolve complaints of customers and supports inquiries by auditors.
Task Tracker: In particular for illiquid instruments regular checks are essential but unfortunately analysis as well as mitigation actions are time-consuming and therefore daily checks aren't realistic.
To resolve sub-sets are assigned to different check-frequencies (daily, weekly, monthly, quarterly, etc…) according to their exposure. It must be ensured that each price is at least checked quarterly. The module calculates the next check date and presents the experts of the competence center daily a manageable amount, composed of elements of all frequency classes randomly selected. Important for such proceeding: conducted checks, their results and mitigation actions must be documented with date, status information and comments, the latter based on predefined text-strings, which allow for filtering and reporting. Mitigation actions, which are e.g. creation of a more advanced IPV method or switch of the price source, can normally not be completed at short notice why the module generates automatic reminders of open tasks to the user until their completion.
Upshot: As shown above, it's possible to provide a financial institution daily with a consistent and complete set of independently validated prices, which reflect the market to the maximum possible extent, by a highly automated IT-platform with generic user interfaces, which allows for adaptions to changes of the markets or bank's universe of financial instruments.
However the implementation is expensive – therefore it's interesting to look on the benefits in a nutshell.
Central pricing service for the whole bank: the broader the user group the higher is the benefit. Competence Centers generate a "price-pool", which provides all market data customers of the bank with (sub-) sets of the requested format. Capacities and expertise are centrally bundles, double-work at different departments can be avoided.
Not only Fair Value Measurement is supported, but also processes like Fair Value Hierarchy Classification or Fair Value Adjustment can use the outcome of the LQS-module.
Furthermore the core IPV process can run intraday (on demand or per defined frequency) and herewith support Market Conformity checks, which need validated prices near to trading time.
A highly automated IPV increases productivity of market data experts, replaces the widespread time-consuming (semi-) manual, excel-based solutions, and provides herewith space for a deeper treatment with the problems caused by complex and/or illiquid market parameters.
Buying of Market Data is very expensive. Therefore quality of used contributors should be assessed regularly. The daily "Contributor Scoring" enables to identify the really "good contributors" for the different (groups of) products and helps to optimize contracts to the most efficient extent.
Authors:
Dorothee Hockel, Vice President Market Data and Valuations, UniCredit Bank AG, implemented an IPV-Plattform for UniCredit Group, E-Mail: dorothee.hockel@unicredit.de
Yvan Robert, CFA, Senior Principal, Accenture GmbH, implemented an IPV-Plattform for UniCredit Group, E-Mail: yvan.robert@accenture.com
Dorothee Hockel, Vice President Market Data and Valuations, UniCredit Bank AG
Yvan Robert, CFA, Senior Principal, Accenture GmbH