IBM RegTech Innovations

The problem of calculating dynamic initial margin

Share this post:

The requirement for financial institutions to calculate dynamic initial margin is driven mainly by regulation. In 2017 the European Banking Authority (EBA) launched its Targeted Review of Internal Models (TRIM) project to assess whether the models currently used by banks comply with regulatory requirements and whether their results are reliable. TRIM states that banks need to capture time dependent initial margin for both cleared and non-cleared trades in line with “contractual arrangements.”

In this blog, we will be discussing the problem of calculating dynamic initial margin in the context of counterparty credit risk. We will also look at some innovative solutions to this problem that the financial engineering research group at IBM have been investigating.

The definition of initial margin

Margin is defined as the amount that the holder of a financial instrument must deposit, either with their counterparty or exchange, to cover the risk associated with the instrument. Variation margin covers day-to-day changes in the instrument mark-to-market value and initial margin covers potential losses in the event of counterparty default. The term “initial” can be misleading, implying that the collateral is posted only at trade inception while the variation margin is posted day-on-day as the mark-to-market value of the trade changes. In fact, the amount of initial margin posted can vary day-on-day as well.

The calculation of initial margin

There are two main approaches used for the calculation of Initial Margin: a historical VaR (Value at Risk) simulation approach and a formulaic sensitivity-based approach that has been developed by ISDA (the International Swaps and Derivatives Association). This sensitivity-based approach is referred to as the Standard Initial Margin Model (SIMM).

Initial margin as a component of counterparty credit risk

Why do we need to calculate forward/dynamic initial margin?

Both front office traders interested in calculating pricing adjustments such as CVA (Credit Valuation Adjustment), and risk managers interested in monitoring the bank’s exposure to its counterparties using measures such as PFE (Potential Future Exposure), will need to take account of initial margin through time. This initial margin forecast through time is referred to as forward or dynamic initial margin. The result is that if we are using the historical VaR approach to calculate initial margin in the context of counterparty credit risk, then we have hugely increased the computational burden. This is illustrated in the diagram below: each of the blue lines represents one of the market data scenarios (or paths) used to calculate credit risk. At each of the timesteps along these paths the red lines represent the nested historical simulation that must be carried out to determine the initial margin.

Calculating initial margin using historical VaR approach

Possible solutions to the problem of calculating initial margin

IBM has two solutions that are used to calculate and monitor counterparty credit risk: Algorithmics Integrated Market and Credit Risk (IMCR) and also IBM XVA Sensitivities Foundation. Both of these solutions will need to be able to calculate dynamic initial margin. Below we will outline three possible approaches to the calculation of initial margin that have been investigated by the financial engineering research group at IBM:

1. “Tail VaR” approach

This approach attempts to optimize the calculation of initial margin using the nested historical VaR methodology by reducing the number of historical scenarios that must be used. If we consider that we have a credit risk scenario set where at each time-step and scenario we also have to evaluate a nested historical scenario set, then the steps involved in the calculation are as follows:

  • A first pass of the trade simulation is carried out using a using reduced number of the credit risk scenarios and the full set of historical scenarios.
  • Under each of the credit risk scenarios, the initial margin is determined by applying the historic shocks and taking the 99th percentile of the netted results.
  • Across all of the credit risk scenarios, we then determine the subset of historical scenarios that were most often used to determine the initial margin (e.g. the top 20 most often used historical scenarios).
  • A second pass through the calculation is then carried out using the full set of credit risk scenarios and the reduced set of (tail) historic scenarios.

Testing by the research group at IBM has shown the results from this approach to be within 5% of the results obtained using the fully nested calculation, while a performance improvement was seen of approximately x30. The disadvantage of this approach is that it still takes an appreciable amount of time and hardware to carry out a batch calculation when compared to a ‘normal’ credit risk batch without any initial margin calculation.

2. Cornish Fisher approximation

This approach attempts to calculate an approximation for the initial margin based solely on the results of the credit risk scenarios. That is, there is no nested historic simulation to be carried out. The steps of the calculation are as follows:

  • At each time-step there will be a distribution of values from the credit risk scenarios at the netting set level.
  • The first four “moments” of this distribution are calculated: the mean value, the variance, the skewness and the kurtosis (a measure of the “pointedness” of the distribution).
  • These values are then used with the Cornish-Fisher expansion to determine the value of initial margin.

The advantage of this approach is that it is very easy to implement and has no significant effect on system performance. However, the results of testing by the research group has shown that the approximation is not a good fit when compared to the results from the full brute force nested historical simulation.

3. “Simple VaR” approach

This approach also attempts to calculate an approximation for the initial margin based solely on the results of the credit risk scenarios. The steps of the calculation are as follows:

  • At each time-step Ti there will be a distribution of exposure values from the credit risk scenarios at the netting set level.
  • We then consider the values at a previous time-step which may be a number of days before Ti. The number of days will normally be set by the Margin Period of Risk (MPoR).
  • Under each scenario we then determine the change in value (delta) between Ti-MPoR and Ti.
  • Once this calculation has been carried out for every scenario, we will have a distribution of delta values.
  • We then take some percentile (e.g. the 99th) of the delta distribution to be the initial margin value.

This process is illustrated in the simple diagrams below:

The advantage of this approach is that it is fairly easy to implement and has no significant effect on the system performance. Testing by the research group at IBM has shown the approximation to be within 10% of the values produced by using the fully nested historical simulation.

In conclusion, the current thinking at IBM is to implement the “Simple VaR” approach within its Integrated Market and Credit Risk and XVA Sensitivities solutions. Once this initial approximation is in place further enhancements to the basic algorithm will be investigated.

Click here for more information about Algorithmics Integrated Market and Credit Risk click here or IBM XVA Sensitivities.

Offering Manager, Industry Platforms Services

More IBM RegTech Innovations stories

Why FIS chose IBM Safer Payments for payment fraud prevention

With new payments products surfacing regularly, unpredictable threats also continue to increase with these new platforms and services. Balancing the need for adequate antifraud measures without impacting the customer experience has never been more important. In the emerging world of real-time person-to-person (P2P) payments, establishing a frictionless (and “fraudless”) experience is even more critical since […]

Continue reading

Understanding alternative assets and the challenges of managing their risk

Why alternative assets Unless one has been in a Rip Van Winkle-style cryogenic state over the past few years, it would be hard not to notice the shift in asset allocations toward “alternative assets” for investment portfolios of buy side institutions. There are several reasons behind the tilt. Pension and Insurance funds suffered from the […]

Continue reading

Succeeding in the future of payments

Embracing the future in the age of instant gratification The payments industry is at an inflection point. Customers increasingly demand new payment options like real-time payments and P2P payments, underpinned by security and trust. The costs of running payments operations, complying with a myriad of regulations and enforcing payments standards are increasing. Exposure to operational […]

Continue reading