May 24, 2016

FRTB Market Risk Capital Fundamentals Explained

In this video blog, Risk Product Specialist, Sammy Colas, breaks down the two methodologies outlined by the final FRTB market risk capital regulation – the sensitivity based approach and the internal model based approach. He explains how they differ and the impact each may have on capital requirements. He also explores some of the challenges institutions could face when moving to the new FRTB framework, including data storage for back testing and PnL attribution exercises, desk optimization capital structure, and the role model validation.

Video Transcript:

Jim Jockle (Host): Hi, welcome to Numerix video blog, I'm your host Jim Jockle. Today we’re going to talk about the Fundamental Review of the Trading Book. Basel’s introduction and arguably the first major rewrite of market risk guidelines for over a decade. Joining me today is Sammy Colas, Risk Product specialist at Numerix, Sammy welcome.

Sammy Colas (Guest): Thank you for having me, Jim.

Jockle: Alright, so let’s just start at the beginning. We have sensitivity based approach and we have the internal model based approach. Why don’t you just give us a quick break down of both approaches and some of the impacts as it relates to capital.

Colas: Yes, absolutely. So let me start with the standardized approach. There are really three segments to it. When banks will have to compute these capital charges, they will have to first compute a sensitivity based capital charge. On top of this charge, they will have to have a default risk charge and then a residual risk add-on and I can go into a little bit more details into all the charges if you want.

Jockle: Ok, and as it relates to the internal model approach, how is that different?

Colas: Yes, so internal model approach, which, it’s a risk based approach which the banks can choose to use to report their capital charges. It will more than likely be lower capital charges, but having said that even though banks want to use or will use the internal model approach to report their charges, we still need to compute the standardized approach capital charges. Why? For two reasons. First as a benchmark the regulator wants the banks to compare the numbers on a monthly basis and second, more importantly, as a fallback. Meaning that under the internal model approach banks will have to pass rigorous back testing and PnL attribution tests and should they fail these tests, they will have to revert back to the standardized approach.

Jockle: So, you bring up the back testing issue. So the backbone of market risk management for many, many years has been VaR, but with Basel and FRTB we’re now starting to see the introduction of expected shortfall and as well as looking at tail risk. What are some of the challenges that are stressing existing infrastructures as banks are starting to have to move to this kind of new environment, if they’re going to get optimal capital relief under IMA?

Colas: Yes, so there’s going to be many challenges. The first one will be data challenge; data will be coming from everywhere. At the desk level they want the trade information and market data obviously, but also storing historical data because of these rigorous back testing and PnL attribution exercises, banks will have to store all of this data potentially going back in the past to rerun capital computations and so forth. On top of this, these capital charges are at the desk level, meaning that their risk models used at the desk level could be different across the desk and they will need to be aggregated at the bank level at some points. So it’s going to be a big infrastructure problem.

Jockle: So you bring up an interesting point. So a lot of systems have been designed differently as it is related to risk management systems vs foreign office models. A lot of institutions have invested millions and millions of dollars’ in front office models versus risk systems that were designed to be more portfolio analytics. What are some of the challenges in terms of reconciling those models as it has such an implication for PnL attribution?

Colas: Yea, so that’s exactly the purpose of this PnL attribution test. So basically, why do banks need to run this test? So first off to understand again these tests will be validating the risk management models at the desk level, meaning that regulators want to understand and make sure that whatever risk factors being used in the risk management model at the desk level is that the set of risk factors are sufficient enough to explain the PnL at the desk level. So it’s really getting this trust that the internal models used to compute this capital charges are actually accurate enough.

Jockle: So one of the other areas that has come into the IMA approach has really been around model validation. The definitive guideline in this is Fed SR-11, introduced a couple of years ago. A lot of documents are referring back to that, what changes in the FRTB framework around model validation at this point?

Colas: The final rules were just written as of January and were just published as of January 2016 and they’re expected to be put in place in January 2019 under local regulation. So it’s still an open topic but there will be external auditors that will need to validate specifically these internal models. So we expect to have banks reaching out to external auditors to actually help them validate their models. Making sure that they can pass the back testing exercises and the PnL attribution exercises.

Jockle: So as these teams are coming together, where’s the starting point? Is it looking at existing architecture and figuring out how to adapt? Or are there other challenges that institutions should be thinking about as they’re looking to implement these guidelines?

Colas: We discussed a lot about the data, but I think in parallel a really, really hot topic will be desk optimization capital structure because the Basel Committee released a quantitative impact study showing that on average, under the internal model approach, the capital charges will increase by 41% with this new framework, this new FRTB framework compared to the current one. So banks will definitely try to optimize their desk, assess the profitability of desk even, and maybe split some of them.

Jockle: I want to thank Sammy Colas for his thoughts on FRTB. And of course, stay up to date on all that Numerix is doing for FRTB via our resource page on numerix.com. You can check out a new white paper on FRTB, "Finalized but Far from the Finish Line," as well as a replay of our webinar on FRTB CVA. Of course we always want to talk about the topics that you want to talk about, follow us on LinkedIn and on Twitter @nxanalytics, I’m your host Jim Jockle. Thanks for joining.

Need Assistance?

Want More From Numerix? Subscribe to our mailing list to stay current on what we're doing and thinking

Want More from Numerix?

Subscribe to our mailing list to stay current on what we're doing and thinking at Numerix

Subscribe Today!