The fundamental review of the trading book by the Basel Committee on Banking Supervision is proving a tough piece of prudential regulation to implement, because it probes weaknesses such as banks’ siloed nature and differing data standards across departments. Justin Pugsley investigates.

What is happening?

The Basel Committee on Banking Supervision (BCBS) has been revising aspects of the trading book since as far back as 2009, and has produced numerous consultations and updates on a complex and demanding piece of regulation billed as the fundamental review of the trading book (FRTB). 

This is part of the BCBS’s mission to clamp down on regulatory arbitrage, improve risk management and make banks more robust.

For banks themselves, the emphasis is very much on the word ‘fundamental’. Fitch Ratings has described the plan as one of the most profound market risk regulatory changes in decades, entailing a complete revamp of the way banks identify, measure and hedge risk.

Reg rage anxiety

Not unsurprisingly, banks are struggling with the momentous task of implementation, and there are concerns that not all banks or even supervisors will be completely ready by the go-live date of January 2019.

Why is it happening?

In the words of the BCBS: “The financial crisis exposed material weaknesses in the overall design of the framework for capitalising trading activities.” Banks had insufficient capital to absorb losses, and the BCBS wants to increase those levels, especially for more exotic and complex instruments such as over-the-counter derivatives.

The BCBS aims to make it harder for banks to engage in risky behaviour and game the rules to do so. One method was to shift assets between the trading book, designed for short-term holdings, and the bank book, for long-term holdings, depending on which offered the most favourable capital treatment.

To stamp that out, the BCBS has tightened up the criteria as to which asset qualifies for which book.

What do the banks say?

To borrow a line from Led Zeppelin’s Stairway To Heaven, “There are two paths you can go by.”  

In the regulatory purgatory, where banks currently find themselves, those two paths are to use either standard approach (SA) models or the internal model approach (IMA). Naturally, the big banks prefer IMA because SA will make capital requirements on many exposures up to four times higher, if not more, particularly for complex and esoteric products.   

FRTB does not come cheap, particularly if IMA is the favoured route.

Various estimates put the cost of FRTB at anywhere between $100m and $200m over a bed-down period of three years, requiring a deep dive into IT systems and data-gathering processes. For example, depending on the instruments handled, some trading desks will generate 60 times more data than they do now.

And even when those models satisfy local supervisors, banks will still need to hold 40% more capital against exposures in the trading book, according to the BCBS’s own calculations.

Though the requirements for FRTB are mostly understood, there are still some areas that remain vague, which are troubling banks. These include knowing whether in certain circumstances third-party data can be used instead of or to complement data generated by the trading desk, which has implications for capital charges.

Another area of concern is profit-and-loss attribution tests, in particular how risk calculations can capture valuation adjustments that are not part of the risk management process

Annoyingly for banks, local supervisors have apparently not been very helpful in clarifying these points, probably because they are themselves grappling with the demands FRTB will place on their resources.

Will it provide the right incentives?

It should make banks safer and more selective over what they trade, in areas where they already have a strong competitive edge, experience and plenty of data. Greater specialisation will of course mean more expensive products for end users, but maybe that is a price worth paying for a more sound financial system. 

But there is a potential silver lining to this traumatic regulatory surgery. Banks will be forced to think about how they gather, distribute and analyse data and put in place a more joined-up approach across the front office, risk and finance. That will mean collapsing inter-departmental silos, which should in turn drive operational efficiencies. And along with greater automation, that might bring about much-needed productivity gains to offset some of the soaring regulatory costs.    

PLEASE ENTER YOUR DETAILS TO WATCH THIS VIDEO

All fields are mandatory

The Banker is a service from the Financial Times. The Financial Times Ltd takes your privacy seriously.

Choose how you want us to contact you.

Invites and Offers from The Banker

Receive exclusive personalised event invitations, carefully curated offers and promotions from The Banker



For more information about how we use your data, please refer to our privacy and cookie policies.

Terms and conditions

Join our community

The Banker on Twitter