Banks can make their Basel II compliance journey easier if they know which stage they are at, what problems they are likely to face and what solutions they can rely on. Arun Pingaley and Kiran Narsu outline a compliance continuum.

As they face impending Basel II deadlines in key jurisdictions and frenetic implementation of compliance initiatives, several banks are beginning to realise that their perceptions of the capital accord have been off the mark. Our research shows that there are several stages of a Basel II compliance programme. Banks can benefit enormously from understanding this continuum of stages in their journey to compliance. If they can identify which stage they are at, they can prepare themselves for the accompanying challenges and gauge how much work lies ahead.

Internationally active banks seeking Basel II compliance typically proceed through three distinct stages along the continuum. Although banks may have implemented varying types of vendor solutions or built technological frameworks during each stage, these do not appear to be a barrier to progress at subsequent stages.

STAGE 1: DATA

At the first stage of compliance, banks often wrestle with data challenges, including availability, movement and quality.

Data availability challenges

Banks have varying expectations of the Basel approach they undertake. For instance, if bank A adopts an internal ratings-based (IRB) advanced approach to credit risk, it will have to wrestle with significantly more data than bank B, which is planning to adopt the standardised approach. Common to both approaches is the validation of internal ratings data. For this, banks would depend on default data supplied by an external data provider, such as Moody’s KMV. However, only the IRB approach requires data about recoveries.

In all cases, data availability begins with an understanding of data for a chosen approach. Data element identification ensures that the bank knows the data that is required for its preferred approach.

Data movement and consolidation challenges

Once data is identified and available, banks need to ensure that the proper ‘pipes’ are laid to move data out of the appropriate transaction systems into a repository or central ‘risk warehouse’. Many banks have several such warehouse investments but our research shows that these often do not have enough of the risk data elements required. Even if they do, the data elements are often not at a suitable level of granularity required for downstream capital computation of risk-weighted assets. Subsequently, the work of enhancing such repositories to include the appropriate data is a challenge that must be properly addressed.

Data quality challenges

Although availability and completeness of data are critical, banks must ensure that data that is used for capital computation is accurate. Banks, therefore, must seek to reconcile accounting system data with general ledger data, and pass necessary adjustment entries to ensure that they are synchronised.

In general, banks should not underestimate the importance of ensuring strong data quality processes during the building and delivery of the risk central repository. Banks will be serving potentially multiple regulatory masters during the Basel II supervisory review process and bad data entering into a capital computation system, no matter how powerful the system, will result in extra work to correct the numbers.

Typically, the primary owners at a bank that face these challenges during the first stage of the Basel II compliance continuum are the technology staff who have an in-depth data element knowledge.

STAGE 2: MODELLING

As banks grow more confident in the quality, sufficiency and availability of their data, and move along the continuum, they face a new set of challenges. Inherent to the Basel II compliance process is an accurate means of estimating the probability of default (PD) of an obligor or the loss given default (LGD) of an exposure. Here, numerous hurdles await most banks.

Given that models for PD must be based on historical default data, the lack of availability of such data severely compromises the bank’s ability to estimate and validate its PD models. Many banks attempt to solve this problem by purchasing published default data from various vendors. If they do so, they must ensure that their portfolios correlate to the external vendor’s data so they can determine just how applicable the external default data is to the bank’s customer base.

However, for some businesses – private banking, for example – such data is difficult to collect because there is seldom sufficient internal default history or any external data provider. This restricts a bank’s ability to use validated IRB models for estimating PDs or LGDs.

The above problems also lead to the possibility of supervisors being unwilling to accept the PD/LGD estimates the bank provides because they would not be able to back-test and validate them.

In most cases, the risk group in the bank is the primary owner of the development and implementation of the statistical models used in estimation of PD and LGD, among others.

STAGE 3: CAPITAL COMPUTATION

A bank’s Basel II compliance initiative is complete when the institution is able to compute regulatory capital and provide the appropriate capital adequacy return (reports) to regulator(s). Banks that have progressed through the first two stages of the continuum begin to explore in detail, at stage three, the capital computation process that they perceive to be both well defined (which it is) and easy to implement using simple tools such as spreadsheets (which it is not).

The capital computation process is probably the most challenging stage of the compliance continuum, due in large part to the gulf between early expectations and late-stage realisation of the nitty-gritty details – compounded by the ever-shrinking number of days left for a bank to comply.

A bank’s computations span credit, market and operational risk (although there is precious little new on market risk in the Basel II accord).

Credit risk computational challenges

Many banks believe that the only decision they will need to take is which one of the three approaches for calculating unexpected loss to use: advanced IRB, foundation IRB or standardised. However, they must bear in mind that there is an abundance of options prescribed in the accord within these three basic approaches. For instance, there are two options for assigning risk weights and two collateral handling options to choose from in the standardised approach itself. In the IRB approaches, there are two different options for treating specialised lending and equity exposures.

When all the permutations are considered and coupled with the requirement for multi-jurisdictional reporting (for example, reporting in line with CAD III in Europe and in line with OCC requirements in the US) and the necessity to support different approaches for each jurisdiction (including all the options in each approach), the computation process becomes immensely challenging. Therefore, any computational engine, whether bought or built, must have the ability to support any and all of these approaches simultaneously, and support the ability to switch between these approaches when necessary.

Many banks often overlook the need to address pillar II and pillar III of the accord to the satisfaction of their supervisors because they are caught up in the mechanics of the pillar I calculations. Pillar III’s requirement to support multiple reporting formats brings its own challenges. For example, supervisors may at any given point, for market discipline purposes, call on a bank to share new information about the risk being carried on its books, which may, in turn, imply a new set of reports at short notice. This would require the creation of a new computational set that previously was not available in the engine. Banks must ensure that the solution that they implement provides the appropriate level of pillar III capabilities.

Similarly, the challenge of providing supervisory oversight over the pillar I calculations is significant because the computational processes need to be transparent, accessible and detailed enough to explain all aspects of the calculations.

Banks’ finance departments are typically designated with the task of performing capital computations and the necessary reporting to the supervisors.

Conclusion

Banks are discovering the above continuum the hard way, finding the problems as they stumble along the path to compliance. However, if they can benchmark themselves on which stage they are at and buckle themselves up for the challenges that lie ahead, it will not only serve them well in their endeavours towards Basel II compliance, but also help them to enjoy the benefits of maintaining reduced capital.

Arun Pingaley is head of the functional solutions group, and Kiran Narsu is vice-president of business development at Reveleus, an i-flex business.

PLEASE ENTER YOUR DETAILS TO WATCH THIS VIDEO

All fields are mandatory

The Banker is a service from the Financial Times. The Financial Times Ltd takes your privacy seriously.

Choose how you want us to contact you.

Invites and Offers from The Banker

Receive exclusive personalised event invitations, carefully curated offers and promotions from The Banker



For more information about how we use your data, please refer to our privacy and cookie policies.

Terms and conditions

Join our community

The Banker on Twitter