Does increasing regulatory pressure on banks to simplify their finance function spell the end of the general ledger as we know it? Dan Barnes investigates.

Structural changes to system architecture in banks are likely to have a knock-on effect on business processes, changing (and possibly removing) some traditional tools. In the finance function, changes to approaches in accounting, combined with improving analytical tools, may kill off the general ledger as we know it.

Certainly, there is a chance for banks to simplify and enhance their ability to track their financial position, superseding the less complex, older ways of working. Banks must assess whether the increased short-term costs of developing a system will prove fruitful in the medium to long term.

Regulatory pressures

In recent years, changes to the finance function have been occurring in response to various pressures such as regulation and the competitive advantage delivered by new tools. These pressures threaten traditional operational systems that cannot provide the detailed levels of data necessary for modern performance management.

A typical response has been to expand the functionality of existing tools but this has repercussions, as Simon Doherty of Teradata notes. “The general ledger is an overhead. To a bank it is an operational system. Clearly, operational systems are ideally kept lean and efficient so that they don’t become difficult and costly to maintain,” he says.

However, this ideal is hard to maintain in reality. Business requirements put pressure on the departments and this extends to their tools, Mr Doherty says. “The process of accretion means [these systems] do become more complex as they gain functionality and the finance department clearly has a desire to simplify this; there is a great resistance in these functions to increasing the complexity of the general ledger.”

At the same time, International Financial Reporting Standards (IFRS) have pushed accounting towards value-based, rather than transaction-based, reporting. This means that banks should have a more realistic view of their financial position through tracking information from a number of sources, such as customer and risk data, rather than only transactions.

“The fundamental difference,” says Doug Squibb, product development manager at SAS Institute, “is that one provides for due diligence and the other has been created to provide understanding and transparency.”

In the light of the Parmalat and Enron scandals, it is understandable that the ability of the general ledger to provide due diligence is under question. This alone has led to the growth of management information systems to supplement existing processes, as data-based systems allow analysis, development of cash-flow forecasts and risk assessment. The need to apply such technology for the purpose of compliance may well be the start of greater developments.

Positive motives

Appetite is not driven by enforcement alone, though; there are positive reasons to change. IT can enable a bank to get a better view of performance so that it can exploit areas of growth.

cp/22/pJordaan, Pieter.jpg
Pieter Jordaan, First National Bank: the decision to change was not purely technological
At First National Bank, Pieter Jordaan, head of business intelligence, stresses that understanding the customer is what has driven the development of an activity-based costing approach at his bank. “When you’ve got millions of customers on your books you need to understand where you create value and obviously the general ledger isn’t going to give you that. We need to understand in terms of pricing and cost management and in terms of marketing initiatives. We need to see where we can derive the biggest return from our customer base,” he says. Mr Jordaan explains that initially the change came about because internal processes needed to be altered. The decision was not purely a technology play, although system provider Armstrong Laing Group then became involved.

 

“Although there was a change of software, this was very much about changing processes that were limiting our ability to understand – to identify better methods of understanding costs and any areas where we were misaligning price in comparison with the standard market,” he says. Over-engineered

Having purpose-built systems is enabling many banks to undertake projects in this area. “A lot of banks tried to develop their financial system into this perfect tool and over-engineered the environment so it just became too costly,” notes Mr Jordaan.

The key to successful project management in this area is to understand which information to include to generate value. As the industry has improved its analysis of internal processes, this over-engineering will become likely.

cp/22/pField, Chris.jpg
Chris Field, Extensity: banks may construct a ‘data mart’ rather than a full warehouse
A central inhibitor to moving to a full data warehouse is the time, risk and cost involved in a full implementation, says Chris Field, practice manager at Extensity. In the medium term, it is more likely that banks will try to construct a ‘data mart’ in which specific data is gathered for a specific purpose, rather than build a multi-purpose warehouse. According to Mr Field, rumours of the demise of the ledger are premature. “The general ledger can’t go away; as a record of your transactions it provides information that something like a data warehouse can’t provide. The biggest problem with it is the sheer volume of data,” he says.

“It comes down to the best practices for analysis. Typically, in best practice you are controlling 50-60 metrics. In the general ledger, you have typically got thousands and thousands of accounts so it’s the ‘wood for the trees’ argument,” says Mr Field.

Retention likely

The greater likelihood is that banks will develop systems that combine existing tools to build a larger system. From experience, banks have been hesitant to exchange one thing for another when they could have both. Legacy systems dating back 30-plus years still have a huge role in banks of a certain age and with a tolerance for cost.

Teradata’s Mr Doherty does not see the demise of the ledger as imminent: “We’re not going to see the death of the general ledger because it is a sort of bipolar entity – it has always been a transaction processing system in its own right.”

But he believes that this may change in time – particularly as costs build. “There can be a tendency to build system on system rather than replace them and add functionality because we get stuck in a certain way of thinking. When a big change comes along, there’s always a period in which we try to use yesterday’s tools to meet that new environment,” he says.

Environmental shifts do not currently appear to be slowing down. If anything, regulation and new technologies are appearing more rapidly than has previously been seen. In turn, this suggests increases in cost will outstrip those previously seen.

At this rate, the general ledger of today may not last through the night.

PLEASE ENTER YOUR DETAILS TO WATCH THIS VIDEO

All fields are mandatory

The Banker is a service from the Financial Times. The Financial Times Ltd takes your privacy seriously.

Choose how you want us to contact you.

Invites and Offers from The Banker

Receive exclusive personalised event invitations, carefully curated offers and promotions from The Banker



For more information about how we use your data, please refer to our privacy and cookie policies.

Terms and conditions

Join our community

The Banker on Twitter