Packaged solutions are making headway as the best approach to replacing legacy core-banking infrastructure, but organisations should look to current trends when planning their system migrations. By Peter Middleton, VP financial services, Oracle EMEA, and Andre Loustau, CTO, Temenos

Core banking system replacement and enhancement are high on the agenda for many banks worldwide – a fact attested to by a number of recent surveys and reports from industry analysts. A survey published in the April issue of The Banker showed that replacement of core systems or components thereof is considered, alongside multi-channel distribution, as the highest priority for retail banks. Reports from Celent and TowerGroup predict large growth in spending in this area by all banks over the coming years, and a shift towards more vendor-built solutions, even among larger banks.

The cost and risk of tinkering with the infrastructure remain concerns for banks. But the industry’s oft-discussed business drivers – the need to reduce risk and cost across the business while increasing agility and competitiveness – mean that most have decided to embark on some form of replacement, be it full-scale or just of selective components.

One size may not fit all

Once the decision has been made to replace – what next? Functionality is the key consideration when choosing a new solution. When making decisions such as this, all banks must consider the fit of a solution to their current needs. Does it support local accounting and regulatory requirements for the areas it operates in? Does it support the native language and the multiple channels required by the bank?

The cost of new hardware and ongoing maintenance of the entire solution are also important considerations. Banks that operate in more than one country should also ask themselves if there is an opportunity for rolling out a single solution across the entire operation.

Beyond a solution’s fit for current purpose, banks should be considering business and technology trends and how these might shape their core banking requirements in the next 10 to 15 years.

Chief among these considerations is the push to truly embrace the ‘real-time’ concept in banking business and infrastructure. This includes the real-time enterprise model described by Gartner, and its impact on traditional overnight batch processing. It also includes the rationalisation of customer, product and channel data, and how this can be used to provide real-time business information to support decision-making.

The current situation in both these areas today is, for many banks, holding back progress in areas such as customer service as well as cost and risk minimisation. But Gartner analysts predict that by 2007, the dominant players in every business segment will have achieved leadership or fortified their lead through real-time capabilities.

“Real-time is becoming an important competency as businesses fight to retain demanding customers while coping with economic uncertainty and turbulence,” says Mark Raskino, research director for Gartner. “This is a key finding as we believe technology is creating a business opportunity gap and fear of the consequences will overcome uncertainty, compelling companies to take action on renovating their enterprise architecture and process base.”

The death of ‘end-of-day’

Back-office processing activities are usually not foremost in the thoughts of a bank’s business executives. But if problems emerge and the processes that traditionally take place overnight are not completed, the business can grind to a halt. A delay of just half an hour in systems coming online at the start of the day can cost an organisation a significant amount of money.

Most banking systems to date have focused on helping banks speed the throughput and time it takes to complete all the required overnight batch processes. The ideal situation for many has been to leave enough time to fix any errors and run the whole process again if required. But the constant worry is still that overnight processes will not be completed by the time the bank must begin operations at the start of a new business day.

This is especially problematic for organisations with global operations that are looking to take advantage of the benefits that can be gained through running a common core banking solution across multiple sites. Doing this across a number of time zones is difficult if the overnight batch has to be completed in the time window between the western-most branch closing and the eastern-most branch opening for business. A failure of the overnight batch would typically lead to a system restore, a rectification of the problem and a re-run of the batch. Such a scenario could severely delay the availability of the system to those users in the eastern-most branches.

But today, banks are looking to use technology to move beyond the traditional approach and remove this worry. As new channels have emerged and put more pressure on banks to offer 24x7 access to their services, work-arounds have been created to give customers a semblance of real-time by making some kind of information always available. But this is not an accurate and real-time picture of the customer’s status or the status of the bank, and the work-arounds have many limitations.

By embracing technology that enables posting to the live-account data records while end-of-business tasks are being processed, organisations can free themselves from the constraints of the traditional batch environment. They can also evaluate the different transaction processes that traditionally take place in batches overnight, to streamline and group them more logically and efficiently.

Non-stop processing

In a non-stop processing architecture, all the functions traditionally performed during the end-of-day process are still required, but are now carried out as online transactional jobs running as background tasks and are known as the close-of-business process. All on-line functions can continue uninterrupted while the close-of-business processing is running, thereby ensuring a true 24x7 environment. From an accounting perspective, the value and book dated account balances are validated and updated, as appropriate, by the close-of-business processes and online users.

An additional benefit of the close-of-business process operating in a transactional manner is that, in the event of an error occurring, the relevant transaction will simply be rolled back and the process will continue with the next transaction. This means that the traditional problems and time constraints associated with restoring the system following an end-of-day error are removed, thereby adding a significant degree of resilience.

This offers enormous advantages for clients operating multiple branches or multi-legal entities. Organisations can run independent close of business processes for each company or entity. This is of great benefit to larger or more complex organisations running multiple entities and operating a central system with regional hubs. It enables a truly global installation, with multiple entities across the globe running on a single, integrated, real-time banking platform with centralised reporting and risk management.

Dutch multinational bank ING is in the process of standardising on a common core banking application across its wholesale banking operations, which were previously running on systems that dated from the 1980s. The bank is taking a shared service centre approach where front and back office are separated, and back-office functionality consolidated into a single central location for the region.

“When selecting a packaged solution to roll out across our different branches in this way, it was obviously important that the solution had the ability to handle multiple languages and multiple legal entities,” says Erik Dralans, head of operations IT Europe, ING Bank. “But we were also very interested in the possibility of removing the need for traditional batch processing in each of the branches. By doing this we can achieve a more cost-effective and flexible basis for our business.”

Operational = analytical

As well as being able to achieve real-time processing, often across multiple locations, organisations are increasingly looking to have access to all data in real-time, and that is where an integrated operational package provides an immediate advantage. Traditional, in-house developed systems were built up department by department, with no common data structures. The newest core banking systems today, however, support keeping most of the data in one place.

This helps the bank to move from a higher, macro level assessment of risk to dealing with it at a lower level, in real-time. This means greater accuracy, which can benefit the bank in terms of regulatory compliance. Better, more detailed reporting capabilities, for example, could enable a bank to reduce its capital adequacy liability under Basel II.

This is significant because it can free up capital to invest in other areas. With the removal of end-of-day processing, banks can embrace the philosophy of access to real-time transactional data. The whole point of a data warehouse is an anathema. The practice of extracting consumer data from different sources to put into a centralised customer information file is a diluting process because information is collected in at a summary level and therefore is out of date and incomplete almost immediately.

Speaking at a Gartner symposium at the end of last year, Gartner Research Asia Pacific chief Bob Hayward identified real-time data warehousing as one of the top 10 strategic technologies for 2004. But interestingly, the best approach to take was not to have a data warehouse at all.

“Previously, there has been a time-lag with information being a day-old because of a policy of overnight back-ups,” he said. “Many executives are telling Gartner today that this level of service is no longer good enough.

“The idea of rejecting a data warehouse completely does not mean ignoring the need to have information available. Instead, the strategy involves deployment of a new generation of business intelligence tools residing on a database of information.”

Efficiency drive

Today, with the right tools, banks have the option of running analytics in a real-time operating environment, without any degradation of performance on the operational systems because a totally different set of processors can handle the analytical side. So not only can operational efficiency be increased, but information required for Basel II, for example, or information used to pursue business opportunities, can be more easily identified and extracted.

Banks that embrace this model will find that their core banking environment doubles as the real-time analytical and operational data environment. In short, banks are able to work with a single source of truth. This truth can provide the organisation with a number of business benefits, including:

  • seamless multi-channel access to banking services and transparent and synchronised account status information;

 

  • faster product to market for competition and differentiation;

 

  • improved customer service generally;

 

  • more accurate risk management across all categories of risk;

 

  • reduction in the cost of maintenance and support of current legacy systems;

 

  • simpler management across different time zones for global organisations.

UK bank Abbey is one financial institution that has implemented an information superstore to ensure it has consistent data about all its processes. “We know what drives profitability at the product level,” says Margaret Schwartz, the bank’s director of retail information. “But our understanding will be enhanced by looking at why profitability varies between customers who might otherwise look similar.”

Real-time a reality

The concept of real-time has been around for decades but today the technology exists to make it feasible. For banks looking at replacing legacy core banking systems that are constraining their business activities, there are two main areas where real-time capabilities should be considered. In the back-office space, removing the need to shut down systems to conduct end-of-day processes and back ups can provide significant cost benefits while reducing systemic risk. In the realm of customer and business data, achieving real-time business intelligence should be the goal. With both of these in place, banks can greatly enhance their agility and competitiveness.

The power of grid Overnight processing of the day’s transactions has always been a very computer-intensive task and the processing power required has meant large mainframes dedicated to the task. But with the evolving grid technologies available on the market, organisations can take a more distributed approach to both processing and analytics.

Grid computing is the coordinated use of a large number of servers and storage acting as one computer. It is also sometimes described as ‘utility’ or ‘on demand’ computing, depending on which vendors are involved. Whatever the name, this technology promises banks no longer need to worry about spikes in demand and excess capacity costs; computing power is available when they need it. Grids are built with low-cost modular components, so it is easy to start small and increase the computing power when required.

According to Larry Tabb of the Tabb Group: “There is tremendous opportunity in both data and service grids as firms strive to better attempt to better utilise their existing data, manage their heterogeneous infrastructure, leveraging component-based web services technology to extend and implement services-based architecture.”

Grid computing introduces sophisticated workload management capabilities that make it possible for applications to share resources across many servers. Data processing capacity can be added or removed on demand, and resources within a location can be dynamically provisioned. As banks look to standardise their core banking solutions and improve the way they handle transaction processes, grid computing can provide increased flexibility.

Shift to vendor built solutions

Analyst firm Celent claims that today, approximately 70% of the top 100 and approximately 50% of the top 1000 financial institutions in the world have internally-built core banking systems in place. But many who were scarred by attempts at replacement during the 1980s and 1990s now view core systems replacement as too large a project to undertake alone.

As smaller banks are already benefiting from the implementation of packaged solutions, large banks are beginning to consider them. There is obviously little competitive advantage to be gained from sticking with existing systems. Competitive advantage comes from delivering higher levels of customer satisfaction at a cost that is lower than your competitors. But many legacy core banking infrastructures make doing this difficult. And even the banks with the largest in-house IT resources are beginning to realise that re-building everything from scratch themselves to provide the required flexibility is not an effective use of resources.

E-mail Peter Middleton at peter.middletonoracle.com E-mail: Andre Loustau at aloustautemenos.com

PLEASE ENTER YOUR DETAILS TO WATCH THIS VIDEO

All fields are mandatory

The Banker is a service from the Financial Times. The Financial Times Ltd takes your privacy seriously.

Choose how you want us to contact you.

Invites and Offers from The Banker

Receive exclusive personalised event invitations, carefully curated offers and promotions from The Banker



For more information about how we use your data, please refer to our privacy and cookie policies.

Terms and conditions

Join our community

The Banker on Twitter