Technology is not always a primary consideration in banking M&A activity. But the success, or failure, of subsequent IT integrations can make or break any deal.

Nomura’s European CIO Mark Butterfield first learnt of the Tokyo-based bank’s plans to buy Lehman Brothers’ Europe, Middle East and Africa investment banking and equities business on Wednesday, September 17, 2008. That left him with just two days to assemble and prepare a team to carry out a hasty due-diligence process, which would take place over a matter of hours that Saturday, and play a vital part in Nomura’s eventual decision to go through with the acquisition.

In the frenzy of shotgun marriages and fire-sale acquisitions which swept the banking world in 2008, Mr Butterfield’s position was unusual, but by no means unique. Many other IT heads were faced with having to make similar calls on the viability of acquisitions of an often unprecedented scale. No small task in an industry where technology underpins almost every competitive advantage.

The importance of IT to Nomura’s planned purchase was particularly huge, however. Because Lehman’s various segments were split up between a number of different parties, Nomura did not have the luxury of acquiring a fully functional, profitable company and performing an unhurried integration over a number of years. Instead, the bank’s management had to decide whether it could actually afford to take on 3000 European staff, as part of a business unit which was unable to operate in isolation.

For the acquisition to be a success, Mr Butterfield had to be able to quickly integrate, and bring back online, the IT systems which had helped Lehman become a top player in the European market. “There were other concerns too, but essentially they were irrelevant if we couldn’t get the technology to work,” he says, something which the bank’s executive board was quick to realise. “If we had said that we wouldn’t be able to achieve that for a year, I’m not sure if the deal would have gone ahead. IT was so fundamental that if integration wasn’t achievable relatively quickly, there would have been a huge question mark over the deal.”

Across the industry, the success or failure of banking merger and acquisition (M&A) activity can very easily depend on IT integration, even after a merger is confirmed on paper, warns Peter Redshaw, a managing vice-president with consultancy Gartner. “Despite proper due diligence, a lot of banks will be stumped by the complexity of mergers.” Complexity, he adds, which is invariably underestimated.

Diligence deficiencies

When IT can make or break even a more measured M&A deal, the exceptionally short timeframes and high-stress situations which characterised 2008’s flurry of takeovers and mergers may well raise concerns. Unsurprisingly, in many cases little or no systematic due diligence or planning was conducted before deals were finalised, says Mr Redshaw, and as a result, some final integration goals may prove unreachable. This, along with broader concerns about the merits of certain acquisitions, may account for the reluctance of many of the banks which made significant acquisitions at the height of the crisis to discuss merger progress, and the extreme caution exercised by others in doing so.

Mr Butterfield’s team was left with scant hours on the Saturday of the deal weekend to survey IT systems and garner as much information as possible from Lehman staff, before spending an evening sequestered with lawyers and representatives from PricewaterhouseCoopers, which was acting as administrator for the US bank. The next morning saw a board meeting, where Mr Butterfield presented what limited findings and views on technology integration he had been able to garner.

He concluded that the integration was possible, based on his assessment  of a few key criteria, including the practicability of routing trade flow from Lehman’s trading operations to Nomura’s processing environment, and of distributing Nomura’s underlying reference data back to Lehman’s platforms.

However, Mr Butterfield frankly admits that because he and his team did not know exactly which systems Nomura would eventually take ownership of, establishing how long the process would last was not easy. “It was impossible to pin down exactly who owned what in a weekend, no matter how many lawyers you had sitting round a table… these firms are just too big and too complicated to ever get to that point,” he says. Despite these uncertainties, bank officials settled on a 12 week schedule to get Lehman’s EMEA franchise up and running, and eventually, after board meetings with Tokyo, the decision to formally enter into the deal was made.

IT systems may have been a potential deal breaker for Nomura, and always play a vital part in cementing the relationship between new banking partners, but they are often not afforded the consideration they should be in M&A deals, says Alistair Maughan, partner at US law firm Morrison Foerster, and co-chair of its technology transactions group. “The whole issue of how systems are going to work together is too often not taken into account until the deal has been done, and that then leaves IT departments stuck figuring out how to actually do it.”

Of course, technology integration may not be a huge cost consideration when compared to a large-scale acquisition or divestment exercise. But the undertaking can be huge, covering everything from telecoms systems, networks and desktops, to far more complex specific business-facing applications, banking systems and trading platforms. Delays, or even outright failure, can cost a bank dearly.

Surveying systems

It is only once the ink has dried and technology teams finally get to take a proper look at a new acquisition’s systems, that the challenges involved in an IT integration project become fully apparent. When Alvarez & Marsal (A&M) – the New York-based restructuring firm charged with unwinding Lehman’s remaining assets – first began to survey the failed bank’s IT infrastructure it found itself facing a mass of unmapped systems, says Ann Cairns, managing director with A&M, and head of its European financial industry group. “One of the things we started to look for was what the blueprint of the bank was; what systems kept it running, how they worked, and who the key people who ran them were. And while some of that was there, there wasn’t a blueprint in the cupboard we could pull out to see how it all worked.”

The challenges faced by A&M in unravelling Lehman’s IT platforms, as well as the methods it used to tackle them, are common to many a banking M&A situation, says Ms Cairns’ colleague Jeffrey Donaldson, a managing director and head of the firm's IT team. Specifically, A&M relied heavily on Lehman’s business continuity and disaster recovery systems, which already identified critical systems, and provided A&M with the first tranche of applications required to provide basic services to parties, such as PwC, Barclays Capital and Nomura, which were responsible for different parts of Lehman, under the terms of a transaction sharing agreement. This was no small task, when the firm’s London-based exotic options portfolio alone required somewhere in the region of 20,000 pricing codes, says Ms Cairns.

A lack of blueprints and documented planning may be further compounded by the sheer age of some of the critical systems in most large banks, not to mention the fact that the IT architects who actually designed them are often no longer a part of the firm. Even worse, younger employees may not have the skills to thoroughly disentangle systems.

Take Common Business-Oriented Language (Cobol), for example. The venerable programming language – apocryphally said to have more money spent on it annually than Switzerland’s gross domestic product – celebrated its 50th birthday recently, but is still incredibly commonplace in banking IT. One large European banking firm which consultancy and software provider Micro Focus has worked with currently has all of its card and retail banking systems, as well as much of its corporate and investment banking platforms, written in Cobol, says John Rogers, business director of the Micro Focus’s financial services arm.

However, employees familiar with the programming language are becoming an increasingly rare commodity. According to a survey of IT heads by Computerworld magazine, 59% of in-house Cobol programmers are aged 45 or older, compared to just 5% under 35.

For some, systems are even more archaic. A majority of systems running from the datacentre of a major UK bank, which was acquired at the height of the crisis (and which declined to comment), are written in a low-level programming language dating from the 1950s. That may well prove to be something of an integration headache for its new owner.

Companies such as Micro Focus offer software and guidance to customers updating legacy systems to more modern platforms. However the dangers are clear, especially given that M&A activity often involves large numbers of redundancies. Sacrificing expertise as part of a ruthless cost-cutting exercise may not always be a smart long-term move.

“In many cases, when the acquiring bank asks how a particular system works, the answer is often that the person who wrote it left the bank 25 years ago, and it hasn’t been touched since,” says David Parker, head of Accenture’s London-based banking practice. At best, that will merely lengthen the time taken to complete the migration. However, he adds that very few banks ever manage to completely switch off all of a new acquisition’s systems. “In many cases, they just don’t have a handle on the different processes because what is really there just isn’t documented, and it’s just too painful and risky to turn it off.”

An unwelcome inheritance

Poorly documented systems are not the only unwelcome inheritance acquiring firms are likely to find themselves saddled with. A newly purchased bank will likely come complete with numerous pre-existing relationships with technology vendors and service providers, from software licences to outsourcing arrangements.

As the trend for banks to develop fewer systems internally and purchase more software and services from third parties continues to develop, this is becoming a bigger problem than ever. These often unwanted obligations are rendered even less welcome by the contractual rigidity blindly accepted by banks for which failure or takeover was never a consideration.

As a result, consolidating these services can be tough. Acquiring banks are often reduced to frantic renegotiation to deal with a mass of external commitments, which in many cases are a hangover from over-procurement undertaken in the heady days of the mid-2000s. Mr Maughan describes the diet for financial services technology lawyers over the past two years as being predominantly renegotiations of such contracts. “In the past, the issue of flexibility was not given the attention it ought to have been by some banks, because they never thought they’d be in the position of actually reducing volumes,” he says.

Once systems and obligations have been surveyed, integration can begin in earnest, and for most banks, the speed at which this progresses is crucial. This was certainly the case at Nomura, where a simple re-badging of Lehman’s operations was not a viable option.

If the Japanese bank had bought a full firm, it could have integrated and rationalised systems over time. As it was, the firm acquired thousands of employees with no functional business. And while the asking price for Lehman’s European division may only have been $2, Nomura was facing a hefty wages bill, with no prospect of immediate returns. Bank officials were naturally keen to put newly acquired staff to work, and not only to ensure value for money. “Obviously what you don’t want is a couple of thousand traders sitting around with no way to do business, because they’re not going to stick with you if they’re not back in the market,” says Mr Butterfield. “There was an unbelievable focus to get the franchise up and running, and we had to move quickly because it was the only way to hold everyone together.”

The situation was further exacerbated by Nomura’s larger competitors' attempts to poach its newly acquired staff. One employee describes rival firms’ efforts to recruit ex-Lehman employees directly outside Nomura’s Bank Street offices.

Nomura was ultimately successful, however, and within four weeks of the deal officially closing on October 13, 2008, Mr Butterfield’s team had some of its newly acquired equities trading systems running in test mode, and by week six, some tentative proprietary trades were made. A small number of clients with very low trade flows were taken on shortly afterwards, and in January 2009, the process of reconnecting to Lehman’s former markets got under way in earnest.

For M&A deals conducted under slightly less pressing circumstances, integrating systems as fast as possible should still be the goal, says Intesa Sanpaolo’s CIO Silvio Fraternali, who oversaw the IT side of the 2007 merger between Banca Intesa and Sanpaolo IMI. “Time is money in this type of project, so the shorter the integration, the better,” he says. “The best approach is to merge first, and then make the systems more sophisticated later.”

Planning for success

However, being too eager may also prove costly in the long run, cautions Likhit Wagle, global leader for banking and financial markets at IBM Global Business Services. “Some of the investment banking integrations which took place post-crisis were achieved very fast, but because they were not done with a pre-determined plan or operating model in place, they have built in a huge amount of expense. At the moment it isn’t creating a problem because volumes are high enough. But the cost per transaction is very large and they’ve stored up quite a significant problem for the future.”

Mr Wagle points to Spanish banking giant Santander as the antithesis of this approach. The Madrid-based group has acquired operations in the UK, US, Europe and Brazil in recent years, and a standardised integration strategy taking two to three years has been utilised in every instance, says CIO Jose Maria Fuster. The process results in a full implementation of Santander’s proprietary banking systems and complete re-branding. But before that takes place comes an initial evaluation and design phase to help tailor the integration to unique aspects of the local market and the acquiree’s IT architecture.

This process is central to Santander’s operating ethos, says Mr Fuster. “The way we manage the group is very much based on ensuring that we have the same systems implemented in all of our different banks. It enables us to better manage the group, so from a governance perspective, it’s very important.” It pays dividends from a business perspective too, he says, and rolling out the bank’s own systems and applications in each new acquisition helps achieve cost reductions which Mr Fuster describes as “crucial for the numbers of the operation”. Certainly, Santander claimed to net £300m ($487m) in annual cost savings following the conversion of Abbey’s systems in the UK, and the group currently sports an impressive cost-to-income ratio of below 40%. 

Bank of America also has an extensive pedigree in acquisitions, and a well-defined process for post-merger integration, says Eric Livingston, a senior change executive with the Charlotte, North Carolina-based firm. It is a process which, Mr Livingston says, has been employed in almost all of the M&A activity the bank has been involved in for the past 20 years.

Of course Bank of America’s most notable recent acquisition – its $50bn acquisition of Merrill Lynch – also took place at the height of the crisis and Mr Livingston was responsible for post-merger integration efforts, leading the central transition programme. However, despite the extraordinary circumstances and timing, he says the Merrill integration ran to a similar timeframe and process as the rest of BoA’s acquisitions. Although he does concede that market conditions and customer expectations did set it apart.

Diligent planning and extensive experience are no guarantee of success, however. Santander’s migration of Alliance & Leicester and Bradford & Bingley’s systems to its own platforms reportedly blocked customers from accessing internet banking services for a time. Alliance & Leicester customers also complained of cards arriving late and of sort codes changed without notification.

Mergers will always be an incredibly complex undertaking, so perhaps completely eliminating such issues is unrealistic. But there is certainly room for improvement, and many of the technical risks banks face when conducting M&A activity could be mitigated by giving proper consideration to IT integration when entering into a deal. Most in the market claim to have learnt from the events of recent years, and certainly, the costs and operational risk issues involved have caused many banks to give more thought to the topic when considering expansion of one sort or another.

But the real question, says Mr Maughan, is whether those lessons will prove long-lasting. “This isn’t the first time we’ve had a recession, and neither is it the first time the banking world has seen major acquisitions and divestments, but these issues still remain. It will be a wise bank which actually tries to institutionalise what it has learnt into its corporate memory.”

PLEASE ENTER YOUR DETAILS TO WATCH THIS VIDEO

All fields are mandatory

The Banker is a service from the Financial Times. The Financial Times Ltd takes your privacy seriously.

Choose how you want us to contact you.

Invites and Offers from The Banker

Receive exclusive personalised event invitations, carefully curated offers and promotions from The Banker



For more information about how we use your data, please refer to our privacy and cookie policies.

Terms and conditions

Join our community

The Banker on Twitter