Share the article
twitter-iconcopy-link-iconprint-icon
share-icon
FintechMay 6 2007

Integration takes leap of faith

Could the difficulties surrounding integration of the front, middle and back offices be political rather than technical? Alan Duerden reports.
Share the article
twitter-iconcopy-link-iconprint-icon
share-icon

Financial institutions are finding themselves in a double bind, no-win situation. In a market that is constantly evolving and growing, a bank’s ability to regenerate is key. To generate alpha, banks need to come up with new instruments, be more innovative, more flexible and able to establish quickly where that flexibility and innovation should be channelled.

At the same time, global trends towards cost cutting and efficiency have increased the risks involved in dealing with trading partners, generating compliance issues. Banks need to remain competitive in a market that is becoming increasingly complex and restrictive. It is a causality dilemma, and data integration is at the heart of it.

The front office wants to give customers the best service it can, and to churn out data in updating instruments and creating new ones. The back office must match, reconcile and settle transactions accurately and push for further data integration with the front office if it is to comply with or achieve the best possible risk management that it can.

“Orchestrating the flow of data though all of this supply chain is not a trivial task,” says Tim Lind, senior vice-president of product management at GoldenSource, the enterprise data management company. “If it were easy, it would have been sorted out by now.”

Complicated process

One problem behind integrating data between the front, middle and back offices is the heterogeneous way in which most financial institutions have developed with different business, customer and technology ‘silos’. This complicates the process of using data across an enterprise when there are different ways of accomplishing the same thing.

“There is not one vendor that can supply all the business applications required to run a modern financial programme,” says Mr Lind. “Nor is there any modern financial institution gifted enough to integrate all these different applications in a seamless way.”

The bespoke and legacy systems that financial institutions have built up over time exacerbates inoperability, having been set up to deal with very specific financial instruments when interoperability was not on the agenda.

These bespoke systems are not usually integrated with anything else, and bringing them together can mean deconstructing individual instruments to find out their common traits and what can be used across a palette of instruments. This is costly, time consuming and potentially affects the everyday running of business.

Problem solving

Streamlining consistent data to be shared between data silos and between the front, middle and back offices can be achieved through service-oriented architecture (SOA), which allows data to be unified across a whole enterprise by connecting up the numerous software applications that the financial industry has accumulated. “The challenge is to feed those business applications with a consistent source of validated content,” explains Mr Lind.

SOA can be accomplished through an ‘open-heart surgery’ type approach or a project delivered in smaller steps rather than executed in one go.

Francis Wenzel, vice-president of product management, data management solutions at Sungard, a software and processing solutions provider, likens the rip-and-replace tactic to trying to boil the ocean. “You can’t just go out there and do it in one hit. It’s a maddeningly complex process and one of the biggest challenges is the business case justification for such a large project,” he says.

Step-by-step projects are preferred because ‘home-grown’ systems are used to manage specific aspects of the data flow between the front and the back offices. If these are taken away to be replaced or updated, some parts of the business could grind to a halt. The data integration process suffers the perennial problem of standards, whether messaging, symbology or data model standards.

“When you are looking at mid and back-office applications that are consuming reference data, securities data, counterparty data and end-of-day pricing, it is a hodgepodge of different ways of expressing data and there is no standard way of working,” explains Alexis Kalmanovitz, global head of strategy in the enterprise division at Reuters. Similarly, in the front office, when a broker and client communicate, often the same references for instruments are not used.

Through XML (Extensible Markup Language) and the International Standard Organisation (ISO), standards are being put in place to enable a lingua franca to which people can subscribe when dealing with different investment instruments.

“The advances that have been made in standards show people how difficult it will be for them in the future, and if they are already using the standards, it reduces the complexity and the cost of connecting up significantly,” says Susan Scott, senior lecturer in information systems at the London School of Economics.

Excuses can always be found for postponing development projects, the usual suspects being associated with cost, gaps in knowledge and technological constrains. With data integration, however, these excuses are not nearly as viable as they may seem, says Dr Scott. “The technology is around, the know-how of how you integrate data is available. The main issue seems to be the management of change.”

The political agenda

Underpinning all of the problems and solutions that have been mentioned is the issue of internal politics of changing processes, who takes ownership of data integration, and getting different departments within organisations to actually work together.

In a market that stimulates the ‘get rich or die trying’ philosophy, personal and departmental successes are based on quarterly figures and quick results. “The typical programme to fix data lasts three times longer than the average tenure of the CIO that would need to make the decision to make it happen,” says Paul Mee, director of European strategic IT at Mercer Oliver Wyman, the financial services consultancy. “The data problem won’t get fixed unless ambition is set out with a collective plan and the right level of recognition. Fixing data is not something that currently makes corporate heroes. However, doing a big acquisition, entering a new market or launching a new product does.”

No ownership

If you sit people down one-to-one they would all probably agree that data integration is a necessary part of a department’s development but no-one seems able to take ownership of the situation. At the likes of JPMorgan, Citi, Nasdaq and Barclays, data ownership has become more formalised and institutionalised with the emergence of executive level chief data officers and data managers.

As with other departments, data management and data integration groups are being created to have the expertise in sourcing data content from external or internal sources, knowing the business rules associated with keeping that content accurate, and the distribution and publication of that data to consuming systems. What is needed is the means for people to think about data, care for it and have the ability for it to be looked after throughout the organisation.

“It is all about the mentality within the institution and whether they want to look at data and its integration on an enterprise-wide basis,” says Darren Purcell, director at Standard & Poor’s. “People are looking at how they manage data internally and how they can improve that process.”

In a complex and restrictive marketplace that demands higher profits and quicker returns, the tools are there for banks to embrace data integration between the front, middle and back office. A jump over the internal political hurdle is all that is needed to integrate both data and people effectively.

Was this article helpful?

Thank you for your feedback!

Read more about:  Digital journeys , Fintech