Unless banks take a more holistic view of their data management systems, they will find regulatory compliance and cross-asset analysis increasingly difficult. Heather McKenzie explains.

Since the late 1990s, when the US securities markets tried and failed to reach T+1 settlement, data content has gained importance at financial institutions. Data is no longer considered an add-on but is now an asset and its role in risk management, customer services and regulatory compliance is well recognised.

Michael Atkin, managing director of the EDM Council (EDMC), a business forum for senior financial services industry executives involved in data management, says financial institutions must ensure that data content is consistent, accurate and can be compared and shared across multiple internal processes and functions as well as externally with counterparties and customers. EDMC’s 55 member firms include Credit Suisse, Vanguard, Franklin Templeton and Morgan Stanley.

“The world is changing for financial institutions – market restructures, increased complexity of shareholder cycles and the need for operational efficiency are all driving change,” says Mr Atkin. “For the most part, financial institutions have built this industry on business unit-oriented silos, each of which has its own data warehousing strategy and data models that work for that silo only.”

Any institution attempting to share data, compare it and verify it between different silos will be faced with significant reconciliation and repair issues. “Banks don’t have a holistic view of their enterprise and don’t fully understand their customer requirements,” he says.

If firms pay greater attention to data content as an asset, they will be able to use it to support more innovative trading, improve customer services, do cross-asset risk analysis and comply with regulatory obligations, says Mr Atkin.

Multiple data sources

In his report to the EDMC published earlier this year, Principia Verbum – Entering the Age of Reason in Data Management, Mr Atkin wrote: “EDM and the need to both understand and address complex data dependencies across functions, between applications and among multiple lines of business is pushing data content to a level of equivalence with technology as part of a financial institution’s core operational infrastructure.”

The EDMC’s message on data as infrastructure is being pushed by its members and sponsoring companies, Bearing Point, SunGard, Cicada, GoldenSource and IBM. David Beeston, entity analytics business manager, northern Europe, at IBM, says: “While there is still an element of data being a slave to the application layers, this is starting to change.”

Regulators require banks to share customer data across applications, which has been difficult as banks have kept customer data quite separate. Moreover, growth through merger or acquisition has further exacerbated the problem of disparate silos of data.

IBM argues that legacy infrastructures have been set up to manage data, “not information”, and fall short when it comes to providing insight into customers and in providing operational information and analytics across the enterprise. To comply with multiple regulations, banks compile, cleanse and reconcile the same information many times, says IBM, something that will be too costly and complex as compliance moves from being a periodic to an ongoing task.

Inconsistent data from multiple payment systems causes operational inefficiencies and increased fraud risk. Legacy systems are inflexible and difficult to change, technically incompatible, functionally inconsistent and operationally redundant, resulting in high operational costs and a dependence on antiquated skill sets.

“The greatest asset an organisation has is its data, be it customer, product, entity or unstructured,” says Mr Beeston. “We are trying to cut across all the data silos by implementing open standards and heterogeneous infrastructure, based on service-oriented architecture.”

Mr Atkin cites four significant drivers of the move to data as infrastructure – risk, regulation, operational efficiency and business development. “Regulators are looking at risk as a determinant of capital requirements. How banks manage their own risk, as well as counterparty risk, is a data-driven process,” he says.

Regulators are also insisting that banks are able to verify their business partners and report on transactions in order to justify investment decisions. “If you cut through all the of nonsense surrounding regulatory requirements, it all boils down to these two capabilities. Financial institutions need providence over data and data processes,” says Mr Atkin.

For greater operational efficiency, banks should leverage data across all of their activities, enabling greater automation, thereby reducing costs of repair and reconciliation and reducing trade delays.

Finally, if data content is brought up the scale of importance in a bank’s infrastructure, new business development in terms of products and innovations will result. Unless a financial institution fully understands its customers and its own business structures, it will not be able to deliver appropriate products and services.

“If you look at all four of these drivers, what financial institutions need is control over the data asset,” says Mr Atkin. “Most of the time, however, data content is treated as a detail of the technology environment. But for the first time, precision, granularity and meaning of data has become very important, and that is different from technology. Technology is only about processing data and providing access and integration of data.”

Mr Atkin describes data content as a “third leg” in financial institutions’ operations, in addition to technology and trade servicing. “These three elements support all of the processes of the financial institution and that is the reason the EDMC is looking at data as a core component of infrastructure,” he says.

Cross wires

There is no shortage of vendor moves on the data front. In corporate actions, for example, data about particular corporate events is paramount, but financial institutions face issues of inconsistent data coming from a wide variety of sources.

In April, FT Interactive Data, a supplier of financial information to global markets, formed an alliance with Avox, a company that provides business entity data content, to provide a joint business entity data service. The service is aimed at financial institutions that have to comply with the European Commission’s Undertakings for Collective Investments in Transferable Securities (Ucits III) directive.

Under Ucits III, total exposure to a particular parent organisation through all its subsidiaries must be calculated and managed in order to ensure that the total exposure falls within the limits set by the regulator.

Business entity data

The two firms are banking on increased demand for business entity data among European fund management administrators. In addition to Ucits III, requirements for Basel II and the Market in Financial Instruments Directive (MiFID) will have an impact on business entity data. “Many customers tell us they need a comprehensive and affordable service offering business entity linkages data that will help them meet their Ucits III compliance requirements,” says Bob Cumberbatch, European business director at FT Interactive Data.

“They need to identify and maintain linkages that determine which issues link into which issuers, and which issuers link into which higher entities or parent organisations. Business entity data is a specialised area – corporate actions alone cannot be relied upon to capture all of the changes that occur either to an individual business entity or within a corporate group’s organisational structure.”

Mr Atkin says upgrading infrastructures and understanding data dependencies enables financial institutions to shift their focus from low value-added data scrubbing processes to high value-added data manufacturing processes. “The focus on data models, business rules and application requirements result in the creation of tools to measure capabilities.

“Audit and monitoring tools promote comparison and quality benchmarking. Comparison facilitates competition and encourages suppliers to pay attention to utilisation rates rather than just bulk delivery.”

PLEASE ENTER YOUR DETAILS TO WATCH THIS VIDEO

All fields are mandatory

The Banker is a service from the Financial Times. The Financial Times Ltd takes your privacy seriously.

Choose how you want us to contact you.

Invites and Offers from The Banker

Receive exclusive personalised event invitations, carefully curated offers and promotions from The Banker



For more information about how we use your data, please refer to our privacy and cookie policies.

Terms and conditions

Join our community

The Banker on Twitter