Share the article
twitter-iconcopy-link-iconprint-icon
share-icon
FintechJuly 2 2006

Grand designs

Michael Meriton explains what drives the move to adopt an EDM approach and, once in place, what it allows businesses to achieve.
Share the article
twitter-iconcopy-link-iconprint-icon
share-icon

When an organisation is looking to make the first step towards a shared data infrastructure, it is important to be practical and limit the functional scope of the first project, to avoid aiming for a global, enterprise-wide project that never gets delivered. There is a danger, however, that this pragmatic focus on quick-win and functional-fit obscures the point behind an enterprise data management (EDM) strategy: to adopt an operating and IT model that separates the data management function from existing business and IT processes.

It is clear that all fund managers, investment banks and custodians will benefit from a better security master, but while the functional differences between competing security master applications will typically be small, there are very significant architectural differences that can compromise future EDM deployment.

There are essentially three defining attributes of a true EDM architecture:

  • It establishes a separate data management layer within an organisation’s overall systems architecture. This layer exists separately from any technical integration layers such as message-based middleware. It is also separate from any applications that supply and consume data. This means the layer is not coupled to any specific technology or business application. Indeed, its sole purpose is to act as the natural home for all critical referential data as it moves through the stages of data acquisition to validation, enrichment, matching, exception management and publishing within the data lifecycle.

 

  • It is capable of performing inbound many-to-one logic. This means that multiple data sources and formats can be translated and normalised into a single structured record. An example is the creation of a composite record of multiple cross-validated source records. This is known as the ‘Golden Copy’, a composite record of cleansed, validated, enriched data that can be trusted.

 

  • It is capable of performing outbound one-to-many logic. Once a Golden Copy or otherwise validated record has been created, it can be published in different formats and frequencies to different applications. This means clean data is available when and where it is needed.

The decision to implement a separate data management layer to create and distribute a Golden Copy record for all critical reference data puts high demands on the technical architecture that is used to deliver this trusted data to mission-critical business applications. It needs to be:

  • available without downtime;
  • guaranteed that no data is ever lost;
  • be easy to integrate and maintain;
  • scaled in a cost-effective way;
  • have a built-in security and audit function; and
  • be able to re-use existing investments in enterprise technologies and legacy systems, limiting the number of ‘points of failure’.

But what does a separate data management layer mean for the business?

Only when data is de-coupled from specific functions and applications is it possible to move to more flexible IT and operating models such as:

  • centralised versus distributed;
  • in-house or outsourced data management;
  • service-oriented architecture (SOA) environment for application development and legacy integration;

Centralised versus decentralised

Much data management is a high-volume, low-expertise function. Therefore IT operations and basic administration can be moved to centralised low-cost centres. At the same time there are few high-impact specialised exception management and validation tasks that still need to be handled by the experts using the data for their business need – a distributed model.

Unless a physically separate data management layer is in place, the data specialists are condemned to also spend much of their time on the commoditised and low value-added elements of the data acquisition and validation process.

In-house versus outsourced

Linked to the centralised versus distributed model is the ability to move any or all of the data management process to an outsourced service provider. As more of the banking and capital markets industry is focusing on core competencies, the pressure on IT departments to be able to facilitate these new models without introducing new risks is growing. Separating data from core business processes is a precondition to achieving this.

Service-oriented architecture

In an SOA environment, it is not just data and applications that are decoupled from each other. Discrete business functions can be invoked independently by ‘virtual’ applications as and when they need them. If, for instance a securities reference identifier such as a CUSIP needs to be cross-referenced to a securities reference identifier such as an ISIN, the SOA model allows this check to take place without performing process and database intensive look-up tasks in a conventional shrink-wrapped application.

This is the approach being adopted by 80% of financial institutions and software providers to build new application stacks, as well as to integrate new business functions within legacy systems.

These technical architecture models provide the platform that allows banks and investment firms to:

  • Integrate data from acquisitions without having to replace business applications.
  • Plug a separate data management layer into existing service offerings in order to increase scalability and robustness of legacy processing platforms, effectively extending the usable shelf life of previous-generation IT investments.
  • Focus on core competencies. The trend towards standardised market operating systems and relational database providers is spilling over to data management. The complexity of modelling and managing a complete data lifecycle in this diverse industry is such that the investment needed to develop total in-house solutions is not justified by the competitive advantages it is perceived to deliver.

Why do firms need the many-to-one functionality of EDM architecture? There are two reasons: to achieve data completeness and quality.

First, a single business function often requires a data record that cannot be sourced from a single supplier. The only way to achieve completeness for the data set is to obtain it from multiple sources and create a Golden Copy. Typical back-office processes need instrument attributes such as price, credit-rating and fundamental data. In some cases the only way to create this record is to combine different sources by matching against a single identifier.

A second reason for many-to-one functionality is to improve the quality of the same data set by validating a record against a second or third supplier. This validation can be automatic. If two sources agree on attribute x, it is validated, or if sources disagree, manual exception management can take place.

If there are too few sources, manual research can take place. In the event of data conflicts, the many-to-one function can automatically resolve the issue (according to pre-defined rules), or an expert user can be used for research and resolution.

The most effective way to achieve a scalable many-to-one data management model is to translate all data (securities, customers, counterparties, positions and transactions) to a normalised data model with strong vertical business domain knowledge embedded. A data model that describes Customer A as not just as a string of x characters with basic data typing and rudimentary relationships, that also depicts how different types of business relationships work together and can be linked to the core business of banking and investments.

This is especially true for position management. Here the value of normalisation is not just quality improvement, but also completeness. The information collected is cumulative and builds a new data set, creating, for example, a consolidated position for a single customer from multiple balances and transactions.

More than a securities master

For instruments and entities, normalisation can offer much in terms of quality and completeness improvements, but it does not create truly new information. Normalised position management, however, does: it adds to the total pool of knowledge, and is therefore the stuff of which competitive advantage is made. The fact that having true real-time positions requires trusted instrument data and standardised entity data only reinforces that EDM gains exponential strategic value in each phase of deployment.

EDM powers profit generation

Value is already being realised when verified static data is available, but only when that data is linked to profit-generating activity, such as trading, lending and cash management, will EDM be able to deliver its full potential.

If many-to-one can deliver higher quality and complete data at the individual record level, the main business benefit of the many-to-one model combined with one-to-many functions are:

  • The ability to rationalise data supply once a Golden Copy exists. There is no longer a need to purchase the same data set again for a different business purpose.
  • Once trusted data exists, the need to mitigate downstream risk though control mechanisms and reconciliation tasks is greatly reduced.
  • Higher quality data means better decisions and fewer errors.
  • Integrated reference and positions data delivers the full transformation to a strategic enterprise data management environment.

An EDM platform, as depicted in the diagram above, brings standardisation and normalisation logic to the inbound data flowing from many sources (many-to-one). Once the data has been transformed into a Golden Copy, the same logic is applied inversely. A single flow of complete and quality data to many consuming applications (one-to-many). There is huge value in translating potentially hundreds of representations into a single business meaning. But the greater value is only realised when this single meaning can be sent to business applications without complex integration.

This is where the lesson learned from the middleware revolution pays dividends; namely, once data can be trusted, it should flow freely.

The EDM circle begins with a message delivered in a bespoke format to a central reference data engine, where it is normalised to a single data model and aggregated with transaction and positions data. The circle is complete and available to power strategic enterprise applications to enable business growth, improve operational efficiency and minimise risk.

Michael Meriton is president and CEO at GoldenSource Corporation.

Was this article helpful?

Thank you for your feedback!

Read more about:  Digital journeys , Fintech