Share the article
twitter-iconcopy-link-iconprint-icon
share-icon
FintechMarch 3 2010

Decoding the issue of reference data

Inconsistent reference data is an ongoing industry-wide problem that leads to time-consuming trade breaks and results in millions of dollars of unnecessary annual expenditure. The industry is considering creating a centralised reference data utility that would provide a global store of transaction information. In this Masterclass, John Mason, CEO of SmartStream's DClear Utilities, outlines the industry's efforts to address them. Writer Michelle Price
Share the article
twitter-iconcopy-link-iconprint-icon
share-icon
Decoding the issue of reference data

Click here to view an edited video of the discussion

The participants

John Mason - CEO, DClear Utilities SmartStream

Michelle Price - Business editor, The Banker

What is the significance of reference data and can you explain the challenges faced by institutions when attempting to manage it?

Reference data is core to the securities business: if you think about any transaction that is undertaken, there are very specific elements to the activity itself, but also aspects that do not change. For example, If I had to go and buy IBM on the stock exchange, that company will have a code, by which it will always be referred to. That is the reference data element. It exists across all transactions carried out by any institution.

The challenge for businesses is that there are various codes involved in such negotiations - if you consider the markets, every single one has identification codes. IBM may have a certain code if traded on the London Stock Exchange, but would need a different code if traded on one of the alternative trading venues throughout Europe. There are a number of codes that actually refer to the same company, and the challenge that lies ahead is maintaining that consistency across multiple institutions and geographic borders.

If an organisation trades a product with a particular entity on behalf of a certain client then any number of different codes can impact that transaction. The trick is making sure that each and every code refers to the same thing in the same way.

Watch the video 

This is an edited version of the discussion from The Banker's Exclusive Masterclass Series. Click below to view more:

In the wake of Markets in financial instruments directive (Mifid) and the rise of new trading venues, is it fair to say that the problem is getting worse?

Yes, it is becoming more difficult. More codes have been introduced: for example, an organisation on the multilateral trading facility Chi-X would be '.x' suffix, while on the London Stock Exchange it will be '.l' and so on. There are a number of extra codifications, or symbols, as we would refer to them within the industry, and the cross-coding matrix is becoming more complex and, as a result, there are costs and risks.

Are companies beginning to address this issue?

They are aware of the situation - they can't just bury their heads in the sand. Some firms have internal teams, while others outsource to third-party organisations. Others have looked to businesses such as ourselves to provide managed services where we're taking on board some of that capability. The most prevalent way of managing the problem, however, is still internally using some form of enterprise data management platform and employing sufficient staff to manage it, right from sourcing it from the vendor, through to distributing it to the end clients within the banks themselves. For some organisations that costs tens, if not hundreds of millions of dollars every year.

Is the enormous cost proving the greatest impetus for bringing about change?

A number of factors come into play. Money is always a factor, particularly when you consider the impact of the past 18 months or so, where trading volumes declined and banking revenues fell. These institutions are always going to look at maximising value for money, and in an area where they are typically spending tens, if not hundreds of millions of dollars a year, then that is going to be one area of focus. Certainly risk and regulation are other areas that banks need to be aware of, in terms of exposure to a counter-party or a particular instrument. We saw the significance of that with the collapse of Lehman Brothers.

Are the regulators now examining reference data directly, or is the pressure more indirect?

I think they are looking at it. Last year the Financial Services Authority (FSA) fined a major bank for inconsistencies in its reference data around certain trades that were filed. We also have initiatives from the European Central Bank (ECB) and in the US. There's no doubt that regulators are taking a very hard look at what can be done about reference data - not necessarily just at the high level, but right down to the fundamental issue: what do we describe things as? What language do we use to talk about the products so we get it right much earlier in the cycle rather than allowing multiple codes and various descriptions to take over further down the processing stream? I think it's open to debate, however, as to how far the regulators will step into the whole issue - be it in terms of legislation or just advising.

The debate around how to address this issue is gaining momentum in reference data circles and the suggestion is to create a central reference data utility. What are your views on this?

It is likely that this will come about and it would be beneficial. At the moment, the discussion is focusing on the development of standards versus the utility, as if they are mutually exclusive. I don't see it that way. I strongly view it as standards and utilities reinforcing transparency within the industry, but I think the utility should come first and can be a tool in driving those standards throughout the industry. The central reference data utility ultimately will want to minimise the number of symbols it has to manage, so it will be a necessary tool in pushing towards standardisation.

The industry is likely to prefer a bank-owned service and there is a concern that more than one utility might emerge. What are your views on this and where do vendors fit into the picture?

There are many definitions of such a utility. For a lot of people, a bank-owned model translates as free. It means free data and a move away from the traditional aggregators as data sources. I think, for me, the utility is a commercial proposition. Someone has to manage and validate that data and create a consistency, and organisations need help to migrate towards a standard format. I think that's where businesses such as DClear can play a role in collaborating with banks or other companies to make this happen.

This is not the first time the industry has mooted a utility for reference data. What are the challenges in moving to this model?

The perception of what the utility is about and a clear definition as to what it is looking to provide would help. There is also a huge industry around data management so there is always a fear factor in moving away towards the unknown. In the past 18 months, people have started speaking about disruptive solutions and I think there has been much more of an appetite for this in the marketplace - a fundamental change in the way in which we do things. I think the utility has returned to pre-eminence because of that: it would fundamentally change how we do things in the future.

Do you think regulators should drive the agenda, or can the industry take the lead on this issue?

The industry has the power to bring about change. I don't necessarily think it should or needs to rely on legislation, as it's ultimately for the industry's benefit. Often people associate regulation with additional cost, and they do it because they have to. In this instance I think there's a genuine desire to lower transaction costs and to limit the number of trade breaks. About 16% of trades break due to reference data problems, and 29% break in derivatives processing due to data reference issues. I think there is a genuine desire for businesses to get beyond where we are today, to improve the processing rates.

How much do you think will be achieved over the next two years?

Businesses will be looking at the potential role of a utility. The ECB is talking about it, and in the US the National Institute of Finance (NIF) is being mooted as a body that can take on a utility for securities information and legal entity information. In the US they're also thinking about a central store of transaction information, primarily because of transparency and data access. Over the next two to three years there will be some consolidation of those concepts. I hope if we do move down that road, we receive some central governance. I don't think there is any merit in the ECB and the US ploughing separate furrows. It needs to be consolidated or, again, we'll end up in the same boat with disparate solutions to the same problem.

The issues

- Confusion created by a variety of codes

- Complex cross-coding matrix

- High cost of managing data

- Call for central reference data utility

- Challenges that lie ahead

Watch Now - Watch the debate or individual chapters - visit thebanker.com/media

Was this article helpful?

Thank you for your feedback!

Read more about:  Digital journeys , Fintech