Now speed is king on the trading floor, banks must optimise their systems’ engineering – or find themselves left behind. Dan Barnes reports.

The exchange traded options market is growing at pace, but the nature of the product places more pressure on technologies than is found with other investment instruments. As an option is affected by changes to the underlying instrument involved, the data around an option can alter without any trading having occurred.

The amount of movement can vary tremendously, creating a need for robust systems – even if there is little trading activity – so that they can cope with the volume of information. Gerhard Lessmann, member of the executive board, Deutsche Boerse Systems, gives this example: “In the year 2000 we had 1.2 quotes per traded contract. In 2006 this has increased to 17.09. So we have roughly 15 times the number of transactions per traded contract that we had in 2000. We are not earning money with quotes, we are earning money with a traded contract, but we believe these quotes are very valuable for the market as a whole.”

Significant volumes

Frédéric Ponzo, managing director of consultancy NET2S, says that other exchanges share these levels of volume. “On Euronext there are about 4000 products that generate around 400,000 orders a day. So for almost half a million orders a day, you are generating a bid price every single time the underlying is changed. And do you know how many trades they make a day? About a thousand. The ratio of execution over the number of orders is half a per cent.”

For market makers it is crucial that they update their prices frequently. For this they must follow the market very closely, picking up on small changes in the underlying or in the overall market situation. In doing this it allows them to give “newer and better quotes so they can quote with newer and narrower spreads than they would do otherwise”, as Mr Lessmann puts it. But whether market maker or trader, in the options business there is a considerable amount of data processing to work through, if you wish to take advantage of, and make money from, the opportunities presented.

Into the machine

What this translates into is a huge amount of pressure that is being put onto systems within exchanges and banks to get the data and then process the data. Abdullah Malik, director FX options at Dresdner Kleinwort, notes that the bank has been designing systems with this in mind – regardless of prevailing market conditions.

“We have had a tough few years in this market, although now the volume is back. Fortunately we have always developed our systems to cope with large volumes of data; that is not true for all of our peers. This requires end-to-end performance so your systems must be constructed and measured with capacity for large volumes built in from start to finish, there is no point having pools of capacity and bottlenecks between them.”

According to Mr Ponzo, there are very real threats to players whose IT is not up to scratch. “Every time there is a big announcement – such as a change in employment figures – between three and 10 banks have their market data platform crash on them,” he says.

Awareness is an important part of the solution. Mr Ponzo recounts that in one case he worked on a US bank that had been monitoring data levels closely and found that the July 2006 level of data feeding into the bank was 2.5 times the amount that was entering the bank in July 2005.

“The first challenge is getting the data into the bank,” he says. “I have at least two clients who have called me in a hurry because they have lost all data feeds thanks to the sheer volume of data coming in. This is not a potential issue, this is happening now.”

Once a bank has managed to get the data into its system, the next stage is to process it. The more efficiently this is done, the faster the bank can respond. Yet this is where many organisations fail. Although hardware is important in establishing low latency within systems, the most common problem for banks is within the engineering of software. Mr Malik believes that using packaged “off-the-shelf” software will not suffice for the ambitious “as you have no competitive edge if your systems are the same as those of other players”.

Another issue with commercially available software is that it may have become outdated. Software built in the late 1990s or early 2000s may not have been designed to scale to the required degree, making its other strengths less valuable.

Over-engineered

“From a technically minded point of view, [systems] can be beautifully created. But they were not designed to cope with this much data. They are over engineered for requirements,” says Mr Ponzo. Using the example of market data systems he notes that they will often generate up to seven pieces of data for every piece received, detailing how the received data is to be treated. Mr Malik agrees that the process flow can be crucial when trying to gain an edge.

“Often the pricing engine and the risk management platform are separate. In our instance, pricing and risk management are run together to manage our risk and distribute products/prices,” he adds.

Latest developments

An important development in software design that assists with processing at this level has been that of parallelisation. This involves running computing tasks at the same time across different processors to make the process faster. It is not suitable for all tasks as some require a sequential order of processes, referred to as serial processing. However, by building applications in this manner, you can take advantage of the latest advances in dual-core chips that, when installed, allow a single PC or server to process in parallel.

At Eurex this has been used to great effect in dealing with large volumes of data at great speed. “Originally we would take the order, lock everything in the respective product, process the order and then we would take the next one.” says Mr Lessmann. “We are overlapping the non-critical parts of transactions so that they can process in parallel and the critical time span of the transaction where we really have to work serially, where we cannot process in parallel, is as short as possible.”

The need for processing speed is producing not just scale operations – one market maker client of Mr Ponzo is using 160 CPUs to price its options – but also innovation, as he recounts: “At one stage they were contemplating doing their pricing on PlayStation 2 games consoles. Three-dimensional graphics use floating point calculations, options pricing uses floating point calculations. What they are now doing is using the graphical adaptors for the video cards. It is an example of where you really need to push the boundaries if you want to keep up. And not everybody can.”

If an organisation is serious about the options business, it had better be serious about its technology too.

PLEASE ENTER YOUR DETAILS TO WATCH THIS VIDEO

All fields are mandatory

The Banker is a service from the Financial Times. The Financial Times Ltd takes your privacy seriously.

Choose how you want us to contact you.

Invites and Offers from The Banker

Receive exclusive personalised event invitations, carefully curated offers and promotions from The Banker



For more information about how we use your data, please refer to our privacy and cookie policies.

Terms and conditions

Join our community

The Banker on Twitter