Share the article
twitter-iconcopy-link-iconprint-icon
share-icon
Transaction bankingNovember 5 2007

Algorithmic trading: villain or scapegoat?

Algorithmic trading was blamed for the escalating volatility spikes in August. Frances Maguire asks if this is a fair criticism as well as looking at how algorithms are getting smarter.
Share the article
twitter-iconcopy-link-iconprint-icon
share-icon

The US subprime credit crunch in August blazed across all financial markets setting new records, both in terms of volatility spikes and volumes traded. Indeed, the industry saw three days that exceeded 10 billion shares traded. The Chicago Board Options Exchange’s Volatility Index (VIX) soared past 30 and by the end of the month VIX futures had set three trading records.

While it is too early to draw firm conclusions about the cause and effect of the market volatility, already many are pointing the finger at algorithmic traders as the cause of the rapid, volatile price swings that occurred in even the most illiquid stocks. Moreover, the ability for computer-driven funds to respond to extreme market events is being questioned.

Volatility suppressor

According to TABB Group senior analyst Adam Sussman, algorithmic trading did not contribute to the volatility spikes, and in some cases algorithmic trading actually lowered volatility. He says: “If anything, the volatility would have been worse in the absence of algorithmic trading. It would have been spread out over the longer period of time – algorithmic trading simply means that trading happens faster and in time it lowers volatility.”

Rob Boardman, head of algorithmic sales at Investment Technology Group, says that much of the herding pattern of trading, where everyone is buying and selling the same stocks, has less to do with algorithmic trading and more to do with the fact that investment managers are using similar strategies.

He says: “Many quant [quantitative] funds not only use quant techniques to trade but also to construct portfolios and decide when to buy or sell, and many of them are taking the same market data. As they end up with the same input parameters, they often trade in similar patterns.”

Mr Boardman adds that during normal trading quant funds do not necessarily increase volatility, and that some of them actually decrease it by using reversion strategies. However, in August stock price fluctuations were so violent that most of the momentum strategies were throwing out similar signals.

“This, combined with the fact that many of the quant funds were facing margin calls from prime brokers, meant that many were forced into making the same trade. This was the origin behind this herd mentality witnessed in August, rather than the algorithms themselves,” says Mr Boardman.

He believes that many of the quant funds realised that they were not as risk-neutral as they thought, and were perhaps not as hedged as they presumed in terms of sector weightings. “Many quant managers have learnt that their risk controls were inadequate. They reacted to the unstable markets by reducing leverage and that unfortunately led to increased volatility as everyone was paring back the same bets. This reaction brought about the very demise that they feared.”

Alasdair Haynes, CEO of ITG International, adds that August was an exceptional market, with volatility hitting new highs of 30% and above. “Algorithms are models based on statistical data and if they hit a market that is an outlier, the deviations are always going to throw up bizarre results in exceptional circumstances.”

Proceed with caution

Mr Haynes believes it is down to the algorithmic trader to realise that the distribution has changed and that strategy and course need to be altered quickly. “What the algorithmic traders have learnt over the past two months is that these tools should be used correctly, with caution, and they must be able to adapt to changing market conditions. They didn’t use algorithmic trading to cut positions very quickly and that exasperated the problem, rather than actually reducing the risk,” he says.

ITG has developed a next-generation set of algorithms, designed to cope better with volatility. Relying solely on historic data in volatile markets proved to be inadequate and ITG now has a range of algorithms that can be adapted within minutes to cope with fast-changing markets.

Mr Boardman says: “We have also developed algorithms that have intrinsic hedging capabilities, with algorithms that understand the risk implied by the whole basket of stocks rather than the list of individual stocks. These new techniques adapt algorithms to use the correlation between stocks to trade in a more intelligent way.”

Intelligent algorithms

Dynamic implementation shortfall (DIS) was built originally for transition managers who wanted to control large and complex portfolio trades. Users can select from a list of parameters, including cash imbalance, sector neutrality, time horizon and speed of execution, and the algorithm reacts in real time to benchmark, spread levels, volatility and liquidity to execute the portfolio over single or multiple days. DIS differs from most list-based implementation shortfall algorithms, which are static and follow a predetermined trading schedule based on historic volatility and liquidity estimates.

“DIS has technology that understands the relative sector weightings within a trade list and can actually manage them down automatically. It recognises the sector differences and responds to trading by hedging appropriately. No algorithm is ever going to completely remove risk, especially in volatile markets, but the understanding of how to build algorithms to cope with outlier cases has improved tremendously,” says Mr Boardman.

During the trading crunch in August, it was reported that volume in dark pools soared, implying that as volatility increases, traders are less inclined to want to show their hand. Two new algorithms from UBS use a price improvement network and other dark pools and non-displayed liquidity to minimise market impact.

Tim Wildenberg, managing director and head of direct execution services, Europe, at UBS, says algorithmic trading should not be blamed for the recent volatility spikes; they were more about the positions and the nature of the positions the quant funds were running.

He says: “There are two kinds of algorithmic trading: direct executive services [services provided to clients to help them execute more efficiently] and the proprietary trading technology, used by hedge funds and banks that play the market in order to generate alpha. It is the latter that suffered during the subprime crisis. The nature of the models used tend to aggregate and come up with similar strategies so that when there is a market move it is exaggerated when everyone is unwinding at the same time, creating volatility.”

Volume trading

In making the distinction he adds that many of the quant funds do not use algorithms to execute trades but use pure direct market access systems designed for high volume rapid access, and UBS saw large volumes go through its DMA infrastructure in August.

The algorithms used to execute trading strategies, as opposed to those used to drive investment decisions, are already moving to next-generation productions. Mr Wildenberg believes the algorithms using only historic volume curves of stocks would struggle in a market that deviates from the norm today.

He says the support model needed for algorithmic trading is crucial, that they need to be controlled. He adds: “Algorithms now provide a real-time review of the market, tracking volatility and market depth constantly and reacting to it to provide best execution. This is overlaid by traders and technicians who are tuning the algorithms in real-time.

“What we have learnt from August is that our clients are much more interested in what we are doing in terms of real-time tweaking and modification.”

Mr Wildenberg adds that volume-proofing work carried out earlier in the year at UBS paid off in August and following the new peaks in market volumes, the bank is reviewing its back-up infrastructure further to improve data processing and its spill-over infrastructure now that new records have been set for back-testing purposes.

UBS has also just launched a set of next-generation smart algorithms for execution strategies in response to the imminent arrival of the Markets in Financial Instruments Directive (MiFID), which is expected to generate multiple new pools of liquidity for stocks.

UBS Tap is a liquidity-tapping algorithm, designed to seek out liquidity and take advantage of the market fragmentation that is expected to follow MiFID. It uses UBS’s crossing capabilities so that it minimises market impact by rarely putting shares out on the order book.

Mr Wildenberg says: “There are a number of settings for Tap. Its most passive setting, Tap One, only looks for hidden liquidity but at its highest setting, Tap Now, and it becomes a very aggressive liquidity-seeking algorithm, and will be cunning about how much it leaves so that other sellers will not know if there is a big buyer in the market.”

Quick reflexes

Part of the problem in August was related to the speed at which algorithms can react to the changing market conditions. The production of news by wire services Reuters and Dow Jones in a format that algorithms can ‘read’ and respond to, faster than humans, could add to making next-generation algorithms smarter.

cp/22/pSuutariKirsti.jpg
Kirsti Suutari, global business manager of algorithmic trading at Reuters, says: “The machines have only executed on the models built, so it is more about the research and the input to the models that caused these market circumstances.”
cp/22/pBrownRichard.jpg
Richard Brown, global business manager for Reuters NewsScope, adds that because the subprime crisis was a unique situation there was no way for the news algorithms to necessarily be able to know what to do, which is why human intervention was needed.He says: “Pricing data is something that quantitative researchers have much experience with, but the analysis of news quantitatively rather than just qualitatively is an emerging area that will offer new data sets for algorithmic traders to differentiate their models.”

Was this article helpful?

Thank you for your feedback!