Quant: Of arms and the market
High frequency trading not only borrows military technology, it also looks like an arms race on a global liquidity battlefield. Stuart Baden Powell asks if innocent bystanders are getting hurt
High frequency trading (HFT) is a subset of automated trading and is thus both algorithmic and quantitative in approach - driven by models which attempt to strip out small increments of alpha from a multitude of opportunistic probability-based forecasts. While frequently conducted by the proprietary trading desks of bulge-bracket banks, it is the independent proprietary trading houses such as Getco and Optiver that draw the most attention. HFT execute large numbers of trades at high speed, have holding periods in the order of milliseconds and end the day flat, with marginal, if any, overnight LIBOR-based carry risk.
The consultancy TABB Group notes that HFT accounts for approximately 60% of US and 35% of European total cash equities volume and generates tens of billions of dollars in annual net profits and unusually high risk-adjusted returns.
Key to the rise of HFT were two major pieces of legislation that changed the way cash equities were traded: Regulation National Market System (NMS) came into force in the US in 2005 and the Markets in Financial Instruments Directive (MiFid) for Europe in 2007.
The central point of both was to facilitate the creation of competition in execution venues and, subsequently, both the US and EU experienced a ‘fragmentation' of volume where market share migrated to the ‘start-up' venues. The alternatives differentiated on speed and price with the majority incorporating ‘co-location' facilities (the practice of locating the HFT server in the same data centre as the execution venue to reduce the ‘latency' or ‘delay' between messages) and paying a rebate to those who post passive liquidity.
Today the London Stock Exchange accounts for approximately 55% of FTSE 100 trading and the NYSE just 25% of NYSE-listed names. Although this diversification remains a process, to date the net result has been that those firms with faster connections to multiple trading venues can utilise key information to trade ahead of those without. HFT has been at the forefront of this change and hold equity stakes in many alternative venues; the profit opportunities - and the conflicts of interest - are intriguing.
HFT techniques are often split into three categories: market making, statistical arbitrage and latency arbitrage. However, we would argue that latency arbitrage is not a distinct group but, rather, is a component of the operational layer, underpinning the two supra-categories.
Market making and statistical arbitrage can be pseudonyms for the more traditional inventory and information models of trading. Increasingly, though, with HFT being able to move faster than others across multiple venues as well as the importance of trading off ‘tick-level' data, market making can be significantly more profitable with information components attached, thus distorting the separation between the two.
To provide a practical example, recently the London Stock Exchange suffered a four-hour outage, during which many expected trading to migrate to an alternative venue such as Chi-X or BATS. In the event, many buy-side trading desks preferred to cease trading. These two major alternatives are heavily utilised by HFT and yet the spreads widened significantly with virtually no volume migrating; it was as if HFT which ‘make markets' were not able to trade higher volume in either capacity without incoming institutional order flow and the information it provides. Beneath this layer we have heard some firms propose that ‘co-location' for the broker removes the risks of latency arbitrage. However, the issue at stake is more complex than chronology in isolation and, rather, involves a range of sophisticated mathematical applications in order to successfully counteract HFT.
Dropping a level HFT makes extensive use of parallel computing by using multi-core processors as well as borrowing from other industries, particularly defence, for components of artificial intelligence and algorithmic decision-making. For example, high signal-to-noise separation is desired in both military and trading computational applications: models such as ‘logistic regression' are often used to identify real signals within high-noise environments such as multi-venue trading or on the battlefield with multi-target targeting systems; both aim for near-zero false alarm rates. Examples in the military include the well-publicised ‘Predator drones'.
Another HFT application is the Kalman and Extended Kalman filter, also used in navigation and guidance systems for Tomahawk and Cruise missiles. As part of the ‘data fusion' algorithm family, the filter is deployed to integrate data from multiple sources (execution venues) and gather that information to achieve statistical inferences. In effect, the filter shows the real internal state of a linear dynamic system - such as the market - from that of a series of noise-based measurements and acts as a secondary signal overlay for logistic regression. In the world of HFT, high numbers of small orders - ‘noise liquidity' - are deployed as both an aggressive and/or defensive technique, thus distorting the market for those without suitable counter-trading skill. The surveillance department at the French regulator, the AMF, discovered that in April 2010 three firms accounted for 39.6% of CAC-40 orders, yet cancelled 96.5% of those orders.
A final key underlying model to note is Bayes theory: utilised by HFT for probabilistic forecasting, it operates under the rubric of ‘genetic algorithms'. These are able to increase their internal confidence level and alter decisions of the trade, based on changes of input - in effect, to adapt to a dynamic environment. Factor in Markhov switching models for cross-market arbitrage opportunities, and neural networks with their simplified process structures to further speed up the execution, and HFT deploys models that can infer profit probability of close to 1.
The area of HFT is relatively nascent in academic terms, with few notable studies available. HFT activity can also be difficult to detect. That said, there is a forming body of increasingly credible research. A widely quoted study by Jonathan Brogaard noted that HFT "dampens intraday volatility" and "adds substantially to the price discovery process". However, the reliability and validity of the results have recently been destabilised. Robert Jarrow of Cornell University and Philip Protter from Columbia University found that HFT activities "move the market price away from the fundamental value and increase the return's volatility [as well as] market volatility". Frank Zhang of Yale University found that HFT trading is "positively correlated with stock price volatility", "hinders price discovery" and overall "generates some harmful effects for the US capital market".
With regard to the institutional portfolio, total expense ratios have not only risen in real nominal terms but also through less visible means - such as when market impact components are infrequently factored into transaction cost analysis (TCA) tools and as such not accounted for in the bottom line. Most TCA measures are made against trades that were completed but not against those that weren't. This lacuna in cost tracking is related to ‘noise trading' and is termed ‘disappearing liquidity'; it is what the SEC and CFTC report on the 6 May ‘flash crash' was referring to when it noted that "high trading volume is not necessarily an indictor of market liquidity". Indeed as global head of trading at Invesco, Kevin Cronin noted to an audience in London recently, "some of these orders have no intention to trade and it seems to me to be manipulation". Ultimately, trading is a zero-sum game and, as such, alpha migrates from the pension fund to the HFT - or as Noble Laureate Paul Krugman puts it: "It's a kind of tax on investors".
The impact on the portfolio is one point. However, there is a wider impact of HFT on the economics of the capital market. Secondary markets function to allow the efficient raising of capital and allocate it to its most productive use. But as another Nobel prize winning economist, Kenneth Arrow, noted in 1973: "Speculation based on private information imposes a ‘double social loss', by using up resources and undermining markets".
As for possible next steps, some, like French finance minister Christine Lagarde, have called for HFT to be banned; some exchanges and investment banks that provide services to HFT have called for little, if any, alterations; and European and US regulators may well address some concerns shortly. While computers have taken centre stage in the trading space, HFT has engaged in intra-day speculation and, definition dependent, some would say computers themselves are capable of making ‘investments' - for now, at least, no matter how sophisticated algorithms are reported to be, the more traditional side of the community has to use improved techniques to ensure that trading costs are minimised and that the wider secondary market remains strong, fair and stable for the benefit of the long-term investor.
Stuart Baden Powell is head of European electronic trading strategy at RBC Capital Markets