The Theory of Everything – and TCA
In the Oscar-winning film The Theory of Everything the lead character Stephen Hawking lays out his vision of a single equation that explains all physical aspects of the universe.
The scientist explains in lay terms the challenges of integrating the two broad areas of theoretical physics that have emerged over the last century – Einstein’s general relativity and quantum field theory (analysing the properties and effects of sub-atomic particles) – into one over-arching ‘Theory of Everything’. One approach looks at very broad aspects of the universe and space and time, while the other focuses on infinitesimally small objects as the basis for broader theories and interpretation.
This rarefied scientific debate has echoes in the more prosaic world of Transaction Cost Analysis in financial markets, where the availability of more granular data coupled with pressure from regulators is driving a whole new wave of research and analysis, writes Michael Sparkes.
There is a risk that these latest tools may be thought by some to be able to answer all the questions on trading costs and best execution. This is clearly not the case, and a combination of methods of analysis is vital.
Traditionally TCA was conducted at a relatively high level, focusing on the outcome of orders and looking at the implicit costs incurred by price movements caused by market impact or by delays in the execution process (as distinct from explicit costs such as commissions).
Every institution has an investment process, which forms a sort of investment DNA for everything it does. It is reflected in activities such as portfolio construction, stock selection, decision timing and trading strategies. Some firms are value-oriented and incur relatively low transaction costs, as they are typically trading against the consensus. Others are more event-driven and momentum-oriented; inherently they need to trade more quickly than others, incurring higher impact costs in order to capture as much alpha as possible before others do so. Similarly some portfolios are made up of many small positions which can be easily and cheaply traded, while others consist of fewer positions which may be highly illiquid, and cannot be readily and quickly traded without severe loss of value.
All of this should be reflected in the approach to TCA a firm employs, and the metrics used to monitor efficiency.
There have been calls in some quarters for a standardised approach to TCA; such thinking should be firmly resisted, given the wide range of needs and types of analysis. The high level analysis must take into account many aspects of the underlying process, since the costs will be linked to factors beyond the control of the trader.
But a whole new level of complexity has been introduced to European markets, starting with the fragmentation of trading that resulted from the first MiFID set of regulations in 2007, making the trading landscape considerably more complex. At the same time new trading systems allowed asset managers to record and analyse details of every fill generated by their orders. With algorithms often slicing a large order up into very small pieces, this can mean thousands of separate executions for just one order. The final element of complexity has been the use of data tags to track and report on these individual fills.
Together these factors have driven a rapid evolution in analytical approaches which have taken on added urgency since the publication of the FCA’s Thematic Review on Best Execution and the final draft of the proposed MiFID II regulations. These stipulate that investors must not just monitor the venues on which their trades are being executed, but must describe the steps they take in the choice of those venues and their strategies to achieve best execution. This will entail much more precision and forensic analysis of the tactics used at the most granular levels in terms of sizes, timing and venues of trades.
The linking of venues to execution strategies is more than coincidental – indeed, it is crucial: the way in which an algo is designed to route an order is inextricably linked to the execution strategy selected. This may be a fixed participation strategy, or liquidity-seeking, or aimed at trading only in the so-called dark pools or crossing networks. Each strategy will tend to execute in different venues, or in different sequences, or in different volumes at different times. Hence it is essential for the latest applications of TCA to link the analysis of venues to that of execution strategies.
With this new granularity of data, new metrics also come into play. Looking at simple average price or implementation shortfall calculations may not be as relevant in comparing venues – shorter term statistics on reversion or spread capture may be more revealing. Similarly the number and sequence of venues used can be analysed, as can the costs or benefits of trading in lit or dark venues.
Traders now regularly use such data to monitor the ways in which their brokers execute their orders, for instance in the differing patterns of behaviour of smart order routers or algo strategies. And this data can be used to predict the most efficient way to execute a given order type, so as with more traditional approaches, post-trade data can be a vital input to pre-trade decision making.