Insights from High Frequency Trading World

Wednesday, December 08, 2010

Share
Sean Hendelman, CEO of T3Live, was featured on a panel at the High Frequency Trading World Conference in New York yesterday. Sean runs high frequency trading strategies at T3 Capital Management, a separate but affiliated company with T3Live. The event itself was the largest of HFT events that I've attended so far seemingly pointing to the increasing interest in the business and, in particular, the greater awareness of HFT following the May 6th flash crash event. There was even a panel with strictly institutional investors where the broad conclusion seemed to be: we have no idea what HFT is doing to us. I also met a venture capitalist at the event who is looking to break into the business by funding a shop or strategy. Market participants all across the spectrum are recognizing the prevalence and staying-power of HFT and looking to 'get in on the action'.

My takeaway from the event

If I had one key takeaway from the event it would be: computers are faster than humans traders. It is amazing how crystal clear this message has become as I listen to HFT practitioners speak about their businesses. As I have written in the past, the manual side of the business requires more sophistication than ever as HFT has come to dominate the efficiency-creating spectrum, an arena daytraders found profitable in the past. With HFT now driving fantastic efficiency in US equity markets in terms of short-term supply/demand pricing, profitability in scalp-style daytrading has diminished. As a general rule, traders continuing to find success have been expanding their timeframes to profit from inefficiencies over longer time periods. Below is my summary of Sean's panel, let us know what you think!

Volume and Latency Concerns: The Need for Speed

Moderator: John Cogman, VP, Autobahn Sales, Deutsche Bank Securities
Adam Afshar, President and CEO, Hyde Park Global Investments
Chris Bartlett, Director, Nobilis Capital
Sean Hendelman, Chief Executive Officer, Managing Partner, T3 Capital Management
Tim Cox, Director of Execution Services, New York, Bank of America - Merrill Lynch
Feargal O'Sullivan, Head of Trading Solutions, Americas, NYSE Technologies

The motives for latency reduction

The discussion started with participants giving their thoughts on the motives for latency reduction: why do you want to be the fastest and how important is latency? Sean took the lead stating that the lowest latency is a key component of his business and he strives to achieve the lowest possible latency for his strategies. The barriers to entry are quite high with costs that many cannot afford when getting started. Bartlett chimed in to say that latency is a product of the particular strategy employed. For a particular strategy, the relevant question is: can you get the fill you need? How that question is answered will drive your latency needs.

Afshar, CEO of Hyde Park Global, a 100% robotic trading firm based on artificial intelligence, went about re-defininig the goal of HFT. He explains that the purpose is to deal with the non-linearity of price data by splitting up data in chunks and small enough time sets whereby it can be modeled linearly. He believes the limitations for HFT will not be technology as it is falling in price over time, but rather strategies are becoming more intellectually complicated. The real challenge is finding the talent capable of creating profitable strategies.

Bartlett threw some numbers into the debate to define the HFT playing field. The ultralow latency is largely needed for capitalizing on fleeting arbitrage opportunities and typically lives in the 100-200 microsecond land. Normal HFT players, the likes of liquidity providers and rebate traders, are somewhere in the 500 microsecond in latency. The accuracy of these numbers can certainly be debated but for the average daytrader suffice it to say that it's far faster than even the human eye can see and interpret the Level II.

Afshar once again reiterated his belief that, in the end, it will simply be a matter of the smartest people winning the race. He used GETCO as an example, the largest electronic trading firm in the world. Founded by two guys in Chicago just over a decade ago in 1999, the firm quickly became a powerhouse in the liquidity provision space because they had the best ideas. While they're definitely major technology user, their business grew out of nothing to something because of programming ingenuity.

The keys to the technology itself

Sean took hold of the discussion centered around the technology angle and used O'Sullivan's clarifying remarks about the pyramid of HFT players. The top of the pyramid is a very small, select group that engages in ultra-high frequency trading while just below and lower are the vast majority of traders that do not strive to be the absolute fastest in the market. Sean argued that while technology costs may be falling over time, maintaining a place in the top of the pyramid continues to require substantial investment costs. The goal with designing a architectural system is to keep it light and flexible. Large institutions often struggle with dinosaur technology because changing one piece requires the updating of many other pieces. In order to remain adaptable, technology must be kept as light as possible. O'Sullivan added that not only does complexity drive the need for technology but also the efficiency of the code itself in executing the operations.

Regulatory impact on HFT operation

Afshar explained how he thinks about the goal of technology: reducing friction costs. He contended that lowering transaction costs and the time in which money is tied up are the key purposes of buying new technologies. Regulations, in effect, impose arbitrary friction costs on traders.

Sean jumped in to say that regulation isn't really a major problem at the current time. Most HFT firms are already going through the process of pre and post-trade checks through brokerages. The checks on the order side are extremely rapid and nothing to worry about. Yet, the quote side has significant room for improvement. As Bartlett added, there is just a ton of data to receive and interpret on the quote side slowing processing capabilities down.

Measuring latency

Some very intriguing discussions on measuring the latency ensued. Bartlett spoke about the ineffectiveness of using the Windows clock because it is only precise down to about 15 microseconds. If one uses the exchange timestamps, they're all different and simply don't match up because there's no universal clock that can be synched to efficiently.

Sean changed the discussion a bit and gave HFT participants the easiest way to measure latencies of various vendors and exchanges. Build two of the exact same strategies and run them simultaneously with the one variable being a particular vendor. Whichever strategy gets the print is the fastest, no more discussion needed.


Brandon R. Rowley
"Chance favors the prepared mind."
blog comments powered by Disqus