WRG High Frequency Trading: Bookstaber on The Future of Finance

Wednesday, October 27, 2010

I had the privilege of attending the World Research Group's Buy-Side Tech: Global Equities Trading Summit. Longtime readers will remember I attended this conference five months ago and wrote up my thoughts then (Day One & Day Two). With different speakers and topics this second time around was just as interesting and enlightening. This post will cover the first speech of the conference with more posts to come this week.

What lies ahead in financial markets

The conference's Keynote Speaker was Rick Bookstaber, Senior Policy Advisor to the SEC. Bookstaber is a brilliant and influential mind (I discussed his derivative market reform ideas a while back). There was so much great content in his speech titled "The Future of Finance" that I am at a loss where to start.

Bookstaber began his speech by tracing the history of information proliferation. He articulated that we are living in a new age where firms "hide information in plain sight". While disclosure rules are effective in forcing companies to release information, Bookstaber rhetorically asked, "who actually reads (or believes) a company prospectus?" He told a story of surfing on a beach where he, as a novice surfer, pushed out into the ocean and tried to surf. After nearly killing himself, a much more experienced surfer warned him that the way the waves break on this particular beach leads to a very dangerous end of the run if you don't know what you're doing. This helpful gentleman recommended that Bookstaber move down the coast to another, safer beach. While there were warning signs at the beach, Bookstaber ignored them, exactly what we all do now as we've become immune to the relentless disclosures of risk. This is certainly a concept the SEC must study because disclosure is useless if no one reads it or firms muddle the waters by simply listing every imaginable risk.

A new risk to companies is the concept of "viral information flow". Firms can no longer control with any degree of certainty the news that travels around about their businesses. We don't know what will go viral and what will never gain traction.

Bookstaber's explanation of the flash crash

Bookstaber briefly explained what he believes happened during the flash crash and likened it to the 1987 crash. An exogenous shock exposed the structural deficiencies in the market. Like the portfolio insurance of 1987, today's market was filled with many retail stop loss orders and aggressive high frequency traders. With markets now moving faster than humans could intervene algorithms simply hit the next bid in the book. Obviously their computers cannot read intention and hit bids at silly levels, literally down to a penny in some stocks.

The real question then is, why is the order book so thin that this could happen? Bookstaber postulates that decimalization is a primary reason. Market makers need to collect spreads in order to be willing to take on the risk of assuming positions. He thinks minimum increments in stocks should reflect the "true" spreads and allow market makers the ability to profit. With a reliable business model for profiting from spreads, regulators could increase the market makers' obligations in liquidity provision. Another idea would be to give priority to liquidity providers willing to commit to a certain amount of time to stay on the book.

Algorithmic shredding the tactic now

A strategy now pervasive among HFT desks is termed algorithmic shredding. The goal of the algorithm is to shred information that would otherwise have been useful to other market participants. This term can refer to the breaking up of large orders into small, individual orders and using smoke screens of false bids and offers and other games to shield intent.

Bookstaber's discussion of this topic got me to thinking about game theory in economics. I recently listened to an interview with a Head Trader, Kurt Kujawa from Cortina Asset Management, on block trading and equity market liquidity. Kujawa laments the declining use of block trades among institutions and harshly criticizes the change to "young kids just throwing their orders into an algorithm" for execution.

It occurred to me that in the relatively transparent world of the past, the early adopters of algorithms likely found better results in execution. Yet, as everyone has now adopted the algorithm transparency is gone and blocks are only found in dark pools. So, in the end, we may find that we have worse execution for all. When everyone is anonymous and only out for themselves, we may end up with worse results overall.

Don't worry too much about May 6th

Bookstaber is not particularly worried about the implications of May 6th. In his book, A Demon of Our Own Design: Markets, Hedge Funds, and the Perils of Financial Innovation, he outlines products that can create a crisis. Systems that have tight coupling and complexity are prone to 'normal accidents'. He defines tight coupling on his blog:
Tight coupling is a term I have borrowed from systems engineering. A tightly coupled process progresses from one stage to the next with no opportunity to intervene. If things are moving out of control, you can’t pull an emergency lever and stop the process while a committee convenes to analyze the situation. Examples of tightly coupled processes include a space shuttle launch, a nuclear power plant moving toward criticality and even something as prosaic as bread baking. (Source)
Along with tight coupling seen in high frequency trading, complexity is clearly present with programs and codes few can understand. Therefore, with tight coupling and complexity HFT will have normal accidents. The real question is whether this creates a crisis which Bookstaber believes it does not. While losing 500 points in the Dow in a matter of minutes is scary and shocking, it quickly rebounded and now we have some safeguards in place to prevent it from happening again (individual stock circuit breaker so far).

Creating robust systems

Bookstaber discussed his "cockroach theory" for robust risk management systems. He critiques the use of value-at-risk (VAR) because it is pro-cyclical. Prior to a crisis volatility and correlation is low yet when a crisis hits both jump, oftentimes unexpectedly and dramatically. VAR will lead firms to increase risks at just the times they should not and will comfort those taking on the greatest amount of risk just before collapse.

Over-optimization to a particular financial environment will inevitably lead to disaster as the environment changes. Species throughout evolution that were perfectly optimized to particular climates or landscapes go extinct when the world changes. Yet, the cockroach while not a greatly optimized species, can survive is all kinds of environments helped with its simple defense mechanism of running away as wind crosses the spines on its legs.

Much more to come from this conference...

Brandon R. Rowley
"Chance favors the prepared mind."

*DISCLOSURE: Nothing relevant.
blog comments powered by Disqus