Is Quant the New King in Town?
The “quant” revolution in trading is well underway, but how do they leverage their data and what new risks does this introduce?
Despite what you might hear in the news, quantitative, or systematic hedge funds* are not new. In fact, many of these hedge funds date back to the 1980s; however, for most of the history of the hedge fund history their fundamentally oriented sexy stock-picking cousins have “captured the focus, the flows, and the glory.”[1] All of that has changed in the last decade or so as a combination of increased computing power, more asset classes shifting to being traded on exchanges versus over the counter, and increased investor confidence with algorithms in general has paved the way for sky rocketing growth. The industry is poised to pass $1 tn in assets this year.[2]
One company that has been a huge beneficiary of this trend is AQR Capital Management. Initially started in 1998 as a hedge fund, today the firm offers quantitative strategies across many asset classes (hedge funds, commodities, bonds, stocks, etc.) and boasts $224bn in assets under management[3] and the second largest hedge fund assets in the world. The firm earns money by charging investors a management fee that is based on a percentage of dollars invested, and in some cases a performance based fee as well.
Without question, data is what brings oxygen to this firm. The firm’s investment process is as follows[4]:
- Idea generation: As with traditional investing, idea generation comes from an individual, generally an economist with a PhD that has an economic intuition for some signal that could predict outperformance.
- The researchers will run regressions on a given data set to assess its predictive power. They will run a number of related tests to try and ensure a causal relationship and to determine if the signal is measuring something materially different that signals already leveraged in the model. For robustness, the researcher will have to find additional “out of sample” data sets in other asset classes or time periods to gain confidence the relationship isn’t spurious.
- Portfolio construction: After potentially months of testing, signals that have made it through the entire process are incorporated into the model, again through empirical testing. This requires an entirely new set of data on which to test how this new signal could impact the risks in the portfolio in order to size its weighting appropriately in the broader model.
- Portfolio optimization: Once an ideal paper portfolio has been generated by the model, it needs to be translated into a portfolio that can be transacted efficiently and that is palatable to investors. AQR’s researchers have developed a process of reverse optimization that incorporates the cost of trading, diversification requirements, and risk constraints to generate the actual set of holdings they want in their portfolio.
- Execution: Once a set of buys and sells have been generated, AQR’s trading team brings the portfolio to life – again using a set of algorithms. Over the course of the last 20 years, the firm has collected trillions of dollars worth of “tick data” or trading information that they have leveraged to develop patient trading algorithms that aim to reduce both the costs from paying the “bid/ask” spread as well as market impact (the supply and demand dynamics that can cause prices to rise and fall when investors try to transact).
This is a process occurs every single day (and even intraday) at the firm and requires a significant amount of technological infrastructure and data to bring to life. While this value creation process is rigorous and robust, it is clearly not without its own risks and costs:
- It is increasingly challenging to get differentiated data sources, so the firm must also rely on differentiated individuals and researchers, just like traditional firms
- As quant assets grow, certain signals get more “crowded”, potentially reducing the future returns
- With any empirical process, there are always fears of data mining – while there are checks in place, it’s impossible to completely avoid
- The research is inherently backwards looking which may not be predictive of the future
- There have been instances of “quant crashes” in the past where systematic strategies have reacted in similar ways to anomalous data causing broader instability in markets
These are new risks introduced in quant strategies, but the quant approach also eliminates many of the risks inherent in traditional processes (investor biases, poor risk management and portfolio construction, concentration, etc.). For this reason, many traditional managers are working to implement elements of quantitative rigor into their own processes. While it remains to be seen where the ultimate equilibrium between traditional and quant will lie, what is clear is that quant investing is here to stay and it’s likely to get much bigger than it is today.
* For those who are unfamiliar, a quantitative investor uses researchers to build models and then those models generate the buy and sell decisions that the firm uses to invest. As with any other industry, the nature of the models and investment process can vary dramatically from firm to firm, so the term “quantitative” can be liberally used to describe a wide variety of strategies and approaches. One thing they all have in common is that the use of “big data” is critically important in every step of the value creation process.
Sources:
[1] https://www.bloomberg.com/news/articles/2017-06-20/rise-of-robots-inside-the-world-s-fastest-growing-hedge-funds
[2] https://www.ft.com/content/ff7528bc-ec16-11e7-8713-513b1d7ca85a
[4] Firm employee conversations, personal experience