"The Future of Computer Trading in Financial Markets," a 2011 working paper commissioned by the UK Government Office of Science, concluded, "[I]t is not impossible that human traders will simply no longer be required at all in some market roles. The simple fact is that we humans are made from hardware that is just too bandwidth-limited, and too slow, to compete with coming waves of computer technology."
Things have moved a long way from 1971, when the NASDAQ (National Association of Securities Dealers Automated Quotations) first started generating electronic quotes that were then acted on by traders. While estimates vary significantly, a research paper by the U.S. Securities and Exchange Commission that found high-frequency trading (HFT) alone accounts for over half of equity trading activity, and underscored that HFT is only a subset of all computer-based trading.
A 2012 update of the UK paper noted, "Markets are already 'socio-technical' systems, combining human and robot participants." That meshing isn't without friction.
In 2012, Knight Capital lost $440 million in 30 minutes before the company's engineers could shut things down. The cause? Errors in a software upgrade meant a "buggy algorithm was apparently buying high and selling low," noted Bloomberg Businessweek. "Do that 40 times a second, 2,400 times a minute, and you now have a system that's very efficient at burning money."
Flash crashes of individual stocks now happen with some regularity. Automated circuit breaks intended to limit obviously erroneous spikes help some, but post mortems often trace the glitches to human error, some as simple as "fat finger" trades that add one or more unintended zeros to trades.
HFT, in particular, has become a lightning rod. "[W]hat some have dubbed 'the rise of the machine'—ha[s] been blamed for adding volatility and prompted exchanges and banks to increase controls, especially after a 'flash crash' on Wall Street in May 2010, when the Dow fell more than 600 points in a matter of minutes, sparked by a large seller creating an imbalance in the market," according to Reuters.
There is an expectation that new regulation on HFT will come along both in the U.S. and overseas markets, but it is a challenge. "There is a relative lack of evidence and analysis to inform the development of new regulations, not least because of the time lag between rapid technological developments and research into their effects, and the lack of available, comprehensive and consistent data," according to the 2012 UK paper.
Looking at the broader landscape of quantitative investing, Robert Litterman, chair of the risk committee at the hedge fund Kepos Capital and a co-developer of the Black-Litterman Global Asset Allocation Model, said, "You can think of it as using computers, but of course it predates computers and really just has to do with finding metrics that predict both returns and risk and then optimizing portfolios along those dimensions."
"When I started, very simple algorithms worked very well," he recalled. "As more people did it, it became more sophisticated. It became statistical arbitrage and that has evolved into high-frequency trading."
That sense of continual evolution is key to Litterman. "It's not a fixed landscape," he noted. "One of the concerns about quantitative investing is that, to the extent it is understandable and can be reproduced, it's hard to keep that as a source of alpha because other people are going to come in and try to do the same thing." When the approach becomes crowded, its efficacy typically fades.
Investing is a business with few constants. He noted two: you get paid for taking risks and the landscape continually evolves. While there seems to be a strong trend to incorporate more machines into markets since they have the bandwidth to pull a signal from the noise of market activity, Litterman insists that the human part of the equation remains essential: "You can bring statistical evidence to bear but you also have to bring judgment to bear as well."