Market surveillance: a watching brief
The US Securities & Exchange Commission is often accused of using skateboards to chase Ferraris in its attempts to keep up with trading houses, but less than a year after announcing that it intended to create a new market surveillance system – and six months after going live with it – its cloud-based approach is paying dividends and allowing it to examine the structures of the markets it supervises at a greater level of granularity than ever before.
Midas – the Market Information Data Analytics System – is based on High Frequency Trading software from Tradeworx and allows the US regulator to perform what Gregg Berman, associate director of the SEC’s Office of Analytics and Research, calls “full depth of book analysis”.
“We have learnt how important it is to analyse all order book information,” Berman told a session at the Sifma Tech Expo event in New York. This means collecting information from a billion data points, he added, but the fact that the system is cloud based means that the analysis of the data is not as onerous as might be thought. “Because it’s in the cloud, access to processing power is just not an issue for us right now. There is no hardware to support, no software upgrades to maintain, no data feeds to handle, and hence no SEC resources are required for these tasks. Users at the SEC simply log into Midas from their own desktops whenever they want to access the system. If we need to perform a very large analysis, we employ multiple servers and invoke parallel jobs,” said Berman.
The Tradeworx HFT system quickly brought the SEC up to some sort of parity with the powerful systems used by Wall Street Investment firms, but it will always be analysing the market to understand its structures rather than continually monitoring it, Berman said.
This has led to some criticism of its technical capabilities, which he said are unfounded in the wake of Midas. The idea that market surveillance – as opposed to monitoring of individual firms could ever move to real-time, or even intraday is based on a misunderstanding of the complexity of the problem. Looking across the market at those billion data points lets the SEC investigate the interplays of activities that lead to things like the Flash Crash of May 2010 – and to say that HFT was not the underlying issue.
“Given the interconnected nature of our markets, the first step in our analysis was to determine whether the Flash Crash was initially triggered by events in the cash equity markets or in the derivative futures markets. To perform such an analysis we unfortunately needed to fully reconstruct the order books for thousands of individual stocks -a process that involved building analytics to process billions of individual records and took us nearly 4 months,” said Berman. “Recall that at that time all bets were on the problem originating in the equity markets – the press, the pundits, and even the markets themselves believed the crash must have been directly caused by something in the cash equity markets. Speculation ranged from delays in market data; to problems arising from market fragmentation; to claims that one or more equity-based high-frequency traders suddenly went wild.”
The results of the SEC analysis (still disputed by some) was that a completely different set of events had caused the crash.
“But what we were able to show by careful and painstaking analysis, was that contrary to initial perceptions, the problem actually originated in the futures market for S&P 500 “E-Mini” contracts and then quickly cascaded to the markets for individual equities,” said Berman. “Our findings were initially unexpected, and even to this day there are those who remain sceptical of some of our conclusions. If you find yourself in that category I’d encourage you to download a recent and completely independent analysis of the Flash Crash performed by two researchers at the Duisenberg School of Finance in Amsterdam.”
Berman’s presentation would not have pleased anyone in the audience opposed to HFT. “What’s illegal is illegal at any speed,” he said. While there are many questions about the role of HFT in the market, it is the SEC’s responsibility to better understand what is happening in order to inform the debate, which is what Midas will be assisting with. “If we don’t diagnose the problem properly we won’t get the right prognosis,” he said, adding that many problems in the industry are caused by “sloppiness combined with a lack of checks and balances”.
For this reason, he is dismissive of calls to regulate against HFT “Some have called for regulators to “slow down” the market, and some foreign jurisdictions have gone as far as proposing or adopting rules they believe will do just that, but I personally find it difficult to approach the question of whether we should, and if so how we might, slow down the markets, or even determine that the markets are indeed too fast, until we meaningfully measure its actual speed,” he said.
“Here are the issues: First, we need to be more deliberate and careful in the way we describe market speed, especially in the media. There is a humongous difference between a millisecond and a microsecond: the latter is a thousand times smaller than the former. So it behoves us all to stop throwing out terms that have precise meaning without any consideration of what we are really saying. Second, I don’t think market participants really care how long it takes the average person to blink, so I’m not sure why everyone keeps comparing the speed of the market to this benchmark. I’m also not sure why, absenting any other facts, people should be assuming it is problematic for trading to occur faster than the blink of an eye.”
More importantly, he thinks it is not central to the task of the agency: “Aside from my concerns about general market misperceptions and hyperbole, I think we can and must do a much better job at assessing the speed of the market in ways that would more directly inform policy.”
He gives a simple example: assume that 10 market participants are each bidding 100 shares to buy the same stock on the same exchange at a price of $25. If $25 was the best bid on that exchange, the consolidated tape would show this as 1,000 shares bid at $25.
Now consider what happens if an 11th participant decides to join the bid with an additional 100 shares, but shortly thereafter an existing bidder decides to cancel his prior order for 100 shares. In this case the tape would show 1,000 shares of interest at $25 jumping to 1,100 shares, only to quickly fall back to 1,000 shares.
“For all intents and purposes, it would seem that someone posted, and then quickly cancelled, 100 shares – producing what is often called a flickering quote,” he said. “But who, in this instance, is the party that produced this flickering quote? The answer is, surprisingly, that no individual party produced the flickering quote. It is simply a ramification of distinct parties each acting, and then reacting, at similar times. I think if we are to better understand the speed of the market, address concerns, and even consider actions, we must start by analysing data that informs us on how market participants actually behave, and not limit ourselves to simply observing the results of their behaviour in aggregate. Fortunately, we can do just that using tools like Midas.
Processing the data is a huge task. “What makes this process tricky is not necessarily the availability of data, or even the underlying technologies (though building a reliable and meaningful real-time monitoring system is a huge task). Rather, the tricky part is figuring out what is abnormal or outlier activity, and what is just an uncommon activity.
Many people would say that an outlier activity is something that occurs only 1% of the time, or maybe even 0.1% of the time. But remember we have a billion records a day. If outliers are defined at the 0.1% level, that would imply one million potential outlier events.
The fact is that simple broad-brush strokes won’t teach us much. The vast majority of outlier events are nothing more than just that – outlier events. As humans, it is in our DNA to try to interpret all such events, look for meaning, and take action. It’s why we see faces in clouds.”
By the end of this year, however, the regulators of the US markets will have another powerful tool, the Comprehensive Audit Trail, created by a joint plan of the seventeen national securities exchanges and FINRA (collectively the SROs), at the direction of the SEC Rule 613, which requires them to submit a plan by 6 December 2013.
By design, this system will provide a much more complete view of the equity and equity options markets, and will include such data elements as customer ID.
Randy Snook, executive vice president of business policy and practices at Sifma, said: “CAT will be a massive undertaking that will require significant resources from the industry and technology professionals in particular. The sheer size of the CAT database makes it unlike any other system: it is estimated that it will process over 50 billion records resulting in approximately 10 terabytes of data per day. And the number of records is expected to grow 25% annually. That means the system will have to support over 12 petabytes of data in order to retain data over a period of five years.”
The SEC’s Berman said that in some respects, “Midas is a bit of a precursor to CAT, at least for those of us performing the analyses and buried deep in the data. As such, it’s really just the beginning of an exciting trend in market regulation that combines advanced technologies and data, with, most importantly, talented and dedicated staff with a passion for understanding the markets. And for technologists and quants, the SEC has become an incredibly interesting place to work. I find it incredible that in a period of only six months we went from being significantly behind most market participants in this area of technology to leap-frogging most buy-side firms and landing on par with regard to the data collection and analysis capabilities of many high-frequency trading firms.”