CFTC mulls retooling market surveillance for high-frequency trading

An alphabet soup of regulators collect data on financial transactions, but a universal eye into markets is still on the drawing board.

astonished man in data center

U.S. financial regulators are in the midst of upgrading their IT capabilities to monitor and police high speed computer trading. As documented in the recent Michael Lewis book "Flash Boys," high-speed traders go to considerable expense to co-locate their network infrastructure inside exchanges to gain a latency advantage over less nimble rivals that can be measured in microseconds. They hire top mathematicians to write algorithms to predict the movement of markets and trade accordingly. Events like the 2010 "Flash Crash" have roused regulators to the need to develop investigative capabilities that match the speed and dynamism of the market.

The consensus is that regulators are still a long way off.

The Commodity Futures Trading Commission is collecting more data than ever via new rules established under the authority of the Dodd-Frank legislation. Over the past year, the CFTC has seen a 166 percent increase in the kinds of data its system takes in, according to Jorge Herrada, associate director of the CFTC's Office of Data and Technology. The loading of structured data into CFTC systems requires standing up new architecture, customized loaders, and quality checks between the agency and designated swap data repositories.

"Each new data set is a major project unto itself, analogous to standing up a new factory production line," Herrada said at a June 3 meeting of the CFTC's Technology Advisory Committee.

The CFTC has collected about 160 terabytes of transaction data, covering billions of transactions. But there are still lingering issues about what the data represents.

There is room for interpretation about data like the time of the execution or modification of a trade, and until such definitions are harmonized across the industry, the ability to track the movements of trades from one SDR to another is limited.

"We have the raw materials for a great surveillance program, but we still have a lot to do," Herrada said.

Regulators are also limited by law to specific roles. But technically it could be feasible to design a system that provides regulators with a window into activity across different asset classes, whether stocks, bonds, futures or other categories.

Bad actors move across markets, said Steve Joachim, executive vice president of the Financial Industry Regulatory Authority, a self-regulatory organization. "Frankly, today the regulatory community does not treat the marketplace on a broad scale in one view," Joachim said. A 21st century system would give a more complete picture of the markets across the world and across sectors.

Pivoting to this new reality in government time might prove challenging. While the 160 terabyte dataset at CFTC is large by most government standards, it is dwarfed by the processing demands of the NSA, NASA, NOAA, and even some private-companies. Dave Lauer, president and managing partner at KOR Group, said there are open source platforms to manage more data than are generated in the financial services industry.

These big data issues "are intimidating, but they are problems that have been solved," Lauer said, adding that government regulators are behind the times when it comes to advanced analytics, machine learning, and other capabilities that could improve surveillance of financial markets.

The real issue is redeploying resources to attack the problem --the government IT workforce as currently constituted can't fix the problem, according to Lauer.

"You can't do this at GS-13 or GS-14. That's why a radical reallocation of budget resources is necessary. This is what regulation is going to be in the 21st century," he said.