Can the human regulators defeat the robot traders?

Traders on the floor of the NYSE on December 1, 2008.
Getty Images
Traders on the floor of the NYSE on December 1, 2008.

The Commodity Futures Trading Commission is looking to build a regulatory framework around high-speed and algorithmic futures trading.

The announcement, made earlier this week, comes in the form of something called a "concept release." It's a 137-page document that requests public input on more than 100 questions about dozens of proposed ways to control risks from technology that allows for many more trades to be made much faster, with much less human interaction, than any time in the past.

Despite the report's length, however, the CFTC doesn't seem to be addressing this problem directly—by forcing the traders to accept responsibility for their trades.

(Read more: Government takes first steps to regulate high-speed trading)

Right now algorithmic traders can externalize the costs of their errors because the exchanges have agreed to a policy of canceling errant trades. This reduces the risk and the cost of trading electronically, which is one reason electronic trading has grown so large.

A simple rule that required all executed trades to stand would create a huge incentive to improve systems or restore the role of human judgment in trading.

Would we still have some risk of market instability? Of course. Some algorithmic traders would still design flawed trading schemes. Some of the safety measures put in place would fail. There's no way to design a system that won't have any failures.

But there likely would be a diversity of approaches to reducing losses from flawed algorithmic trading. Some of those would be better than others. But the diversity would mean that if one of the approaches turns out to have an unexpected flaw, the damage will be limited to the firms that used that approach.

Regulatory fixes lead to market homogenization. Everyone follows the same set of rules, taking the same precautions. But when the government-certified safety measures wind up being flawed, the damage is universal because everyone has the same system.

(Read more: Cramer: It's time to embrace high-frequency trading)

That's the story of what happened to the banks during the financial crisis.

Adequacy regulations told banks how much capital they needed to hold against different kinds of risk. Highly rated mortgage backed securities had very low risk weightings, which meant banks didn't have to much capital supporting them.

That, in turn, encouraged banks to buy up gobs of mortgage bonds. This meant that the balance sheets of most banks were highly correlated. When the regulatory view of mortgage risk turned out to be wrong, the entire system was very nearly brought down.

(Read more: Would better capital requirements have prevented the crisis?)

There's good reason to worry that the CFTC, despite its many questions, might design a flawed regulatory system. For one thing, the rules likely will be incredibly complex. Throwing complex rules at complex algorithms is a recipe for opacity not stability.

What's more, the attempt to regulate could backfire. By reducing the apparent risks of high-speed trading, the CFTC would be inviting even more of it. This would make the costs of regulatory error even greater.

Rules aren't going to defeat the robots. To do that, we need actual humans.

Follow John Carney on Twitter @Carney