Should We Ban the Bots?
Every country that taxed HFT saw volumes collapse. Those that didn't got flash crashes. Explore the regulatory dilemma no one has solved.

Every country that tried to tax high-frequency trading saw trading volumes collapse. Every country that didn't saw a flash crash. There are no good options — only tradeoffs. This is the uncomfortable reality facing regulators worldwide as they grapple with an algorithmic arms race that now accounts for 60-75% of all equity trading volume in the United States.
In October 2024, India's National Stock Exchange experienced a 15-minute outage triggered by algo-driven order flooding that overwhelmed exchange systems. The incident echoed the 2010 Flash Crash, when the Dow Jones plunged nearly 1,000 points in 36 minutes before recovering—later attributed to a toxic interaction between a large sell order and high-frequency algorithms. Yet when Sweden introduced a financial transaction tax in the 1980s, over 50% of trading volume fled to London within weeks. The question is no longer whether to regulate algorithmic trading, but whether regulation itself might be the greater risk.
The European Union's Markets in Financial Instruments Directive II (MiFID II), implemented in January 2018, represents the most ambitious attempt to rein in algorithmic trading. The regulation introduced sweeping requirements: algorithmic traders must be registered, strategies must be tested against market stress scenarios, and all trades must be timestamped to microsecond precision.
The results have been mixed at best. A 2023 European Securities and Markets Authority review found that while MiFID II succeeded in improving market transparency, it also contributed to significant fragmentation of liquidity across multiple trading venues. bid-ask spreads on mid-cap stocks widened by approximately 15% in the three years following implementation, effectively increasing trading costs for institutional investors managing pension funds and retail savings.
[!INSIGHT] MiFID II's unintended consequence was driving algorithmic trading into darker corners. When lit exchanges became more expensive and regulated, trading migrated to systematic internalizers and dark pools—less transparent venues that now account for over 40% of European equity volume.
The regulation's approach to high-frequency trading specifically required firms to provide detailed documentation of their algorithms to national regulators. But here's the catch: by 2022, only 23% of EU member states had the technical capability to actually review these algorithm submissions. The gap between regulatory ambition and enforcement reality remains vast.
The Market Maker Paradox
One of the most contentious aspects of algorithmic regulation involves market-making obligations. Under MiFID II, designated market makers must maintain continuous two-sided quotes during trading hours. Yet HFT firms, which provide the bulk of liquidity in modern markets, explicitly avoid market-maker designation to escape these obligations.
“*"We've created a system where the firms providing the most liquidity have the fewest obligations, while those with obligations provide minimal liquidity. It's a regulatory paradox that defies easy solution.”
This structural mismatch became evident during the March 2020 COVID volatility spike. Traditional market makers largely withdrew, while HFT firms continued trading—but only in the most liquid names. Small-cap stocks experienced spreads widening to 8-12% of market price, effectively freezing price discovery for smaller companies.
America's Regulatory Paralysis: The Reg AT Debacle
In the United States, the Securities and Exchange Commission's attempt at algorithmic regulation followed a dramatically different path—and reached a dead end. Proposed in November 2015, Regulation Alternative Trading Systems (Reg AT) would have required algorithmic traders with direct market access to register with the SEC, maintain detailed algorithm source code for inspection, and implement mandatory kill switches.
The proposal faced immediate, ferocious pushback from the industry. The question of whether regulators should have access to proprietary trading algorithms touched a nerve: firms argued that source code represents their core intellectual property, and that SEC examiners with access could leak trade secrets or be recruited by competitors.
By 2020, Reg AT was effectively dead. SEC Chair Jay Clayton declined to advance the rule, citing concerns about competitive harm. The agency pivoted to a more modest approach: the Consolidated Audit Trail (CAT), which tracks all order lifecycle events but does not regulate algorithmic strategies themselves.
[!NOTE] The CAT system, fully operational as of 2024, generates approximately 58 billion records daily. Yet the SEC has repeatedly requested budget increases simply to store and analyze this data—raising questions about whether surveillance without substantive regulatory teeth serves any purpose beyond bureaucratic box-checking.
The practical result is that American algorithmic trading operates in a largely self-regulated environment. Exchanges set their own rules for algorithmic access, creating a patchwork where firms can essentially shop for the most permissive regulatory regime.
The Tobin Tax Wars: What Happens When You Tax Algorithms
The financial transaction tax (FTT)—often called the Tobin Tax after economist James Tobin's 1972 proposal—represents the bluntest instrument for algorithmic regulation. The theory is simple: impose a small tax on each transaction, and high-frequency strategies that execute thousands of trades per second become uneconomical.
France implemented an FTT in 2012, covering equity trades in large-cap French companies at 0.2% of transaction value. A 2022 Banque de France analysis found that trading volume in affected stocks declined by approximately 20% relative to comparable European stocks not subject to the tax. More troubling, bid-ask spreads widened, and market depth decreased—particularly during volatility events.
Italy's 2013 FTT produced similar results, with trading volume migrating to foreign exchanges. When the UK considered extending its stamp duty to derivatives in 2021, the Treasury's own analysis estimated that 70% of derivatives trading volume would relocate offshore within two years.
[!INSIGHT] The fatal flaw in transaction taxes is their assumption that trading activity is fixed. In reality, trading is highly mobile. Tax one jurisdiction, and algorithms simply route orders through untaxed venues. The net result is reduced market quality without reduced algorithmic activity.
Yet proponents argue that reduced trading volume is not inherently negative. EU Parliament estimates suggest that an FTT across all member states could raise €30-35 billion annually while dampening speculative short-term trading. The fundamental debate remains unresolved: is high trading volume a feature of efficient markets, or a bug that enriches HFT firms at the expense of long-term investors?
The Stability vs. Liquidity Tradeoff
Beneath every regulatory debate lies a fundamental tension that no jurisdiction has resolved: the tradeoff between market stability and market liquidity.
High-frequency algorithms provide substantial liquidity benefits. Studies consistently show that bid-ask spreads have declined by 30-50% since HFT became dominant in the early 2000s. For retail investors executing small trades, this means significant cost savings on every transaction. Pension funds and mutual funds benefit similarly when executing routine rebalancing trades.
But this liquidity comes with strings attached. Algorithmic liquidity is provisioned—and withdrawn—at microsecond speed. When market conditions exceed the parameters of trading algorithms, they simultaneously exit, creating liquidity vacuums that amplify price movements. The pattern has repeated across flash crashes in 2010, 2015, and 2019: algorithms provide liquidity when it's least needed and withdraw it when it's most needed.
“*"Asking HFT firms to provide liquidity during crashes is like asking insurance companies to pay claims during hurricanes while prohibiting premium increases. The economics simply don't work.”
The regulatory challenge is that traditional market-making obligations—requiring continuous quotes regardless of conditions—are incompatible with the statistical models underlying algorithmic trading. An HFT firm forced to make markets during a crash would face existential risk; yet a market without such obligations remains vulnerable to cascading liquidations.
Emerging Approaches: Speed Bumps and Auctions
Several exchanges have experimented with structural solutions that don't rely on banning or taxing algorithms. The most prominent are speed bumps—tiny delays (measured in microseconds) applied to incoming orders that reduce the advantage of speed-based strategies.
Canada's TMX Group introduced speed bumps in 2018 under the TSX Alpha Exchange. The results were telling: institutional order flow increased significantly, with asset managers citing improved execution quality. However, overall trading volume declined as some HFT firms migrated to venues without delays.
A more radical approach involves frequent batch auctions—processing orders in discrete time intervals rather than continuously. A 2023 Cornell University study found that batch auctions implemented in a controlled market environment reduced the profitability of latency arbitrage strategies by over 80% while maintaining overall market liquidity.
[!NOTE] The SEC approved the Investors Exchange (IEX) as a national securities exchange in 2016, specifically noting its 350-microsecond speed bump as a legitimate market structure choice. However, IEX captured only 2.5% of US equity volume by 2024—suggesting that the market has not embraced structural speed limitations.
The fundamental question remains unanswered: should regulators mandate structural changes that disadvantage certain trading strategies, or should they allow market participants to choose their preferred trading venues?
Implications: The Path Forward
The regulatory landscape for algorithmic trading reflects a broader truth about technological regulation: controlling complex, rapidly evolving systems through static rules is nearly impossible. The algorithms of 2024 bear little resemblance to those of 2015, let alone 2010. Any regulation drafted today will be obsolete within years, if not months.
This suggests a need for adaptive regulatory frameworks—approaches that define principles and outcomes rather than specific technical requirements. The UK's Financial Conduct Authority has moved in this direction with its "outcome-focused" regulation of algorithmic trading, focusing on market fairness and stability rather than prescribing specific algorithmic behaviors.
The international dimension adds another layer of complexity. Algorithmic trading operates across borders; regulation remains nationally bounded. Until major financial centers coordinate their approaches—a process that has eluded regulators for over a decade—arbitrage opportunities will continue to undermine national regulatory efforts.
Sources: European Securities and Markets Authority (2023 MiFID II Review), SEC Consolidated Audit Trail Reports (2020-2024), Banque de France Financial Stability Review (2022), Journal of Financial Markets (Cornell Batch Auction Study, 2023), London School of Economics Working Papers on Market Microstructure, MIT Sloan School Research Papers.
This is a Premium Article
Hylē Media members get unlimited access to all premium content. Sign up free — no credit card required.


