I actually co-wrote a paper about this at the time, and it's very rare I get a chance to talk more about it! https://jot.pm-research.com/content/13/3/5 (sadly it's paywalled)
Periodic auctions still need tie breakers. CBOE for instance falls back to size then time. This is the same tie breaker that some CME futures contracts have used in a continuous order book. Those contracts always had more gamesmanship than standard price/time contracts when I was trading.
Has that become true in the auction space at well?
When they were first introduced, each of the fba books had slightly different mechanics (matching priority, timing, price determination), so each book needs a slightly different approach. I guess you could see it as gamesmanship, but in equity markets dealing with market mechanics properly is just part of the job.
Yeah it's a shame. I don't own the IP, my previous employer does, and I think when they got it published in the journal of trading they had to agree to some level of exclusivity which meant it can't just be uploaded elsewhere.
I just linked to it in case someone here had a subscription.
I was working in HFT as a dev team lead around the time of this article (2105).
I remember this was being seriously considered by one of our target exchanges (can't remember if it was Eurex or Globex).
Our main HFT trader didn't seem worried - he said that the race would just change from a race to pick off an opportunity into a race to align with any auction timeframe.
Back then, our strategies were implemented in FGPA so our response to events could be timed very accurately. Even randomly-timed rolling auctions wouldn't have posed any challenges.
Probably explains why this idea never ended up being implemented by any of the major exchanges.
Haha yeah! Though times for these auctions are double-digit milliseconds, a lifetime for your fpga strategies!
And it's still fairly niche, these are complementing CLOBs/dark pools rather than replacing them.
Where did you move to from hft?
Moved into a small company that does process control (SCADA) systems development. Took a fairly large drop in salary but the work/life balance improved and job satisfaction increased.
I'd previously done a lot of work in embedded SCADA systems (hence the fit for working with with FPGAs in HFT).
I left mainly because I genuinely felt that there was a certain futility with ultra low latency trading...it's less about trading and more an arms race between quants and techies of different companies, all with deep pockets.
I guess embedded SCADA systems are my comfort blanket :-)
I'm merely an observer, but it feels, intuitively, that HFT was great when things were fairly predictable -- or more like the major indices and individual names moved a most 2% on any given day -- and now, in the post-covid era, starting with the DPZ spike, and you could argue the TSLA original call buying spree, everything is in shambles. A lot of HFTs I know of suffered serious losses...
Could you explain, indirectly or without identifiers, why now is different than back then?
It's kind of an open secret, but retail traders hopping onto meme stocks like DPZ and TSLA is, counterintuitively to an outsider, actually very profitable for HFTs and market makers.
A good example might be - imagine you are a car dealership, so serving as a rough approximation of a market maker. What kind of entities do you want to trade against? Other car dealerships (informed counterparties), or your average suburban minivan owner (uninformed counterparties)?
It's immediately obvious - the rationale is that when you trade against uninformed order flow, your measure of adverse selection is far lower than if you trade against informed order flow. Your average suburban minivan owner is going to be more time-sensitive and price-insensitive than another car dealership who is willing to look high and low for better deals.
Adverse selection, in this context, is that of the orders you're offering to the market, only the subset which have the greatest likelihood of immediately losing you money are selected. From the perspective of your counterparty, they will only lift your offer if they think it will make them an immediate unrealized profit. Keeping track of your adverse selection is an extremely important part of HFT - in fact, HFTers will try to identify informed vs uninformed order flow and only try to trade against the latter, to reduce immediate unrealized losses due to adverse selection.
This is why PFOF (payment for order flow) exists. It's because companies like Virtu think that traders on RobinHood have no clue what they're doing, and they [Virtu] can come in and eat all the alpha. Virtu doesn't frontrun RH orderflow - instead, they get what's called "first look" at the flow. They get to decide to either immediately fill the offer, or let it hit the real market. From the perspective of a RH user, this is really no harm, because whether Virtu trades against you, or your offer gets lifted against the broader market, doesn't really matter to you.
So my inner self says that these are red herrings, or false flags set by 'real players' to lead the stampede into your living room.
It feels like there's never been a better 'cover' than r/wallstreetbets for firms with real capital to put material volume behind names that would in other scenarios be extremely suspect, or near manipulation. Now you can say, 'see, retail said they like AMD, AMC, etc... (THE STOCK)" because, yes, some random user with karma can now be the input to your 'algo' to move $name_of_stock, or generally, SPY calls / 'poots' in size wildly impossible otherwise.
Humbly, I think it's brilliant
[Edit] to put this comment in context of my response to another poster, I believe the macro can be independent of the micro (aka many little names can get blown up while your main indices maintain some sense of normalcy).
The more retail traders, the more money that institutional traders can make! Institutional traders love the uninformed flow. It's no fun trading purely against other algorithms (and also not nearly as profitable)
I'll just chime in and say I believe you are misinformed and wrong on this issue. I can say definitively that my firm which engages in HFT and all the other quant/HFT firms I know of have decided to stay clear of the meme stocks, ie. GME, BB, AMC, and a few others and I am not aware of any open secret that HFT firms are sneakily taking advantage of this situation.
Your statement that HFT firms think that retail traders have no clue what they're doing is a complete misrepresentation of the intention behind PFOF. It's not at all that we think retail traders are idiots, it's that retail orders are usually not coordinated and sustained activities the same way that institutional orders are. If an institution is buying and I sell into it, it is quite likely that the institution will continue buying more and more over a long period of time which increases the duration of my exposure to that institution's order flow. Furthermore it's unlikely that institutional order flow on the continuous market will balance out with other institutional order flow, since in situations where such an opportunity exists, brokers for said institutions will arrange for a block trade or use auctions instead of the continuous market. So trading against an institution means assuming exposure for an extended period of time.
With retail orders, usually a trader buys with a few orders in a way that's typically uncoordinated with other orders and that's it. I don't need to be worried that if I sell to a retail trader that a whole bunch of further traders will follow behind them in the same direction, increasing my exposure.
This is not to say that institutions know what they're doing and retail traders don't, or vice-versa. An institution may have no idea what they're doing and pissing their money away and I still won't want to trade against it simply because as an HFT firm my goal is to lock in a spread as quickly as possible as opposed to speculate on the long term prospects of a company. If anything, to the extent that there is an open secret in this industry, it's that institutions don't perform much better or have much of an advantage over anyone else. That said even if they did it wouldn't matter one way or another, what I care about is that the order flow that I am trading against can balance out over a short period of time so that I can lock in the spread.
It is precisely because retail traders are behaving in a coordinated manner on meme stocks that my firm and all the other ones I know about are not participating in them. Retail flow on meme stocks is often coordinated, at least implicitly so as an HFT firm you may risk holding a significant position for a long time, which is not ideal.
That said the market is very big and the meme stocks constitute but a tiny fraction of a fraction of the activity. It's not a particularly big deal one way or another.
Having worked in HFT for many years now, this absolutely rings true. I would add, backtesting a meme stock strategy sounds like an exercise in over fitting. There are so few previous examples to look at. And all for something that rarely happens, relative to how many stocks are out there and how many trading days there are in a year.
I’m not sure about delta one firms but almost all the options MM firms have been having record years in the COVID / meme stock era.
In broad strokes, the things that hurt market makers the most are long winded price trends and accumulation of inventory. So generally MMs can and often will eat large initial losses (depending on how many wings they happened to have owned at the time) when huge volatility spikes happen but when the raised volatility stays at that level for some amount of time (you’ll sometimes hear this referred as market “regimes”) and the MM was able to not blow out from the initial spike they’ll more than make up their losses from the good trading environment after the fact.
Market makers as a whole were suffering during the mid 2010s when volatility was low year to year, correlation with SPY was high, and all the indices basically just went straight up every month.
The way I see it, HFT firms provide liquidity to the market, which is good. They do so in an automatic fasion which makes it cheaper than the past system of human traders. But they also do a speed competition which is mostly wasteful. There may be some benefit for the overall market of faster communications but it is pretty low.
All systems have waste, some more and some less. This is unavoidable. So the discussion might be more productive if it was framed like this: Which system provides more benefits with less waste? Would frequent batch auctions lead to less resources being spent on wasteful racing, and more resources performing useful services for other market particpants?
Or if we zoom out even more, the two main purposes of the market is allocating capital efficiently to businesses, and redistributing money from the working population to retired people (401k, IRA). And we can ask which market structure will make it better at these tasks?
Framed like this it becomes natural to look at the other side of equation. Instead of asking which structure would screw over HFT's the most, we can ask which structure would be most convinient for say index funds or other mutural funds. And which structure would be most convinient for the individual stock picker. We could even start to ask which structure would help HFT's provide more liquidity with less risk.
How do HFTs add valuable liquidity to the market? Does the 500ns faster transaction time for a block of AMZN matter to literally anyone? Other than the two sides of the trade who lost some money to the HFT who MITM'd them.
When someone buys liquidity, they don't do so to close their order 500ns faster. They do it to ensure they can trade at the current market price because they don't want to take the risk that the market will move away from them while waiting for a counter-party to trade with.
Those that are comfortable taking this risk can simply issue a LIMIT order instead of a MARKET order.
So your contention is that HFTs make prices more stable? That they somehow assume risk and that justifies their profits. How would that work? I thought HFTers only got involved in between two parties when they knew they could make a profit. That's why HFTs don't have days when they lose money.
Faster transaction times result in tighter spreads, as HFT firms compete each other on the price-time priority queue. HFT firms compete against each other on time, and while yes, that's a zero sum game, the end-result to the broader market is useful.
An analogy might be something like Uber and Lyft competing with each other for clients and drivers. From the perspective of everyone else, it doesn't matter much if they ride Uber or they ride Lyft. But the adversarial games that they play against each other [Uber and Lyft] are beneficial to both riders and drivers. Perhaps a duopoly isn't the best example, so you may extrapolate this to any industry where there's a sufficient amount of participants to keep things competitive.
But the spread in many-to-most stocks is limited by the sub-penny rule (SEC rules, as of 2005, say you can't have a spread < $0.01) rather than by the supply of market makers in that stock. Extra competition in those markets is negative-sum.
The SEC rule is that one may not quote a spread less than 0.01, but that does not mean one can not trade with a spread less than 0.01. There are several workarounds available that are well known, the simplest is in the form of mid-point pegged orders that allow spreads down to half a penny. On top of that U.S. exchanges offer a variety of different fee combinations, including negative fees which can be used to decrease the fee even further. All HFT firms take advantage of these fees, furthermore there are liquidity enhancing programs offered by the major exchanges as well as by ETFs. These can all be used to reduce the spread of a stock below the 1 cent limit.
The main reason they compete on speed is because it's illegal to compete on price. Abolish the sub-penny rule and it'd go back to being boring market plumbing.
This isn't really a good take: many markets do not trade one tick wide in the first place and people are still competing on speed. Delta neutral trading is a zero sum game and no matter what rules you put in place it will still be incredibly cut-throat.
Moreover at this point speed is a commodity, if you're willing to shell out cash, you can get access to top tier infra right out of the gate. The real game is not how fast you are (though obviously that's important too), but how smart you can be while maintaining good tick-to-trade latency.
Trading has never been a vanilla/boring business and it likely never will be either.
> many markets do not trade one tick wide in the first place and people are still competing on speed
True but not something that I find compelling. If you can only compete on price in penny increments, then you'd have to be hugely more confident to undercut a 1c spread with a 2c spread; if you could offer a 1.8c spread by taking a little more time over your calculations, that would change things.
> Delta neutral trading is a zero sum game and no matter what rules you put in place it will still be incredibly cut-throat.
I mean yes, to the extent that there's profit in it at all. But the profits have already been shrinking year-on-year. Plenty of mature industries like supermarkets are utterly cut-throat, but don't bother regular people.
> Moreover at this point speed is a commodity, if you're willing to shell out cash, you can get access to top tier infra right out of the gate. The real game is not how fast you are (though obviously that's important too), but how smart you can be while maintaining good tick-to-trade latency.
Well, it's the same thing, like the project management triangle - you can always trade quality for speed and vice versa, the hard part is when you want to improve both. But I do agree that at this point a lot more of it is known quantities and techniques.
> Trading has never been a vanilla/boring business and it likely never will be either.
My sense is that it's no longer where the best and the brightest go (and as I said before, profits are shrinking a lot). More and more of it is commodified. Which is what we should expect from any industry, honestly - at some point things are new and exciting and profitable, then they become mature and less so.
Swap to an auction batch model and they don’t provide liquidity, they simply don’t have significant stakes relative to the number of daily transactions. Essentially their an outgrowth of all trades needing to be instantaneous which lets them reuse the same capital thousands of times per day.
Add to that the fact HFT are profitable and they must therefore provide negative economic value. Either the seller or the buyer is failing to capture value.
Nonsense. Trade can be positive sum in utility to both parties. One party making profits doesn't imply anything about how utility is changing on the other side of the trade. If you buy an iPhone, Apple have gotten richer but you have also gained something (utility).
Your understanding of HFT is wrong as well. It's not all low latency arbitrage. It's also execution finesse, risk management and ML-heavy. HFT firms will still be extremely active under this new market structure.
High turnover rates are part of the definition of high frequency trading. If your holding positions for 30 minutes your not doing HFT. Ultimate buyers and sellers aren’t gaining anything on the timescales we are talking about.
I don't know why you're bringing this up because it's not relevant, and nor is it accurate. You can have high turnover rates in batch auction strategy as well as a continuous trading strategy, this market structure change won't change that. And you can have a HFT strat with low turnover, e.g in a large tick name that barely moves where you might get five latency sensitive trades per day.
Now it’s true an individual stock may only see 5 trades per day from a HFT algo, but theirs more than just one stock. The larger pool of money sitting around waiting for those 5 trades the lower your ROI. So, the obvious strategy is to reuse the same pool of money to back multiple different strategies.
A HFT that's trading bond futures may only have a handful of trades per day and hold for a long time because it's hard to liquidate effectively. Sometimes they just range all day.
Anyway my point is that it's wrong to think that HFT are going out of business with this change because it's a fundamental misunderstanding of HFT. There's almost always an ML component and always an execution component and these two skills are going to be critical to profiting off the new market structure. Citadel, Jump, Tower, you name it. I promise you they will be all over this new structure.
I completely agree, and they are going to use the same tools. The question is if this change is a net positive trade for the economy, and that I don’t know but I have heard reasonable arguments in favor.
The idea is that HFTs provide value by tightening spreads.
The slower a market maker is, the more risk they take on when they quote, because they are more likely to be caught by market moves - less likely to cancel their quote when the market starts moving, less likely to be able to hedge if they get filled at the start of a move. To make up for that risk, they have to earn more per trade. The only way to do that is to quote a wider spread [1]. That means that real money participants end up paying more when they cross that spread.
The value captured by HFTs has not come from real money participants, but from other, slower, market makers, and they have shared that value with real money participants.
[1] Or to demand a bigger stipend, or steeper maker-taker pricing, from the exchange, either of which means bigger fees for other participants.
It’s not always that straight forward. In the hft space you’re essentially always responding to 1 event, a trade, an order, a cancellation etc. Events are generally spread far enough apart that you can consider them independently (obviously not always, but often) So it doesn’t matter how complicated your model is, if you can pose the question “what would you do if X happened” to your model then you can prepare your fpga with “if X happens do Y”. As long as your universe of possibilities is tractable (which it often is) then an arbitrary model can have all latency sensitive events offloaded to fpga.
The part of the strategy that lives on the FPGA has to be fairly simple, but that doesn't have to be the only part of the strategy. You can do all sorts of deep and meaningful computation on a real computer, then export some settings for a simple event-driven model to the FPGA, updated many times a second.
This is interesting. May I ask how do you do it from software poinr of view? As for CPU and memory for sure it is possible, but I'm more interested about how do you connect into exechange/broker?
Contiki, SLIP, PPP, or maybe some ethernet expansion ?
If you're a market maker, you really, really want to be able to do low-latency trading in order to hedge fills before the market moves against you. If market makers can't do this, they will make worse markets - show less size and wider prices, or just get out of the game. How do you do this under continuous batch auctions?
I have an underdeveloped idea that what we really need is limit order types with built-in hedging. "Bid to buy 100 gizmos at 30c each, and for every five gizmos bought, immediately offer to sell 1 widget at $1.20; cancel this order if the best offer for widgets moves below that price" sort of thing. Basically, you're moving the simple reasoning that has to be executed at low latency from the market maker's FPGA to the exchange's matching engine.
Sometimes, you can do this by putting orders in spreads, but only where a spread exists (or can be defined) for the two legs you care about, in the right ratio.
You might also want to do more complicated things, like pulling an order in one product if another product moves a lot, because you think that presages a move in the product you're quoting.
The idea would be, firstly, to make it much easier to make markets without having to invest in low-latency infrastructure, broadening the base of participants who can do it, and secondly, to reduce the negative impact of speed-blunting interventions like continuous batch auctions or speed bumps.
The hedging scenario you describe is one of the hallmarks of combinatorial auctions[0], which let participants enter bids on packages. (Disclaimer, I'm a founder at OneChronos which is applying these auctions to US equities.) So a market maker can express something like: "fill me for any package that includes `x` gizmos AND `k * x` anti-gizmos simultaneously".
The more powerful and general version of this is: "Buy and sell any mix of products, subject to the total package being neutral across these 10 risk factors I care about."
> You might also want to do more complicated things, like pulling an order in one product if another product moves a lot
This is a key problem in US equities or any market with similar fragmentation. The way we're approaching that is to allow those package bids to also include constraints on "current" market conditions at the moment of the auction. A simple one would be "if the momentary spread between asset A and B is greater than X, don't trade."
Putting what amounts to trading bot logic into the orders themselves seems like a bad idea for scalability. Is all order logic visible to all market participants? If so, then everyone has to run their own local market logic resolver to determine actual liquidity. If not, then true market liquidity is now opaque.
If anyone else is faster than you, then true market liquidity is already opaque. Indeed, this is one of the criticisms of HFT - that the liquidity they present is "fake", "ghost", "phantom" liquidity, because it will be pulled at the slightest provocation.
Existing mechanisms which obscure liquidity are iceberg orders, market maker protection [1] [2] [3], and various kinds of non-displayed orders [4] [5] which i confess i am not very familiar with.
I think this illustrates that exchanges are sometimes willing to sacrifice a little transparency in order to encourage more liquidity provision. This is a fundamental axis of market design. At one end are classic lit exchanges, at the other end is OTC dealing, and there are all sorts of shades of grey in between. Which is most appropriate will depend on the specific balance of participants and activity in the market in question.
While I was at jpmorgan I actually spent some time thinking about alternative auction structures (vs the order book model). The current trading model is ultimately a mechanization of the rules from trading happened in a literal trading floor room, and a lot of the structural issues stem from those rules treating time as infinite resolution and the speed of information propagation/light being instaneous.
There’s some interesting details about how large trades are done today that could perhaps be better reflected into some element of auction design.
Worth studying the multiple mechanisms that drive the Tokyo Stock Exchange. Itayose vs Zaraba - and the role of the saitori. Here's an interesting paper from 1991:
so I think theres a huge design space, and I think it partially turns into a "mechanism design" challenge to articulate a landscape of transaction / market auction mechanisms that
1) incentivize maximizing market liquidity
2) recognize the speed of light is finite, and have that inform the minimal time scale matching can happen on.
3) obviate/remove the need to obscure large trades as a large number of smaller trades (which is half the value of so called algorithmic trading strategies to institutional investors). This could be via having one design constraint on auctions be that the market impact of the sum of the small trades should be equivalent to the single large trade. (ignoring the issue of the exogenous information of there was a large trade ).
some interesting knock on consequences of these ideas are the following
1) the larger the time scale you're willing wait for the trade to be matched to "the other side", the cheaper it should be to trade! (creating liquidity is valuable!)
2) if you're willing to allow your trade to be "partially matched" instead of all or nothing, that too creates liquidity.
the point being, you start with "what are all the complications of how people do large/complicated trades today that should just be trivial with the right auction" is sortah my perspective. thats glossing over a lot of complexity and other concerns, but those are some high notes.
that said,this is just the tip of the iceberg, and these sort of market design questions are genuinely under studied in my mind, and i could easily spend hours talking about this in greater depth over coffee or such.
Couldn't agree more on the mechanism design front. There are so many dimensions to the problem - especially for US equities - that the design becomes quite nuanced and tradeoffs have to be made somewhere or other. The duration aspect of waiting for trades to be matched (and for liquidity to accumulate) is an especially challenging one in fast financial markets.
I'm curious if you've come across OneChronos (upcoming US equities ATS - I'm a founder there along with @lpage; disclaimer). A lot of what you describe is baked in as a goal of our auction design. Most importantly, drawing out liquidity by incentivizing truthful bidding and allowing people to encode things like substitutability. We try to do that by giving people the tools to express their full intent to the venue[1]. Don't wanna get any more salesy than I already have here, but I always like to get points of view from other practitioners.
There are severe technological consequences for pushing for synthetic discrete time. Exchanges that currently execute in a serialized fashion may no longer be able to support the trading volume if the underlying platform is unable to develop batch sizes that naturally align with hardware capabilities and timings.
Put differently, I think what is going to happen is you will start stacking way more orders at each interval than you can process before the next because the wonderful CPU pipelining effects get wrecked each time you hit an arbitrary time slice boundary. I suppose you could intentionally spin the CPU instead of yielding back to the OS during these delays, but that means you are not able to process any orders that are currently arriving, so your tail ends up growing longer and longer.
I would be very worried if the machines actually executing orders today were anywhere close to 100% load on average, because of the well-known issue where tail latency explodes as you get closer to 100% load. So I doubt the relevance of everything you wrote there.
Batched auctions require different algorithms, sure. They may even be more expensive to execute. I suppose you have to sort the batch once instead of sorting as you go. Maybe that makes it O(n log n) instead of O(n)? Can you keep a traditional order book up-to-date in O(1) per transaction? Either way, seems like this should be a non-issue. Even if exchanges need to add more shards for order processing, that's just not a big deal.
This hasn’t materialised as a problem (batch auctions are one on the MiFID II venue models) - some eu venues have run this model for around four years now, its definitely less widely used than other models but has a niche.
Why would a premade queue be worse for pipelining than a random queue which you also have to modify as you process on it? Seems like a single pause per batch rather than constantly checking if the work queue has something new put in it.
Is there any mathematical proof that it's harder to game batch auctions than what we have now?
For example, while other markets and the real world moves on, you gain info. So the later in the batch you can submit a trade, the greater your advantage.
See Proposition 6 on pg. 1600. The argument is purely economic and based on a simplified (but first-order reasonable) model of liquidity provisioning and price action. Price-priority uniform clearing price batch auctions (with or without randomized call times) transform competition over speed into competition over price.
Technical/practitioner notes:
1. Gaming is a bit of an overloaded term, and in this context, implies that agents are doing something wrong. Mechanism design assumes that agents will respond rationally and strategically to the mechanism they're presented with, so whatever happens is on the designer. Ideally, the mechanism chosen will result in an individual response that collectively optimize the designer's objective function, e.g., maximizing social welfare or the auctioneer revenues. Suppose the mechanism chosen isn't the "best" one for a specific set of agents & goods. In that case, agents might have individually rational behaviors that result in sub-optimal outcomes relative to what was achievable with another mechanism.
2. Prop 6 isn't entirely predicated on having a random call time (there will be competition over price as long as there are two "fast" types in the market). However, randomizing auction call times is still practically speaking useful.
The US Markets generally operate multiple sessions: pre open, on-open, regular-market trading, on-close, post-close or after-hours.
Open and closing auctions are batch auctions.
To facilitate liquidity seeking, the major exchanges publish a periodic "imbalance" feeds in the minutes and seconds up to the auction. As you can imagine, its gamed in a multitude of ways, including special order types (D-Quotes on the NYSE anyone?) that only a select few market participants know about or have access to.
The point is, whatever mechanism you choose: over time that mechanism morphs and transforms. New options are provided under the guise of liquidity seeking but really serve to benefit the HFTs.
That was my initial thought as well: this would just become an arms race to submit your trade last? But maybe if the trades were priority queued it would negate that.
Gameability is distinct to whether there is an advantage to low latency. A random batch auction will be gameable but not so much via speed. Not sure how a math proof would work for that though.
The ones that are currently live tend to have features that try to negate this (randomised uncrossing times, price collars, order priority based on arrival time or size)
Budish et al. start with the statement that "The high-frequency trading arms race is a symptom of flawed market design" and present a mechanism that mostly addresses a specific feature of the current market structure. In 2021, most folks on all sides of the table (liquidity providers, executing brokers, financial institutions) agree that the arms race is individually and collectively value-destroying. They don't claim that it's the "best mechanism" for capital markets. And every mechanism is a set of tradeoffs, so no one should ever really make that claim. At OneChronos (YC S16) [1], we view the arms race as something that's very much worth solving for, but a tiny piece of a much bigger opportunity to make markets function better for all players.
HFT seems like it should be illegal. The 'idea' of the markets is that its fair access to all, at least in theory. I understand there is always assymetry, but things like insider knowledge has been made illegal to try keep the fairness (or at least attempt). HFT does what no ordinary person can, its an unfair advantage. So should it not be banned? I recall there was a company setting up a market with a minimum latency, ensured by all trades going through a spool of miles of fibre optic cable in their office. The idea was to prevent HFT at source, which was cool, but a shame wider markets just let HFT slide, and the fairness assymetry widen.
The exchange you are referring to is IEX [1]. Michael Lewis (author of The Big Short) wrote Flash Boys about the intricacies of low latency trading and this exchange [2]. One of the primary "unfair" aspect of HFT is Front Running and has been illegal even before electronic trading (the name comes from traders racing ahead of big buyers in the trading pit). It was a big problem during the advent of electronic trading but has since been tamed. Exchanges arbitraging off their clients order books is another matter. Trading fast in reaction to real time (public) events is a market efficiency. Not accessible to the masses, but neither was traveling to wall street to place a trade.
I thought part of the problem is that exchanges charge more for faster access (e.g. physical collocation), so the exchanges profitability depends in part on creating a bidding war between HFT companies?
The ad-tech world worked on a Vickery Auction for quite some time. I've often wondered what things would look like if the financial world worked that way instead.
(Vickery Auctions are pretty much dead now because websites saw that bidders were bidding $X and automatically assumed that because they weren't getting $X, but rather $(X - Y), they were being ripped off)
Have you ever wondered how is it possible that when you buy some shares of SPY someone out there is somehow able to collect 500 securities to fulfil your order? Even if it's not a literal action-reaction, that is what must be happening at the margin.
Also, if you still don't believe me, then try to find out how to unpack X amount of shares of SPY (for some non-small amount of X) into individual securities. Can you do it yourself, for example? Whom to call, where's the button for that, who can do it?
Around the introduction of MiFID II regulation in 2018, several exchange operators added these frequent auction books.
Cboe's period auctions book is the biggest of these by volume: https://www.cboe.com/europe/equities/trading/periodic_auctio...
In addition to Cboe, Turquoise, Goldman Sachs, UBS, Virtu and Aquis also run frequent batch auction venues: https://www.cboe.com/europe/equities/market_share/market/ven...
I actually co-wrote a paper about this at the time, and it's very rare I get a chance to talk more about it! https://jot.pm-research.com/content/13/3/5 (sadly it's paywalled)