The chaotic float of Facebook is spawning its own chaos, with the shares taking a beating, investors suing and regulators investigating the underwriters and the platform for the float, NASDAQ.
Amidst allegations that the analysts from the investment banks underwriting the float selectively briefed investors about their downgrading of its revenues and earnings even as the size of the offering and its price were being raised, there is a lot of finger-pointing going on in the wake of $US17 billion or so of the initial market valuation of the company having evaporated.
The actions of the underwriters, led by Morgan Stanley, will be tested by the regulators and the law suits, but the embarrassing and costly technology glitches that marred the listing and generated massive confusion on that opening morning are perhaps the most significant issues to emerge from the float, given the inexorable increase in automated and high-frequency trading occurring in global markets.
Unlike the so-called ‘’flash crash’’ that occurred in US markets in 2010, where the interaction between automated algorithmic trading and high-frequency traders during a period of market volatility caused US equity and futures markets to plunge nearly 10 per cent in minutes, the 20 minute delay before Facebook shares began trading, and the hours before investors had their trades and cancelled trades confirmed, appears to have little to do with high-frequency trading but a lot to do with automated trading.
NASDAQ issued its explanation of what happened earlier this week. Despite preparing for the float by testing trading volumes of a billion securities under 100 different scenarios, what actually occurred in the minutes leading up to the final pricing of the shares ahead of their listing and initial trading revealed a major shortcoming in the exchange’s systems. In that pre-opening period its system matches buying and selling orders to establish the opening price. The system allows continuous trading through that period, enabling traders to add or cancel orders.
After its system had calculated the price for the opening trade, NASDAQ was hit by orders to cancel more trades, altering the order book and causing the system to re-calculate the state of the order book. Again, however, before it could ‘’print’’ the opening trade more cancellations were received, a process that kept repeating itself in what became a loop.
NASDAQ’s CEO, Bob Greifeld, colourfully described the cancellations as ‘’fitting in between raindrops.’’ The massive volumes of Facebook trades added two milliseconds to the time it was taking to process trades and it was in those two milliseconds – between raindrops – that the cancellations kept pouring in.
Eventually NASDAQ switched to another version of its matching engine and was able to manually complete the pre-listing process and print the opening trade.
In the process, however, its systems effectively ‘’lost’’ 20 minutes’ worth of activity – orders for about 30 million shares and cancellations within that 20 minute window weren’t ‘’distributed’’ by its system and it took several hours before traders knew whether or not their trades had occurred and the prices at which they had occurred. NASDAQ thinks about half those trades weren’t executed at the opening price. It may have to make good any losses.
The embarrassing and potentially costly glitch in its systems when trying to manage one of the more high-profile listings in its history wasn’t due to high-frequency trading, which has been blamed in the past for aberrant trading. High-frequency traders need stocks to be trading before they can piggy back on the market activity. When the problem involves milliseconds and cancellations of orders, however, it presumably does relate to automated and algorithmic trading. Modern exchanges woo high-frequency traders and the investment banks that dominate automated trading because of their trading volumes, their claimed beneficial impact on market liquidity and the exorbitant prices they can charge them for co-locating their servers right next to the exchanges’ own servers – a pre-requisite when trading occurs in milliseconds.
NASDAQ has altered its policy and will no longer accept cancellations after the pre-listing process starts its final calculation phase and before the initial trade has been printed. The question is why it allowed order modifications in the first place, with some suggesting it was pandering to its most valuable customers.
The 2010 flash crash and the Facebook float highlight what can go wrong when most trading is automated and managed by algorithms that may not work as envisaged when confronted by novel circumstances while trading on platforms that are themselves highly complex and software-driven.Add the connectedness of electronic markets and the ability to hedge and arbitrage between physical and derivatives markets, and transparent exchanges and opaque ‘’dark pools,’’ all in micro seconds, and it’s a wonder that things don’t go spectacularly wrong more often.