The Evolution of Algorithmic Trading

 

 The usage of trading algorithms has increased since 2013, when roughly 70% of US shares were traded algorithmically, according to experts. However, algorithmic trading did not appear overnight. It's beneficial that we look at the past.

With the development of the internet in the late 1980s and early 1990s, algorithmic trading came into existence. Computerized high-frequency trading did not, however, become widely used until 1998, when the U.S. Securities and Exchange Commission (SEC) approved electronic exchanges.

The following are the topics we shall examine in this article:

Algorithmic trading: what is it?
What happened to make algo trading possible?
Early in the process
An elaboration and expansion phase
that boom
the current situation
Incorporating social and emotional factors 


Algorithmic trading: what is it?

Trading algorithms, which are automated computer programmes containing trading instructions that take time, price, and volume into consideration, are used to execute orders in algorithmic trading. This kind of trading turns winning trading tactics into computer algorithms that tell the machine when to buy and sell an asset, the appropriate price to trade at, and the appropriate volume to trade at.

Algorithmic trading, sometimes referred to as automated trading, takes use of computers' speed and constant accessibility in order to outperform human traders. It automates the implementation of several trading techniques and intricate mathematical models using specialised software. Some of the tactics could involve trend tracking, while others might use mean reversal.

They can also be spread-betting or arbitrage methods, which are frequently associated with high-frequency trading (HFT) and are characterised by high order-to-trade ratios and rapid turnover. Before human traders are able to comprehend the information they view, computer algorithms are built up for these to make complex judgments to initiate trades based on information that is received electronically.

Since the turn of the century, algorithmic trading has become more popular among both individual and institutional traders. In reality, a 2019 study revealed that trading algorithms—rather than humans—performed almost 92% of the trading on the Forex market. To spread out the execution of their bigger orders, the majority of investment banks, pension funds, mutual funds, and hedge funds utilise algorithmic trading.



What happened to make algo trading possible?

Numerous occurrences in the financial markets occurred before algo trading became popularity to prepare the way for it. Some of the more significant ones are as follows:

In 1949, the first trading rule-based fund was introduced: Richard Donchian, an American trader, founded Futures, Inc., a publicly traded commodities fund that deals in the futures markets. The fund was the first to produce real trading buy and sell signals using a set of established rules. It made use of a mathematical framework built on price moving averages for commodities. The creators had to manually track the markets using information from ticker tapes because there was no internet to support it. This might be viewed as the very first attempt to automate trading thanks to its rule-based structure.

 In order to solve the problem of portfolio selection, Harry Max Markowitz developed the Markowitz Model in 1950. This model served as the foundation for modern portfolio theory, or MPT, which was published in The Journal of Finance in 1952. The father of quantitative analysis is thought to have been Markowitz.

The first computerised arbitrage transaction took place in 1960 thanks to the collaboration of Harry Markowitz and hedge fund managers Ed Thorp and Michael Goodkin. Many computational finance applications were created with the advent of personal computers in the late 1970s and early 1980s, and signal processing techniques including time series analysis and optimization were widely used.

The Market Data System I (MDSI) and MDS-II of the New York Stock Exchange were introduced in 1965. MDSI was introduced to give automated quotations. The MDS-II, which was three times as good as the MDS I, was created as a result of the MDS I's success. By July 1972, MDSII was completely functioning.

The Instinet Trading System was founded in 1967 by Herbert R. Behrens and Jerome M. Pustilnik, making it the earliest electronic communications network on Wall Street. Instinet became a major rival of the NYSE once it was introduced and allowed large institutional investors to trade pink sheet or over-the-counter stocks directly with one another in an electronic setting.

The Intermarket Trading System was introduced in 1978: A significant game-changer was the Nasdaq Intermarket Trading System (ITS). The Securities Industry Automation Corporation (SIAC) is in charge of managing the network, which connects the trading floors of several exchanges and enables real-time trading across them. Any broker on the floor of a participating exchange might use the network to react to real-time price movements and place an order while the ITS is coordinated.


Renaissance Technologies' 1982 launch: Jim Simons established Renaissance Technologies as a quant fund. The fund used its $10 billion in black-box algorithmic trading to leverage statistical models to forecast changes in the price of financial assets. It simply selected the trades to enter using quantitative analysis.
The founding of Nasdaq: In order to provide completely automated over-the-counter (OTC) trading, Nasdaq was established in 1971. Nasdaq was the first to provide online trading after initially merely giving quotes. Later, it added electronic trading.

The NYSE automated order flow was introduced in 1984: Although the order flow started to be computerised in the 1970s, the "designated order turnaround" system (DOT), which eventually evolved into the SuperDOT, was introduced by the New York Stock Exchange in 1984. Orders were electronically directed by the DoT to the appropriate trading location, where they were manually carried out. A market order transmission from a member company to the NYSE trading floor was made possible by the SuperDOT, and as a result, the member firm received an order confirmation after the order was carried out on the floor. The SuperDOT system, which allowed orders of up to 2,000 shares to be electronically routed to a specialist, represented a significant advancement in the speed and volume of equities transaction execution.

In 1993, Interactive Brokers was established: The first company to offer computerised trading was Interactive Brokers, which Thomas Peterffy founded in 1993. The business helped popularise the technology that Timber Hill created (the first trading portable computer) for electronic network and customer trade execution services. Before starting Interactive Brokers, Thomas Peterffy developed the first completely automated algorithmic trading system in 1987. This system made use of an IBM computer to fully automate transaction execution and data extraction from a linked Nasdaq interface.


The 1996 debut of Island-an ECN: The electronic communication network (ECN), also known as Island at the time, was introduced in 1996. The network allowed traders who have subscribed to get stock information via an electronic feed. It consistently offers execution pricing and volume data that is current.

Early in the process

The US Securities and Exchange Commission (SEC) approved alternative trading methods in 1998, opening the door for automated high-frequency trading by enabling electronic exchanges.

Depending on their activity and trading volume, alternative trading systems might choose to register as broker-dealers or national securities exchanges and adhere to extra regulations under Regulation ATS thanks to the SEC's recently approved rules and rule modifications. The legislation accelerated the development of algorithmic trading by giving it legitimacy and transparency.

The conclusion of the US Decimalization process in 2001 was another milestone that aided the widespread use of algo trading. By allowing for smaller disparities between the bid and offer prices, this procedure, which raised the minimum tick size from 1/16 of a dollar (US$0.0625) to US$0.01 per share, brought about additional modifications to the market structure.

The conversion was made in order to adhere to accepted international trading standards, but it also had several advantages for investors. For instance, it made it simpler for investors to recognise and react to changing price bids. Due to the lower incremental price changes, it also tightened spreads.

On August 28, 2000, the Phase-In Period for the UD Decimalization procedure started, and it ended on April 9, 2001. However, the NYSE and the American Stock Exchange only adopted decimalization on January 29, 2001. Since that time, fractional price quotations have never been used; only the decimal trading format has been used.

An elaboration and expansion phase

 Additional laws were introduced to improve and modernise electronic trading in addition to the 1998 legislation that permitted alternative trading methods. For instance, the Regulation National Market System (Reg NMS) was developed in 2005, although it wasn't operational until 2007.

The Reg NMS was a collection of actions to improve and fortify the national market system for stocks. When matching buyers and sellers, its order protection rule stipulates that exchanges communicate real-time data to a centralised organisation and that exchanges and brokers accept the most advantageous offer. It modified how businesses function under the Trade Through Rule and appears to promote high-frequency algo trading.

The establishment of Pandas at AQR Capital Management in 2008 was another significant step. Pandas is a Python data analysis and manipulation software programme that is highly helpful for quickly doing quantitative research on financial data.

Similar to this, Spread Networks, a private corporation, introduced the fastest and most reliable dark fibre connections between the larger New York and greater Chicago metropolitan regions in August 2010. Between New York and Chicago, the Spread Networks route was the quickest and most varied, traversing only 825 fibre miles in just 13.3 milliseconds. This greatly accelerates algo trading speed and fosters the development of the HFT ecosystem.

The boom

 In the late 2000s, algorithmic trading saw a significant growth. Algorithmic trading made up less than 10% of equities orders in the early 2000s, but it expanded quickly, and by the end of 2009, it controlled 70% of the US securities markets. According to the NYSE, the amount of algo trading increased by 164% between 2005 and 2009.

The speed at which trades were executed significantly decreased along with the rise of algorithmic trading. For example, in 2001, HFT transactions took many seconds to execute, but by 2010, this was decreased to milliseconds, even microseconds, and then in 2012, nanoseconds.

Algorithmic trading's huge volume and lightning-fast execution speed increase the possibility of flash crashes, which are sudden market crashes. In reality, a $4.1 billion algo-executed sell in May 2010 caused a flash crash. The crash destroyed about a trillion dollars' worth of market value, with the Dow Jones Index dropping by 1000 points in a single trading day, including 600 points in only five minutes, before rising again a short while later. The SEC imposed circuit breakers to temporarily halt trading during such extreme volatility periods after the flash collapse.

The US Commodity Futures Trading Commission (CFTC) also adopted new guidelines for greater oversight of automated trading on US designated contract exchanges in 2015. (DCMs). This was done to set standards for the development, testing, and oversight of ATSs in order to mitigate possible hazards from algorithmic trading by incorporating risk controls through maximum order message and maximum order size criteria.

The pace of execution increased as technology advanced. Fixnetix, a tech startup located in London, introduced nano trading technology in 2011 by creating a microprocessor that could carry out trades in nanoseconds. The field-programmable gate array (FPGA) microchip, dubbed iX-eCute, was designed for ultra-low latency trading. The microprocessor could do twenty or more pre-risk tests in less than 100 nanoseconds. For the quick execution of transactions, financial institutions link to stock exchanges and electronic communication networks (ECNs) via algorithmic trading systems and network routes.

The advent of Quantopian, which was established in 2011 by John Fawcett and Jean Bredeche, was another factor that contributed to the expansion of algo trading. The business provides free access to open-source resources, such as data sources and Python-built tools, so that algorithm developers may create and test their own trading algorithms.

The goal was to develop automated trading tools and methods that Quantopian could use to enhance its services for institutional clients. As a result, the business held competitions dubbed "Quantopian Open." Participation was available to everybody, regardless of experience in education or employment. Institutional investors that invest their funds to be managed by the best algorithms are another group drawn to Quantopian. These investor-members, who profited from their strategies, were obligated to pay a royalty or commission to the developer-members who developed the winning algos. However, in 2020 the government deemed Quantopian bankrupt and ordered its closure.

 

 Analyzing social media and incorporating it into algorithmic trading

 

 A $30 million investment helped a New York-based start-up named Dataminr develop a brand-new service in September 2012 that converts social media posts into usable trading signals. The objective was to deliver breaking business news up to 54 minutes quicker than conventional news sources. The platform was able to recognise certain specific "micro-trends" that might benefit its clients by providing them with special insights and the ability to foresee what is most likely to occur in the near future.

But debates on social media mostly consisted of business news articles. Consumer product responses, conversational changes in specialised online groups, and patterns of attention growth and decay are a few more crucial social media signs.

Other businesses have also created AI algorithms to identify linguistic and propagation trends in the more than 340 million tweets shared every day on Twitter and conduct in-the-moment analysis to find promising signals. The rising almost immediate influence of social media on stocks, however, first irked the regulators, and on April 2, 2013, the SEC and CFTC imposed limitations on public corporate announcements made via social media.

However, the number of business organisations using Twitter grew with time. In reality, Bloomberg Terminals started integrating live Tweets into its economic data offering two days after the SEC and CFTC imposed limits on April 4, 2015, with the Bloomberg Social Velocity tracking unusually high in-chatter about certain businesses.

On April 23, 2013, a bogus Tweet issued by the Associated Press account claimed that the White House had been struck by two explosions. This incident is a wonderful illustration of how irregular news items may have an impact on stock markets. The news spread terror on Wall Street, and the Dow Jones fell 143 points, or 1%, in three minutes, from 14699 to 14555.

 
The current situation

 Trading using algorithms has advanced greatly. To increase its effectiveness, it now makes use of AI and machine learning technology. However, speed is essential in algo trading, as it has always been. Institutions and prop trading businesses strive to situate their computers as near as possible to an exchange's computer servers (in New York), or to a source of crucial market-relevant information, in order to minimise trade execution delay (Washington). This gives these companies instant access to stock prices and other market-moving data before the general investing public.

However, given the significance of HFTs in maintaining the integrity of the market, the appropriate authorities keep a tight eye on what goes on in the area of algorithmic trading.

 

 

 

1 comment:

  1. Extremely useful information which you have shared here. Forex is world's largest marketplace, and there are several pros of forex trading such as limited capital requirements, remote accessibility and low operational costs. Best forex trading broker

    ReplyDelete