Nine Challenges in Modern Algorithmic Trading and Controls
AAlgorithmic Trading and Controls , Vol. No.
1, 2021, pp. 1- 9, to appear
Editorial Article
Nine Challenges in Modern Algorithmic Trading and Controls
Jackie Shen Abstract –
This editorial article partially informs the algorithmic trading community about launching of thenew journal
Algorithmic Trading and Controls (ATC). ATC is an online open-access journal that publishes novelworks on algorithmic trading and its control methodologies. In this inaugural article, we discuss nine majorchallenges that contemporary Algo trading faces. There is nothing superstitiously magical about the number“nine,” but so is any other one. Several of these challenges are at the strategy level, including for example,trading of illiquid securities or optimal portfolio execution. Others are more at the level of risk managementand controls, such as on how to develop automated controls, testing and simulations. The editorial views couldbe inevitably personal and biased, but have been explored with the most innocent intention of contributing tothis important field in modern financial services and technologies.
Keywords –
Algos, liquidity, portfolio, correlation, special days, derivative pricing, universe, clustering,machine learning, auctions, shortfall, transaction cost, unit test, regression test, simulation, automated controls
1. Introduction
By this inaugural article, the new journal
AlgorithmicTrading and Controls (ATC) is formally rolled out.ATC is a discretionary, moderately for-profit , and on-line open-access journal that publishes novel works onalgorithmic trading ( a.k.a. “Algo” or “Algo trading”)and related control frameworks. Automated Algo trad-ing generates both revenues and risks, and hence theimportance of automated controls should never be under-estimated. We refer interested readers to the website ofATC for more details: https://atc.deepquantech.com.In this editorial article, we discuss nine major chal-lenges that contemporary Algo trading faces. Beforediving into the details, we first clarify that as for the jour-nal ATC, we restrict the notion of “Algo trading” to twomost common and influential activites: optimal tradeexecution and market making. The latter covers both ex-change market making and automated over-the-counter(OTC) liquidity provision such as Request-For-Quote(RFQ). Both the journal ATC and this editorial articledo not intend to cover purely opportunistic trading activ-ities that seek alphas or arbitraging opportunities for aprincipal account. The signals and strategies involved insuch trading activities are confidential and proprietary ,and by default prohibited from publishing. Deep QuanTech, LLC, New York, NY 10025, USA
Corresponding Author : Jackie Shen, [email protected]
In terms of asset classes, we focus more on well-regulated markets such as of equities, futures, rates prod-ucts, or liquid foreign exchanges (FX). Whenever appli-cable, we also comment on others such as OTC deriva-tives or credits. This article also focuses primarily onthe markets in North America, esp. in the United States.The nine challenges being explored are as follows.[1] Trading Illiquid Securities[2] Optimal Portfolio Execution[3] Clustering of the Tading Universe[4] Handling of Special Days[5] Real-Time Pricing of Derivatives[6] Trading in (Close) Auctions[7] Transaction Cost Models[8] Automated Controls for Automated Trading[9] Full-Scale Testing and SimulationSome of these challenges call for either strategy revamp-ing or more intelligent data analytics, while others aremore concerned with risk controls or robust testing andsimulation frameworks. Both are in scope for this newexciting journal and require broker-dealers or tradinghouses to make further investments in talents, analytics,infrastructures, and the Three Lines of Defense.1 a r X i v : . [ q -f i n . T R ] J a n ine Challenges of ATC – 2 Finally, an editorial article of such nature could be in-evitably personal and biased. But its purpose is to initiatefurther healthy dialogues within the Algo trading com-munity, which includes fund managers, broker-dealers,trading agents or houses, and regional supervisory func-tions such as the Federal Reserve Board or the PrudentialRegulation Authority.Algo trading has been playing an increasingly vitalrole in the modern landscape of financial technologiesand services, and profoundly impacting both the mainstreet and Wall Street. As a result, it can no longer be op-erated in the conventional style of black boxes, and muststart to promote a healthy culture of open discussions,information sharing, and collaborations.
2. The Nine Challenges for ATC
We now elaborate on the nine challenges. The order issomewhat at will due to the relative independence.
On the market side, the limit order books of an illiquidname are typically “thin” and active market participantsare also very limited, e.g., mostly registered market mak-ers who are obliged to post two-sided quotes. Such poorliquidity qualities make these names highly sensitive tosupply-demand imbalances.As a result, most illiquid names invariably share thesemacro characteristics: lower average daily volumes(ADV), wider average bid-offer spreads, and highervolatilities. All these negative factors aggravate imple-mentation shortfalls (IS) whenever trading these names.On the trading side, the constraint of completion(within a given execution horizon) raises a major hurdlefor Algos. Any passive waiting and inaction “now” maypile up positions for “later” stages and hence increasethe squeezing risk and impact costs. For this reason,traders may better turn to Algos with flat execution pro-files such as VWAP/TWAP or Participation of Volumes(PoV). Further diligence must be exercised because theintraday volume profiles of illiquid names demand eitherlonger time bins to accumulate sufficient volumes, orotherwise to be interpreted in probabilistic manners.For equities, dark pools (wherever available in theregional markets) offer alternative venues to minimizeinformation leakage or seek extra liquidities. They areindeed indispensable for many Algos crafted specificallyfor illiquid or small-cap names. When placing peggingorders in dark pools (e.g., pegged to the mid of the Na-tional Best Bid and Offer (NBBO)), Algo designers must pay extra attention to monitor the dynamics and toxicityof the pegged prices. In the scenario of a single marketmaker, for example, the prices may merely reflect the in-ventory pressure being experienced by the market makerat the moment, and may not necessarily reflect the “true”values (TV).For some less liquid fixed-income products, executionusually hybrids agency and principal trading. An execu-tion Algo constantly monitors external venues as wellas the estimated TVs and spreads of the agent herself.Whenever the agent can offer prices improved from ex-ternal venues, the Algo may execute some portions inthe principal capacity to benefit a client. Such practicemust be governed by the Principle of Best Executionsuniversally required by regulators (e.g., MiFID of theEuropean Securities and Markets Authority (ESMA)).This demands robust market data connections with lowlatency, as well as accurate real-time computation of theTVs and spreads, etc.To automate and integrate all the aforementioned logicand signals remains a major challenge for Algo develop-ers, in order to most effectively trade small-cap namesor many illiquid fixed-income products. For instance, toautomate the trading of municipal bonds, many of whichare illiquid, the main hurdle turns out to be very elemen-tary - how to properly estimate their TVs in real timewhen actual trades are very sparse and hence crediblefootholds for pricing are simply not there.
Unstructured portfolios can be executed using asyn-chronous single-name Algos. By “unstructured,” weinformally refer to baskets that have been formed without any systematic objectives in mind, e.g., delta neutralityor long-short balance.For a sizable portfolio whose execution spans overa sufficiently long time window, execution risk can bereduced if the individual names happen to hedge amongthemselves to some degree. This is usually the case fora structured portfolio, for instance, one resulted fromindex rebalancing when some new names are to be ac-quired and an equal amount of selected old names to beliquidated.In the modern portfolio theory, hedging is quantifiedby the correlations and volatilities of the names. Theycan be calibrated directly from the historically observedreturns, or indirectly assembled from multi-factor mod-els such as the Barra ™ or Axioma ™ .Conventionally these are often end-of-day (EOD)models. One major challenge is how to revamp the To be Published by: Deep QuanTech, LLC, New York ine Challenges of ATC – 3
EOD risk models to land on effective real-time risk mod-els. A popular practice is to use the EOD correlationmodels as the backbone, assuming that correlations varyslowly over longer horizons. The con of this assumptionis that EOD correlations are typically calibrated overlonger horizons and may miss any emerging correlatingpatterns for the purpose of intraday trading . Such sce-narios can emerge when a given name starts to breakaway from a cohort due to some major breakthroughsof products or services, e.g., a pharmaceutical companywith an important new drug approved, or a public com-pany announced to be included in a popular market index.The information has been released, but classical EODmodels may act too slow to reflect it in a timely manner.Using weighting schemes like the exponential movingaverage (EWA)) helps catch up in reaction but still actstoo passively for the purpose of intraday trading.Calibrating intraday volatilities imposes another chal-lenge. For any given single name, static profiling is toestablish a static volatility curve σ ( t ) so that, for ex-ample, σ (
10 : 18am ) denotes the average volatility at10:18am. Such profiles can be prepared overnight andstand by ready before a day-trading session commences.It is a stable and predictable tool, but may lose touch withthe intraday reality of a particular given day. A moreideal solution would turn to dynamic profiling when theentire intraday curve is not pre-calculated but graduallyrolled out. At each “current” time t , the future profile σ ( t : EOD ) can be modeled as a stochastic process orupdated belief based on what has been observed in themarket “so far.” This can be computationally more ex-pensive but surf well with real-time market waves.Away from risk correlations, correlations among im-pact costs have been largely muted in both academicand industrial works. The assumption of independent impact costs might approximate well for most individ-ual names. But there are scenarios well worth fur-ther data-driven studies. For example, suppose that aportfolio contains both a single common stock namedSABC and an exchange-traded fund (ETF) named FXYZthat has SABC as one of its sizable constituents (e.g.,SABC=Exxon Mobil Corp. and FXYZ=XLE - EnergeSelect SPDR Fund). Thanks to index arbitragers, anysudden push-ups of SABC can be transferred to FXYZalmost instantaneously, and vice versa. As a result, theimpact costs of trading sizable amounts of SABC andFXYZ could be intricately entangled, potentially leadingto non-negligible and verifiable observables.To organically integrate all these risk and cost ana-lytics into coherent portfolio execution models imposes another major challenge. Such Algo models must bemathematically tractable and computational feasible andefficient. Among them, self-contained dynamic program-ming models are notoriously harder.Finally, as broker-dealers and trading agents becomemore enthusiastic in integrating and unifying their plat-forms and offering cross-asset trading Algos, it is an-other challenge to optimally trade portfolios contain-ing multiple but correlated asset classes, e.g., commonstocks, ETFs, futures, options, or general FICC products. The security universe covered by a typical investmentbank, execution agent, or registered market maker isusually vast, potentially including tens of thousands ofdifferent names depending on asset classes. This is espe-cially true for global equities trading.Clustering helps organize trading universes and dras-tically reduces operating complexities. In a stationarytrading environment, clustering can substantially im-prove operational efficiency by sharing a set of com-mon strategies, parameter factories, implementations,or risk controls within individual clusters. In an emer-gency scenario, clustering can also offer a default frame-work or procedure for automated handling, e.g., instan-taneous imputation of certain risk characteristics whendata servers or connections are experiencing unautho-rized naps or unexpected glitches.Conventional frameworks or risk models have alreadybeen able to offer some rudimentary schemes of cluster-ing or classification, e.g., via the directions of sectors,industrial groups, or fundamental risk drivers such asmarket capital sizes, value vs. growth, etc.Modern machine learning (ML) techniques probablycan offer more. For the purpose of intraday optimalexecution or continuous market making, an overnightprocess of clustering and classification is more idealthan the traditionally static segmentation schemes, suchas those merely driven by sectors or industrial groups.Here the main challenges are to sort out all security char-acteristics, be them fundamental or technical, that aremost relevant to intraday trading for either optimal ex-ecution or market making. ML-driven clustering mayalso have to accommodate conventional risk models andto set proper clustering objectives. Unlike prevailingrisk models, non-numerical characteristics can also beaccommodated using modern ML techniques, such ascategorical feature variables or those derived from alter-native data like investment sentiments on social media.
To be Published by: Deep QuanTech, LLC, New York ine Challenges of ATC – 4
There does not seem to exist a formal theory about spe-cial days , but industrial practitioners know that theyought to be treated differently. In the modern era whentechnologies can facilitate quicker and more adaptiveresponses to market dynamics, indeed they deserve morecustomized and responsive strategies.Well-known examples include the Fed AnnouncementDays, Half Trading Days, Month-End or Quarter-EndDays, Index Rebalancing Days, Triple Witching Days,etc. Trading patterns are more pronounced on certainspecial days than on some others. Also days like IndexRebalancing may observe more salient impacts on cer-tain specific individual names, i.e., the new joiners anddropouts for an index.On special days, trading patterns may differ in all ses-sions, including for instance, the continuous core session,and open and close auctions. As a result, the correspond-ing trading parameters should be prepared using properstatistical methods or machine learning techniques. Dur-ing real-time implementation of a special day, trade be-liefs and forecasting must be further updated based onthe specially calibrated models, parameter factories, andthe actually observed market dynamics. Each specialday may assume its very special identities.An equally stimulating notion, though less popularlyimplemented in the Algo trading industry, is “Special Pe-riods.” Special periods may be anchored around specialdays, for example, the first trading week or month of anInitial Public Offering (IPO) for equities, or the weeknear a roll date of a major index futures for futures, etc.They may signify the periods of known transitions anduncertainties, and hence are more tractable than latentperiods in general regime-switching models.Special days or periods represent the non-stationarymoments of the life cycles of securities or trading envi-ronments, and may present good opportunities for thosewho master them. Trading with specialized and effec-tive strategies is certainly not a trivial task, and requiresspecial investments in talents and analytics.
Traditionally in most investment banks, pricing mod-els have been developed for booking trades manually .Traders have to add further overhead premiums such asspreads or commission fees on top of the “true” pricesprojected by pricing models. Pricing models and theassociated risk analytics are also utilized by mid- andback-offices for monitoring the end-of-day (EOD) posi-tions and aggregated risks. With increasing demands on derivatives and moreefficient exchange or OTC trading, automated tradingand clearing of derivatives have gradually become apriority for many investment banks. Among all buildingblocks, pricing models stand out as the core pillars.Here the main challenges include: (A) speeding upcomputation for pricing and associated risks, and (B) re-vamping pricing infrastructures, including data connec-tions and servers, to facilitate fast and robust real-timeprice queries. The two are clearly intertwined.Conventional EOD reading and construction of pric-ing curves, e.g., interest rate curves or credit curves,are too sluggish for intraday and real-time trading. Ide-ally a pricing engine must be able to query market data(e.g., money markets, bills, notes, bonds, or swaps andswaptions on rates or credits) in real time , and then toconstruct on the fly the implied rates or credit curves.This demands rewiring or upgrade of market data sub-scriptions and connections, as well as computing enginesfor curve calibration.Furthermore, the heavy machinery of partial differen-tial equations (PDE) for option pricing or Monte-Carlo(MC) simulations for exotics has to be re-designed, inorder to substantially catch up in speed for real-timedynamic environments. Taylor or asymptotic expansion,spline interpolation or extrapolation, and more generalapproximation techniques probably have to be adopted toreduce frequent calls of the heavy artillery. The second-or minute-time windows of computation must be com-pressed towards the scale of milli- or micro-seconds forAlgo trading, at the sacrifice of some accuracy.On the control side, both the Tech Risk Management(TRM) and Model Risk Management (MRM) must stepup in scrutinizing the soundness and robustness of thesenovel infrastructures and real-time pricing logic.
Previ-ous approvals on EOD pricing models do not automati-cally transfer to real-time models!
Needless to say, automated derivative trading willbecome the most exciting area for most investment banksor trading firms. It requires serious investments in thebest infrastructures, analytics, and above all, IT talents.
There are open and close auctions in the US national mar-ket system (NMS).
Intraday auctions also exist in someregions such as Europe. Auctions provide important al-ternatives for liquidity sourcing and price formation, andplay a critical role in modern-day trading.Among all, close auctions are becoming the mostprominent trading sessions across global markets. Per-
To be Published by: Deep QuanTech, LLC, New York ine Challenges of ATC – 5 haps it can be best justified by this simple keyword -“completion,” that universally governs intraday tradingactivities, be them high-touch or low-touch.When traders or portfolio managers are mandated toliquidate or acquire certain positions before any givenEOD, the close auction offers the final substantive poolof liquidities. This applies, for instance, to index fundmanagers who attempt to minimize fund tracking errorson an index rebalancing day, or to traders on a centralrisk book who attempt to stay compliant with a firm’sinternal EOD risk limits and allocations.Close auction volumes have been on steady rise andno serious liquid-seeking traders can afford to miss them.For developed markets such as in US or Europe, averageclose auction volumes have already stepped into thedouble-digit zone (as a percentage of the average dailyvolume (ADV)).Algos offer a variety of options to traders or clients foreffectively tapping auction liquidities. Unless explicitlyinstructed to complete execution before the close, intheory any Algo can offer participation in close auctions.Taking VWAP for instance, a natural way is to treat theclose auction volume as a Dirac “point” mass and thento allocate the auction participation proportionally.In addition, there also exist dedicated auction Algosthat are marketed under the name of “Target Close (TC).”Different broker-dealers or execution agents may designit with their own objectives and customizable options.Each TC Algo attempts to benchmark against the closeprice while maximally curbing potential price impactor information leakage. This means that some portionsmay have to be traded in the continuous core session justbefore a close auction.To best serve the interests of trading clients, all theseAlgos that tap close liquidities must develop forecastingmodels on auction volumes and prices, as well as theirpre-close dynamics. They must properly digest informa-tion such as imbalance and indicative prices that is beingcontinuously disseminated to the public after a certaintime before a close auction (e.g., 3:45pm in US).Modern data analytics and machine learning methodscan probably improve these predictive models. Tradition-ally only straightforward statistics have been explored.The main challenges here are that each primary exchangehas its own auction roll-out procedures and rules, andthat some specific operations could perplex modeling ef-forts, e.g., special orders like NYSE’s Closing D Orders.Finally, it is also nontrivial to seamlessly integrate thesepredictive analytics into a self-contained and objective-driven optimization problem.
In the current article, TCM is restricted to pre-trade forecasting models for estimating the transaction costsof trading any proposed positions. We shall reserve TCA,
Transaction Cost Analysis , for any post-trade analysison the costs of actually executed trades. The costs dueto fees and commissions are out of scope, since they areeither published or contracted. In addition, transactioncosts here mainly refer to the impact costs , not the marketrisk costs associated with innate market fluctuations.In an editorial article like this, savvy readers mighthave been searching for the keyword “TCM” from thevery start. Indeed, TCM is probably the most celebratedmetric in Algo Trading, though this does not mean thatit has been thoroughly understood.In fact, TCM to the Algo community behaves a bitlike the concept of “gravity” to the society. For thou-sands of years, human beings have been aware of theexistence of gravity and successfully applied it to im-portant social-economic activities such as measuring theweight of grains for taxing purposes or the weight ofgold and silver as currencies. The true enlightenmentof gravity, however, did not emerge until Newton andEinstein uncovered the laws behind.In the earlier years, broker-dealers or trading agentsdid openly reveal their TCM models either formally orinformally. But the trend is that these models sink deeperand deeper underwater, and become proprietary and con-fidential . This is especially true for many emerging TCMmodels for FICC, such as those for bonds or Foreign Ex-changes (FX). Freely accessible TCM models are veryrare. For instance, only the Kissell Research Group(KRG) still maintains an open and free TCM model un-der the brand name of “I-Star,” at the time when thisarticle is published.In theory, for any given security there should be asingle ground-truth TCM model, which should be keptopen, transparent, and accessible to any traders or fundmanagers. The fact that different firms develop theirown confidential and proprietary
TCM models perhapsalready suggests something disturbing. Or rather, it mayhave also signalled the very complexity and ambiguityof the notion of TCM. Even for post-trade TCA whentrade data have been completely observed, it is not sostraightforward to carve out the net impact costs.Let us dip into some light details. Most TCM modelsseek a function form of:
TCM = Φ ( Q , [ T , T ] | s ) , To be Published by: Deep QuanTech, LLC, New York ine Challenges of ATC – 6 where s denotes a given security, Q a targeted buy/sellposition in s , and [ T , T ] a designated execution window,e.g., Q = ,
000 shares, T =
10 : 00 am, and T = s supplies all thecost and risk parameters, e.g., average spread θ s , averagedaily volume ADV s , and average volatility σ s . Hence theexpanded function form is given by: TCM = Φ ( Q , [ T , T ] , θ s , ADV s , σ s ) . It is a convenient format for pre-trade cost forecasting,as well as for portfolio optimization when trading costsare factored in. But it does not differentiate among theactual Algos. Such a model often implicitly assumes theVWAP or PoV (Participation of Volume) Algo. A moreideal model should indicate such dependency, i.e.,
TCM = Φ ( Q , [ T , T ] , θ s , ADV s , σ s | Algo ) . For example, the net impact cost is very different fora typical front-loading
Implementation Shortfall (IS)Algo that is benchmarked against the arrival price, asversus a more flat-loading
VWAP Algo. Using an Algo-independent TCM model to project IS impact costs isdoomed to be inaccurate.The reality is that few broker-dealers or trading housesprovide Algo-specific TCM models, to our best knowl-edge.Furthermore, TCM modelling also faces some theo-retical challenges. Consider a schedualing -based Algo,be it a non-dynamic VWAP or IS Algo, for which theexecution path q t ∈ [ T , T ] is pre-scheduled by a suitableoptimization model and satisfies (cid:90) T T q t dt = Q , with q t denoting trading speed.It is generally held true that the final netted IS, expressedas the basis-point spread over the arrival price, bears theform of: IS = C + Z . Here C = TCM = F ( q t ∈ [ T , T ] | s ) is a deterministic func-tional of the execution path q t ∈ [ T , T ] , for the given secu-rity s , and Z is a zero-mean random component resultedfrom the innate stochastic price fluctuations of the mar-ket (e.g., Brownians as in most published works includ-ing Bertsimas and Lo, Almgren and Chriss, or Shen, justto name a few). In particular, one has the convenientinterpretation of the TCM: TCM = C = E [ IS ] .In reality, even for a given deterministic schedule q t ∈ [ T , T ] , C is still stochastic . The cost component C involves the complex interactions of numerous real-timefactors, including the dynamics of the limit order books,the strategy of allocating marketable vs. limit orders,and the usage of dark or grey venues and different ordertypes. Some of these variables are latent and not directlyobservable, e.g., the liquidity in a dark or grey venue, orthe waiting queues of limit orders.In addition, it is also less obvious why the two ran-dom components C and Z should be independent . Hereperhaps one needs a bit bold revolution that is parallelto the “Theory of Relativity,” in spirit . For a sizabletrading path ripping through a given market (e.g., withan average market participation rate of 30%, say), whyshould one still believe in the existence of an “absolute”market where the rest participants still trade accordingto a pre-designed and undisturbed Brownian motion?There is still some long way to go before reaching amore coherent and matured theory of TCM and morerigorous computational implementations. Notice that theR-Squared scores are universally low for TCM modelsof the current generation, as low as in the teens or withsingle digits. TCM 2.0, which is yet to come, can per-haps substantially benefit from modern data analytics aswell as machine learning techniques. A coherent theory clearly holds the key. On the other hand, for many lessliquid FICC securities or ones whose markets are stillat their infancy stages, it will take some extra miles tomaterialize even TCM 1.0. Algos differ from many conventional financial productsor services. First and foremost, Algos directly accessthe National Market Systems (US), Regulated Markets(MiFID, Europe), or general national exchanges. Anyserious system glitches, operational incidents, or designflaws could generate broad impacts on the regional se-curity prices, major index levels, associated derivativemarkets, or even the net asset values (NAV) of pensionor retirement funds.Algos expose their owners, agents, or clients to allmajor types of risks, be them investment banks, broker-dealers, trading firms or various funds. The main risktypes include, for example,(a) regulatory risk for infringing rules, laws, or regu-lations on securities, markets, and trading,(b) financial risks for suffering substantial principallosses as a result of erroneous trading activities,(c) reputational risks for violating the core principles
To be Published by: Deep QuanTech, LLC, New York ine Challenges of ATC – 7 of financial integrity, or for offering poorly man-aged products and services to clients, and(d) operational and technological risks for inade-quately testing and monitoring trading systems, net-works, or servers and data centers, etc.Because of the autonomous nature, most behaviorsof Algos have to be controlled in an automated or low-touch way, instead of via manual or high-touch inter-ventions. The latter applies only to ultimate controlssuch as the Emergency Shutdown Procedure ( a.k.a. the“Kill Switch”) when the entire Algo system or exchangeconnections have to be shut down via manual commands(as in Linux) or clicking on-screen “panic” buttons.Controls throughout the life cycles of orders, e.g., in-coming parent orders and child orders at different stages,have to be automated and embedded within the orderor execution management systems (OMS/EMS). Thereshould be blocks of control codes or scripts residingwithin the OMS/EMS that can automatically police orderactivities, e.g., parent order acceptance, child order gen-eration, order splitting and routing, and messaging withexternal exchanges or venues. No senior management teams or clients can feel trulyat ease with black-box Algo systems unless it is con-firmed that these systems are largely self-regulatory andthat comprehensive controls are automated and algorith-mic as well.
Controls could be kept simple for a small proprietarytrading firm who focuses on only a very limited set ofsecurities using a limited set of Algos. For a large-scaleinvestment bank, broker-dealer, or trading firm, however,it is a daunting task to develop a rigorous and effective control framework that is transparent and auditable, e.g.,by either the internal audit teams or external regulators.These firms trade hundreds or thousands of names acrossmultiple asset classes on each business day, relying ontens or hundreds of Algos.At the minimum, such a control framework means(i) to establish control governance structures or com-mittees within a given firm,(ii) to develop formal control policies and procedures,(iii) to construct and maintain a detailed control inven-tories , including some key pieces such as identifiedrisks, proposed controls, actual implementationswithin the OMS/EMS or beyond, and unit or re-gression tests that prove the effectiveness of theimplemented controls, (iv) to clearly delegate and orchestrate the responsibili-ties within the Three Lines of Defense, includingAlgo desks, risk management, and independent in-ternal or external audit teams, and(v) to monitor and document the entire life cycles of thecontrols, including (a) any onboarding requirementsfor new Algos and associated controls, (b) changemanagement of existing controls, (c) effectivenessand breaching incidents of the established controls,and (d) periodic reviews of the controls.
To establish a matured control framework often requiresmultiple years of commitment and investment from in-vestment banks or trading firms !To better elucidate the above discussion, which issomewhat abstract, let us walk through a relatively “sim-ple” example.Suppose for a given Algo named OGLA, among its280 proposed controls, there is one specific control withidentification number CTL-ID9988 which is to limit anincoming parent order to a compliance limit of Θ = not be fooled by its illusorysimplicity! From the control-framework point of view,one can and should challenge it from multiple facets.• ( Ownership ) Who defines this limit of 128 MM?And who are the validators and approvers?• (
Documentation ) What is the rationale in the his-torical context of this Algo named OGLA? Andwhere is this rationale documented?• (
Data Security ) Within the trading system of AlgoOGLA, where is this limit number of “128 MMUSD” stored? And who has the right to access andoverwrite it?• (
Ongoing Monitoring ) In the past quarters, whatis the rejection rate under this limit? If the rejectionrate has been consistently high, which may havesignalled a systematic increment of trading scalesfrom clients (instead of due to fat fingers), shouldthe Algo desk consider to increase this limit forlegitimate business? In the same fashion, if themaximum position has been only 30% of this limitconsistently in the past quarters, should the Algodesk consider to lower it in order to more effectivelycurb fat-finger errors?
To be Published by: Deep QuanTech, LLC, New York ine Challenges of ATC – 8 • (
Exception Handling ) When there is a legitimatereason for a trade to go above this limit, e.g.,when both the external client and the internal salestrader(s) have manually communicated about andconfirmed such a trade size, what is the emergencyprocedure for such a trade to legitimately pass thelimit check, instead of being rejected outright?The example has been fabricated but the above pointsare profoundly real. The actual impact could go waybeyond the couple of lines that implement such a decep-tively trivial control: i f ( o r d e r . n o t i o n a l > CTL ID9988 . l i m i t )o r d e r . s t a t u s = REJECTED ; . . .
Another major challenge for developing a coherentcontrol framework is that Algos by nature are dynamic .In response to emerging trading environments, new clientrequirements, or novel IT developments, Algo systemsare in a constant state of morphing and revamping. It ishighly burdensome to scrutinize and approve frequentbut legitimate (and occasionally very urgent) changeswhile maintaining a consistent policy.One partial solution is perhaps to replace human ap-provers and validators by automated algorithms, e.g.,via machine learning techniques or artificial intelligence.But this could take away the already shrinking pool ofjobs for working daddies and mommies - a ubiquitouswrestle between humans and AIs in the modern era.
If analytics and strategies define the mind and soul ofan Algo, lines after lines of codes then build up thevery flesh and body. Ensuring the healthiness of thebody is the highest priority for Algo development andmaintenance.The codes embody all the critical functions of Algos,such as messaging with external venues or clients, lis-tening to real-time market trade and quote (TAQ) data,and querying reference data, profiling data, or parameterfactories. Most importantly, they implement all the coreEMS/OMS logic. The codes as a whole constitute into acomplex ecosystem of interacting units.In general, objective oriented programming (OOP) of-fers an effective framework for large-scale code design,components structuring, sharing of common functional-ities, and multi-developer collaboration. For example,C++ and Java have been broadly employed as the main-stream languages for Algo development. But soundOOP structuring does not always guarantee bulletproofshelters from coding errors. As time passes, any given Algo system has to evolvein order to fix bugs, incorporate novel strategies, or offernew functions and features. Then logic and strategiesbecome more and more involved and coding structuresmore complex. As a result, an Algo system may becomeincreasingly vulnerable to programming bugs and flaws.The human factor is a significant source of such po-tential errors. Developers or strategists come and go.Consequently coding styles may change and many hid-den intentions of initial designs (e.g., on classes, vari-ables or functions) may gradually get lost or misused.Formal documentation of all coding details is virtuallyimpossible, while informal in-line commenting is alsoinsufficient to maintain code sanity.The other major source of errors arises from all therevamping efforts for expanding new features, function-alities, or products. It is highly nontrivial to ensure an organic integration of the new and old codes, especiallyfor large-scale or in-depth projects such as platform mi-grations or adopting complex quantitative models. Hereare two example scenarios when due diligence must beexercised. In reality, one must face all types of challeng-ing scenarios.(a) For instance, inserting a new member function intoan existing class which modifies an existing global variable could turn very risky without careful exam-ination on how the variable has been utilized in theexisting framework. This is especially true whenthis global variable has been used somewhere elseas a control signal for making trade decision suchas order splitting or cancellation.(b) New Algo products are often built upon existingmodules. These units must be organically inte-grated, instead of being linked perfunctorily. Previ-ously they may have been operating independently.Once being encapsulated under the hood of somenew parent logic, these units may have to run in par-allel or series. As a result, a responsible developerwould have to carefully examine the signals thatthese modules all listen to, the controls all governedby, as well as the complete flow of cause-effectevents. Otherwise, serious glitches could surfaceunder certain trading environments that happen toawaken some previously dormant bugs.
Unit Tests are designed to verify that individual mem-ber functions or task blocks have been coded up asintended.
Regression Tests make sure that these unitsor other general functionalities remain stable and pre-dictable during rounds of code changes. Regression tests
To be Published by: Deep QuanTech, LLC, New York ine Challenges of ATC – 9 are especially critical for hard compliance controls suchas on notional or credit limits.These popular and automated tests, if sufficiently com-prehensive and accurate, can indeed deliver a high-levelof assurance on the soundness of an Algo system.But an Algo system is not merely an inorganic stackof individual units or functionalities. In general, neitherunit tests nor regression tests can go all the way bottom-up to cover the entire dynamic decision trees embeddedwithin an Algo system.
Most often bugs sneak around inthese decision trees where no tests have ever probed.
As a result, unit and regression tests must be aug-mented by full-scale simulations of an entire Algo sys-tem. This is where the ultimate challenge lies.First of all, it is highly nontrivial to simulate the dy-namics and all possible scenarios of the markets. Forexample, for testing purposes, one must be able to sim-ulate a sudden trade halt (e.g., as triggered by a circuitbreaker rule) and to test/simulate how an Algo systemhandles the entire life cycle of such a halt. Similarly, fora global market system whose trading hours revolve justas the Earth does, e.g., the FX market, the simulationsystem must be able to highlight and react to the par-ticular market open and close periods of other regionalmarkets and the companion liquidity spikes. These aremerely two examples.Even with well-designed mocked market dynamicsand event sequencing, the other side of the challenge isto require a simulation system to go through all scenariopaths that an Algo system can possibly wander through.The dynamic actions of an Algo may involve numerouscontrol or switch statements, e.g., typically coded upby lines like “if-elseif’s-else” or “switch.” In addition,they are often cascaded from parent-level requests tochildren-level responses. The net effect is that a typicalAlgo amounts to a growing decision tree with numerousbranches along the time axis. Failure to simulate throughany particular path may expose the Algo to a potentiallyunregistered bug. But it is a daunting task to ensure thatall probably paths be fully visited and simulated.Finally, Algo developers should pay extra attention tothe testing and simulation of system capacity . An Algosystem or action that runs smoothly for 5 test names insimulation does not necessarily prove that it will behaveso in a real trading environment for 500 synchronousnames. When this happens, the financial risk could turnvery high when an Algo system fails for critical actionslike the “Kill Switch.”
3. Conclusion
Algorithmic trading and its effective controls have beenplaying a fundamental role in contemporary financialtechnologies and services. Optimized trade executionfor pension funds, retirements funds and non-profit en-dowment funds, for example, directly impacts the life ofhundreds of millions of main-street citizens. Similarly,automated market making is also vital for maintainingorderly and robust markets by stable liquidity pooling.Therefore, it is beneficial for the entire Algo tradingcommunity to nurture an open and collaborative culture,as well as to ensure the healthiness and further advance-ment of ATC.This editorial article has been written in this very spirit.We have summarized the current status and challengesfacing the nine important facades of ATC. When turnedover, the coin of challenges also reveals the other side ofexciting opportunities and competing edges in the Algotrading industry. Trading institutions who are commit-ted in making further investments in talents, analytics,technologies, and control frameworks will finally excel.Finally, we emphasize that by no means these chal-lenges are claimed to constitute into an exclusive list.There exist also some other important ones, for example,with regard to the robustness of data servers and connec-tions, effectiveness of integrating modern data analytics,handling less liquid asset classes or OTC derivatives, anddeveloping ultra low-latency trading systems.We also remind readers that the editorial views hereincould be inevitably personal and biased.
Acknowledgments
The author is very grateful to all the former colleaguesat the equities Algo trading teams at both J.P. Morganand Barclays, all the risk and control teams at GoldmanSachs for electronic trading on all asset classes, as wellas all the consultant colleagues from E&Y and KPMGwho have played critical roles in disseminating and fus-ing knowledge and practices on electronic trading. Theviews in this article are however personal, and by nomeans imply any endorsement by or representation ofthese institutions or colleagues.