Mark Salmon
University of Cambridge
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mark Salmon.
Computing in Economics and Finance | 2005
Matthew Hurd; Mark Salmon; Christoph Schleicher
We model the joint risk neutral distribution of the euro-sterling and the dollar-sterling exchange rates using option-implied marginal distributions that are connected via a copula function that satisfies the triangular no-arbitrage condition. We then derive a univariate distribution for a simplified sterling effective exchange rate index (ERI). Our results indicate that standard parametric copula functions, such as the commonly used Normal and Frank copulas, fail to capture the degree of asymmetry observed in the data. We overcome this problem by using a non-parametric dependence function in the form of a Bernstein copula, which is shown to produce a very close fit. We further give an example of how our approach can be used to price currency index options.
international conference on machine learning and applications | 2014
Andreea-Ingrid Funie; Mark Salmon; Wayne Luk
Advances in high frequency trading in financial markets have exceeded the ability of regulators to monitor market stability, creating the need for tools that go beyond market microstructure theory and examine markets in real time, driven by algorithms, as employed in practice. This paper investigates the design, performance and stability of high frequency trading rules using a hybrid evolutionary algorithm based on genetic programming, with particle swarm optimisation layered on top to improve the genetic operators performance. Our algorithm learns relevant trading signal information using Foreign Exchange market data. Execution time is significantly reduced by implementing computationally intensive tasks using Field Programmable Gate Array technology. This approach is shown to provide a reliable platform for examining the stability and nature of optimal trading strategies under different market conditions through robust statistical results on the optimal rules performance and their economic value.
Archive | 2008
Roman Kozhan; Mark Salmon
This paper examines the predictability of exchange rates on a transaction level basis using both past transaction prices and the structure of the order book. In contrast to the existing literature we also recognise that the trader may be subject to (Knightian) uncertainty as opposed to risk regarding the structure by which exchange rates are determined and hence regarding both the model he employs to make predictions and the reliability of any conditioning information. The trader is faced with a two stage decision problem due to this uncertainty; first he needs to resolve a question of market timing as to when to enter the market and then secondly how to trade. We provide a formalisation for this two stage decision problem. Statistical tests indicate the significance of out of sample ability to predict directional changes and the economic value of predictability using one week of tick-by-tick data on the USD-DM exchange rate drawn from Reuters DM2002 electronic trading system. These conclusions rest critically on the frequency of trading which is controlled by an inertia parameter reflecting the degree of uncertainty; trading too frequently significantly reduces profitability taking account of transaction costs.
application-specific systems, architectures, and processors | 2015
Andreea Ingrid Funie; Paul Grigoras; Pavel Burovskiy; Wayne Luk; Mark Salmon
Over the past years, examining financial markets has become a crucial part of both the trading and regulatory processes. Recently, genetic programs have been used to identify patterns in financial markets which may lead to more advanced trading strategies. We investigate the use of Field Programmable Gate Arrays to accelerate the evaluation of the fitness function which is an important kernel in genetic programming. Our pipelined design makes use of the massive amounts of parallelism available on chip to evaluate the fitness of multiple genetic programs simultaneously. An evaluation of our designs on both synthetic and historical market data shows that our implementation evaluates fitness function up to 21.56 times faster than a multi-threaded C++11 implementation running on two six-core Intel Xeon E5-2640 processors using OpenMP.
Archive | 2008
Wing Wah Tham; Mark Salmon
The paper approaches the modeling of the yield curve from a stochastic volatility perspective based on time deformation. The way in which we model time deformation is new and differs from alternatives that currently exist in the literature and is based on market microstructure theory of the impact of information flow on a market. We model the stochastic volatility process by modeling the instantaneous volatility as a function of price intensity in the spirit of Cho and Frees (1988), Engle and Russell (1998) and Gerhard and Hautsch (2002). One contribution of the paper therefore lies with the introduction of a new transaction level approach to the econometric modelling of stochastic volatility in a multivariate framework exploiting intensity-based point processes previously used by Bowsher (2003), Hall and Haustch (2003). We find that the individual yields of U.S. treasury notes and bonds appear to be driven by different operational clocks as suggested by the market segmentation theory of the Term Structure but these are related to each other through a multivariate Hawkes model which effectively coordinates activity along the yield curve. The results offer some support to the Market Segmentation or Preferred Habitat models as the univariate Hawkes models we have found at each maturity are statistically significantly different from each other and the major impact on each maturity is activity at that maturity. However there are flows between the different maturities that die away relatively quickly which indicates that the markets are not completely segmented. Diagnostic tests show that the point process models are relatively well specified and a robustness comparison with realized volatility indicates the close relationship between the two estimators of integrated volatility but also some differences between the structural intensity model and the model free realized volatility. We have also shown that bond returns standardized by the instantaneous volatility estimated from our Hawkes model are Gaussian which is consistent with the theory of time deformation for security prices quite generally.
Archive | 2008
Ba M. Chu; Mark Salmon
Stein (1972, 1986) provides a flexible method for measuring the deviation of any probability distribution from a given distribution, thus effectively giving the upper bound of the approximation error which can be represented as the expectation of a Steins operator. Hosking (1990, 1992) proposes the concept of L-moment which better summarizes the characteristics of a distribution than conventional moments (C-moments). The purpose of the paper is to propose new tests for conditional parametric distribution functions with weakly dependent and strictly stationary data generating processes (DGP) by constructing a set of the Stein equations as the L-statistics of conceptual ordered sub-samples drawn from the population sample of distribution; hereafter are referred to as the L-moment (GMLM) tests. The limiting distributions of our tests are nonstandard, depending on test criterion functions used in conditional L-statistics restrictions; the covariance kernel in the tests reflects parametric dependence specification. The GMLM tests can resolve the choice of orthogonal polynomials remaining as an identification issue in the GMM tests using the Stein approximation (Bontemps and Meddahi, 2005, 2006) because L-moments are simply the expectations of quantiles which can be linearly combined in order to characterize a distribution function. Thus, our test statistics can be represented as functions of the quantiles of the conditional distribution under the null hypothesis. In the broad context of goodness-of-fit tests based on order statistics, the methodologies developed in the paper differ from existing methods such as tests based on the (weighed) distance between empirical distribution and a parametric distribution under the null or the tests based on likelihood ratio of Zhang (2002) in two respects: 1) our tests are motivated by the L-moment theory and Steins method; 2) offer more flexibility because we can select an optimal number of L-moments so that the sample size necessary for a test to attain a given level of power is minimal. Finally, we provide some Monte-Carlo simulations for IID data to examine the size, the power and the robustness of the GMLM test and compare with both existing moment-based tests and tests based on order statistics.
Archive | 2007
Roman Kozhan; Mark Salmon
In this paper we test whether investors are uncertainty averse during a real-life trading process in the foreign exchange market. We do this through an agent-based model in which fundamentalist and chartist beliefs of the exchange rate are allowed to be either uncertainty neutral or uncertainty averse. The uncertainty aversion is modelled via the maxmin expected utility approach. We find that traders are uncertainty averse in the FX market. The estimation results show that the inclusion of uncertainty averse agents improves the performance of the model and the uncertainty aversion parameter is significantly different from zero. Fundamentalists are found to be uncertainty neutral and chartists - mainly uncertainty averse.
Self-aware Computing Systems | 2016
Maciej Kurek; Tobias Becker; Ce Guo; Stewart Denholm; Andreea-Ingrid Funie; Mark Salmon; Tim Todman; Wayne Luk
This chapter describes self-awareness in four financial applications. We apply some of the design patterns of Chapter 5 and techniques of Chapter 7. We describe three applications briefly, highlighting the links to self-awareness and self-expression. The applications are (i) a hybrid genetic programming and particle swarm optimisation approach for high-frequency trading, with fitness function evaluation accelerated by FPGA; (ii) an adaptive point process model for currency trading, accelerated by FPGA hardware; (iii) an adaptive line arbitrator synthesising high-reliability and low-latency feeds from redundant data feeds (A/B feeds) using FPGA hardware. Finally, we describe in more detail a generic optimisation approach for reconfigurable designs automating design optimisation, using reconfigurable hardware to speed up the optimisation process, applied to applications including a quadrature-based financial application. In each application, the hardware-accelerated self-aware approaches give significant benefits: up to 55× speedup for hardware-accelerated design optimisation compared to software hill climbing.
Archive | 2008
Nektaria V. Karakatsani; Mark Salmon
The time-series relationship between investor sentiment and market returns, in particular the direction and size of the effects, remains ambiguous, being assessed under the restrictive assumption of linearity. This paper reveals the presence of four, intuitive, regimes in price and sentiment formation in the US stock market at the monthly level over the period 1965-2003, even after controlling for various economic and financial factors. An optimistic state of high returns (occurrence probability: 44%) alternates with a pessimistic state of low returns (35%), while two infrequent, highly volatile states capture temporal irregularities: episodes of extreme negative returns and strong pessimism (13%) and a reversal phase of intense optimism (8%). Five main findings arise: i) In the high return (low return) state, only individual (institutional) sentiment is influential, being a contrarian (momentum) signal for the subsequent return and responding positively (negatively) but weakly to its lagged value. In the former case, the impact of sentiment is consistent with correction of a previous mispricing, possibly induced by individuals, while in the latter, it indicates institutions correct predictive ability. ii) The impact of institutional sentiment is substantial but constrained in the pessimistic state, while the effect of individual sentiment is moderate but augmented substantially at irregular times. iii) Individuals interpret institutional optimism as a positive signal, whereas institutions perceive individuals optimism as a contrarian indicator. iv) Total arbitrage cost exerts a positive impact on both subsequent returns and institutional optimism. v) Interest rates reductions amplify investors optimism at irregular times, most evidently during the market reversal phase.
signal processing systems | 2018
Andreea Ingrid Funie; Paul Grigoras; Pavel Burovskiy; Wayne Luk; Mark Salmon
Genetic programming can be used to identify complex patterns in financial markets which may lead to more advanced trading strategies. However, the computationally intensive nature of genetic programming makes it difficult to apply to real world problems, particularly in real-time constrained scenarios. In this work we propose the use of Field Programmable Gate Array technology to accelerate the fitness evaluation step, one of the most computationally demanding operations in genetic programming. We propose to develop a fully-pipelined, mixed precision design using run-time reconfiguration to accelerate fitness evaluation. We show that run-time reconfiguration can reduce resource consumption by a factor of 2 compared to previous solutions on certain configurations. The proposed design is up to 22 times faster than an optimised, multithreaded software implementation while achieving comparable financial returns.