Apostolos-Paul N. Refenes
London Business School
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Apostolos-Paul N. Refenes.
Neural Networks | 1994
Apostolos-Paul N. Refenes; Achilleas Zapranis; Gavin Francis
Abstract We examine the use of neural networks as an alternative to classical statistical techniques for forecasting within the framework of the APT (arbitrage pricing theory) model for stock ranking. We show that neural networks outperform these statistical techniques in forecasting accuracy terms, and give better model fitness in-sample by one order of magnitude. We identify intervals for the network parameter values for which these performance figures are statistically stable. Neural networks have been criticised for not being able to provide an explanation of how they interact with their environment and how they reach an outcome. We show that by using sensitivity analysis, neural networks can provide a reasonable explanation of their predictive behaviour and can model their environment more convincingly than regression models.
IEEE Transactions on Neural Networks | 1997
Apostolos-Paul N. Refenes; A.N. Burgess; Y. Bentz
Neural networks have shown considerable successes in modeling financial data series. However, a major weakness of neural modeling is the lack of established procedures for performing tests for misspecified models, and tests of statistical significance for the various parameters that have been estimated. This is a serious disadvantage in applications where there is a strong culture for testing not only the predictive power of a model or the sensitivity of the dependent variable to changes in the inputs but also the statistical significance of the finding at a specified level of confidence. Rarely is this more important than in the case of financial engineering, where the data generating processes are dominantly stochastic and only partially deterministic. Partly a tutorial, partly a review, this paper describes a collection of typical applications in options pricing, cointegration, the term structure of interest rates and models of investor behavior which highlight these weaknesses and propose and evaluate a number of solutions. We describe a number of alternative ways to deal with the problem of variable selection, show how to use model misspecification tests, we deploy a novel way based on cointegration to deal with the problem of nonstationarity, and generally describe approaches to predictive neural modeling which are more in tune with the requirements for modeling financial data series.
Archive | 1999
Achilleas Zapranis; Apostolos-Paul N. Refenes
1 Introduction.- 2 Neural Model Identification.- 3 Review of Current Practice in Neural Model Identification.- 4 Neural Model Selection: the Minimum Prediction Risk Principle.- 5 Variable Significance Testing: a Statistical Approach.- 6 Model Adequacy Testing.- 7 Neural Networks in Tactical Asset Allocation: a Case Study.- 8 Conclusions.- Appendices.- A Computation of Network Derivatives.- B Generating Random Normal Deviates.- References.
Journal of Forecasting | 1999
Apostolos-Paul N. Refenes; Achilleas Zapranis
In recent years an impressive array of publications has appeared claiming considerable successes of neural networks in modelling financial data but sceptical practitioners and statisticians are still raising the question of whether neural networks really are ‘a major breakthrough or just a passing fad’. A major reason for this is the lack of procedures for performing tests for misspecified models, and tests of statistical significance for the various parameters that have been estimated, which makes it difficult to assess the models significance and the possibility that any short-term successes that are reported might be due to ‘data mining’. In this paper we describe a methodology for neural model identification which facilitates hypothesis testing at two levels: model adequacy and variable significance. The methodology includes a model selection procedure to produce consistent estimators, a variable selection procedure based on statistical significance and a model adequacy procedure based on residuals analysis. We propose a novel, computationally efficient scheme for estimating sampling variability of arbitrarily complex statistics for neural models and apply it to variable selection. The approach is based on sampling from the asymptotic distribution of the neural models parameters (‘parametric sampling’). Controlled simulations are used for the analysis and evaluation of our model identification methodology. A case study in tactical asset allocation is used to demonstrate how the methodology can be applied to real-life problems in a way analogous to stepwise forward regression analysis. Neural models are contrasted to multiple linear regression. The results indicate the presence of non-linear relationships in modelling the equity premium. Copyright
Neurocomputing | 1997
Apostolos-Paul N. Refenes; Y. Bentz; D.W. Bunn; A.N. Burgess; Achilleas Zapranis
Abstract We propose a simple modification to the error backpropagation procedure which takes into account gradually changing input-output relations. The procedure is based on the principle of Discounted least squares whereby learning is biased towards more recent observations with long term effects experiencing exponential decay through time. This is particularly important in systems in which the structural relationship between input and response vectors changes gradually over time but certain elements of long term memory are still retained. The procedure is implemented by a simple modification of the least-squares cost function commonly used in error backpropagation. We compare the performance of the two cost functions using both a controlled simulation experiment and a non-trivial application in estimating stock returns on the basis of multiple factor exposures. We show that in both cases the DLS procedure gives significantly better results. Typically, there is an average improvement of above 30% (in MSE terms) for the stock return modelling problem.
Defence and Peace Economics | 1995
Apostolos-Paul N. Refenes; Christos Kollias; Achilleas Zapranis
Greece has regularly ranked as the country with the highest defence burden in NATO and the European Union. Over the past decades she has allocated an averatge 6% of GDP to defence yearly. This study using neural networks examines the external security determinants of Greek military expenditure in the context of the ongoing Greek‐Turkish conflict.
Archive | 1999
Achilleas Zapranis; Apostolos-Paul N. Refenes
Suppose that the sample \( {D_n} = \left\{ {({x_{i,1}},{x_{i,2}},...,{x_{i,m}},{y_i})} \right\}_{i = 1}^n = \left\{ {\left( {{x_{i,}}{y_i}} \right)} \right\}_{i = 1}^n \) comprises n independent observations on m explanatory variables x j , j = 1,…, m and one dependent variable y and that each observation can be regarded as a realization of an (m + 1)-dimensional distribution function Ξ(x, y) =Ψ(y|x)Ω(x) (the operating model) which will sometimes also be denoted by F for simplicity. We view the observations as being generated by an unknown function o(x) with the addition of a stochastic component, commonly taken to be independently and identically distributed (i.i.d.) with zero mean and constant variance σ2, i.e.
Archive | 1999
Achilleas Zapranis; Apostolos-Paul N. Refenes
Archive | 1999
Achilleas Zapranis; Apostolos-Paul N. Refenes
{y_i} = \phi \left( {{x_i}} \right) + {\varepsilon _i}
Archive | 1999
Achilleas Zapranis; Apostolos-Paul N. Refenes