Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Andreas S. Weigend is active.

Publication


Featured researches published by Andreas S. Weigend.


International Journal of Neural Systems | 1997

A First Application of Independent Component Analysis to Extracting Structure from Stock Returns

Andrew D. Back; Andreas S. Weigend

This paper explores the application of a signal processing technique known as independent component analysis (ICA) or blind source separation to multivariate financial time series such as a portfolio of stocks. The key idea of ICA is to linearly map the observed multivariate time series into a new space of statistically independent components (ICs). We apply ICA to three years of daily returns of the 28 largest Japanese stocks and compare the results with those obtained using principal component analysis. The results indicate that the estimated ICs fall into two categories, (i) infrequent large shocks (responsible for the major changes in the stock prices), and (ii) frequent smaller fluctuations (contributing little to the overall level of the stocks). We show that the overall stock price can be reconstructed surprisingly well by using a small number of thresholded weighted ICs. In contrast, when using shocks derived from principal components instead of independent components, the reconstructed price is less similar to the original one. ICA is shown to be a potentially powerful method of analyzing and understanding driving mechanisms in financial time series. The application to portfolio optimization is described in Chin and Weigend (1998).


Information Retrieval | 1999

Exploiting Hierarchy in Text Categorization

Andreas S. Weigend; Erik D. Wiener; Jan O. Pedersen

With the recent dramatic increase in electronic access to documents, text categorization—the task of assigning topics to a given document—has moved to the center of the information sciences and knowledge management. This article uses the structure that is present in the semantic space of topics in order to improve performance in text categorization: according to their meaning, topics can be grouped together into “meta-topics”, e.g., gold, silver, and copper are all metals. The proposed architecture matches the hierarchical structure of the topic space, as opposed to a flat model that ignores the structure. It accommodates both single and multiple topic assignments for each document. Its probabilistic interpretation allows its predictions to be combined in a principled way with information from other sources. The first level of the architecture predicts the probabilities of the meta-topic groups. This allows the individual models for each topic on the second level to focus on finer discriminations within the group. Evaluating the performance of a two-level implementation on the Reuters-22173 testbed of newswire articles shows the most significant improvement for rare classes.


IEEE Transactions on Neural Networks | 1994

Computing second derivatives in feed-forward networks: a review

Wray L. Buntine; Andreas S. Weigend

The calculation of second derivatives is required by recent training and analysis techniques of connectionist networks, such as the elimination of superfluous weights, and the estimation of confidence intervals both for weights and network outputs. We review and develop exact and approximate algorithms for calculating second derivatives. For networks with |w| weights, simply writing the full matrix of second derivatives requires O(|w|(2)) operations. For networks of radial basis units or sigmoid units, exact calculation of the necessary intermediate terms requires of the order of 2h+2 backward/forward-propagation passes where h is the number of hidden units in the network. We also review and compare three approximations (ignoring some components of the second derivative, numerical differentiation, and scoring). The algorithms apply to arbitrary activation functions, networks, and error functions.


IEEE Transactions on Neural Networks | 1998

A bootstrap evaluation of the effect of data splitting on financial time series

Blake LeBaron; Andreas S. Weigend

Exposes problems of the commonly used technique of splitting the available data into training, validation, and test sets that are held fixed, warns about drawing too strong conclusions from such static splits, and shows potential pitfalls of ignoring variability across splits. Using a bootstrap or resampling method, we compare the uncertainty in the solution stemming from the data splitting with neural-network specific uncertainties (parameter initialization, choice of number of hidden units, etc.). We present two results on data from the New York Stock Exchange. First, the variation due to different resamplings is significantly larger than the variation due to different network conditions. This result implies that it is important to not over-interpret a model (or an ensemble of models) estimated on one specific split of the data. Second, on each split, the neural-network solution with early stopping is very close to a linear model; no significant nonlinearities are extracted.


Archive | 1998

Discovering Structure in Finance Using Independent Component Analysis

Andrew D. Back; Andreas S. Weigend

Independent component analysis is a new signal processing technique. In this paper we apply it to a portfolio of Japanese stock price returns over three years of daily data and compare the results obtained using principal component analysis. The results indicate that the independent components fall into two categories, (i) infrequent but large shocks (responsible for the major changes in the stock prices), and (ii) frequent but rather small fluctuations (contributing little to the overall level of the stocks). The small number of major shocks indicate turning points in the time series and when used to reconstruct the stock prices, give good results in terms of morphology. In contrast, when using shocks derived from principal components instead of independent components, the reconstructed price does not show the same results at all. Independent component analysis is shown to be a potentially powerful method of analysing and understanding driving mechanisms in financial time series.


International Journal of Neural Systems | 1997

Modeling Volatility Using State Space Models

Jens Timmer; Andreas S. Weigend

In time series problems, noise can be divided into two categories: dynamic noise which drives the process, and observational noise which is added in the measurement process, but does not influence future values of the system. In this framework, we show that empirical volatilities (the squared relative returns of prices) exhibit a significant amount of observational noise. To model and predict their time evolution adequately, we estimate state space models that explicitly include observational noise. We obtain relaxation times for shocks in the logarithm of volatility ranging from three weeks (for foreign exchange) to three to five months (for stock indices). In most cases, a two-dimensional hidden state is required to yield residuals that are consistent with white noise. We compare these results with ordinary autoregressive models (without a hidden state) and find that autoregressive models underestimate the relaxation times by about two orders of magnitude since they do not distinguish between observational and dynamic noise. This new interpretation of the dynamics of volatility in terms of relaxators in a state space model carries over to stochastic volatility models and to GARCH models, and is useful for several problems in finance, including risk management and the pricing of derivative securities. Data sets used: Olsen & Associates high frequency DEM/USD foreign exchange rates (8 years). Nikkei 225 index (40 years). Dow Jones Industrial Average (25 years).


ieee conference on computational intelligence for financial engineering economics | 1998

What drives stock returns?-an independent component analysis

Andrew D. Back; Andreas S. Weigend

The paper discusses the application of a signal processing technique known as independent component analysis (ICA), also called blind source separation, to multivariate financial time series. The key idea of ICA is to linearly map observed multivariate time series (such as a portfolio of stocks) into a new space of components that are statistically independent. The authors apply ICA to daily returns of the 28 largest Japanese stocks and compare the ICA results to principal component analysis. Their results indicate that the estimated ICs fall into two categories, (i) infrequent but large shocks (responsible for the major changes in the stock prices), and (ii) frequent but rather small fluctuations (contributing little to the overall level of the stocks). They show that the overall stock price can be reconstructed surprisingly well by thresholding the weighted ICs and using, on average, only one such shock per quarter. In contrast, when using shocks derived from principal components instead of independent components, the reconstructed price does not resemble the original one. The technique of ICA is shown to be a potentially powerful method to analyze and understand driving mechanisms in financial time series.


WIT Transactions on Information and Communication Technologies | 1970

Modeling Financial Data Using Clustering AndTree-based Approaches

Fei Chen; Stephen Figlewski; Andreas S. Weigend

This paper compares tree-based approaches to clustering. We model a set of 3-million transactional T-bond futures data using these two techniques and compare their predictive performance on trade profit. We illustrate their respective strengths and weaknesses.


Archive | 1998

Modeling Financial Time Series Using State Space Models

Jens Timmer; Andreas S. Weigend

In time series problems, noise can be divided into two categories: dynamic noise which drives the process, and observational noise which is added in the measurement process, but does not influence future values of the system.In this framework, empirical volatilities (the squared relative returns of prices) exhibit a significant amount of observational noise. To model and predict their time evolution adequately, we estimate state space models that explicitly include observational noise. We obtain relaxation times for shocks in the logarithm of volatility. We compare these results with ordinary autoregres-sive models and find that autoregressive models underestimate the relaxation times by about two orders of magnitude due to their ignoring the distinction between observational and dynamic noise. This new interpretation of the dynamics of volatility in terms of relaxators in a state space model carries over to stochastic volatility models and to GARCH models, and is useful for several problems in finance, including risk management and the pricing of derivative securities.


Archive | 1995

A neural network approach to topic spotting

Erik D. Wiener; Jan O. Pedersen; Andreas S. Weigend

Collaboration


Dive into the Andreas S. Weigend's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jens Timmer

University of Freiburg

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrew W. Lo

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge