Anton Maximilian Schäfer
Siemens
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Anton Maximilian Schäfer.
international conference on artificial neural networks | 2006
Anton Maximilian Schäfer; Hans Georg Zimmermann
Recurrent Neural Networks (RNN) have been developed for a better understanding and analysis of open dynamical systems. Still the question often arises if RNN are able to map every open dynamical system, which would be desirable for a broad spectrum of applications. In this article we give a proof for the universal approximation ability of RNN in state space model form and even extend it to Error Correction and Normalized Recurrent Neural Networks.
International Journal of Neural Systems | 2007
Anton Maximilian Schäfer; Hans-Georg Zimmermann
Recurrent Neural Networks (RNN) have been developed for a better understanding and analysis of open dynamical systems. Still the question often arises if RNN are able to map every open dynamical system, which would be desirable for a broad spectrum of applications. In this article we give a proof for the universal approximation ability of RNN in state space model form and even extend it to Error Correction and Normalized Recurrent Neural Networks.
international conference on artificial neural networks | 2006
Anton Maximilian Schäfer; Steffen Udluft; Hans Georg Zimmermann
Recurrent neural networks (RNNs) unfolded in time are in theory able to map any open dynamical system. Still they are often blamed to be unable to identify long-term dependencies in the data. Especially when they are trained with backpropagation through time (BPTT) it is claimed that RNNs unfolded in time fail to learn inter-temporal influences more than ten time steps apart. This paper provides a disproof of this often cited statement. We show that RNNs and especially normalised recurrent neural networks (NRNNs) unfolded in time are indeed very capable of learning time lags of at least a hundred time steps. We further demonstrate that the problem of a vanishing gradient does not apply to these networks.
international conference on artificial neural networks | 2006
Hans Georg Zimmermann; Lorenzo Bertolini; Ralph Grothmann; Anton Maximilian Schäfer; Christoph Tietz
In econometrics, the behaviour of financial markets is described by quantitative variables. Mathematical and statistical methods are used to explore economic relationships and to forecast the future market development. However, econometric modeling is often limited to a single financial market. In the age of globalisation financial markets are highly interrelated and thus, single market analyses are misleading. In this paper we present a new way to model the dynamics of coherent financial markets. Our approach is based on so-called dynamical consistent neural networks (DCNN), which are able to map multiple scales and different sub-dynamics of the coherent market movement. Unlikely to standard econometric methods, small market movements are not treated as noise but as valuable market information. We apply the DCNN to forecast monthly movements of major foreign exchange (FX) rates. Based on the DCNN forecasts we develop a technical trading indicator to support investment decisions.
the european symposium on artificial neural networks | 2008
Alexander Hans; Daniel Schneegaß; Anton Maximilian Schäfer; Steffen Udluft
Archive | 2007
Anton Maximilian Schäfer; Steffen Udluft
Archive | 2007
Anton Maximilian Schäfer; Steffen Udluft; Hans-Georg Zimmermann
Archive | 2007
Anton Maximilian Schäfer; Steffen Udluft
the european symposium on artificial neural networks | 2007
Anton Maximilian Schäfer; Steffen Udluft; Hans-Georg Zimmermann
Lecture Notes in Computer Science | 2006
Anton Maximilian Schäfer; Steffen Udluft; Hans Georg Zimmermann