Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Joseph Rynkiewicz is active.

Publication


Featured researches published by Joseph Rynkiewicz.


Neurocomputing | 2012

General bound of overfitting for MLP regression models

Joseph Rynkiewicz

Multilayer perceptrons (MLP) with one hidden layer have been used for a long time to deal with non-linear regression. However, in some task, MLPs are too powerful models and a small mean square error (MSE) may be more due to overfitting than to actual modeling. If the noise of the regression model is Gaussian, the overfitting of the model is totally determined by the behavior of the likelihood ratio test statistic (LRTS), however in numerous cases the assumption of normality of the noise is arbitrary if not false. In this paper, we present an universal bound for the overfitting of such model under weak assumptions, this bound is valid without Gaussian or identifiability assumptions. The main application of this bound is to give a hint about determining the true architecture of the MLP model when the number of data goes to infinite. As an illustration, we use this theoretical result to propose and compare effective criteria to find the true architecture of an MLP.


workshop on self-organizing maps | 2006

Self-organizing map algorithm and distortion measure

Joseph Rynkiewicz

We study the statistical meaning of the minimization of distortion measure and the relation between the equilibrium points of the SOM algorithm and the minima of the distortion measure. If we assume that the observations and the map lie in a compact Euclidean space, we prove the strong consistency of the map which almost minimizes the empirical distortion. Moreover, after calculating the derivatives of the theoretical distortion measure, we show that the points minimizing this measure and the equilibria of the Kohonen map do not match in general. We illustrate, with a simple example, how this occurs.


Neurocomputing | 2008

Estimating the number of components in a mixture of multilayer perceptrons

Madalina Olteanu; Joseph Rynkiewicz

Bayesian information criterion (BIC) criterion is widely used by the neural-network community for model selection tasks, although its convergence properties are not always theoretically established. In this paper we will focus on estimating the number of components in a mixture of multilayer perceptrons and proving the convergence of the BIC criterion in this frame. The penalized marginal-likelihood for mixture models and hidden Markov models introduced by Keribin [Consistent estimation of the order of mixture models, Sankhya Indian J. Stat. 62 (2000) 49-66] and, respectively, Gassiat [Likelihood ratio inequalities with applications to various mixtures, Ann. Inst. Henri Poincare 38 (2002) 897-906] is extended to mixtures of multilayer perceptrons for which a penalized-likelihood criterion is proposed. We prove its convergence under some hypothesis which involve essentially the bracketing entropy of the generalized score-function class and illustrate it by some numerical examples.


international symposium on neural networks | 2003

Estimation of multidimensional regression model with multilayer perceptrons

Joseph Rynkiewicz

Recurrent neural networks are still a challenge in neural investigation. Most commonly used methods have to deal with several problems like local minima, slow convergence or bad learning results because of bifurcations through which the learning system is driven. The following approach, which is inspired by Echo State networks [1], overcomes those problems and enables learning of complex dynamical signals and tasks.


Neurocomputing | 2011

Asymptotic properties of mixture-of-experts models

Madalina Olteanu; Joseph Rynkiewicz

The statistical properties of the likelihood ratio test statistic (LRTS) for mixture-of-expert models are addressed in this paper. This question is essential when estimating the number of experts in the model. Our purpose is to extend the existing results for simple mixture models (Liu and Shao, 2003 [8]) and mixtures of multilayer perceptrons (Olteanu and Rynkiewicz, 2008 [9]). In this paper we first study a simple example which embodies all the difficulties arising in such models. We find that in the most general case the LRTS diverges but, with additional assumptions, the behavior of such models can be totally explicated.


international symposium on neural networks | 2008

Asymptotic Law of Likelihood Ratio for Multilayer Perceptron Models

Joseph Rynkiewicz

We consider regression models involving multilayer perceptrons (MLP) with one hidden layer and a Gaussian noise. The data are assumed to be generated by a true MLP model and the estimation of the parameters of the MLP is done by maximizing the likelihood of the model. When the number of hidden units of the model is over-estimated, the Fischer information matrix of the model is singular and the asymptotic behavior of the LR statistic is unknown or can be divergent if the set of possible parameter is too large. This paper deals with this case, and gives the exact asymptotic law of the LR statistic. Namely, if the parameters of the MLP lie in a suitable compact set, we show that the LR statistic converges to the maximum of the square of a Gaussian process indexed by a class of limit score functions.


Neurocomputing | 2006

Efficient estimation of multidimensional regression model using multilayer perceptrons

Joseph Rynkiewicz

This work concerns the estimation of multidimensional non-linear regression models using multilayer perceptrons (MLPs). The main problem with such models is that we need to know the covariance matrix of the noise to get an optimal estimator. However, we show in this paper that if we choose as the cost function the logarithm of the determinant of the empirical error covariance matrix, then we get an asymptotically optimal estimator. Moreover, under suitable assumptions, we show that this cost function leads to a very simple asymptotic law for testing the number of parameters of an identifiable MLP. Numerical experiments confirm the theoretical results.


Conference on air pollution modelling an dsimulation | 2002

Ozone Modeling in an Urban Atmosphere Using Artificial Neural Network. Hybrid Hidden Markov Model/Multi-layer Perceptron. The NEUROZONE Model

Joseph Rynkiewicz; A.L. Dutot; F. Steiner

The concentrations of ozone depend both of the emissions/chemical reactions involving nitrogen oxides and volatile organic compounds in the presence of sunlight and also of the meteorological parameters. In this study, we would present a statistical model based on these meteorological predictors.


Environmental Modelling and Software | 2007

A 24-h forecast of ozone peaks and exceedance levels using neural classifiers and weather predictions

A.L. Dutot; Joseph Rynkiewicz; Frédy E. Steiner; Julien Rude


European Journal of Economic and Social Systems | 2004

Some Known Facts about Financial Data

Eric de Bodt; Joseph Rynkiewicz; Marie Cottrell

Collaboration


Dive into the Joseph Rynkiewicz's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Denys Pommeret

Aix-Marseille University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge