D. G. Sotiropoulos
University of Patras
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by D. G. Sotiropoulos.
international conference on industrial informatics | 2009
M.S. Apostolopoulou; D. G. Sotiropoulos; Ioannis E. Livieris; Panayiotis E. Pintelas
We present a new curvilinear algorithmic model for training neural networks which is based on a modifications of the memoryless BFGS method that incorporates a curvilinear search. The proposed model exploits the nonconvexity of the error surface based on information provided by the eigensystem of memoryless BFGS matrices using a pair of directions; a memoryless quasi-Newton direction and a direction of negative curvature. In addition, the computation of the negative curvature direction is accomplished by avoiding any storage and matrix factorization. Simulations results verify that the proposed modification significantly improves the efficiency of the training process.
Optimization Methods & Software | 2008
M.S. Apostolopoulou; D. G. Sotiropoulos; Panayiotis E. Pintelas
We present a new matrix-free method for the large-scale trust-region subproblem, assuming that the approximate Hessian is updated by the L-BFGS formula with m=1 or 2. We determine via simple formulas the eigenvalues of these matrices and, at each iteration, we construct a positive definite matrix whose inverse can be expressed analytically, without using factorization. Consequently, a direction of negative curvature can be computed immediately by applying the inverse power method. The computation of the trial step is obtained by performing a sequence of inner products and vector summations. Furthermore, it immediately follows that the strong convergence properties of trust region methods are preserved. Numerical results are also presented.
Archive | 1997
Michael N. Vrahatis; D. G. Sotiropoulos; E. C. Triantafyllou
A new method for the computation of the global minimum of a continuously differentiable real—valued function f of n variables is presented. This method, which is composed of two parts, is based on the combinatorial topology concept of the degree of a mapping associated with an oriented polyhedron. In the first part, interval arithmetic is implemented for a “rough” isolation of all the stationary points of f. In the second part, the isolated stationary points are characterized as minima, maxima or saddle points and the global minimum is determined among the minima. The described algorithm can be successfully applied to problems with imprecise function and gradient values. The algorithm has been implemented and tested. It is primarily useful for small dimensions (n ≤ 10).
Optimization Letters | 2011
M.S. Apostolopoulou; D. G. Sotiropoulos; C.A. Botsaris; Panayiotis E. Pintelas
We present a nearly-exact method for the large scale trust region subproblem (TRS) based on the properties of the minimal-memory BFGS method. Our study is concentrated in the case where the initial BFGS matrix can be any scaled identity matrix. The proposed method is a variant of the Moré–Sorensen method that exploits the eigenstructure of the approximate Hessian B, and incorporates both the standard and the hard case. The eigenvalues of B are expressed analytically, and consequently a direction of negative curvature can be computed immediately by performing a sequence of inner products and vector summations. Thus, the hard case is handled easily while the Cholesky factorization is completely avoided. An extensive numerical study is presented, for covering all the possible cases arising in the TRS with respect to the eigenstructure of B. Our numerical experiments confirm that the method is suitable for very large scale problems.
Applied Mathematics and Computation | 2005
D. G. Sotiropoulos; T. N. Grapsa
We present an interval branch-and-prune algorithm for computing verified enclosures for the global minimum and all global minimizers of univariate functions subject to bound constraints. The algorithm works within the branch-and-bound framework and uses first order information of the objective function. In this context, we investigate valuable properties of the optimal center of a mean value form and prove optimality. We also establish an inclusion function selection criterion between natural interval extension and an optimal mean value form for the bounding process. Based on optimal centers, we introduce linear (inner and outer) pruning steps that are responsible for the branching process. The proposed algorithm incorporates the above techniques in order to accelerate the search process. Our algorithm has been implemented and tested on a test set and compared with three other methods. The method suggested shows a significant improvement on previous methods for the numerical examples solved.
Applied Mathematics and Computation | 2010
M.S. Apostolopoulou; D. G. Sotiropoulos; C.A. Botsaris
We present a new matrix-free method for the computation of negative curvature directions based on the eigenstructure of minimal-memory BFGS matrices. We determine via simple formulas the eigenvalues of these matrices and we compute the desirable eigenvectors by explicit forms. Consequently, a negative curvature direction is computed in such a way that avoids the storage and the factorization of any matrix. We propose a modification of the L-BFGS method in which no information is kept from old iterations, so that memory requirements are minimal. The proposed algorithm incorporates a curvilinear path and a linesearch procedure, which combines two search directions; a memoryless quasi-Newton direction and a direction of negative curvature. Results of numerical experiments for large scale problems are also presented.
panhellenic conference on informatics | 2009
Ioannis E. Livieris; D. G. Sotiropoulos; Panayiotis E. Pintelas
In this paper, we evaluate the performance of descent conjugate gradient methods and we propose a new algorithm for training recurrent neural networks. The presented algorithm preserves the advantages of classical conjugate gradient methods while simultaneously avoids the usually inefficient restarts. Simulation results are also presented using three different recurrent neural network architectures in a variety of benchmarks.
european symposium on algorithms | 2005
Josep Díaz; G. Grammatikopoulos; Alexis C. Kaporis; Lefteris M. Kirousis; Xavier Pérez; D. G. Sotiropoulos
We show that uniformly random 5-regular graphs of n vertices are 3-colorable with probability that is positive independently of n.
panhellenic conference on informatics | 2009
Ioannis E. Livieris; M.S. Apostolopoulou; D. G. Sotiropoulos; Spyros Sioutas; Panayiotis E. Pintelas
Artificial neural networks have been widely used for knowledge extraction from biomedical datasets and constitute an important role in bio-data exploration and analysis.In this work, we proposed a new curvilinear algorithm for training large neural networks which is based on the analysis of the eigenstructure of the memoryless BFGS matrices. The proposed method preserves the strong convergence properties provided by the quasi-Newton direction while simultaneously it exploits the nonconvexity of the error surface through the computation of the negative curvature direction without using any storage and matrix factorization.Moreover, for improving the generalization capability of trained ANNs, we explore the incorporation of several dimensionality reduction techniques as a pre-processing step.
international conference on industrial informatics | 2009
M.S. Apostolopoulou; D. G. Sotiropoulos; C.A. Botsaris; Panayiotis E. Pintelas
We present a matrix-free method for the large scale trust region subproblem (TRS), assuming that the approximate Hessian is updated using a minimal-memory BFGS method, where the initial matrix is a scaled identity matrix. We propose a variant of the Moré-Sorensen method that exploits the eigenstructure of the approximate Hessian, and incorporates both the standard and the hard case. The eigenvalues and the corresponding eigenvectors are expressed analytically, and hence a direction of negative curvature can be computed immediately. The most important merit of the proposed method is that it completely avoids the factorization, and the trust region subproblem can be solved by performing a sequence of inner products and vector summations. Numerical results are also presented.