Ioannis E. Livieris
University of Patras
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ioannis E. Livieris.
Applied Mathematics and Computation | 2013
Ioannis E. Livieris; Panayiotis E. Pintelas
Conjugate gradient methods have been established as excellent neural network training methods, due to the simplicity of their iteration, numerical efficiency and their low memory requirements. In this work, we propose a conjugate gradient neural network training algorithm which guarantees sufficient descent using any line search, avoiding thereby the usually inefficient restarts. Moreover, it approximates the second order curvature information of the error surface with a high-order accuracy by utilizing a new modified secant condition. Under mild conditions, we establish that the global convergence of our proposed method. Experimental results provide evidence that our proposed method is in general superior to the classical conjugate gradient training methods and has a potential to significantly enhance the computational efficiency and robustness of the training process.
international conference on industrial informatics | 2009
M.S. Apostolopoulou; D. G. Sotiropoulos; Ioannis E. Livieris; Panayiotis E. Pintelas
We present a new curvilinear algorithmic model for training neural networks which is based on a modifications of the memoryless BFGS method that incorporates a curvilinear search. The proposed model exploits the nonconvexity of the error surface based on information provided by the eigensystem of memoryless BFGS matrices using a pair of directions; a memoryless quasi-Newton direction and a direction of negative curvature. In addition, the computation of the negative curvature direction is accomplished by avoiding any storage and matrix factorization. Simulations results verify that the proposed modification significantly improves the efficiency of the training process.
Applied Mathematics and Computation | 2012
Ioannis E. Livieris; Panayiotis E. Pintelas
Abstract Conjugate gradient methods are probably the most famous iterative methods for solving large scale optimization problems in scientific and engineering computation, characterized by the simplicity of their iteration and their low memory requirements. In this paper, we propose a new conjugate gradient method which is based on the MBFGS secant condition by modifying Perry’s method. Our proposed method ensures sufficient descent independent of the accuracy of the line search and it is globally convergent under some assumptions. Numerical experiments are also presented.
Journal of Computational and Applied Mathematics | 2013
Ioannis E. Livieris; Panayiotis E. Pintelas
Conjugate gradient methods have played a special role for solving large scale optimization problems due to the simplicity of their iteration, convergence properties and their low memory requirements. In this work, we propose a new class of spectral conjugate gradient methods which ensures sufficient descent independent of the accuracy of the line search. Moreover, an attractive property of our proposed methods is that they achieve a high-order accuracy in approximating the second order curvature information of the objective function by utilizing the modified secant condition proposed by Babaie-Kafaki et al. [S. Babaie-Kafaki, R. Ghanbari, N. Mahdavi-Amiri, Two new conjugate gradient methods based on modified secant equations, Journal of Computational and Applied Mathematics 234 (2010) 1374-1386]. Further, a global convergence result for general functions is established provided that the line search satisfies the Wolfe conditions. Our numerical experiments indicate that our proposed methods are preferable and in general superior to the classical conjugate gradient methods in terms of efficiency and robustness.
International Journal on Artificial Intelligence Tools | 2012
Ioannis E. Livieris; Panayiotis E. Pintelas
Conjugate gradient methods constitute excellent neural network training methods which are characterized by their simplicity and their very low memory requirements. In this paper, we propose a new s...
International Scholarly Research Notices | 2012
Ioannis E. Livieris; Panagiotis Pintelas
We propose a conjugate gradient method which is based on the study of the Dai-Liao conjugate gradient method. An important property of our proposed method is that it ensures sufficient descent independent of the accuracy of the line search. Moreover, it achieves a high-order accuracy in approximating the second-order curvature information of the objective function by utilizing the modified secant condition proposed by Babaie-Kafaki et al. (2010). Under mild conditions, we establish that the proposed method is globally convergent for general functions provided that the line search satisfies the Wolfe conditions. Numerical experiments are also presented.
panhellenic conference on informatics | 2009
Ioannis E. Livieris; D. G. Sotiropoulos; Panayiotis E. Pintelas
In this paper, we evaluate the performance of descent conjugate gradient methods and we propose a new algorithm for training recurrent neural networks. The presented algorithm preserves the advantages of classical conjugate gradient methods while simultaneously avoids the usually inefficient restarts. Simulation results are also presented using three different recurrent neural network architectures in a variety of benchmarks.
Journal of Imaging | 2018
Ioannis E. Livieris; Andreas Kanavos; Vassilis Tampakas; Panayiotis E. Pintelas
A critical component in the computer-aided medical diagnosis of digital chest X-rays is the automatic detection of lung abnormalities, since the effective identification at an initial stage constitutes a significant and crucial factor in patient’s treatment. The vigorous advances in computer and digital technologies have ultimately led to the development of large repositories of labeled and unlabeled images. Due to the effort and expense involved in labeling data, training datasets are of a limited size, while in contrast, electronic medical record systems contain a significant number of unlabeled images. Semi-supervised learning algorithms have become a hot topic of research as an alternative to traditional classification methods, exploiting the explicit classification information of labeled data with the knowledge hidden in the unlabeled data for building powerful and effective classifiers. In the present work, we evaluate the performance of an ensemble semi-supervised learning algorithm for the classification of chest X-rays of tuberculosis. The efficacy of the presented algorithm is demonstrated by several experiments and confirmed by the statistical nonparametric tests, illustrating that reliable and robust prediction models could be developed utilizing a few labeled and many unlabeled data.
Optimization Letters | 2016
Ioannis E. Livieris; Panagiotis Pintelas
In this work, we present a new limited memory conjugate gradient method which is based on the study of Perry’s method. An attractive property of the proposed method is that it corrects the loss of orthogonality that can occur in ill-conditioned optimization problems, which can decelerate the convergence of the method. Moreover, an additional advantage is that the memory is only used to monitor the orthogonality relatively cheaply; and when orthogonality is lost, the memory is used to generate a new orthogonal search direction. Under mild conditions, we establish the global convergence of the proposed method provided that the line search satisfies the Wolfe conditions. Our numerical experiments indicate the efficiency and robustness of the proposed method.
Optimization Letters | 2015
Ioannis E. Livieris; Panagiotis Pintelas
In this work, we propose a new conjugate gradient method which consists of a modification of Perry’s method and ensures sufficient descent independent of the accuracy of the line search. An important property of our proposed method is that it achieves a high-order accuracy in approximating the second order curvature information of the objective function by utilizing a new modified secant condition. Moreover, we establish that the proposed method is globally convergent for general functions provided that the line search satisfies the Wolfe conditions. Our numerical experiments indicate that our proposed method is preferable and in general superior to classical conjugate gradient methods in terms of efficiency and robustness.