Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Panayiotis E. Pintelas is active.

Publication


Featured researches published by Panayiotis E. Pintelas.


Artificial Intelligence Review | 2006

Machine learning: a review of classification and combining techniques

Sotiris B. Kotsiantis; Ioannis D. Zaharakis; Panayiotis E. Pintelas

Supervised classification is one of the tasks most frequently carried out by so-called Intelligent Systems. Thus, a large number of techniques have been developed based on Artificial Intelligence (Logic-based techniques, Perceptron-based techniques) and Statistics (Bayesian Networks, Instance-based techniques). The goal of supervised learning is to build a concise model of the distribution of class labels in terms of predictor features. The resulting classifier is then used to assign class labels to the testing instances where the values of the predictor features are known, but the value of the class label is unknown. This paper describes various classification algorithms and the recent attempt for improving classification accuracy—ensembles of classifiers.


international conference on knowledge-based and intelligent information and engineering systems | 2003

Preventing Student Dropout in Distance Learning Using Machine Learning Techniques

Sotiris B. Kotsiantis; Christos Pierrakeas; Panayiotis E. Pintelas

Student dropout occurs quite often in universities providing distance education. The scope of this research is to study whether the usage of machine learning techniques can be useful in dealing with this problem. Subsequently, an attempt was made to identifying the most appropriate learning algorithm for the prediction of students’ dropout. A number of experiments have taken place with data provided by the ‘informatics’ course of the Hellenic Open University and a quite interesting conclusion is that the Naive Bayes algorithm can be successfully used. A prototype web based support tool, which can automatically recognize students with high probability of dropout, has been constructed by implementing this algorithm.


artificial intelligence methodology systems applications | 2004

Increasing the Classification Accuracy of Simple Bayesian Classifier

Sotiris B. Kotsiantis; Panayiotis E. Pintelas

Simple Bayes algorithm captures the assumption that every feature is independent from the rest of the features, given the state of the class feature. The fact that the assumption of independence is clearly almost always wrong has led to a general rejection of the crude independence model in favor of more complicated alternatives, at least by researchers knowledgeable about theoretical issues. In this study, we attempted to increase the prediction accuracy of the simple Bayes model. Because the concept of combining classifiers is proposed as a new direction for the improvement of the performance of individual classifiers, we made use of Adaboost, with the difference that in each iteration of Adaboost, we used a discretization method and we removed redundant features using a filter feature selection method. Finally, we performed a large-scale comparison with other attempts that have tried to improve the accuracy of the simple Bayes algorithm as well as other state-of-the-art algorithms and ensembles on 26 standard benchmark datasets and we took better accuracy in most cases using less time for training, too.


mediterranean conference on control and automation | 2007

Local cost sensitive learning for handling imbalanced data sets

M. Karagiannopoulos; D. S. Anyfantis; Sotiris B. Kotsiantis; Panayiotis E. Pintelas

Many real-world data sets exhibit skewed class distributions in which almost all cases are allotted to a class and far fewer cases to a smaller, usually more interesting class. A classifier induced from an imbalanced data set has, typically, a low error rate for the majority class and an unacceptable error rate for the minority class. This paper firstly provides a systematic study on the various methodologies that have tried to handle this problem. Finally, it presents an experimental study of these methodologies with a proposed local cost sensitive technique and it concludes that such a framework can be a more effective solution to the problem. Our method seems to allow improved identification of difficult small classes in predictive analysis, while keeping the classification ability of the other classes in an acceptable level.


Applied Mathematics and Computation | 2013

A new conjugate gradient algorithm for training neural networks based on a modified secant equation

Ioannis E. Livieris; Panayiotis E. Pintelas

Conjugate gradient methods have been established as excellent neural network training methods, due to the simplicity of their iteration, numerical efficiency and their low memory requirements. In this work, we propose a conjugate gradient neural network training algorithm which guarantees sufficient descent using any line search, avoiding thereby the usually inefficient restarts. Moreover, it approximates the second order curvature information of the error surface with a high-order accuracy by utilizing a new modified secant condition. Under mild conditions, we establish that the global convergence of our proposed method. Experimental results provide evidence that our proposed method is in general superior to the classical conjugate gradient training methods and has a potential to significantly enhance the computational efficiency and robustness of the training process.


artificial intelligence applications and innovations | 2007

Robustness of learning techniques in handling class noise in imbalanced datasets

D. Anyfantis; M. Karagiannopoulos; Sotiris B. Kotsiantis; Panayiotis E. Pintelas

Many real world datasets exhibit skewed class distributions in which almost all instances are allotted to a class and far fewer instances to a smaller, but more interesting class. A classifier induced from an imbalanced dataset has a low error rate for the majority class and an undesirable error rate for the minority class. Many research efforts have been made to deal with class noise but none of them was designed for imbalanced datasets. This paper provides a study on the various methodologies that have tried to handle the imbalanced datasets and examines their robustness in class noise.


international conference on industrial informatics | 2009

A memoryless BFGS neural network training algorithm

M.S. Apostolopoulou; D. G. Sotiropoulos; Ioannis E. Livieris; Panayiotis E. Pintelas

We present a new curvilinear algorithmic model for training neural networks which is based on a modifications of the memoryless BFGS method that incorporates a curvilinear search. The proposed model exploits the nonconvexity of the error surface based on information provided by the eigensystem of memoryless BFGS matrices using a pair of directions; a memoryless quasi-Newton direction and a direction of negative curvature. In addition, the computation of the negative curvature direction is accomplished by avoiding any storage and matrix factorization. Simulations results verify that the proposed modification significantly improves the efficiency of the training process.


Studies in Educational Evaluation | 2003

Evaluating and improving educational Material and tutoring aspects of distance learning systems

Christos Pierrakeas; Michalis Nik Xenos; Panayiotis E. Pintelas

This paper presents the approach followed for the evaluation of the educational system of the Informatics Course in the Hellenic Open University. Emphasis is placed on the evaluation method itself, which is general enough to be applied in similar evaluation cases, and on the discussion of the lessons learned during this evaluation. The presented method can be used for the improvement of the two major components of distance learning systems: the educational material and the role of the tutor. The paper discusses the conclusions drawn from the evaluation procedure and the actions taken towards improvement, aiming to serve as an example for similar attempts and share experiences in the field of distance education.


Journal of Computers | 2006

Local Boosting of Decision Stumps for Regression and Classification Problems

Sotiris B. Kotsiantis; Dimitris Kanellopoulos; Panayiotis E. Pintelas

Numerous data mining problems involve an investigation of associations between features in heterogeneous datasets, where different prediction models can be more suitable for different regions. We propose a technique of boosting localized weak learners; rather than having constant weights attached to each learner (as in standard boosting approaches), we allow weights to be functions over the input domain. In order to find out these functions, we recognize local regions having similar characteristics and then build local experts on each of these regions describing the association between the data characteristics and the target value. We performed a comparison with other well known combining methods on standard classification and regression benchmark datasets using decision stump as based learner, and the proposed technique produced the most accurate results.


International Journal of Knowledge-based and Intelligent Engineering Systems | 2005

Local voting of weak classifiers

Sotiris B. Kotsiantis; Panayiotis E. Pintelas

Many data mining problems involve an investigation of relationships between features in heterogeneous datasets, where different learning algorithms can be more appropriate for different regions. We propose herein a technique of localized voting of weak classifiers. This technique identifies local regions which have similar characteristics and then uses the votes of each local expert to describe the relationship between the data characteristics and the target class. We performed a comparison with other well known combining methods on standard benchmark datasets and the accuracy of the proposed method was greater.

Collaboration


Dive into the Panayiotis E. Pintelas's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge