Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Cemre Zor is active.

Publication


Featured researches published by Cemre Zor.


IEEE Transactions on Neural Networks | 2013

Ensemble Pruning Using Spectral Coefficients

Terry Windeatt; Cemre Zor

Ensemble pruning aims to increase efficiency by reducing the number of base classifiers, without sacrificing and preferably enhancing performance. In this brief, a novel pruning paradigm is proposed. Two class supervised learning problems are pruned using a combination of first- and second-order Walsh coefficients. A comparison is made with other ordered aggregation pruning methods, using multilayer perceptron base classifiers. The Walsh pruning method is analyzed with the help of a model that shows the relationship between second-order coefficients and added classification error with respect to Bayes error.


IEEE Transactions on Neural Networks | 2011

Minimising Added Classification Error Using Walsh Coefficients

Terry Windeatt; Cemre Zor

Two-class supervised learning in the context of a classifier ensemble may be formulated as learning an incompletely specified Boolean function, and the associated Walsh coefficients can be estimated without the knowledge of the unspecified patterns. Using an extended version of the Tumer-Ghosh model, the relationship between added classification error and second-order Walsh coefficients is established. In this brief, the ensemble is composed of multilayer perceptron base classifiers, with the number of hidden nodes and epochs systematically varied. Experiments demonstrate that the mean second-order coefficients peak at the same number of training epochs as ensemble test error reaches a minimum.


Pattern Recognition | 2017

A decision cognizant Kullback-Leibler divergence

Moacir Ponti; Josef Kittler; Mateus Riva; Teofilo de Campos; Cemre Zor

In decision making systems involving multiple classifiers there is the need to assess classifier (in)congruence, that is to gauge the degree of agreement between their outputs. A commonly used measure for this purpose is the Kullback–Leibler (KL) divergence. We propose a variant of the KL divergence, named decision cognizant Kullback–Leibler divergence (DC-KL), to reduce the contribution of the minority classes, which obscure the true degree of classifier incongruence. We investigate the properties of the novel divergence measure analytically and by simulation studies. The proposed measure is demonstrated to be more robust to minority class clutter. Its sensitivity to estimation noise is also shown to be considerably lower than that of the classical KL divergence. These properties render the DC-KL divergence a much better statistic for discriminating between classifier congruence and incongruence in pattern recognition systems.


Ensembles in Machine Learning Applications | 2011

Bias-Variance Analysis of ECOC and Bagging Using Neural Nets

Cemre Zor; Terry Windeatt; Berrin A. Yanikoglu

One of the methods used to evaluate the performance of ensemble classifiers is bias and variance analysis. In this chapter, we analyse bootstrap aggregating (Bagging) and Error Correcting Output Coding (ECOC) ensembles using a biasvariance framework; and make comparisons with single classifiers, while having Neural Networks (NNs) as base classifiers. As the performance of the ensembles depends on the individual base classifiers, it is important to understand the overall trends when the parameters of the base classifiers – nodes and epochs for NNs –, are changed. We show experimentally on 5 artificial and 4 UCI MLR datasets that there are some clear trends in the analysis that should be taken into consideration while designing NN classifier systems.


multiple classifier systems | 2013

ECOC Matrix Pruning Using Accuracy Information.

Cemre Zor; Terry Windeatt; Josef Kittler

The target of ensemble pruning is to increase efficiency by reducing the ensemble size of a multi classifier system and thus computational and storage costs, without sacrificing and preferably enhancing the generalization performance. However, most state-of-the-art ensemble pruning methods are based on unweighted or weighted voting ensembles; and their extensions to the Error Correcting Output Coding (ECOC) framework is not strongly evident or successful. In this study, a novel strategy for pruning ECOC ensembles which is based on a novel accuracy measure is presented. The measure is defined by establishing the link between the accuracies of the two-class base classifiers in the context of the main multiclass problem. The results show that the method outperforms the ECOC extensions of the state-of-the-art pruning methods in the majority of cases and that it is even possible to improve the generalization performance by only using 30% of the initial ensemble size in certain scenarios.


international symposium on computer and information sciences | 2011

FLIP-ECOC: a greedy optimization of the ECOC matrix

Cemre Zor; Berrin A. Yanikoglu; Terry Windeatt; Ethem Alpaydin

Error Correcting Output Coding (ECOC) is a multiclass classification technique, in which multiple base classifiers (dichotomizers) are trained using subsets of the training data, determined by a preset code matrix. While it is one of the best solutions to multiclass problems, ECOC is suboptimal, as the code matrix and the base classifiers are not learned simultaneously. In this paper, we show an iterative update algorithm that reduces this decoupling. We compare the algorithm with the standard ECOC approach, using Neural Networks (NNs) as the base classifiers, and show that it improves the accuracy for some well-known data sets under different settings.


international conference on biometrics | 2009

Upper Facial Action Unit Recognition

Cemre Zor; Terry Windeatt

This paper concentrates on the comparisons of systems that are used for the recognition of expressions generated by six upper face action units (AU s) by using Facial Action Coding System (FACS ). Haar wavelet, Haar-Like and Gabor wavelet coefficients are compared, using Adaboost for feature selection. The binary classification results by using Support Vector Machines (SVM ) for the upper face AU s have been observed to be better than the current results in the literature, for example 96.5% for AU2 and 97.6% for AU5 . In multi-class classification case, the Error Correcting Output Coding (ECOC ) has been applied. Although for a large number of classes, the results are not as accurate as the binary case, ECOC has the advantage of solving all problems simultaneously; and for large numbers of training samples and small number of classes, error rates are improved.


Pattern Recognition | 2018

Error sensitivity analysis of Delta divergence - a novel measure for classifier incongruence detection

Josef Kittler; Cemre Zor; Ioannis Kaloskampis; Yulia Hicks; Wenwu Wang

Abstract The state of classifier incongruence in decision making systems incorporating multiple classifiers is often an indicator of anomaly caused by an unexpected observation or an unusual situation. Its assessment is important as one of the key mechanisms for domain anomaly detection. In this paper, we investigate the sensitivity of Delta divergence, a novel measure of classifier incongruence, to estimation errors. Statistical properties of Delta divergence are analysed both theoretically and experimentally. The results of the analysis provide guidelines on the selection of threshold for classifier incongruence detection based on this measure.


international conference on acoustics, speech, and signal processing | 2017

Maritime anomaly detection in ferry tracks

Cemre Zor; Josef Kittler

This paper proposes a methodology for the automatic detection of anomalous shipping tracks traced by ferries. The approach comprises a set of models as a basis for outlier detection: A Gaussian process (GP) model regresses displacement information collected over time, and a Markov chain based detector makes use of the direction (heading) information. GP regression is performed together with Median Absolute Deviation to account for contaminated training data. The methodology utilizes the coordinates of a given ferry recorded on a second by second basis via Automatic Identification System. Its effectiveness is demonstrated on a dataset collected in the Solent area.


international conference on pattern recognition | 2016

BeamECOC: A local search for the optimization of the ECOC matrix

Cemre Zor; Berrin A. Yanikoglu; Erinc Merdivan; Terry Windeatt; Josef Kittler; Ethem Alpaydin

Error Correcting Output Coding (ECOC) is a multi-class classification technique in which multiple binary classifiers are trained according to a preset code matrix such that each one learns a separate dichotomy of the classes. While ECOC is one of the best solutions for multi-class problems, one issue which makes it suboptimal is that the training of the base classifiers is done independently of the generation of the code matrix.

Collaboration


Dive into the Cemre Zor's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mateus Riva

University of São Paulo

View shared research outputs
Top Co-Authors

Avatar

Moacir Ponti

University of São Paulo

View shared research outputs
Researchain Logo
Decentralizing Knowledge