Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gurkan Ozturk is active.

Publication


Featured researches published by Gurkan Ozturk.


Computational Optimization and Applications | 2015

An incremental clustering algorithm based on hyperbolic smoothing

Adil M. Bagirov; Burak Ordin; Gurkan Ozturk; A. E. Xavier

Clustering is an important problem in data mining. It can be formulated as a nonsmooth, nonconvex optimization problem. For the most global optimization techniques this problem is challenging even in medium size data sets. In this paper, we propose an approach that allows one to apply local methods of smooth optimization to solve the clustering problems. We apply an incremental approach to generate starting points for cluster centers which enables us to deal with nonconvexity of the problem. The hyperbolic smoothing technique is applied to handle nonsmoothness of the clustering problems and to make it possible application of smooth optimization algorithms to solve them. Results of numerical experiments with eleven real-world data sets and the comparison with state-of-the-art incremental clustering algorithms demonstrate that the smooth optimization algorithms in combination with the incremental approach are powerful alternative to existing clustering algorithms.


Machine Learning | 2015

An incremental piecewise linear classifier based on polyhedral conic separation

Gurkan Ozturk; Adil M. Bagirov; Refail Kasimbeyli

In this paper, a piecewise linear classifier based on polyhedral conic separation is developed. This classifier builds nonlinear boundaries between classes using polyhedral conic functions. Since the number of polyhedral conic functions separating classes is not known a priori, an incremental approach is proposed to build separating functions. These functions are found by minimizing an error function which is nonsmooth and nonconvex. A special procedure is proposed to generate starting points to minimize the error function and this procedure is based on the incremental approach. The discrete gradient method, which is a derivative-free method for nonsmooth optimization, is applied to minimize the error function starting from those points. The proposed classifier is applied to solve classification problems on 12 publicly available data sets and compared with some mainstream and piecewise linear classifiers.


International Journal of Information Technology and Decision Making | 2010

AN AUTOMATED MULTI-OBJECTIVE INVIGILATOR-EXAM ASSIGNMENT SYSTEM ∗

Zehra Kamisli Ozturk; Gurkan Ozturk; Mujgan Sagir

This paper is concerned with the invigilator-exam assignment problem. A web-based Automated Invigilator Assignment System (AIAS), consists of a mathematical model; a database storing the information and web-based user interfaces is constructed to solve the problem by providing an environment for a practical usage. The core of the system is the mathematical model developed for obtaining the exact solution. We conclude the paper by presenting a real-life problem solved by the proposed approach.


international conference on computational science | 2016

Arrhythmia Classification via k-Means Based Polyhedral Conic Functions Algorithm

Emre Cimen; Gurkan Ozturk

Heart disease is one of the important cause of death. In this study, we used ECG data obtained from MIT-BIH database to classify arrhythmias. We select 5 classes, normal beat (N), right bundle branch block (RBBB), left bundle branch block (LBBB), atrial premature contraction (APC) and ventricular premature contraction (VPC). We applied k-means based Polyhedral Conic Functions (k-means PCF) algorithm to classify instances. The performance of the proposed classifier is shown with numerical experiments. With proposed algorithm we obtained 98 % accuracy rate. This test result is compared with other well known classification methods.


Archive | 2014

Piecewise Linear Classifiers Based on Nonsmooth Optimization Approaches

Adil M. Bagirov; Refail Kasimbeyli; Gurkan Ozturk; Julien Ugon

Nonsmooth optimization provides efficient algorithms for solving many machine learning problems. In particular, nonsmooth optimization approaches to supervised data classification problems lead to the design of very efficient algorithms for their solution. In this chapter, we demonstrate how nonsmooth optimization algorithms can be applied to design efficient piecewise linear classifiers for supervised data classification problems. Such classifiers are developed using a max–min and a polyhedral conic separabilities as well as an incremental approach. We report results of numerical experiments and compare the piecewise linear classifiers with a number of other mainstream classifiers.


Digital Signal Processing | 2017

Incremental conic functions algorithm for large scale classification problems

Emre Cimen; Gurkan Ozturk; Ömer Nezih Gerek

Abstract In order to cope with classification problems involving large datasets, we propose a new mathematical programming algorithm by extending the clustering based polyhedral conic functions approach. Despite the high classification efficiency of polyhedral conic functions, the realization previously required a nested implementation of k-means and conic function generation, which has a computational load related to the number of data points. In the proposed algorithm, an efficient data reduction method is employed to the k-means phase prior to the conic function generation step. The new method not only improves the computational efficiency of the successful conic function classifier, but also helps avoiding model over-fitting by giving fewer (but more representative) conic functions.


Optimization Methods & Software | 2018

A sharp augmented Lagrangian-based method in constrained non-convex optimization

Adil M. Bagirov; Gurkan Ozturk; Refail Kasimbeyli

ABSTRACT In this paper, a novel sharp Augmented Lagrangian-based global optimization method is developed for solving constrained non-convex optimization problems. The algorithm consists of outer and inner loops. At each inner iteration, the discrete gradient method is applied to minimize the sharp augmented Lagrangian function. Depending on the solution found the algorithm stops or updates the dual variables in the inner loop, or updates the upper or lower bounds by going to the outer loop. The convergence results for the proposed method are presented. The performance of the method is demonstrated using a wide range of nonlinear smooth and non-smooth constrained optimization test problems from the literature.


international conference on computational science | 2016

Max Margin Polyhedral Conic Function Classifier

Gurkan Ozturk; Gurhan Ceylan

In classification problems, generalization ability has a key role for successful prediction. Well known Support Vector Machine classifier, tries to increase generalization ability via maximizing the margin, which is the distance between two parallel hyperplanes on the closest points. In this work we investigate maximizing the margin on non-parallel multi surfaces, by adapting GEPSVM* to Polyhedral Conic Function Classifiers.


Top | 2013

A novel piecewise linear classifier based on polyhedral conic and max–min separabilities

Adil M. Bagirov; Julien Ugon; Dean Webb; Gurkan Ozturk; Refail Kasimbeyli


SoftwareX | 2017

ICF: An algorithm for large scale classification with conic functions

Emre Cimen; Gurkan Ozturk; Ömer Nezih Gerek

Collaboration


Dive into the Gurkan Ozturk's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Adil M. Bagirov

Federation University Australia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Julien Ugon

Federation University Australia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mujgan Sagir

Eskişehir Osmangazi University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dean Webb

Federation University Australia

View shared research outputs
Researchain Logo
Decentralizing Knowledge