Dražen Bajer
Josip Juraj Strossmayer University of Osijek
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Dražen Bajer.
International Journal of Applied Mathematics and Computer Science | 2014
Goran Martinović; Dražen Bajer; Bruno Zorić
Abstract The feature selection problem often occurs in pattern recognition and, more specifically, classification. Although these patterns could contain a large number of features, some of them could prove to be irrelevant, redundant or even detrimental to classification accuracy. Thus, it is important to remove these kinds of features, which in turn leads to problem dimensionality reduction and could eventually improve the classification accuracy. In this paper an approach to dimensionality reduction based on differential evolution which represents a wrapper and explores the solution space is presented. The solutions, subsets of the whole feature set, are evaluated using the k-nearest neighbour algorithm. High quality solutions found during execution of the differential evolution fill the archive. A final solution is obtained by conducting k-fold crossvalidation on the archive solutions and selecting the best one. Experimental analysis is conducted on several standard test sets. The classification accuracy of the k-nearest neighbour algorithm using the full feature set and the accuracy of the same algorithm using only the subset provided by the proposed approach and some other optimization algorithms which were used as wrappers are compared. The analysis shows that the proposed approach successfully determines good feature subsets which may increase the classification accuracy.
hybrid artificial intelligence systems | 2015
Dražen Bajer; Bruno Zorić; Goran Martinović
During the creation of a classification model, it is vital to keep track of numerous parameters and to produce a model based on the limited knowledge inferred often from very confined data. Methods which aid the construction or completely build the classification model automatically, present a fairly common research interest. This paper proposes an approach that employs differential evolution enhanced through the incorporation of additional knowledge concerning the problem in order to design a radial basis neural network. The knowledge is inferred from the unsupervised learning procedure which aims to ensure an initial population of good solutions. Also, the search space is dynamically adjusted i.e. narrowed during runtime in terms of the decision variables count. The results obtained on several datasets suggest that the proposed approach is able to find well performing networks while keeping the structure simple. Furthermore, a comparison with a differential evolution algorithm without the proposed enhancements and a particle swarm optimization algorithm was carried out illustrating the benefits of the proposed approach.
Advances in Electrical and Computer Engineering | 2012
Goran Martinović; Dražen Bajer
The Traveling Salesman Problem is one of the most famous problems in combinatorial optimization. The paper presents an algorithm based upon the elitist ant system for solving the traveling salesman problem. 2-opt local search is incorporated in the elitist ant system, and it is used for improvement of a given number of solutions previously constructed by artificial ants. A simple mechanism for avoiding a too early stagnation of the search is also proposed. The aforementioned is based on depositing strong pheromones on solution edges of randomly selected ants called random elitist ants. The aim is to encourage exploration in a greater area of the solution space. Experimental analysis shows how high-quality solutions can be achieved by using the considered algorithm instead of the usual elitist ant system with incorporated 2-opt local search.
Expert Systems With Applications | 2016
Dražen Bajer; Goran Martinović; Janez Brest
Abstract The initial population of an evolutionary algorithm is an important factor which affects the convergence rate and ultimately its ability to find high quality solutions or satisfactory solutions for that matter. If composed of good individuals it may bias the search towards promising regions of the search space right from the beginning. Although, if no knowledge about the problem at hand is available, the initial population is most often generated completely random, thus no such behavior can be expected. This paper proposes a method for initializing the population that attempts to identify i.e., to get close to promising parts of the search space and to generate (relatively) good solutions in their proximity. The method is based on clustering and a simple Cauchy mutation. The results obtained on a broad set of standard benchmark functions suggest that the proposed method succeeds in the aforementioned which is most noticeable as an increase in convergence rate compared to the usual initialization approach and a method from the literature. Also, insight into the usefulness of advanced initialization methods in higher-dimensional search spaces is provided, at least to some degree, by the results obtained on higher-dimensional problem instances—the proposed method is beneficial in such spaces as well. Moreover, results on several very high-dimensional problem instances suggest that the proposed method is able to provide a good starting position for the search.
swarm evolutionary and memetic computing | 2013
Goran Martinović; Dražen Bajer
Data clustering is one of the fundamental tools in data mining and requires the grouping of a dataset into a specified number of nonempty and disjoint subsets. Beside the usual partitional and hierarchical methods, evolutionary algorithms are employed for clustering as well. They are able to find good quality partitions of the dataset and successfully solve some of the shortcomings that the k-means, being one of the most popular partitional algorithms, exhibits. This paper proposes a differential evolution algorithm that includes macromutations as an additional exploration mechanism. The application probability and the intensity of the macromutations are dynamically adjusted during runtime. The proposed algorithm was compared to four variants of differential evolution and one particle swarm optimization algorithm. The experimental analysis conducted on a number of real datasets showed that the proposed algorithm is stable and manages to find high quality solutions.
Expert Systems With Applications | 2016
Rudolf Scitovski; Ivan Vidović; Dražen Bajer
A new fast fuzzy partitioning algorithm is proposed.The algorithm is able to find a fuzzy globally optimal partition.The algorithm is able to estimate the most appropriate number of clusters. In this paper, a new fast incremental fuzzy partitioning algorithm able to find either a fuzzy globally optimal partition or a fuzzy locally optimal partition of the set A ? R n close to the global one is proposed. This is the main impact of the paper, which could have an important role in applied research. Since fuzzy k-optimal partitions with k = 2 , 3 , ? , k m a x clusters are determined successively in the algorithm, it is possible to calculate corresponding validity indices for every obtained partition. The number kmax is defined in such a way that the objective function value of optimal partition with kmax clusters is relatively very close to the objective function value of optimal partition with ( k m a x - 1 ) clusters. Before clustering, the data are normalized and afterwards several validity indices are applied to partitions of the normalized data. Very simple relationships between used validity indices on normalized and original data are given as well. Hence, the proposed algorithm is able to find optimal partitions with the most appropriate number of clusters. The algorithm is tested on numerous synthetic data sets and several real data sets from the UCI data repository.
swarm evolutionary and memetic computing | 2014
Goran Martinović; Dražen Bajer
This paper considers the effect of swapping vectors during mutation, which are used for mutant vector construction. In the classic/canonical differential evolution three mutually different vector are picked from the population, where one represents the base vector, and the difference of the remaining two represents the difference vector. Motivated by the fact that there is no selection pressure in selecting the base vector, the effect of setting the best one of the selected three as the base vector is investigated. This way, a corresponding selection pressure is achieved and the exploration of the search space is directed more towards better solutions. Additionally, the order of the vectors used for generating the difference vector is considered as well. The experimental analysis conducted on a fair number of standard benchmark functions of different dimensionalities and properties indicates that the aforementioned approach performs competitively or better compared to the canonical differential evolution.
International Journal of Bio-inspired Computation | 2015
Goran Martinović; Dražen Bajer
The 5th International Conference on Bioinspired Optimization Methods and their Applications (BIOMA 2012) | 2012
Goran Martinović; Dražen Bajer
Croatian Operational Research Review | 2014
Ivan Vidović; Dražen Bajer; Rudolf Scitovski