Jenni Pulkkinen
Tampere University of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jenni Pulkkinen.
Expert Systems With Applications | 2011
Serkan Kiranyaz; Jenni Pulkkinen; Moncef Gabbouj
The particle swarm optimization (PSO) was introduced as a population based stochastic search and optimization process for static environments; however, many real problems are dynamic, meaning that the environment and the characteristics of the global optimum can change over time. Thanks to its stochastic and population based nature, PSO can avoid being trapped in local optima and find the global optimum. However, this is never guaranteed and as the complexity of the problem rises, it becomes more probable that the PSO algorithm gets trapped into a local optimum due to premature convergence. In this paper, we propose novel techniques, which successfully address several major problems in the field of particle swarm optimization (PSO) and promise efficient and robust solutions for multi-dimensional and dynamic problems. The first one, so-called multi-dimensional (MD) PSO, re-forms the native structure of swarm particles in such a way that they can make inter-dimensional passes with a dedicated dimensional PSO process. Therefore, in a multi-dimensional search space where the optimum dimension is unknown, swarm particles can seek for both positional and dimensional optima. This eventually removes the necessity of setting a fixed dimension a priori, which is a common drawback for the family of swarm optimizers. To address the premature convergence problem, we then propose fractional global best formation (FGBF) technique, which basically collects all the best dimensional components and fractionally creates an artificial global-best particle (aGB) that has the potential to be a better ldquoguiderdquo than the PSOs native gbest particle. To establish follow-up of (current) local optima, we then introduce a novel multi-swarm algorithm, which enables each swarm to converge to a different optimum and use FGBF technique distinctively. We then propose a multi-dimensional extension of the moving peaks benchmark (MPB), which is a publicly available for testing optimization algorithms in a multi-modal dynamic environment. In this extended benchmark an extensive set of experiments show that MD PSO using FGBF technique with multi-swarms exhibits an impressive performance and tracks the global maximum peak with the minimum error.
Expert Systems With Applications | 2011
Serkan Kiranyaz; Turker Ince; Jenni Pulkkinen; Moncef Gabbouj
This paper presents a personalized long-term electrocardiogram (ECG) classification framework, which addresses the problem within a long-term ECG signal, known as Holter register, recorded from an individual patient. Due to the massive amount of ECG beats in a Holter register, visual inspection is quite difficult and cumbersome, if not impossible. Therefore, the proposed system helps professionals to quickly and accurately diagnose any latent heart disease by examining only the representative beats (the so-called master key-beats) each of which is automatically extracted from a time frame of homogeneous (similar) beats. We tested the system on a benchmark database where beats of each Holter register have been manually labeled by cardiologists. The selection of the right master key-beats is the key factor for achieving a highly accurate classification and thus we used exhaustiveK-means clustering in order to find out (near-) optimal number of key-beats as well as the master key-beats. The classification process produced results that were consistent with the manual labels with over 99% average accuracy, which basically shows the efficiency and the robustness of the proposed system over massive data (feature) collections in high dimensions.
Expert Systems With Applications | 2010
Turker Ince; Serkan Kiranyaz; Jenni Pulkkinen; Moncef Gabbouj
In this paper, we investigate the performance of global vs. local techniques applied to the training of neural network classifiers for solving medical diagnosis problems. The presented methodology of the investigation involves systematic and exhaustive evaluation of the classifier performance over a neural network architecture space and with respect to training depth for a particular problem. In this study, the architecture space is defined over feed-forward, fully-connected artificial neural networks (ANNs) which have been widely used in computer-aided decision support systems in medical domain, and for which two popular neural network training methods are explored: conventional backpropagation (BP) and particle swarm optimization (PSO). Both training techniques are compared in terms of classification performance over three medical diagnosis problems (breast cancer, heart disease, and diabetes) from Proben1 benchmark dataset and computational and architectural analysis are performed for an extensive assessment. The results clearly demonstrate that it is not possible to compare and evaluate the performance of the two algorithms over a single network and with a fixed set of training parameters, as most of the earlier work in this field has been carried out, since training and test classification performances vary significantly and depend directly on the network architecture, the training depth and method used and the available dataset. We, therefore, show that an extensive evaluation method such as the one proposed in this paper is basically needed to obtain a reliable and detailed performance assessment, in that, we can conclude that the PSO algorithm has usually a better generalization ability across the architecture space whereas BP can occasionally provide better training and/or test classification performance for some network configurations. Furthermore, we can in general say that the PSO, as a global training algorithm, is capable of achieving minimum test classification errors regardless of the training depth, i.e. shallow or deep, and its average classification performance shows less variations with respect to network architecture. In terms of computational complexity, BP is in general superior to PSO for the entire architecture space used.
international conference on image processing | 2010
Serkan Kiranyaz; Moncef Gabbouj; Jenni Pulkkinen; Turker Ince; Kristian Meissner
In this paper, we focus on advanced classification and data retrieval schemes that are instrumental when processing large taxonomical image datasets. With large number of classes, classification and an efficient retrieval of a particular benthic macroinvertebrate image within a dataset will surely pose a severe problem. To address this, we propose a novel network of evolutionary binary classifiers, which is scalable, dynamically adaptable and highly accurate for the classification and retrieval of large biological species-image datasets. The classification and retrieval results for the macroinvertebrate test data attain taxonomic accuracy that equals and even surpasses that of an average expert. Our findings are encouraging for aquatic biomonitoring where cost intensity of sample analysis currently poses a bottleneck for routine biomonitoring.
international conference on innovations in information technology | 2008
Serkan Kiranyaz; Jenni Pulkkinen; Moncef Gabbouj
The particle swarm optimization (PSO) was introduced as a population based stochastic search and optimization process for static environments; however, many real problems are dynamic, meaning that the environment and the characteristics of the global optimum can change over time. Thanks to its stochastic and population based nature, PSO can avoid being trapped in local optima and find the global optimum. However, this is never guaranteed and as the complexity of the problem rises, it becomes more probable that the PSO algorithm gets trapped into a local optimum due to premature convergence. In this paper, we propose novel techniques, which successfully address several major problems in the field of particle swarm optimization (PSO) and promise efficient and robust solutions for multi-dimensional and dynamic problems. The first one, so-called multi-dimensional (MD) PSO, re-forms the native structure of swarm particles in such a way that they can make inter-dimensional passes with a dedicated dimensional PSO process. Therefore, in a multi-dimensional search space where the optimum dimension is unknown, swarm particles can seek for both positional and dimensional optima. This eventually removes the necessity of setting a fixed dimension a priori, which is a common drawback for the family of swarm optimizers. To address the premature convergence problem, we then propose fractional global best formation (FGBF) technique, which basically collects all the best dimensional components and fractionally creates an artificial global-best particle (aGB) that has the potential to be a better ldquoguiderdquo than the PSOs native gbest particle. To establish follow-up of (current) local optima, we then introduce a novel multi-swarm algorithm, which enables each swarm to converge to a different optimum and use FGBF technique distinctively. We then propose a multi-dimensional extension of the moving peaks benchmark (MPB), which is a publicly available for testing optimization algorithms in a multi-modal dynamic environment. In this extended benchmark an extensive set of experiments show that MD PSO using FGBF technique with multi-swarms exhibits an impressive performance and tracks the global maximum peak with the minimum error.
international conference on image processing | 2011
Serkan Kiranyaz; Jenni Pulkkinen; Turker Ince; Moncef Gabbouj
Low-level features (also called descriptors) play a central role in content-based image retrieval (CBIR) systems. Features are various types of information extracted from the content and represent some of its characteristics or signatures. However, especially the (low-level) features, which can be extracted automatically usually lack the discrimination power needed for accurate description of the image content and may lead to a poor retrieval performance. In order to efficiently address this problem, in this paper we propose a multidimensional evolutionary feature synthesis technique, which seeks for the optimal linear and non-linear operators so as to synthesize highly discriminative set of features in an optimal dimension. The optimality therein is sought by the multi-dimensional particle swarm optimization method along with the fractional global-best formation technique. Clustering and CBIR experiments where the proposed feature synthesizer is evolved using only the minority of the image database, demonstrate a significant performance improvement and exhibit a major discrimination between the features of different classes.
2011 IEEE Workshop on Evolving and Adaptive Intelligent Systems (EAIS) | 2011
Serkan Kiranyaz; Stefan Uhlmann; Jenni Pulkkinen; Moncef Gabbouj; Turker Ince
The content-based image retrieval (CBIR) has been an active research field for which several feature extraction, classification and retrieval techniques have been proposed up to date. However, when the database size grows larger, it is a common fact that the overall retrieval performance significantly deteriorates. In this paper, we propose collective network of (evolutionary) binary classifiers (CNBC) framework to achieve a high retrieval performance even though the training (ground truth) data may not be entirely present from the beginning and thus the system can only be trained incrementally. The CNBC framework basically adopts a “Divide and Conquer” type approach by allocating several networks of binary classifiers (NBCs) to discriminate each class and performs evolutionary search to find the optimal binary classifier (BC) in each NBC. In such an evolution session, the CNBC body can further dynamically adapt itself with each new incoming class/feature set without a full-scale re-training or re-configuration. Both visual and numerical performance evaluations of the proposed framework over benchmark image databases demonstrate its scalability; and a significant performance improvement is achieved over traditional retrieval techniques.
international conference of the ieee engineering in medicine and biology society | 2009
Serkan Kiranyaz; Turker Ince; Jenni Pulkkinen; Moncef Gabbouj
In this paper we present a personalized long-term electrocardiogram (ECG) classification framework, which can be applied to any Holter register recorded from an individual patient. Due to the massive amount of ECG beats in a Holter register, visual inspection is quite difficult and cumbersome, if not impossible. Therefore the proposed system helps professionals to quickly and accurately diagnose any latent heart disease by examining only the representative beats (the so called master key-beats) each of which is automatically extracted from a time frame of homogeneous (similar) beats. We tested the system on a benchmark database where beats of each Holter register have been manually labeled by cardiologists. The selection of the right master key-beats is the key factor for achieving a highly accurate classification and thus we used exhaustive K-means clustering in order to find out (near-) optimal number of key-beats as well as the master key-beats. The classification process produced results that were consistent with the manual labels with over 99% average accuracy, which basically shows the efficiency and the robustness of the proposed system over massive data (feature) collections in high dimensions.
international conference of the ieee engineering in medicine and biology society | 2010
Serkan Kiranyaz; Turker Ince; Jenni Pulkkinen; Moncef Gabbouj
In this paper, we address dynamic clustering in high dimensional data or feature spaces as an optimization problem where multi-dimensional particle swarm optimization (MD PSO) is used to find out the true number of clusters, while fractional global best formation (FGBF) is applied to avoid local optima. Based on these techniques we then present a novel and personalized long-term ECG classification system, which addresses the problem of labeling the beats within a long-term ECG signal, known as Holter register, recorded from an individual patient. Due to the massive amount of ECG beats in a Holter register, visual inspection is quite difficult and cumbersome, if not impossible. Therefore the proposed system helps professionals to quickly and accurately diagnose any latent heart disease by examining only the representative beats (the so called master key-beats) each of which is representing a cluster of homogeneous (similar) beats. We tested the system on a benchmark database where the beats of each Holter register have been manually labeled by cardiologists. The selection of the right master key-beats is the key factor for achieving a highly accurate classification and the proposed systematic approach produced results that were consistent with the manual labels with 99.5% average accuracy, which basically shows the efficiency of the system.
international conference on innovations in information technology | 2011
Serkan Kiranyaz; Stefan Uhlmann; Jenni Pulkkinen; Turker Ince; Moncef Gabbouj
In this paper, we propose an incremental evolution scheme within collective network of (evolutionary) binary classifiers (CNBC) framework to address the problem of incremental learning and to achieve a high retrieval performance for content-based image retrieval (CBIR). The proposed CNBC framework can still function even though the training (ground truth) data may not be entirely present from the beginning and thus the system can only be evolved incrementally. The CNBC framework basically adopts a “Divide and Conquer” type approach by allocating several networks of binary classifiers (NBCs) to discriminate each class and performs evolutionary search to find the optimal binary classifier (BC) in each NBC. This design further allows such scalability that the CNBC can dynamically adapt its internal topology to new features and classes with minimal effort. Both visual and numerical performance evaluations of the proposed framework over benchmark image databases demonstrate its efficiency and accuracy for scalable CBIR and classification.