Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Martin Riedel.
Foundations of Computing and Decision Sciences | 2014
Marika Kaden; Mandy Lange; David Nebel; Martin Riedel; Tina Geweniger; Thomas Villmann
Abstract .Classification is one of the most frequent tasks in machine learning. However, the variety of classification tasks as well as classifier methods is huge. Thus the question is coming up: which classifier is suitable for a given problem or how can we utilize a certain classifier model for different tasks in classification learning. This paper focuses on learning vector quantization classifiers as one of the most intuitive prototype based classification models. Recent extensions and modifications of the basic learning vector quantization algorithm, which are proposed in the last years, are highlighted and also discussed in relation to particular classification task scenarios like imbalanced and/or incomplete data, prior data knowledge, classification guarantees or adaptive data metrics for optimal classification.
soft computing | 2015
Marika Kaden; Martin Riedel; W. Hermann; Thomas Villmann
Learning vector quantization (LVQ) algorithms as powerful classifier models for class discrimination of vectorial data belong to the family of prototype-based classifiers with a learning scheme based on Hebbian learning as a widely accepted neuronal learning paradigm. Those classifier approaches estimate the class distribution and generate from this a class decision for vectors to be classified. The estimation can be done by the determination of class-typical sensitive prototypes inside the class distribution area like in LVQ or by detection of the class borders for class discrimination as preferred by support vector machines (SVMs). Both strategies provide advantages and disadvantages depending on the given classification task. Whereas LVQs are very intuitive and usually process the data during learning in the data space, frequently equipped with variants of the Euclidean metric, SVMs implicitly map the data into a high-dimensional kernel-induced feature space for better separation. In this Hilbert space, the inner product is compliant to the kernel. However, this implicit mapping makes a vivid interpretation more difficult. As an alternative, we propose in this paper two modifications of LVQ to make it comparable to SVM: first border-sensitive learning is introduced to achieve border-responsible prototypes comparable with support vectors in SVM. Second, kernel distances for differentiable kernels are considered, such that prototype learning takes place in a metric space isomorphic to the feature mapping space of SVM. Combination of both features gives a powerful prototype-based classifier while keeping the easy interpretation and the intuitive Hebbian learning scheme of LVQ.
international conference on machine learning and applications | 2012
Marika Kästner; David Nebel; Martin Riedel; Michael Biehl; Thomas Villmann
In the present paper we investigate the application of differentiable kernel for generalized matrix learning vector quantization as an alternative kernel-based classifier, which additionally provides classification dependent data visualization. We show that the concept of differentiable kernels allows a prototype description in the data space but equipped with the kernel metric. Moreover, using the visualization properties of the original matrix learning vector quantization we are able to optimize the class visualization by inherent visualization mapping learning also in this new kernel-metric data space.
international conference on artificial neural networks | 2013
Marika Kästner; Martin Riedel; Marc Strickert; W. Hermann; Thomas Villmann
Prototype based classification approaches are powerful classifiers for class discrimination of vectorial data. Famous examples are learning vector quantization models (LVQ) and support vector machines (SVMs). In this paper we propose the application of kernel distances in LVQ such that the LVQ-algorithm can handle the data in a topologically equivalent data space compared to the feature mapping space in SVMs. Further, we provide strategies to force the LVQ-prototypes to be class border sensitive. In this way an alternative to SVMs based on Hebbian learning is established. After presenting the theoretical background, we demonstrate the abilities of the model for an illustrative toy example and for the more challenging task of classification of Wilsons disease patients according to their neurophysiological impairments.
Advances in Self-Organizing Maps and Learning Vector Quantization - Proceedings of the 10th International Workshop, {WSOM} 2014, Mittweida, Germany, July, 2-4, 2014 | 2014
Barbara Hammer; David Nebel; Martin Riedel; Thomas Villmann
Prototype-based models such as learning vector quantization (LVQ) enjoy a wide popularity because they combine excellent classification and generalization ability with an intuitive learning paradigm: models are represented by few characteristic prototypes, the latter often being located at class typical positions in the data space. In this article we investigate inhowfar these expectations are actually met by modern LVQ schemes such as robust soft LVQ and generalized LVQ. We show that the mathematical models do not explicitly optimize the objective to find representative prototypes. We demonstrate this fact in a few benchmarks. Further, we investigate the behavior of the models if this objective is explicitly formalized in the mathematical costs. This way, a smooth transition of the two partially contradictory objectives, discriminative power versus model representativity, can be obtained.
International Journal of Applied Mathematics and Computer Science | 2015
Tomasz Zok; Maciej Antczak; Martin Riedel; David Nebel; Thomas Villmann; Piotr Lukasiak; Jacek Blazewicz; Marta Szachniuk
Abstract An increasing number of known RNA 3D structures contributes to the recognition of various RNA families and identification of their features. These tasks are based on an analysis of RNA conformations conducted at different levels of detail. On the other hand, the knowledge of native nucleotide conformations is crucial for structure prediction and understanding of RNA folding. However, this knowledge is stored in structural databases in a rather distributed form. Therefore, only automated methods for sampling the space of RNA structures can reveal plausible conformational representatives useful for further analysis. Here, we present a machine learning-based approach to inspect the dataset of RNA three-dimensional structures and to create a library of nucleotide conformers. A median neural gas algorithm is applied to cluster nucleotide structures upon their trigonometric description. The clustering procedure is two-stage: (i) backbone- and (ii) ribose-driven. We show the resulting library that contains RNA nucleotide representatives over the entire data, and we evaluate its quality by computing normal distribution measures and average RMSD between data points as well as the prototype within each cluster.
WSOM | 2014
Sven Hellbach; Marian Himstedt; Frank Bahrmann; Martin Riedel; Thomas Villmann; Hans-Joachim Böhme
This paper aims at an approach for labeling places within a grid cell environment. For that we propose a method that is based on non-negative matrix factorization (NMF) to extract environment specific features from a given occupancy grid map. NMF also computes a description about where on the map these features need to be applied. We use this description after certain pre-processing steps as an input for generalized learning vector quantization (GLVQ) to achieve the classification or labeling of the grid cells. Our approach is evaluated on a standard data set from University of Freiburg, showing very promising results.
WSOM | 2014
Mathias Klingner; Sven Hellbach; Martin Riedel; Marika Kaden; Thomas Villmann; Hans-Joachim Böhme
In this work we propose an online approach to compute a more precise assignment between parts of an upper human body model to RGBD image data. For this, a Self-Organizing Map (SOM) will be computed using a set of features where each feature is weighted by a relevance factor (RFSOM). These factors are computed using the generalized matrix learning vector quantization (GMLVQ) and allow to scale the input dimensions according to their relevance. With this scaling it is possible to distinguish between the different body parts of the upper body model. This method leads to a more precise positioning of the SOM in the 2.5D point cloud, a more stable behavior of the single neurons in their specific body region, and hence, to a more reliable pose model for further computation. The algorithm was evaluated on different data sets and compared to a Self-Organizing Map trained with the spatial dimensions only using the same data sets.
Neurocomputing | 2014
Thomas Villmann; Marika Kaden; David Nebel; Martin Riedel
The amount of available functional data like time series and hyper-spectra in remote sensing is rapidly growing and requires an efficient processing taking into account the knowledge about this special data characteristic. Usually these data are high-dimensional but with inherent correlations between neighbored vector dimensions reflecting the functional characteristics. Especially, for such high dimensional data, metric adaptation is an important tool in several learning methods for data discrimination and sparse representation. An important group of metric learning are relevance and matrix learning in vector quantization. Functional variants of relevance and matrix learning are considered in this paper. For an efficient learning of these functional relevance and matrix weights, we propose the utilization of spatial neighborhood correlations regarding the vector dimensions. We show that this efficient enhancement scheme can be seen as a new dissimilarity measure in standard generalized learning vector quantization, emphasizing the functional data aspect, such that theoretical aspects like margin analysis remain valid.
international conference on machine learning and applications | 2012
Thomas Villmann; Marika Kästner; David Nebel; Martin Riedel