Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Andreas Weingessel is active.

Publication


Featured researches published by Andreas Weingessel.


Psychometrika | 2002

An Examination of Indexes for Determining the Number of Clusters in Binary Data Sets.

Evgenia Dimitriadou; Sara Dolnicar; Andreas Weingessel

The problem of choosing the correct number of clusters is as old as cluster analysis itself. A number of authors have suggested various indexes to facilitate this crucial decision. One of the most extensive comparative studies of indexes was conducted by Milligan and Cooper (1985). The present piece of work pursues the same goal under different conditions. In contrast to Milligan and Coopers work, the emphasis here is on high-dimensional empirical binary data. Binary artificial data sets are constructed to reflect features typically encountered in real-world data situations in the field of marketing research. The simulation includes 162 binary data sets that are clustered by two different algorithms and lead to recommendations on the number of clusters for each index under consideration. Index results are evaluated and their performance is compared and analyzed.


International Journal of Pattern Recognition and Artificial Intelligence | 2002

A COMBINATION SCHEME FOR FUZZY CLUSTERING

Evgenia Dimitriadou; Andreas Weingessel; Kurt Hornik

In this paper we present a voting scheme for fuzzy cluster algorithms. This voting method allows us to combine several runs of cluster algorithms resulting in a common partition. This helps us to tackle the problem of choosing the appropriate clustering method for a data set where we have no a priori information about it. We mathematically derive the algorithm from theoretical considerations. Experiments show that the voting algorithm finds structurally stable results. Several cluster validity indexes show the improvement of the voting result in comparison to simple fuzzy voting.


IEEE Transactions on Neural Networks | 2000

Local PCA algorithms

Andreas Weingessel; Kurt Hornik

Within the last years various principal component analysis (PCA) algorithms have been proposed. In this paper we use a general framework to describe those PCA algorithms which are based on Hebbian learning. For an important subset of these algorithms, the local algorithms, we fully describe their equilibria, where all lateral connections are set to zero and their local stability. We show how the parameters in the PCA algorithms have to be chosen in order to get an algorithm which converges to a stable equilibrium which provides principal component extraction.


international conference on artificial neural networks | 2001

Voting-Merging: An Ensemble Method for Clustering

Evgenia Dimitriadou; Andreas Weingessel; Kurt Hornik

In this paper we propose an unsupervised voting-merging scheme that is capable of clustering data sets, and also of finding the number of clusters existing in them. The voting part of the algorithm allows us to combine several runs of clustering algorithms resulting in a common partition. This helps us to overcome instabilities of the clustering algorithms and to improve the ability to find structures in a data set. Moreover, we develop a strategy to understand, analyze and interpret these results. In the second part of the scheme, a merging procedure starts on the clusters resulting by voting, in order to find the number of clusters in the data set.


soft computing | 2002

A Combination Scheme for Fuzzy Clustering

Evgenia Dimitriadou; Andreas Weingessel; Kurt Hornik

In this paper we present a voting scheme for cluster algorithms. This voting method allows us to combine several runs of cluster algorithms resulting in a common partition. This helps us to tackle the problem of choosing the appropriate clustering method for a data set where we have no a priori information about it, and to overcome the problems of choosing an optimal result between different repetitions of the same method. Further on, we can improve the ability of a cluster algorithm to find structures in a data set and to validate the resulting partition.


Neural Processing Letters | 1997

SVD Algorithms: APEX-like versus Subspace Methods

Andreas Weingessel; Kurt Hornik

We compare several new SVD learning algorithms which are based on the subspace method in principal component analysis with the APEX-like algorithm proposed by Diamantaras. It is shown experimentally that the convergence of these algorithms is as fast as the convergence of the APEX-like algorithm.


Archive | 1998

Competitive learning for binary valued data

Friedrich Leisch; Andreas Weingessel; Evgenia Dimitriadou

We propose a new approach for using online competitive learning on binary data. The usual Euclidean distance is replaced by binary distance measures, which take possible asymmetries of binary data into account and therefore provide a “different point of view” for looking at the data. The method is demonstrated on two artificial examples and applied on tourist marketing research data.


international conference on artificial neural networks | 2002

A Mixed Ensemble Approach for the Semi-supervised Problem

Evgenia Dimitriadou; Andreas Weingessel; Kurt Hornik

In this paper we introduce a mixed approach for the semi-supervised data problem. Our approach consists of an ensemble unsupervised learning part where the labeled and unlabeled points are segmented into clusters. Continuing, we take advantage of the a priori information of the labeled points to assign classes to clusters and proceed to predicting with the ensemble method new incoming ones. Thus, we can finally conclude classifying new data points according to the segmentation of the whole set and the association of its clusters to the classes.


IEEE Transactions on Neural Networks | 1997

Adaptive combination of PCA and VQ networks

Andreas Weingessel; Horst Bischof; Kurt Hornik; Friedrich Leisch

In this paper we consider the principal component analysis (PCA) and vector quantization (VQ) neural networks for image compression. We present a method where the PCA and VQ steps are adaptively combined. A learning algorithm for this combined network is derived. We demonstrate that this approach can improve the results of the successive application of the individually optimal methods.


International Journal of Neural Systems | 2003

A ROBUST SUBSPACE ALGORITHM FOR PRINCIPAL COMPONENT ANALYSIS

Andreas Weingessel; Kurt Hornik

We present a noise robust PCA algorithm which is an extension of the Oja subspace algorithm and allows tuning the noise sensitivity. We derive a loss function which is minimized by this algorithm and interpret it in a noisy PCA setting. Results on the local stability analysis of this algorithm are given and it is shown that the locally stable equilibria are those which minimize the loss function.

Collaboration


Dive into the Andreas Weingessel's collaboration.

Top Co-Authors

Avatar

Kurt Hornik

Vienna University of Economics and Business

View shared research outputs
Top Co-Authors

Avatar

Evgenia Dimitriadou

Vienna University of Technology

View shared research outputs
Top Co-Authors

Avatar

Sara Dolnicar

University of Queensland

View shared research outputs
Top Co-Authors

Avatar

Horst Bischof

Graz University of Technology

View shared research outputs
Top Co-Authors

Avatar

David Meyer

Vienna University of Economics and Business

View shared research outputs
Top Co-Authors

Avatar

Christian Buchta

Vienna University of Economics and Business

View shared research outputs
Researchain Logo
Decentralizing Knowledge