Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Wan-Jui Lee is active.

Publication


Featured researches published by Wan-Jui Lee.


international conference on multiple classifier systems | 2007

Kernel combination versus classifier combination

Wan-Jui Lee; Sergey Verzakov; Robert P. W. Duin

Combining classifiers is to join the strengths of different classifiers to improve the classification performance. Using rules to combine the outputs of different classifiers is the basic structure of classifier combination. Fusing models from different kernel machine classifiers is another strategy for combining models called kernel combination. Although classifier combination and kernel combination are very different strategies for combining classifier, they aim to reach the same goal by very similar fundamental concepts. We propose here a compositional method for kernel combination. The new composed kernel matrix is an extension and union of the original kernel matrices. Generally, kernel combination approaches relied heavily on the training data and had to learn some weights to indicate the importance of each kernel. Our compositional method avoids learning any weight and the importance of the kernel functions are directly derived in the process of learning kernel machines. The performance of the proposed kernel combination procedure is illustrated by some experiments in comparison with classifier combining based on the same kernels.


SSPR & SPR '08 Proceedings of the 2008 Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition | 2008

An Inexact Graph Comparison Approach in Joint Eigenspace

Wan-Jui Lee; Robert P. W. Duin

In graph comparison, the use of (dis)similarity measurements between graphs is an important topic. In this work, we propose an eigendecomposition based approach for measuring dissimilarities between graphs in the joint eigenspace (JoEig). We will compare our JoEig approach with two other eigendecomposition based methods that compare graphs in different eigenspaces. To calculate the dissimilarity between graphs of different sizes and perform inexact graph comparison, we further develop three different ways to resize the eigenspectra and study their performance in different situations.


SSPR & SPR '08 Proceedings of the 2008 Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition | 2008

On Euclidean Corrections for Non-Euclidean Dissimilarities

Robert P. W. Duin; Elzbieta Pekalska; Artsiom Harol; Wan-Jui Lee; Horst Bunke

Non-Euclidean dissimilarity measures can be well suited for building representation spaces that are more beneficial for pattern classification systems than the related Euclidean ones [1,2]. A non-Euclidean representation space is however cumbersome for training classifiers, as many statistical techniques rely on the Euclidean inner product that is missing there. In this paper we report our findings on the applicability of corrections that transform a non-Euclidean representation space into a Euclidean one in which similar or better classifiers can be trained. In a case-study based on four principally different classifiers we find out that standard correction procedures fail to construct an appropriate Euclidean space, equivalent to the original non-Euclidean one.


SIMBAD'11 Proceedings of the First international conference on Similarity-based pattern recognition | 2011

Bag dissimilarities for multiple instance learning

David M. J. Tax; Marco Loog; Robert P. W. Duin; Veronika Cheplygina; Wan-Jui Lee

When objects cannot be represented well by single feature vectors, a collection of feature vectors can be used. This is what is done in Multiple Instance learning, where it is called a bag of instances. By using a bag of instances, an object gains more internal structure than when a single feature vector is used. This improves the expressiveness of the representation, but also adds complexity to the classification of the object. This paper shows that for the situation that not a single instance determines the class label of a bag, simple bag dissimilarity measures can significantly outperform standard multiple instance classifiers. In particular a measure that computes just the average minimum distance between instances, or a measure that uses the Earth Movers distance, perform very well.


SSPR&SPR'10 Proceedings of the 2010 joint IAPR international conference on Structural, syntactic, and statistical pattern recognition | 2010

Dissimilarity-based multiple instance learning

Lauge Sørensen; Marco Loog; David M. J. Tax; Wan-Jui Lee; Marleen de Bruijne; Robert P. W. Duin

In this paper, we propose to solve multiple instance learning problems using a dissimilarity representation of the objects. Once the dissimilarity space has been constructed, the problem is turned into a standard supervised learning problem that can be solved with a general purpose supervised classifier. This approach is less restrictive than kernel-based approaches and therefore allows for the usage of a wider range of proximity measures. Two conceptually different types of dissimilarity measures are considered: one based on point set distance measures and one based on the earth movers distance between distributions of within- and between set point distances, thereby taking relations within and between sets into account. Experiments on five publicly available data sets show competitive performance in terms of classification accuracy compared to previously published results.


multiple classifier systems | 2009

A Labelled Graph Based Multiple Classifier System

Wan-Jui Lee; Robert P. W. Duin

In general, classifying graphs with labelled nodes (also known as labelled graphs) is a more difficult task than classifying graphs with unlabelled nodes. In this work, we decompose the labelled graphs into unlabelled subgraphs with respect to the labels, and describe these decomposed subgraphs with the travelling matrices. By utilizing the travelling matrices to calculate the dissimilarity for all pairs of subgraphs with the JoEig approach [6], we can build a base classifier in the dissimilarity space for each label. By combining these label base classifiers with the global structure base classifiers built on dissimilarities of graphs considering the full adjacency matrices and the full travelling matrices, respectively, we can solve the labelled graph classification problem with the multiple classifier system.


International Journal of Pattern Recognition and Artificial Intelligence | 2012

BRIDGING STRUCTURE AND FEATURE REPRESENTATIONS IN GRAPH MATCHING

Wan-Jui Lee; Veronika Cheplygina; David M. J. Tax; Marco Loog; Robert P. W. Duin

Structures and features are opposite approaches in building representations for object recognition. Bridging the two is an essential problem in pattern recognition as the two opposite types of information are fundamentally different. As dissimilarities can be computed for both the dissimilarity representation can be used to combine the two. Attributed graphs contain structural as well as feature-based information. Neglecting the attributes yields a pure structural description. Isolating the features and neglecting the structure represents objects by a bag of features. In this paper we will show that weighted combinations of dissimilarities may perform better than these two extremes, indicating that these two types of information are essentially different and strengthen each other. In addition we present two more advanced integrations than weighted combining and show that these may improve the classification performances even further.


international conference on pattern recognition | 2010

A Study on Combining Sets of Differently Measured Dissimilarities

Alessandro Ibba; Robert P. W. Duin; Wan-Jui Lee

The ways distances are computed or measured enable us to have different representations of the same objects. In this paper we want to discuss possible ways of merging different sources of information given by differently measured dissimilarity representations. We compare here a simple averaging scheme [1] with dissimilarity forward selection and other techniques based on the learning of weights of linear and quadratic forms. Our general conclusion is that, although the more advanced forms of combination cannot always lead to better classification accuracies, combining given distance matrices prior to training is always worthwhile. We can thereby suggest which combination schemes are preferable with respect to the problem data.


2010 2nd International Workshop on Cognitive Information Processing | 2010

An experimental study on combining Euclidean distances

Wan-Jui Lee; Robert P. W. Duin; Alessandro Ibba; Marco Loog

Combining different distance matrices or dissimilarity representations can often increase the performance of individual ones. In this work, we experimentally study on the performance of combining Euclidean distances and its relationship with the non-Euclideaness produced from combining Euclidean distances. The relationship between the degree of non-Euclideaness from combining Euclidean distances and the correlations between these Euclidean distances are also investigated in the experiments. From the results, we observe that combining dissimilarities computed with Euclidean distances usually performs better than combining dissimilarities computed with squared Euclidean distances. Also, the improvements are found to be highly related to the degree of non-Euclideaness. Moreover, the degree of non-Euclideaness is relatively large if two highly uncorrelated dissimilarity matrices are combined and the degree of non-Euclideaness remains lower if two dissimilarity matrices to be combined are more correlated.


international conference on multiple classifier systems | 2010

Selecting structural base classifiers for graph-based multiple classifier systems

Wan-Jui Lee; Robert P. W. Duin; Horst Bunke

Selecting a set of good and diverse base classifiers is essential for building multiple classifier systems. However, almost all commonly used procedures for selecting such base classifiers cannot be directly applied to select structural base classifiers. The main reason is that structural data cannot be represented in a vector space. For graph-based multiple classifier systems, only using subgraphs for building structural base classifiers has been considered so far. However, in theory, a full graph preserves more information than its subgraphs. Therefore, in this work, we propose a different procedure which can transform a labelled graph into a new set of unlabelled graphs and preserve all the linkages at the same time. By embedding the label information into edges, we can further ignore the labels. By assigning weights to the edges according to the labels of their linked nodes, the strengths of the connections are altered, but the topology of the graph as a whole is preserved. Since it is very difficult to embed graphs into a vector space, graphs are usually classified based on pairwise graph distances. We adopt the dissimilarity representation and build the structural base classifiers based on labels in the dissimilarity space. By combining these structural base classifiers, we can solve the labelled graph classification problem with a multiple classifier system. The performance of using the subgraphs and full graphs to build multiple classifier systems is compared in a number of experiments.

Collaboration


Dive into the Wan-Jui Lee's collaboration.

Top Co-Authors

Avatar

Robert P. W. Duin

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

Marco Loog

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

David M. J. Tax

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

Alessandro Ibba

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

Veronika Cheplygina

Erasmus University Rotterdam

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Artsiom Harol

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

Sergey Verzakov

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge