Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Wojciech W. Siedlecki is active.

Publication


Featured researches published by Wojciech W. Siedlecki.


Pattern Recognition Letters | 1989

A note on genetic algorithms for large-scale feature selection

Wojciech W. Siedlecki; Jack Sklansky

Abstract We introduce the use of genetic algorithms (GA) for the selection of features in the design of automatic pattern classifiers. Our preliminary results suggest that GA is a powerful means of reducing the time for finding near-optimal subsets of features from large sets.


International Journal of Pattern Recognition and Artificial Intelligence | 1988

ON AUTOMATIC FEATURE SELECTION

Wojciech W. Siedlecki; Jack Sklansky

We review recent research on methods for selecting features for multidimensional pattern classification. These methods include nonmonotonicity-tolerant branch-and-bound search and beam search. We describe the potential benefits of Monte Carlo approaches such as simulated annealing and genetic algorithms. We compare these methods to facilitate the planning of future research on feature selection.


Pattern Recognition | 1988

An overview of mapping techniques for exploratory pattern analysis

Wojciech W. Siedlecki; Kinga Siedlecka; Jack Sklansky

Abstract We present an extensive review of mapping techniques for exploratory pattern analysis. We place these techniques in eight major groups. Among them is our own innovation—the least squares mapping. We also introduce a method that accelerates and increases the precision of mappings based on the Fisher discriminant. In a sequel, to be published in this journal, we describe experiments that show that mapping techniques can be a versatile and powerful tool for cluster analysis and classifier design.


Pattern Recognition | 1988

Experiments on mapping techniques for exploratory pattern analysis

Wojciech W. Siedlecki; Kinga Siedlecka; Jack Sklansky

Abstract We describe two computer-based experiments evaluating the effectiveness of several mapping techniques for exploratory pattern analysis. The first experiment compares various mappings and classical clustering techniques as aids to people whose objective is to find clusters in the data. The second experiment evaluates the effectiveness of two-dimensional displays produced by analytic mappings for people designing linear and piecewise linear classifiers. The performance of the classifiers designed by the people aided by these displays is compared with automatically trained classifiers. Based on these experiments we selected three best mapping methods. Even the untrained users who took part in our experiments achieved very good results with the aid of these best mappings. In fact, these results were superior by a significant margin to those obtained from renowned classical pattern recognition procedures. Another valuable result of our experiments is that they allowed us to identify the sets of parameters most often used by the participants and, consequently, suggest guidelines for the best use of mapping techniques.


Machine Intelligence and Pattern Recognition | 1988

Mapping Techniques for Exploratory Pattern Analysis

Wojciech W. Siedlecki; Kinga Siedlecka; Jack Sklansky

We describe a versatile collection of mapping methods for computer-aided pattern analysis, and we report the results of two experiments: one for cluster analysis and the second for classifier design. The first experiment involved sixteen human subjects and the second involved fourteen. The collection of mapping methods includes our innovation — the least squares mapping , which combines a squared error criterion with agglomerative hierarchical clustering. In both of these experiments untrained humans aided by the generalized declustering mapping and our least squares mapping outperformed or equaled automatic clustering and classifier design techniques.


Pattern Recognition Letters | 1994

A formula for multi-class distributed classifiers

Wojciech W. Siedlecki

Abstract In this letter we describe a mathematical formula for combining two-class classifiers into a single multi-class decision rule. The use of this formula brings two important benefits. First, the component classifiers can be highly optimized with respect to local properties of the feature space. Second, the component classifiers can be assembled into a sparse two-layer network in which they operate in parallel.


international conference on genetic algorithms | 1989

Constrained Genetic Optimization via Dynarnic Reward-Penalty Balancing and Its Use in Pattern Recognition

Wojciech W. Siedlecki; Jack Sklansky


Handbook of pattern recognition & computer vision | 1993

On automatic feature selection

Wojciech W. Siedlecki; Jack Sklansky


Archive | 1993

LARGE-SCALE FEATURE SELECTION

Jack Sklansky; Wojciech W. Siedlecki


Archive | 1988

Feature selection for large scale problems

Wojciech W. Siedlecki; Jack Sklansky

Collaboration


Dive into the Wojciech W. Siedlecki's collaboration.

Top Co-Authors

Avatar

Jack Sklansky

University of California

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge