Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Josef Kittler is active.

Publication


Featured researches published by Josef Kittler.


Pattern Recognition | 1973

A new approach to feature selection based on the Karhunen-Loeve expansion

Josef Kittler; Peter C. Young

Abstract After surveying existing feature selection procedures based upon the Karhunen-Loeve (K-L) expansion, the paper describes a new K-L technique that overcomes some of the limitations of the earlier procedures. The new method takes into account information on both the class variances and means, but lays particular emphasis on the classification potential of the latter. The results of a series of experiments concerned with the classification of real vector-electrocardiogram and artificially generated data demonstrate the advantages of the new method. They suggest that it is particularly useful for pattern recognition when combined with classification procedures based upon discriminant functions obtained by recursive least squares analysis.


International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 1975

Mathematical methods of feature selection in pattern recognition

Josef Kittler

In the 15 years of its existence pattern recognition has made considerable progress on both the theoretical and practical fronts. Starting from the original application of pattern recognition techniques to the problem of character recognition at the time when pattern recognition was conceived these techniques have now penetrated such diverse areas of science as medical diagnosis, remote sensing, finger prints and speech recognition, image classification, etc.* This wide applicability derives from the inherent generality of pattern recognition, which is a direct consequence of the adopted threestage concept of pattern recognition process. According to this concept the process of pattern recognition is viewed as a sequence of three independent functions--representation, feature selection and classification (Fig. 1). Among these functions only the representation stage, which transforms the input patterns into a form suitable for computer processing, is problemdependent. Both the feature selector, the function of which is to reduce the dimensionality of the representation vector, and the classifier, which carries out the actual decision process, work with a vector of measurements which can be considered as an abstract pattern. As a result, the feature selection and classification stages can be implemented using mathematical methods irrespective of the original application. Naturally, this has had a beneficial effect on the progress in the theory of pattern recognition. Although all three stages of the pattern recognition system play an essential role in the process of classifying patterns by machine, the quality of the systems performance depends chiefly on the feature selector. The reasons


Pattern Recognition | 1976

A locally sensitive method for cluster analysis

Josef Kittler

Abstract In this paper a new method of mode separation is proposed. The method is based on mapping of data points from the N -dimensional space onto a sequence so that the majority of points from each mode become successive elements of the sequence. The intervals of points in the sequence belonging to the respective modes of the p.d.f. are then determined from a function generated on this sequence. The nuclei of the modes formed by the elements of these intervals are then used to obtain separating surfaces between the modes and so to partition the data set with multimodal probability density function into unimodal subsets.


Biological Cybernetics | 1975

Discriminant function implementation of a minimum risk classifier

Josef Kittler; Peter C. Young

The paper discusses the possibility of implementing a minimum risk classifier using the learning machine approach. Necessary conditions on the choice of pairwise classification costs are imposed so that the minimum risk classifier can be implemented using pairwise class separating functions. Parameters of these functions are obtained using a two stage algorithm which minimizes a modified least squares criterion of class separation. In comparison to normal least squares objective function, this criterion increases the sensitivity of the learning scheme near the class separating surface and, consequently, allows for an improvement in the performance of the discriminant function decision making processor. Simplicity of the design procedure is achieved by partitioning the multimodal classes into unimodal subsets, since discriminant functions of unimodal classes can usually be implemented simply and with sufficient accuracy as low order polynomials. The proposed design approach is tested experimentally on an artificial pattern recognition problem.


Information Processing Letters | 1977

A method for determining class subspaces

Josef Kittler

The subspace approach to the design of a pattern recognition system assumes that pattern classes occupy different subspaces of the pattern representation space. If these subspaces were known then pattern vectors with unknown class membership could be classified into their categories by simply comparing the magnitudes of the projection of these patterns into individual c!ass subspaces. A number of methods for determining class subspaces have been suggested in the pattern recognition literature [1,2,3]. In all these methods candidate axes of the class subspaces are acquired using the KarhunenLoeve expansion [4]. Th2 individual methods then differ in the manner these candidate axes are used for construction of class subspaces. Thus, for instance, while Clafic [ l] utilizes the candidate axes directly without any modification, in both the orthogonal subspace method [2] and the nonorthogonal retrenched subspace method [2] the raw class subspaces defined by the candidate axes are amended so that the resulting subspaces satisfy required conditions. Although the latter two methods are considerably more sophisticated than the seminal procedure Clafic they both have certain limitations. The orthogonal subspace method, for instance, is too restrictive and, consequently, it often fails to yield a solution. The nonorthogonal retrenched subspace method, on the other hand. is difficult to implement eve,? if we resort


Information Sciences | 1975

A nonlinear distance metric criterion for feature selection in the measurement space

Josef Kittler

Abstract In this paper a nonlinear distance metric criterion for feature selection in the measurement space is proposed. The criterion is not only a more reliable measure of class separability than criteria based on the Euclidean distance metric but also computationally more efficient.


Information Processing Letters | 1975

On the divergence and the Joshi dependence measure in feature selection

Josef Kittler

The majMty of methods of feature selection in the measurement space are based on measures of probabilktic distance between class conditional probability density functions (p.d.f.) p(~]~~) and p(x(w2) [l 1. These criteria, which, as a rule, are of integral form,, are especially appealing because they provide an indication about separa of classes c*tl, 02 in the feature space under consideration and some of them even yield directly the probability of n&recognition associated with the feature vector X. Amongst these measures the divergence Dl@), defined as ]a]


International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 1974

Foundations of the Theory of Learning Systems, by Ya. Z. Tsypkin (Book Review).

Paul R. Young; Josef Kittler


International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 1974

Ya. Z. Tsypkin, Editor, Foundations of the Theory of Learning Systems, Academic Press, New York, London (1973) 205+xiii pp.

Paul R. Young; Josef Kittler


5th Conference on Optimization Techniques, Part 1 | 1973

15.00..

Josef Kittler

Collaboration


Dive into the Josef Kittler's collaboration.

Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge