Jirí Grim
Academy of Sciences of the Czech Republic
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jirí Grim.
IEEE Transactions on Image Processing | 2009
Jirí Grim; Petr Somol; Michal Haindl; Jan Daneš
We propose a new approach to diagnostic evaluation of screening mammograms based on local statistical texture models. The local evaluation tool has the form of a multivariate probability density of gray levels in a suitably chosen search window. First, the density function in the form of Gaussian mixture is estimated from data obtained by scanning of the mammogram with the search window. Then we evaluate the estimated mixture at each position and display the corresponding log-likelihood value as a gray level at the window center. The resulting log-likelihood image closely correlates with the structural details of the original mammogram and emphasizes unusual places. We assume that, in parallel use, the log-likelihood image may provide additional information to facilitate the identification of malignant lesions as untypical locations of high novelty.
Pattern Analysis and Applications | 2002
Jirí Grim; Josef Kittler; Pavel Pudil; Petr Somol
Abstract: The main motivation of this paper is to design a statistically well justified and biologically compatible neural network model and, in particular, to suggest a theoretical interpretation of the well known high parallelism of biological neural networks. We consider a novel probabilistic approach to neural networks developed in the framework of statistical pattern recognition, and refer to a series of theoretical results published earlier. It is shown that the proposed parallel fusion of probabilistic neural networks produces biologically plausible structures and improves the resulting recognition performance. The complete design methodology based on the EM algorithm has been applied to recognise unconstrained handwritten numerals from the database of Concordia University Montreal. We achieved a recognition accuracy of about 95%, which is comparable with other published results.
Computational Statistics & Data Analysis | 2003
Jirí Grim; Michal Haindl
A new method of texture modelling based on discrete distribution mixtures is proposed. Unlike some alternative approaches the statistical properties of textures are modelled by a discrete distribution mixture of product components. The univariate distributions in the products are represented in full generality by vectors of probabilities without any constraints. The texture analysis is made in the original quantized grey level coding. An efficient texture synthesis is based on easy computation of arbitrary conditional distributions from the model. We include several successful monospectral texture applications of the method to demonstrate the advantages and weak points of the presented approach.
international conference on pattern recognition | 2008
Petr Somol; Jana Novovicová; Jirí Grim; Pavel Pudil
We introduce a new feature selection method suitable for non-monotonic criteria, i.e., for wrapper-based feature selection. Inspired by oscillating search, the dynamic oscillating search: (i) is deterministic, (ii) optimizes subset size, (iii) has built-in preference of smaller subsets, (iv) has higher optimization performance than other sequential methods. We show that the new algorithm is capable of over-performing older methods not only in criterion maximization ability but in some cases also in obtaining subsets that generalize better.
international conference on pattern recognition | 1996
Jirí Grim
The design of layered neural networks is posed as a problem of estimating finite mixtures of normal densities in the framework of statistical decision-making. The output units of the network (third layer) correspond to class-conditional mixtures defined as weighted sums of a given set of normal densities which can be viewed as radial basis functions. It is shown that the resulting classification performance strongly depends on the component densities (second layer) shared by the class conditional mixtures. To enable a global optimization of layered neural networks the EM algorithm is modified to compute m.-l. estimates of finite mixtures with shared components.
multiple classifier systems | 2001
Jirí Grim; Josef Kittler; Pavel Pudil; Petr Somol
We consider a general scheme of parallel classifier combinations in the framework of statistical pattern recognition. Each statistical classifier defines a set of output variables in terms of a posteriori probabilities, i.e. it is used as a feature extractor. Unlike usual combining schemes the output vectors of classifiers are combined in parallel. The statistical Shannon information is used as a criterion to compare different combining schemes from the point of view of the theoretically available decision information. By means of relatively simple arguments we derive a theoretical hierarchy between different schemes of classifier fusion in terms of information inequalities.
international conference on pattern recognition | 2006
Jirí Grim; Michal Haindl; Petr Somol; Pavel Pudil
Assuming local and shift-invariant texture properties we describe the statistical dependencies between pixels by a joint probability density of gray-levels within a suitably chosen observation window. We estimate the unknown multivariate density in the form of a Gaussian mixture of product components from data obtained by shifting the observation window. Obviously, the size of the window should be large to capture the low-frequency properties of textures but, on the other hand, the increasing dimension of the estimated mixture may become prohibitive. By considering a subspace approach based on a structural mixture model we can increase the size of the observation window while keeping the computational complexity in reasonable bounds
systems, man and cybernetics | 2011
Petr Somol; Jirí Grim; Pavel Pudil
The paper addresses the problem of making dependency-aware feature selection feasible in pattern recognition problems of very high dimensionality. The idea of individually best ranking is generalized to evaluate the contextual quality of each feature in a series of randomly generated feature subsets. Each random subset is evaluated by a criterion function of arbitrary choice (permitting functions of high complexity). Eventually, the novel dependency-aware feature rank is computed, expressing the average benefit of including a feature into feature subsets. The method is efficient and generalizes well especially in very-high-dimensional problems, where traditional context-aware feature selection methods fail due to prohibitive computational complexity or to over-fitting. The method is shown well capable of over-performing the commonly applied individual ranking which ignores important contextual information contained in data.
international conference on pattern recognition | 2004
Michal Haindl; Jirí Grim; Petr Somol; Pavel Pudil; Mineichi Kudo
A new method of colour texture modelling based on Gaussian distribution mixtures is discussed. We estimate the local statistical properties of the monospectral version of the target texture in the form of a Gaussian mixture of product components. The synthesized texture is obtained by means of a step-wise prediction of the texture image. In order to achieve a realistic colour texture image and to avoid possible loss of high-frequency details we use optimally chosen pieces of the original colour source texture in the synthesis phase. In this sense the proposed texture modelling method can be viewed as a statistically controlled sampling. By using multispectral or mutually registered BTF texture pieces the method can be easily extended also for these textures.
international conference on advances in pattern recognition | 2001
Petr Somol; Pavel Pudil; Jirí Grim
We introduce a novel algorithm for optimal feature selection. As opposed to our recent Fast Branch & Bound (FBB) algorithm [5] the new algorithm is well suitable for use with recursive criterion forms. Even if the new algorithm does not operate as effectively as the FBB algorithm, it is able to find the optimum significantly faster than any other Branch & Bound [1,3] algorithm.