Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jose Oncina is active.

Publication


Featured researches published by Jose Oncina.


Pattern Recognition Letters | 1994

A new version of the nearest-neighbour approximating and eliminating search algorithm (AESA) with linear preprocessing time and memory requirements

María Luisa Micó; Jose Oncina; Enrique Vidal

Abstract The Approximating and Eliminating Search Algorithm (AESA) can currently be considered as one of the most efficient procedures for finding Nearest Neighbours in Metric Spaces where distances computation is expensive. One of the major bottlenecks of the AESA, however, is its quadratic preprocessing time and memory space requirements which, in practice, can severely limit the applicability of the algorithm for large sets of data. In this paper a new version of the AESA is introduced which only requires linear preprocessing time and memory. The performance of the new version, referred to as ‘Linear AESA’ (LAESA), is studied through a number of simulation experiments in abstract metric spaces. The results show that LAESA achieves a search performance similar to that of the AESA, while definitely overcoming the quadratic costs bottleneck.


international colloquium on grammatical inference | 1994

Learning Stochastic Regular Grammars by Means of a State Merging Method

Rafael C. Carrasco; Jose Oncina

We propose a new algorithm which allows for the identification of any stochastic deterministic regular language as well as the determination of the probabilities of the strings in the language. The algorithm builds the prefix tree acceptor from the sample set and merges systematically equivalent states. Experimentally, it proves very fast and the time needed grows only linearly with the size of the sample set.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 1993

Learning subsequential transducers for pattern recognition interpretation tasks

Jose Oncina; Pedro García; Enrique Vidal

A formalization of the transducer learning problem and an effective and efficient method for the inductive learning of an important class of transducers, the class of subsequential transducers, are presented. The capabilities of subsequential transductions are illustrated through a series of experiments that also show the high effectiveness of the proposed learning method in obtaining very accurate and compact transducers for the corresponding tasks. >


Theoretical Informatics and Applications | 1999

Learning deterministic regular grammars from stochastic samples in polynomial time

Rafael C. Carrasco; Jose Oncina

In this paper, the identification of stochastic regular languages is addressed. For this purpose, we propose a class of algorithms which allow for the identification of the structure of the minimal stochastic automaton generating the language. It is shown that the time needed grows only linearly with the size of the sample set and a measure of the complexity of the task is provided. Experimentally, our implementation proves very fast for application purposes.


Pattern Recognition Letters | 1996

A fast branch & bound nearest neighbour classifier in metric spaces

Luisa Micó; Jose Oncina; Rafael C. Carrasco

The recently introduced algorithm LAESA finds the nearest neighbour prototype in a metric space. The average number of distances computed in the algorithm does not depend on the number of prototypes but it shows linear space and time complexities. In this paper, a new algorithm (TLAESA) is proposed which has a sublinear time complexity and keeps the other features unchanged.


Pattern Recognition | 2006

Learning stochastic edit distance: Application in handwritten character recognition

Jose Oncina; Marc Sebban

Many pattern recognition algorithms are based on the nearest-neighbour search and use the well-known edit distance, for which the primitive edit costs are usually fixed in advance. In this article, we aim at learning an unbiased stochastic edit distance in the form of a finite-state transducer from a corpus of (input, output) pairs of strings. Contrary to the other standard methods, which generally use the Expectation Maximisation algorithm, our algorithm learns a transducer independently on the marginal probability distribution of the input strings. Such an unbiased way to proceed requires to optimise the parameters of a conditional transducer instead of a joint one. We apply our new model in the context of handwritten digit recognition. We show, carrying out a large series of experiments, that it always outperforms the standard edit distance.


international colloquium on grammatical inference | 1996

Identification of DFA: data-dependent vs data-independent algorithms

Colin de la Higuera; Jose Oncina; Enrique Vidal

Algorithms that infer deterministic finite automata from given data and that comply with the identification in the limit condition have been thoroughly tested and are in practice often preferred to elaborate heuristics. Even if there is no guarantee of identification from the available data, the existence of associated characteristic sets means that these algorithms converge towards the correct solution. In this paper we construct a framework for algorithms with this property, and consider algorithms that use the quantity of information to direct their strategy. These data dependent algorithms still identify in the limit but may require an exponential characteristic set to do so. Nevertheless preliminary practical evidence suggests that they could perform better.


Pattern Recognition Letters | 2003

A modification of the LAESA algorithm for approximated k-NN classification

Francisco Moreno-Seco; Luisa Milcó; Jose Oncina

Nearest-neighbour (NN) and k-nearest-neighbours (k-NN) techniques are widely used in many pattern recognition classification tasks. The linear approximating and eliminating search algorithm (LAESA) is a fast NN algorithm which does not assume that the prototypes are defined in a vector space; it only makes use of some of the distance properties (mainly the triangle inequality) in order to avoid distance computations.In this work we propose an improvement of LAESA that uses k neighbours in order to approach to the accuracy of a k-NN classifier, and computes the same number of distances than the LAESA preserving the time and space complexity independent from k.


international colloquium on grammatical inference | 1998

Stochastic Inference of Regular Tree Languages

Rafael C. Carrasco; Jose Oncina; Jorge Calera-Rubio

We generalize a former algorithm for regular language identification from stochastic samples to the case of tree languages or, equivalently, string languages where structural information is available. We also describe a method to compute efficiently the relative entropy between the target grammar and the inferred one, useful for the evaluation of the inference.


international colloquium on grammatical inference | 1996

Using domain information during the learning of a subsequential transducer

Jose Oncina; Miguel Ángel Varó

The recently proposed OSTI algorithm allows for the identification of subsequential functions from input-output pairs. However, if the target is a partial function the convergence is not guaranteed. In this work, we extend the algorithm in order to allow for the identification of any partial subsequential function provided that either a negative sample or a description of the domain by means of a deterministic finite automaton is available.

Collaboration


Dive into the Jose Oncina's collaboration.

Top Co-Authors

Avatar

Luisa Micó

University of Alicante

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Enrique Vidal

Polytechnic University of Valencia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Pedro García

Polytechnic University of Valencia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge