Peter Sussner
State University of Campinas
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Peter Sussner.
IEEE Transactions on Neural Networks | 1998
Gerhard X. Ritter; Peter Sussner; J.L. Diza-de-Leon
The theory of artificial neural networks has been successfully applied to a wide variety of pattern recognition problems. In this theory, the first step in computing the next state of a neuron or in performing the next layer neural network computation involves the linear operation of multiplying neural values by their synaptic strengths and adding the results. A nonlinear activation function usually follows the linear operation in order to provide for nonlinearity of the network and set the next state of the neuron. In this paper we introduce a novel class of artificial neural networks, called morphological neural networks, in which the operations of multiplication and addition are replaced by addition and maximum (or minimum), respectively. By taking the maximum (or minimum) of sums instead of the sum of products, morphological network computation is nonlinear before possible application of a nonlinear activation function. As a consequence, the properties of morphological neural networks are drastically different than those of traditional neural network models. The main emphasis of the research presented here is on morphological associative memories. We examine the computing and storage capabilities of morphological associative memories and discuss differences between morphological models and traditional semilinear models such as the Hopfield net.
Neural Networks | 1999
Gerhard X. Ritter; S Juan Luis Díaz-de-León; Peter Sussner
The theory of artificial neural networks has been successfully applied to a wide variety of pattern recognition problems. In this theory, the first step in computing the next state of a neuron or in performing the next layer neural network computation involves the linear operation of multiplying neural values by their synaptic strengths and adding the results. Thresholding usually follows the linear operation in order to provide for nonlinearity of the network. In this paper we discuss a novel class of artificial neural networks, called morphological neural networks, in which the operations of multiplication and addition are replaced by addition and maximum (or minimum), respectively. By taking the maximum (or minimum) of sums instead of the sum of products, morphological network computation is nonlinear before thresholding. As a consequence, the properties of morphological neural networks are drastically different from those of traditional neural network models. The main emphasis of the research presented here is on morphological bidirectional associative memories (MBAMs). In particular, we establish a mathematical theory for MBAMs and provide conditions that guarantee perfect bidirectional recall for corrupted patterns. Some examples that illustrate performance differences between the morphological model and the traditional semilinear model are also given.
international conference on pattern recognition | 1996
Gerhard X. Ritter; Peter Sussner
The theory of artificial neural networks has been successfully applied to a wide variety of pattern recognition problems. In this paper we introduce a novel class of neural networks, called morphological neural networks, in which the operations of multiplication and addition are replaced by addition and maximum (or minimum), respectively. By taking the maximum (or minimum) of sums instead of the sum of products, morphological network computation is nonlinear before thresholding. As a consequence, the properties of morphological neural networks are drastically different than those of traditional neural network models. In this paper we consider some of these differences and examine the computing capabilities of morphological neural networks. As particular examples of a morphological neural network we discuss morphological associative memories and morphological perceptrons.
IEEE Transactions on Neural Networks | 2006
Peter Sussner; Marcos Eduardo Valle
Neural models of associative memories are usually concerned with the storage and the retrieval of binary or bipolar patterns. Thus far, the emphasis in research on morphological associative memory systems has been on binary models, although a number of notable features of autoassociative morphological memories (AMMs) such as optimal absolute storage capacity and one-step convergence have been shown to hold in the general, gray-scale setting. In this paper, we make extensive use of minimax algebra to analyze gray-scale autoassociative morphological memories. Specifically, we provide a complete characterization of the fixed points and basins of attractions which allows us to describe the storage and recall mechanisms of gray-scale AMMs. Computer simulations using gray-scale images illustrate our rigorous mathematical results on the storage capacity and the noise tolerance of gray-scale morphological associative memories (MAMs). Finally, we introduce a modified gray-scale AMM model that yields a fixed point which is closest to the input pattern with respect to the Chebyshev distance and show how gray-scale AMMs can be used as classifiers
IEEE Transactions on Fuzzy Systems | 2006
Peter Sussner; Marcos Eduardo Valle
Associative neural memories are models of biological phenomena that allow for the storage of pattern associations and the retrieval of the desired output pattern upon presentation of a possibly noisy or incomplete version of an input pattern. In this paper, we introduce implicative fuzzy associative memories (IFAMs), a class of associative neural memories based on fuzzy set theory. An IFAM consists of a network of completely interconnected Pedrycz logic neurons with threshold whose connection weights are determined by the minimum of implications of presynaptic and postsynaptic activations. We present a series of results for autoassociative models including one pass convergence, unlimited storage capacity and tolerance with respect to eroded patterns. Finally, we present some results on fixed points and discuss the relationship between implicative fuzzy associative memories and morphological associative memories
Neurocomputing | 2000
Peter Sussner
Abstract The ability of human beings to retrieve information on the basis of associated cues continues to elicit great interest among researchers. Investigations of how the brain is capable to make such associations from partial information have led to a variety of theoretical neural network models that act as associative memories. Several researchers have had significant success in retrieving complete stored patterns from noisy or incomplete input pattern keys by using morphological associative memories. Thus far morphological associative memories have been employed in two different ways: a direct approach which is suitable for input patterns containing either dilative or erosive noise and an indirect one for arbitrarily corrupted input patterns which is based on kernel vectors. In a recent paper (P. Sussner, in: Proceedings of the International ICSA/IFAC Symposium on Neural Computation, Vienna, September 1998), we suggested how to select these kernel vectors and we deduced exact statements on the amount of noise which is permissible for perfect recall. In this paper, we establish the proofs for all our claims made about the choice of kernel vectors and perfect recall in kernel method applications. Moreover, we provide arguments for the success of both approaches beyond the experimental results presented up to this point.
Information Sciences | 2011
Peter Sussner; Estevão Laureano Esmi
A morphological neural network is generally defined as a type of artificial neural network that performs an elementary operation of mathematical morphology at every node, possibly followed by the application of an activation function. The underlying framework of mathematical morphology can be found in lattice theory. With the advent of granular computing, lattice-based neurocomputing models such as morphological neural networks and fuzzy lattice neurocomputing models are becoming increasingly important since many information granules such as fuzzy sets and their extensions, intervals, and rough sets are lattice ordered. In this paper, we present the lattice-theoretical background and the learning algorithms for morphological perceptrons with competitive learning which arise by incorporating a winner-take-all output layer into the original morphological perceptron model. Several well-known classification problems that are available on the internet are used to compare our new model with a range of classifiers such as conventional multi-layer perceptrons, fuzzy lattice neurocomputing models, k-nearest neighbors, and decision trees.
Fuzzy Sets and Systems | 2008
Marcos Eduardo Valle; Peter Sussner
Fuzzy associative memories (FAMs) can be used as a powerful tool for implementing fuzzy rule-based systems. The insight that FAMs are closely related to mathematical morphology (MM) has recently led to the development of new fuzzy morphological associative memories (FMAMs), in particular implicative fuzzy associative memories (IFAMs). As the name FMAM indicates, these models belong to the class of fuzzy morphological neural networks (FMNNs). Thus, each node of an FMAM performs an elementary operation of fuzzy MM. Clarifying several misconceptions about FMAMs that have recently appeared in the literature, we provide a general framework for FMAMs within the class of FMNN. We show that many well-known FAM models fit within this framework and can therefore be classified as FMAMs. Moreover, we employ certain concepts of duality that are defined in the general theory of MM in order to derive a large class of strategies for learning and recall in FMAMs.
Journal of Mathematical Imaging and Vision | 2003
Peter Sussner
Morphological neural networks (MNNs) are a class of artificial neural networks whose operations can be expressed in the mathematical theory of minimax algebra. In a morphological neural net, the usual sum of weighted inputs is replaced by a maximum or minimum of weighted inputs (in this context, the weighting is performed by summing the weight and the input). We speak of a max product, a min product respectively.In recent years, a number of different MNN models and applications have emerged. The emphasis of this paper is on morphological associative memories (MAMs), in particular on binary autoassociative morphological memories (AMMs). We give a new set theoretic interpretation of recording and recall in binary AMMs and provide a generalization using fuzzy set theory.
Journal of Mathematical Imaging and Vision | 2008
Peter Sussner; Marcos Eduardo Valle
Mathematical morphology was originally conceived as a set theoretic approach for the processing of binary images. Extensions of classical binary morphology to gray-scale morphology include approaches based on fuzzy set theory. This paper discusses and compares several well-known and new approaches towards gray-scale and fuzzy mathematical morphology. We show in particular that a certain approach to fuzzy mathematical morphology ultimately depends on the choice of a fuzzy inclusion measure and on a notion of duality. This fact gives rise to a clearly defined scheme for classifying fuzzy mathematical morphologies. The umbra and the level set approach, an extension of the threshold approach to gray-scale mathematical morphology, can also be embedded in this scheme since they can be identified with certain fuzzy approaches.