Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gerhard X. Ritter is active.

Publication


Featured researches published by Gerhard X. Ritter.


Archive | 1996

Handbook of Computer Vision Algorithms in Image Algebra

Gerhard X. Ritter; Joseph N. Wilson

From the Publisher: Handbook of Computer Vision Algorithms in Image Algebra provides engineers, scientists, and students with an introduction to image algebra and presents detailed descriptions of over 80 fundamental computer vision techniques. The book also introduces the portable iac++ library, which supports image algebra programming in the C++ language.


IEEE Transactions on Neural Networks | 1998

Morphological associative memories

Gerhard X. Ritter; Peter Sussner; J.L. Diza-de-Leon

The theory of artificial neural networks has been successfully applied to a wide variety of pattern recognition problems. In this theory, the first step in computing the next state of a neuron or in performing the next layer neural network computation involves the linear operation of multiplying neural values by their synaptic strengths and adding the results. A nonlinear activation function usually follows the linear operation in order to provide for nonlinearity of the network and set the next state of the neuron. In this paper we introduce a novel class of artificial neural networks, called morphological neural networks, in which the operations of multiplication and addition are replaced by addition and maximum (or minimum), respectively. By taking the maximum (or minimum) of sums instead of the sum of products, morphological network computation is nonlinear before possible application of a nonlinear activation function. As a consequence, the properties of morphological neural networks are drastically different than those of traditional neural network models. The main emphasis of the research presented here is on morphological associative memories. We examine the computing and storage capabilities of morphological associative memories and discuss differences between morphological models and traditional semilinear models such as the Hopfield net.


Graphical Models \/graphical Models and Image Processing \/computer Vision, Graphics, and Image Processing | 1990

Image algebra: an overview

Gerhard X. Ritter; Joseph N. Wilson; J. L. Davidson

Abstract This paper is the first in a sequence of papers describing an algebraic structure for image processing that has become known as the AFATL Standard Image Algebra. This algebra provides a common mathematical environment for image processing algorithm development and methodologies for algorithm optimization, comparison, and performance evaluation. In addition, the image algebra provides a powerful algebraic language for image processing which, if properly embedded into a high level programming language, will greatly increase a programmers productivity as programming tasks are greatly simplified due to replacement of large blocks of code by short algebraic statements. The purpose of this paper is to familiarize the reader with the basic concepts of the algebra and to provide a general overview of its methodology.


Journal of Parallel and Distributed Computing | 1987

Image algebra techniques for parallel image processing

Gerhard X. Ritter; Paul D. Gader

We present a new model of parallel computation—the LogGP model—and use it to analyze a number of algorithms, most notably, the single node scatter (one-to-all personalized broadcast). The LogGP model is an extension of the LogP model for parallel computation which abstracts the communication of fixed-sized short messages through the use of four parameters: the communication latency (L), overhead (o), bandwidth ( g), and the number of processors ( P). As evidenced by experimental data, the LogP model can accurately predict communication performance when only short messages are sent (as on the CM-5). However, many existing parallel machines have special support for long messages and achieve a much higher bandwidth for long messages than for short messages (e.g., IBM SP-2, Paragon, Meiko CS-2, Ncube/ 2). We extend the basic LogP model with a linear model for long messages. This combination, which we call the LogGP model of parallel computation, has one additional parameter,G, which captures the bandwidth obtained for long messages. Experimental data collected on the Meiko CS-2 shows that this simple extension of the LogP model can quite accurately predict communication performance for both short and long messages. This paper discusses algorithm design and analysis under the new model. We also examine, in more detail, the single node scatter problem under LogGP. We derive solutions for this problem which are qualitatively different from those obtained under the simpler LogP model, reflecting the importance of capturing long messages in a model.


Neural Networks | 1999

Morphological bidirectional associative memories

Gerhard X. Ritter; S Juan Luis Díaz-de-León; Peter Sussner

The theory of artificial neural networks has been successfully applied to a wide variety of pattern recognition problems. In this theory, the first step in computing the next state of a neuron or in performing the next layer neural network computation involves the linear operation of multiplying neural values by their synaptic strengths and adding the results. Thresholding usually follows the linear operation in order to provide for nonlinearity of the network. In this paper we discuss a novel class of artificial neural networks, called morphological neural networks, in which the operations of multiplication and addition are replaced by addition and maximum (or minimum), respectively. By taking the maximum (or minimum) of sums instead of the sum of products, morphological network computation is nonlinear before thresholding. As a consequence, the properties of morphological neural networks are drastically different from those of traditional neural network models. The main emphasis of the research presented here is on morphological bidirectional associative memories (MBAMs). In particular, we establish a mathematical theory for MBAMs and provide conditions that guarantee perfect bidirectional recall for corrupted patterns. Some examples that illustrate performance differences between the morphological model and the traditional semilinear model are also given.


international conference on pattern recognition | 1996

An introduction to morphological neural networks

Gerhard X. Ritter; Peter Sussner

The theory of artificial neural networks has been successfully applied to a wide variety of pattern recognition problems. In this paper we introduce a novel class of neural networks, called morphological neural networks, in which the operations of multiplication and addition are replaced by addition and maximum (or minimum), respectively. By taking the maximum (or minimum) of sums instead of the sum of products, morphological network computation is nonlinear before thresholding. As a consequence, the properties of morphological neural networks are drastically different than those of traditional neural network models. In this paper we consider some of these differences and examine the computing capabilities of morphological neural networks. As particular examples of a morphological neural network we discuss morphological associative memories and morphological perceptrons.


IEEE Transactions on Neural Networks | 2003

Lattice algebra approach to single-neuron computation

Gerhard X. Ritter; Gonzalo Urcid

Recent advances in the biophysics of computation and neurocomputing models have brought to the foreground the importance of dendritic structures in a single neuron cell. Dendritic structures are now viewed as the primary autonomous computational units capable of realizing logical operations. By changing the classic simplified model of a single neuron with a more realistic one that incorporates the dendritic processes, a novel paradigm in artificial neural networks is being established. In this work, we introduce and develop a mathematical model of dendrite computation in a morphological neuron based on lattice algebra. The computational capabilities of this enriched neuron model are demonstrated by means of several illustrative examples and by proving that any single layer morphological perceptron endowed with dendrites and their corresponding input and output synaptic processes is able to approximate any compact region in higher dimensional Euclidean space to within any desired degree of accuracy. Based on this result, we describe a training algorithm for single layer morphological perceptrons and apply it to some well-known nonlinear problems in order to exhibit its performance.


Journal of Mathematical Imaging and Vision | 2003

Reconstruction of Patterns from Noisy Inputs Using Morphological Associative Memories

Gerhard X. Ritter; Gonzalo Urcid; Laurentiu Iancu

Morphological neural networks are based on a new paradigm for neural computing. Instead of adding the products of neural values and corresponding synaptic weights, the basic neural computation in a morphological neuron takes the maximum or minimum of the sums of neural values and their corresponding synaptic weights. By taking the maximum (or minimum) of sums instead of the sum of products, morphological neuron computation is nonlinear before thresholding. As a consequence, the properties of morphological neural networks are drastically different than those of traditional neural network models. In this paper we restrict our attention to morphological associative memories. After a brief review of morphological neural computing and a short discussion about the properties of morphological associative memories, we present new methodologies and associated theorems for retrieving complete stored patterns from noisy or incomplete patterns using morphological associative memories. These methodologies are derived from the notions of morphological independence, strong independence, minimal representations of patterns vectors, and kernels. Several examples are provided in order to illuminate these novel concepts.


OE/LASE '90, 14-19 Jan., Los Angeles, CA | 1990

Theory of morphological neural networks

Jennifer L. Davidson; Gerhard X. Ritter

The theory of classical artificial neural networks has been used to solve pattern recognition problems in image processing that is different from traditional pattern recognition approaches. In standard neural network theory, the first step in performing a neural network calculation involves the linear operation of multiplying neural values by their synaptic strengths and adding the results. Thresholding usually follows the linear operation in order to provide for non-linearity of the network. This paper presents the fundamental theory for a morphological neural network which, instead of multiplication and summation, uses the non-linear operation of addition and maximum. Several basic applications which are distinctly different from pattern recognition techniques are given, including a net which performs a sieving algorithm.


Archive | 2007

Computational Intelligence Based on Lattice Theory

Vassilis G. Kaburlasos; Gerhard X. Ritter

The emergence of lattice theory within the field of computational intelligence (CI) is partially due to its proven effectiveness in neural computation. Moreover, lattice theory has the potential to unify a number of diverse concepts and aid in the cross-fertilization of both tools and ideas within the numerous subfields of CI. The compilation of this eighteen-chapter book is an initiative towards proliferating established knowledge in the hope to further expand it. This edited book is a balanced synthesis of four parts emphasizing, in turn, neural computation, mathematical morphology, machine learning, and (fuzzy) inference/logic. The articles here demonstrate how lattice theory may suggest viable alternatives in practical clustering, classification, pattern analysis, and regression applications.

Collaboration


Dive into the Gerhard X. Ritter's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Peter Sussner

State University of Campinas

View shared research outputs
Top Co-Authors

Avatar

Frank M. Caimi

Florida Atlantic University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge