Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Michael A. Lehr is active.

Publication


Featured researches published by Michael A. Lehr.


Proceedings of the IEEE | 1990

30 years of adaptive neural networks: perceptron, Madaline, and backpropagation

Bernard Widrow; Michael A. Lehr

Fundamental developments in feedforward artificial neural networks from the past thirty years are reviewed. The history, origination, operating characteristics, and basic theory of several supervised neural-network training algorithms (including the perceptron rule, the least-mean-square algorithm, three Madaline rules, and the backpropagation technique) are described. The concept underlying these iterative adaptation algorithms is the minimal disturbance principle, which suggests that during training it is advisable to inject new information into a network in a manner that disturbs stored information to the smallest extent possible. The two principal kinds of online rules that have developed for altering the weights of a network are examined for both single-threshold elements and multielement networks. They are error-correction rules, which alter the weights of a network to correct error in the output response to the present input pattern, and gradient rules, which alter the weights of a network during each pattern presentation by gradient descent with the objective of reducing mean-square error (averaged over all training patterns). >


Communications of The ACM | 1994

Neural networks: applications in industry, business and science

Bernard Widrow; David E. Rumelhart; Michael A. Lehr

Just four years ago, the only widely reported commercial application of neural network technology outside the financial industry was the airport baggage explosive detection system [27] developed at Science Applications International Corporation (SAIC). Since that time scores of industrial and commercial applications have come into use, but the details of most of these systems are considered corporate secrets and are shrouded in secrecy. This hastening trend is due in part to the availability of an increasingly wide array of dedicated neural network hardware. This hardware is either in the form of accelerator cards for PCs and workstations or a large number of integrated circuits implementing digital and analog neural networks either currently available or in the final stages of design


Communications of The ACM | 1994

The basic ideas in neural networks

David E. Rumelhart; Bernard Widrow; Michael A. Lehr

Interest in the study of neural networks has grown remarkably in the last several years. This effort has been characterized in a variety of ways: as the study of brain-style computation, connectionist architectures, parallel distributed-processing systems, neuromorphic computation, artificial neural systems. The common theme to these efforts has been an interest in looking at the brain as a model of a parallel computational device very different from that of a traditional serial computer


International Journal of Intelligent Systems | 1993

Adaptive neural networks and their applications

Bernard Widrow; Michael A. Lehr

Fundamental developments in feedforward artificial neural networks from the past 30 years are reviewed. the central theme of this article is a description of the history, origination, operating characteristics, and basic theory of several supervised neural network training algorithms including the Perceptron rule, the LMS algorithm, three Madaline rules, and the backpropagation technique. These methods were developed independently, but with the perspective of history they can all be related to each other. the concept which underlies these algorithms is the “minimal disturbance principle,” which suggests that during training it is advisable to inject new information into a network in a manner which disturbs stored information to the smallest extent possible. In the utilization of present‐day rule‐based expert systems, decision rules must always be known for the application of interest. Sometimes there are no rules, however. the rules are either not explicit or they simply do not exist. For such applications, trainable expert systems might be usable. Rather than working with decision rules, an adaptive expert system might observe the decisions made by a human expert. Looking over the experts shoulders, an adaptive system can learn to make similar decisions to those of the human. Trainable expert systems have been used in the laboratory for real‐time control of a “broom‐balancing system.”


international symposium on neural networks | 1993

Learning algorithms for adaptive processing and control

Bernard Widrow; Michael A. Lehr; Françoise Beaufays; Eric A. Wan; M. Bileillo

Linear and nonlinear adaptive filtering algorithms are described, along with applications to signal processing and control problems. Specific topics addressed include adaptive least mean square (LMS) filtering, adaptive filtering with discrete cosine transform LMS (DCT/LMS), adaptive noise cancelling, fetal electrocardiography, adaptive echo cancelling, inverse plant modeling, adaptive inverse control, adaptive equalization, adaptive linear prediction, and nonlinear filtering and prediction.<<ETX>>


Neurobionics#R##N#An Interdisciplinary Approach to Substitute Impaired Functions of the Human Nervous System | 1993

ARTIFICIAL NEURAL NETWORKS OF THE PERCEPTRON, MADALINE, AND BACKPROPAGATION FAMILY

Bernard Widrow; Michael A. Lehr

ABSTRACT Fundamental developments in feedforward artificial neural networks from the past thirty years are reviewed. The central theme of this paper is a description of the history, origination, operating characteristics, and basic theory of several supervised neural network training algorithms including the Perceptron rule, the LMS algorithm, three Madaline rules, and the backpropagation technique. These methods were developed independently, but with the perspective of history they can all be related to each other. The concept which underlies these algorithms is the “minimal disturbance principle”, which suggests that during training it is advisable to inject new information into a network in a manner which disturbs stored information to the smallest extent possible. Learning algorithms used in artificial neural networks are probably not representative of learning processes in living neural systems. However, study of these algorithms may give neurobiologists and psychologists some clues of what to look for when studying cognition, pattern classification, and locomotion.


Proceedings of SPIE | 1993

Stanford neural network research

Bernard Widrow; Michael A. Lehr; Françoise Beaufays; Eric A. Wan; Michel Bilello

Linear and nonlinear adaptive filtering algorithms are described, along with applications to signal processing and control problems such as prediction, modeling, inverse modeling, equalization, echo cancelling, noise cancelling, and inverse control.


Journal of the Acoustical Society of America | 1999

Directional hearing system

Michael A. Lehr; Bernard Widrow


The handbook of brain theory and neural networks | 1998

Perceptrons, adalines, and backpropagation

Bernard Widrow; Michael A. Lehr


The handbook of brain theory and neural networks | 1998

Noise canceling and channel equalization

Bernard Widrow; Michael A. Lehr

Collaboration


Dive into the Michael A. Lehr's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge