Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Eugene Walach is active.

Publication


Featured researches published by Eugene Walach.


IEEE Transactions on Information Theory | 1984

The least mean fourth (LMF) adaptive algorithm and its family

Eugene Walach; Bernard Widrow

New steepest descent algorithms for adaptive filtering and have been devised which allow error minimization in the mean fourth and mean sixth, etc., sense. During adaptation, the weights undergo exponential relaxation toward their optimal solutions. Time constants have been derived, and surprisingly they turn out to be proportional to the time constants that would have been obtained if the steepest descent least mean square (LMS) algorithm of Widrow and Hoff had been used. The new gradient algorithms are insignificantly more complicated to program and to compute than the LMS algorithm. Their general form is W_{j+1} = W_{j} + 2 mu K epsilon_{j}^{2K-1}X_{j}, where W_{j} is the present weight vector, W_{j+1} is the next weight vector, epsilon_{j} is the present error, X_{j} is the present input vector, mu is a constant controlling stability and rate of convergence, and 2K is the exponent of the error being minimized. Conditions have been derived for weight-vector convergence of the mean and of the variance for the new gradient algorithms. The behavior of the least mean fourth (LMF) algorithm is of special interest. In comparing this algorithm to the LMS algorithm, when both are set to have exactly the same time constants for the weight relaxation process, the LMF algorithm, under some circumstances, will have a substantially lower weight noise than the LMS algorithm. It is possible, therefore, that a minimum mean fourth error algorithm can do a better job of least squares estimation than a mean square error algorithm. This intriguing concept has implications for all forms of adaptive algorithms, whether they are based on steepest descent or otherwise.


IEEE Transactions on Information Theory | 1984

On the statistical efficiency of the LMS algorithm with nonstationary inputs

Bernard Widrow; Eugene Walach

A fundamental relationship exists between the quality of an adaptive solution and the amount of data used in obtaining it. Quality is defined here in terms of misadjustment, the ratio of the excess mean square error (mse) in an adaptive solution to the minimum possible mse. The higher the misadjustment, the lower the quality is. The quality of the exact least squares solution is compared with the quality of the solutions obtained by the orthogonalized and the conventional least mean square (LMS) algorithms with stationary and nonstationary input data. When adapting with noisy observations, a filter trained with a finite data sample using an exact least squares algorithms will have a misadjustment given by M=frac{n}{N}=frac{number of weights}{number of training samples} If the same adaptive filter were trained with a steady flow of data using an ideal orthogonalized LMS algorithm, the misadjustment would be M=frac{n}{4tau_{mse}}=frac{number of weights}{number of training samples} Thus, for a given time constant tau_{mse} of the learning process, the ideal orthogonalized LMS algorithm will have about as low a misadjustment as can be achieved, since this algorithm performs essentially as an exact least squares algorithm with exponential data weighting. It is well known that when rapid convergence with stationary data is required, exact least squares algorithms can in certain cases outperform the conventional Widrow-Hoff LMS algorithm. It is shown here, however, that for an important class of nonstationary problems, the misadjustment of conventional LMS is the same as that of orthogonalized LMS, which in the stationary case is shown to perform essentially as an exact least squares algorithm.


international conference on acoustics, speech, and signal processing | 1984

Adaptive signal processing for adaptive control

Bernard Widrow; Eugene Walach

A few of the well established methods of adaptive signal processing are modified and extended for application to adaptive control. An unknown plant will track an input command signal if the plant is preceded by a controller whose transfer function approximates the inverse of the plant transfer function. An adaptive inverse modeling process can be used to obtain a stable controller, whether the plant is minimum or non-minimum phase. No direct feedback is involved. However the system output is monitored and utilized in order to adjust the parameters of the controller. The proposed method is a promising new approach to the design of adaptive control systems.


IFAC Proceedings Volumes | 1983

Adaptive Signal Processing for Adaptive Control

Bernard Widrow; Eugene Walach

Abstract A few of the well established methods of adaptive signal processing theory are modified and extended in order to address some of the basic issues of adaptive control. An unknown plant will track an input command signal if the plant is preceded by a controller whose transfer function approximates the inverse of the plant transfer function. An adaptive inverse modeling process can be used to obtain a stable controller, whether the plant is minimum or non-minimum phase. A model-reference version of this idea allows system dynamics to closely approximate desired reference model dynamics. No direct feedback is involved. However the system output is monitored and utilized in order to adjust the parameters of the controller. The proposed method performs very well in computer simulations of a wide range of stable plants, and it seems to be a promising alternative approach to the design of adaptive control systems.


Archive | 2008

Appendix C: A Comparison of the SelfTuning Regulator of strm and Wittenmark with the Techniques of Adaptive Inverse Control

Bernard Widrow; Eugene Walach

This chapter contains sections titled: Designing a Self-Tuning Regulator to Behave Like an Adaptive Inverse Control System Some Examples Summary Bibliography


Archive | 2014

Adaptive Inverse Control: A Signal Processing Approach

Bernard Widrow; Eugene Walach


Archive | 2008

Appendix A: Stability and Misadjustment of the LMS Adaptive Filter

Bernard Widrow; Eugene Walach


Archive | 2008

Inverse Plant Modeling

Bernard Widrow; Eugene Walach


Archive | 2008

Appendix G: Thirty Years of Adaptive Neural Networks: Perceptron, Madaline, and Backpropagation

Bernard Widrow; Eugene Walach


Archive | 2008

Adaptive LMS Filters

Bernard Widrow; Eugene Walach

Collaboration


Dive into the Eugene Walach's collaboration.

Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge