Edward M. Corwin
South Dakota School of Mines and Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Edward M. Corwin.
Applied Soft Computing | 2011
Enkhsaikhan Boldsaikhan; Edward M. Corwin; Antonette M. Logar; William J. Arbegast
This paper introduces a novel real-time approach to detecting wormhole defects in friction stir welding in a nondestructive manner. The approach is to evaluate feedback forces provided by the welding process using the discrete Fourier transform and a multilayer neural network. It is asserted here that the oscillations of the feedback forces are related to the dynamics of the plasticized material flow, so that the frequency spectra of the feedback forces can be used for detecting wormhole defects. A one-hidden-layer neural network trained with the backpropagation algorithm is used for classifying the frequency patterns of the feedback forces. The neural network is trained and optimized with a data set of forge-load control welds, and the generality is tested with novel data set of position control welds. Overall, about 95% classification accuracy is achieved with no bad welds classified as good. Accordingly, the present paper demonstrates an approach for providing important feedback information about weld quality in real-time to a control system for friction stir welding.
IEEE Transactions on Neural Networks | 1994
Edward M. Corwin; Antonette M. Logar; William J. B. Oldham
Concerns the problem of finding weights for feed-forward networks in which threshold functions replace the more common logistic node output function. The advantage of such weights is that the complexity of the hardware implementation of such networks is greatly reduced. If the task to be learned does not change over time, it may be sufficient to find the correct weights for a threshold function network off-line and to transfer these weights to the hardware implementation. This paper provides a mathematical foundation for training a network with standard logistic function nodes and gradually altering the function to allow a mapping to a threshold unit network. The procedure is analogous to taking the limit of the logistic function as the gain parameter goes to infinity. It is demonstrated that, if the error in a trained network is small, a small change in the gain parameter will cause a small change in the network error. The result is that a network that must be implemented with threshold functions can first be trained using a traditional back propagation network using gradient descent, and further trained with progressively steeper logistic functions. In theory, this process could require many repetitions. In simulations, however, the weights have be successfully mapped to a true threshold network after a modest number of slope changes. It is important to emphasize that this method is only applicable to situations for which off-line learning is appropriate.
international symposium on neural networks | 1993
Antonette M. Logar; Edward M. Corwin; William J. B. Oldham
Selected recurrent network training algorithms are described, and their performances are compared with respect to speed and accuracy for a given problem. Detailed complexity analyses are presented to allow more accurate comparison between training algorithms for networks with few nodes. Network performance for predicting the Mackey-Glass equation is reported for each of the recurrent networks, as well as for a backpropagation network. Using networks of comparable size, the recurrent networks produce significantly better prediction accuracy. The accuracy of the backpropagation network is improved by increasing the size of the network, but the recurrent networks continue to produce better results for the large prediction distances. Of the recurrent networks considered, Pearlmutters off-line training algorithm produces the best results.<<ETX>>
international symposium on neural networks | 1996
Edward M. Corwin; Antonette M. Logar; William J. B. Oldham
The network defined by Hayashi (1994), like many purely recurrent networks, has proven very difficult to train to arbitrary time series. Many recurrent architectures are best suited for producing specific cyclic behaviors. As a result, a hybrid network has been developed to allow for training to more general sequences. The network used here is a combination of standard feedforward nodes and Hayashi oscillator pairs. A learning rule, developed using a discrete mathematics approach, is presented for the hybrid network. Significant improvements in prediction accuracy were produced compared to a pure Hayashi network and a backpropagation network. Data sets used for testing the effectiveness of this approach include Mackey-Glass, sunspot, and ECG data. The hybrid models reduced training and testing error in each case by a least 34%.
international symposium on neural networks | 1996
Edward M. Corwin; Antonette M. Logar; William J. B. Oldham
A variety of recurrent network architectures have been developed and applied to the problem of time series prediction. One particularly interesting network was developed by Hayashi (1994). Hayashi presented a network of coupled oscillators and a training rule for the network. His derivation was based on continuous mathematics and provided a mechanism for updating the weights into the output nodes. The work presented here gives an alternative derivation of Hayashis learning rule based on discrete mathematics as well an extension to the learning rule which allows for updating of all weights in the network.
Journal of Computing Sciences in Colleges | 2004
Edward M. Corwin; Antonette M. Logar
acm symposium on applied computing | 1994
Antonette M. Logar; Edward M. Corwin; Samuel Watters; Ronald C. Weger; Ronald M. Welch
Satellite Remote Sensing of Clouds and the Atmosphere II | 1997
Chris Konvalin; Antonette M. Logar; David Lloyd; Edward M. Corwin; Manuel Penaloza; Rand E. Feind; Ronald M. Welch
Archive | 2010
Enkhsaikhan Boldsaikhan; Antonette M. Logar; Edward M. Corwin
Journal of Computing Sciences in Colleges | 2004
Edward M. Corwin; Antonette M. Logar