Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where W. Hubbard is active.

Publication


Featured researches published by W. Hubbard.


Neural Computation | 1989

Backpropagation applied to handwritten zip code recognition

Yann LeCun; Bernhard E. Boser; John S. Denker; D. Henderson; R. E. Howard; W. Hubbard; Lawrence D. Jackel

The ability of learning networks to generalize can be greatly enhanced by providing constraints from the task domain. This paper demonstrates how such constraints can be integrated into a backpropagation network through the architecture of the network. This approach has been successfully applied to the recognition of handwritten zip code digits provided by the U.S. Postal Service. A single network learns the entire recognition operation, going from the normalized image of the character to the final classification.


Pattern Recognition | 1991

Design of a neural network character recognizer for a touch terminal

Isabelle Guyon; P. Albrecht; Y. Le Cun; John S. Denker; W. Hubbard

Abstract We describe a system which can recognize digits and uppercase letters handprinted on a touch terminal. A character is input as a sequence of [ x(t), y(t) ] coordinates, subjected to very simple preprocessing, and then classified by a trainable neural network. The classifier is analogous to “time delay neural networks” previously applied to speech recognition. The network was trained on a set of 12,000 digits and uppercase letters, from approximately 250 different writers, and tested on 2500 such characters from other writers. Classification accuracy exceeded 96% on the test examples.


IEEE Communications Magazine | 1989

Handwritten digit recognition: applications of neural network chips and automatic learning

Y. Le Cun; Lawrence D. Jackel; Bernhard E. Boser; John S. Denker; Hans Peter Graf; I. Guyon; D. Henderson; R. E. Howard; W. Hubbard

Two novel methods for achieving handwritten digit recognition are described. The first method is based on a neural network chip that performs line thinning and feature extraction using local template matching. The second method is implemented on a digital signal processor and makes extensive use of constrained automatic learning. Experimental results obtained using isolated handwritten digits taken from postal zip codes, a rather difficult data set, are reported and discussed.<<ETX>>


international conference on pattern recognition | 1990

Handwritten zip code recognition with multilayer networks

Y. Le Cun; Ofer Matan; Bernhard E. Boser; John S. Denker; D. Henderson; R. E. Howard; W. Hubbard; L.D. Jacket; Henry S. Baird

An application of back-propagation networks to handwritten zip code recognition is presented. Minimal preprocessing of the data is required, but the architecture of the network is highly constrained and specifically designed for the task. The input of the network consists of size-normalized images of isolated digits. The performance on zip code digits provided by the US Postal Service is 92% recognition, 1% substitution, and 7% rejects. Structured neural networks can be viewed as statistical methods with structure which bridge the gap between purely statistical and purely structural methods.<<ETX>>


Archive | 1990

Optical Character Recognition and Neural-Net Chips

Y. Le Cun; Lawrence D. Jackel; Hans Peter Graf; Bernhard E. Boser; John S. Denker; Isabelle Guyon; D. Henderson; R. E. Howard; W. Hubbard; Sara A. Solla

Neural Network research has always interested hardware designers, theoreticians, and application engineers. But until recently, the common ground between these groups was limited: the neural-net chips were too small to implement any full-size application, and the algorithms were too complicated (or the applications not interesting enough) to be implemented on a chip. The merging of these efforts is now made possible by the simultaneous emergence of powerful chips and successful, real-world applications of neural networks. Here, we discuss how the compute-intensive part of a handwritten digit recognizer, based on a highly structured backpropagation network, can be implemented on a general purpose neural-network chip containing 32k binary synapses. Using techniques based on the second-order properties of the error function, we show that very little accuracy on the weights and states is required in the first layers of the network. Interestingly, the best digit-recognition network is also the easiest to implement on a chip.


Applied Physics Letters | 1987

Dynamics of microfabricated electronic neural networks

Daniel B. Schwartz; R. E. Howard; John S. Denker; R. W. Epworth; Hans Peter Graf; W. Hubbard; Lawrence D. Jackel; B. L. Straughn; D. M. Tennant

We have studied the dynamics of a totally interconnected network of nonlinear amplifiers by building model electronic circuits using dense arrays of resistors and discrete amplifiers. Such models have been discussed recently in the context of spin glasses and neural networks. Even without optimization for speed, these circuits easily reproduce and extend the results of computer simulations in considerably less time.


Applied Optics | 1987

Building a hierarchy with neural networks: an example—image vector quantization

Lawrence D. Jackel; R. E. Howard; John S. Denker; W. Hubbard; Sara A. Solla

Electronic neural networks can perform the function of associative memory. Given an input pattern, the network searches through its stored memories to find which of them best matches the input. Thus the network does a combination of content-addressable search and error correction. The number of random memories that a network can store is limited to a fraction of the number of electronic neurons in the circuit. We propose a method for building a hierarchy of networks that allows the fast parallel search through a list of memories that is too large to store in a single network. We have demonstrated the principle of this approach by an example in image vector quantization.


international symposium on neural networks | 1990

Hardware requirements for neural-net optical character recognition

L.D. Jackel; Bernhard E. Boser; J.S. Denker; H.P. Graf; Y. Le Cun; I. Guyon; D. Henderson; R. E. Howard; W. Hubbard; Sara A. Solla

Hardware architectures for character recognition are discussed, and choices for possible circuits are outlined. An advanced (and working) reconfigurable neural-net chip that mixes analog and digital processing is described. It is found that different approaches to image recognition often lead to neural-net architectures that have limited connectivity and repeated use of the same set of weights. This architecture is ideal for time-multiplexing (a combined parallel-series processing) on hardware systems that would be too small to evaluate the entire network in parallel. To make this process efficient, a chip needs to have shift registers to format the input data and additional registers to store intermediate results. Within this framework, it is possible to design chips that have broad utility, large connection capacity, and high speed. This was demonstrated by a new chip with 32000 reconfigurable connections


international symposium on circuits and systems | 1990

Optical character recognition: a technology driver for neural networks

R. E. Howard; Bernhard E. Boser; John S. Denker; Hans Peter Graf; D. Henderson; W. Hubbard; Lawrence D. Jackel; Y. Le Cun; Henry S. Baird

It is shown that a neural net can perform handwritten digit recognition with state-of-the-art accuracy. The solution required automatic learning and generalization from thousands of training examples and also required designing into the system considerable knowledge about the task-neither engineering nor learning from examples alone would have sufficed. The resulting network is well suited for implementation on workstations or PCs and can take advantage of digital signal processors (DSPs) or custom VLSI.<<ETX>>


international conference of the ieee engineering in medicine and biology society | 1988

Neural network chips

Lawrence D. Jackel; R. E. Howard; Hans Peter Graf; W. Hubbard; John S. Denker; D. Henderson

Early results from exploring alternative computer architectures based on hints from neurobiology suggest that networks of highly-interconnected, simple, low-precision processors can give tools for tackling problems that have been hard or impossible to do on standard computers. The authors describe an electronic neural model and show how this model is readily adapted for use in pattern-recognition tasks. They also describe a chip, implementing this model, that is used for handwritten digit recognition.<<ETX>>

Collaboration


Dive into the W. Hubbard's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge