Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mark Watson is active.

Publication


Featured researches published by Mark Watson.


Archive | 1991

Adaptive Neural Networks

Mark Watson

The neural networks studied in previous chapters have had a simple flow of information from input to hidden to output neurons through connections without loops and feedback. They were trained by supervised learning techniques where the desired system output was provided during training. Another interesting type of neural network performs unsupervised learning where the system is shown inputs with no desired outputs. The system searches for similar features in the training inputs to group them into categories where members of a single category share common features. These networks are called adaptive resonance theory (ART) networks (Carpenter and Grossberg, 1987). Carpenter and Grossberg have extended their theory to process inputs with a dynamic range. The sample program listed in this chapter uses this newer gray-scale Adaptive Resonance Theory (ART2) which allows inputs to have any value between 0.0 and 1.0 (thus the name gray-scale). This sample program uses the variable naming conventions originated by Gail Carpenter and Stephen Grossberg in their paper introducing the theory for ART2 (Carpenter and Grossberg, 1987). The real time dynamic behavior of ART2 net-works is fascinating to watch; they exhibit interesting behavior when they have few output neurons and are shown a large number of input patterns. The network will reshuffle existing categorizations when necessary.


Archive | 1991

A Chess-Playing Program

Mark Watson

Computer chess programs have long been a testing ground for developing search techniques. Chess programs use some form of lookahead (prediction of future moves) search combined with static evaluation (judging the relative worth of a chess position without lookahead, that is by counting material worth of both black and white pieces and comparing the relative mobility of those pieces). We will develop a chess playing program in this chapter which relies mostly on static evaluation using heuristic chess knowledge. The program will usually give the impression of strong positional play although it plays a poor tactical game. The advantages of this heuristic approach (as compared to brute force search) are: faster execution in a LISP environment, easier to add and test new chess heuristics, and a simpler program. The static evaluation function, in calculating which pieces are subject to capture, effectively performs a two-ply (that is, half moves, by either the white or black pieces) lookahead search.


Archive | 1991

Recognition of Handwritten Characters

Mark Watson

In this chapter, we will use a neural network pattern matcher to recognize handwritten characters. We will discuss required preprocessing of input data, present the sample program, show sample program output, and compare the neural network approach to a more conventional software approach to solving this problem.


Archive | 1991

Introduction to Chaos Theory

Mark Watson

This chapter provides a short overview of Chaos theory. We will see how even simple systems that are modeled with nonlinear equations, that is, those containing polynomial or exponential terms, can show surprising behavior. A simple population growth model will demonstrate a chaotic system.


Archive | 1991

The Substrates of Intelligence, a Neural Network Primer

Mark Watson

What is an artificial neural network? How do artificial neural networks compare with conventional computers and traditional massively parallel computers, and when are they more useful? What are possible applications of neural network technology? These questions will be answered in the beginning of this chapter, followed by the presentation of an engineering model based on equations which characterize the behavior of one popular class of neural networks for supervised learning. Supervised learning uses both input training patterns and desired system output patterns for neural network training. This chapter then provides a simple program demonstrating how to write and run artificial neural network simulators followed by a listing of a complete production-capable neural network simulator. Examples show how to set up training data for and run this complete simulator (which is used for speech recognition in chapter 5 and reading handwritten characters in chapter 6). This chapter ends with more suggested projects and hints for their solution.


Archive | 1991

Basic Software Tools: Machine-Independent Graphics

Mark Watson

One of the best uses for a high-level language like Common LISP is in the rapid prototyping of window-based user interfaces, now an accepted part of the software development process. Common LISP is especially appropriate for developing user interfaces when augmented with one of the current object oriented programming (OOP) packages: Object Lisp, Common LISP object system (CLOS), or Portable Common Loops (PCL).


Archive | 1991

Representing Natural Language as LISP Data Structures and LISP Code

Mark Watson

Natural language processing (NLP) systems allow interaction with a computer using natural language. NLP systems are practical to build when the system vocabulary is small and the domain of discourse limited. In section 8.1, the basic techniques will be developed for parsing Natural Language (NL) text using syntactical analysis by starting with a simple parser which recognizes noun phrases. We will use this first simple noun phrase parser to introduce the notation for augmented transition networks (ATNs).


Archive | 1991

Heuristic Network Search Algorithm

Mark Watson

Many AI applications require a search for a best (or shortest) path through a net-work. Applications for network searches include routing problems (such as aircraft or sales route planning and terrain traversal for robotic vehicles).


Archive | 1991

Speech Recognition Using Neural Networks

Mark Watson

It is generally agreed that natural language will become the most common method for interacting with computers, increasing our reliance on natural language processing, the task of understanding written human language, and speech recognition, the task of converting human speech into written form. This chapter will focus on speech recognition — what it is, what speech features are used for recognition, its associated problems, and possible applications. We will also review a sample program for speech recognition, examining what it does, how it works, show sample data, create the training data for a neural network based speech recognition system, and consider ideas for expanding the system.


Archive | 1991

Pattern Recognition Using Hopfield Neural Networks

Mark Watson

In the last chapter, we saw how a generalized delta rule (backwards error propagation) network could slowly learn to recognize a series of patterns. The delta rule network adapted slowly while training repetitively on a set of examples (sometimes a set of training examples passes through the network over 100,000 times!).

Collaboration


Dive into the Mark Watson's collaboration.

Researchain Logo
Decentralizing Knowledge