Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David M. W. Powers is active.

Publication


Featured researches published by David M. W. Powers.


Archive | 1989

Computer Science and Artificial Intelligence

David M. W. Powers; Christopher C. R. Turk

This chapter takes a step back to place our problem of Natural Language Learning in the historical and theoretical perspectives of Artificial Intelligence. For this purpose we distinguish this engineering tradition of AI from the empirical approaches of Cognitive Science. This then provides a bridge to those techniques and theoretical stances which are not grounded in Linguistics, Psychology or Neurology. The Turing Test provides a foundational metric for AI which focuses our understanding of the term intelligence on this language and learning facility. However, the methodological focus of engineered intelligence is the application of heuristics in a search paradigm. Since virtually any problem can be set up as a problem of choosing an appropriate path through a decision tree, this is an appropriate base for AI. Expert Systems and Natural Language systems traditionally have programmed deterministic rules which largely eliminate the search component and reduce the generality of the systems. But the addition of a general Problem Solving or Machine Learning component returns such systems to the fold of heuristic search. Such systems differ in the way they are read, taught and criticized, how they obtain and make use of positive and negative examples, and in the ex-tent to which they are expected to be error-free and intuitive.


Intelligent Robots and Computer Vision VII | 1989

A PROLOG Simulator For Studying Visual Learning

David M. W. Powers

PROLOG has proven useful as a language for research in Machine Learning, Natural Language and, more recently, as a control language for image processing and visual control of robots. A simulator has been written in PROLOG which allows simulated objects to be manipulated using the same image processing operations as are used in image processing systems. VISISIM allows examination of appropriate heuristics for learning to distinguish significant features with a level of control and a speed beyond real-time real-vision systems. It is designed such that once an acceptable performance has been obtained with the simulator, it may be replaced with another module (e.g. PROVISION) for evaluation in real time on real data. The simulator uses a convexity-based representation which allows explicit control over detail and noise, and parameterization of the image processing operators. VISISIM provides for learnt sequences to be available to the learning system as macros, whilst the learning program itself maintains heuristics about the utility of operators at the level of problem domain and context. VISISIM has been used to simulate a playing card recognition problem and find satisfactory variations of a hand crafted solution used with AUTOVIEW. This, however, was achieved using overly restrictive and unrealistic heuristics. Therefore it is proposed that a Knowledge Engineering approach be taken to the development of more realistic heuristics.


Archive | 1989

Neurology and Neurolinguistics

David M. W. Powers; Christopher C. R. Turk

We come now to an overview of the anatomical and physiological evidence, and the neurological models which underlie and motivate the research we described earlier. We start with the brain, its organization and connection. The most accessible parts of the brain for physiological investigation are those which are directly sensory-motor related. These therefore form the focus of our brief neuroanatomical tour.


Archive | 1989

Metaphor as a Cognitive Process

David M. W. Powers; Christopher C. R. Turk

Throughout this chapter the concept of metaphor is a constant theme, although the reader will find many diversions into discussion of related cognitive processes. In this first section, though, we look at the fundamental nature of metaphor, and the way this relates to the typically human classification systems and ontologies. Metaphor is possibly the most fundamental concept in the understanding of language and its acquisition, but is seldom regarded as more than a literary “device” used for illustrative or poetic ends. The common view is that some similarity between the present situation and something foreign to it is used to convey a “truth” about the situation. We hesitate to use the word “artificial” here, since the measure of a metaphor or analogy is its “aptness” or “naturalness”. But these explicatory or literary “devices” or “artifices” imply conscious and deliberate construction.


Archive | 1989

Postulates, Claims and Hypotheses

David M. W. Powers; Christopher C. R. Turk

This chapter pulls together all the threads we have been considering in this volume and presents a consolidated set of practical conclusions from our study of Natural Language Learning. The proposals collected here are intended to provide the basis for a programme of experimental work. We emphasize the fundamental importance of ontology and relationships from the sensory motor level up. We consider that the essential prerequisites for relationships to be learned are constancy, consistency and usefulness. These correspond to roles in learning for reinforcement, prediction and bootstrapping — in a mixed metaphor based on behaviourist, scientific and engineering methodologies.


Archive | 1989

Computer Modelling Experiments

David M. W. Powers; Christopher C. R. Turk

This chapter presents the experimental framework and research which the authors have pursued to explore the theory and methodology outlined in the preceding chapters. It is deemed essential that language be presented in a full context so that the interhierarchical relationships can be learnt, and can provide cues for intrahierarchical learning. The ideal compromise for this work is a toy world in which the learner can act and be acted upon. It can then learn language in a rich context which avoids the need for the experimenter to laboriously construct semantic sequences to accompany text, and obviates the need for full pattern recognition and robotic dexterity in the prototype.


Archive | 1989

The Mechanics of Language

David M. W. Powers; Christopher C. R. Turk

This chapter moves from a descriptive characterization of language to examine attempts to provide a predictive explanation. We note the important distinction between phoneme and phone, according to whether a distinction was recognizable in the entire potential human speech repertoire and significant or not in a particular actual language. Phonology provided a basis for analyzing the sound systems of language, building from this distinction. Tagmemics extends this analysis to encompass the entirety of linguistically significant human behaviour, expanded and refocussed to include each level, hierarchy and perspective. Context, contrast and equivalence not only have roles in analysis of language but have potential functionality in language learning and development. The same features which help us distinguish between languages as analysts and learners of second languages would seem obvious candidates for a similar role as learners of a first language. The analysis traditionally applied to adult language can also be applied to child language, and proves consonant with the idea of consistent adherence to child grammars rather than just incorrect usage of adult grammar. The most simple of these supposed grammars can add insights to our view of language. We conclude the chapter with a discussion of the generalized and relatively dynamic tagmemic features introduced by Pike, and an illustrative consideration of the basic units of the phonological, grammatical and orthographical hierarchies.


Archive | 1989

Heuristics and Analytic Intransigence

David M. W. Powers; Christopher C. R. Turk

This chapter looks at some of the particular problems associated with the computational demands of AI. The frame problem is a serious pragmatic problem characteristic of AI and NL, in which we must consider tradeoffs in time and space efficiency relating to storage and recall of information. More generally, we consider the question of efficiency and just what can and cannot be achieved in a given time frame. This is contrasted with the intransigent problems for which no efficacious algorithm can guarantee a solution in any time frame. Heuristics may be used to trade a fast probable solution against the possibility of failure in both these cases.


Archive | 1989

The Ubiquity of the Sentence

David M. W. Powers; Christopher C. R. Turk

This chapter considers the larger broader functions of language and grammar. The sentence is both the highest point of grammatical explanation to which traditional grammars aspire, and in its complexity and variety the stumbling block for elaborations based on any theory. The relationships between the major components of one or more sentences, and the possible variations and subtleties admitted by a language at this level, provide the basis for the enrichment of our linguistic theories. We look here at some of the hidden agenda which lies beneath the sentential surface, the expectation and anticipation which lie beneath our pronoun system, including the other understood omissions of anaphora and elision, and the class of humour represented by riddles involving puns and scenarios involving misinformation. We examine whether our grammatical assumptions in areas such as recursion and transformation are adequate, warranted or just simplisitic. Our conclusions are that these approaches gloss over certain semantic overtones and cohesive constraints. We end up questioning the validity of even trying to find a complete syntax and semantics which is invariant across both individual and time, and assert the importance of a dynamic model of grammar.


Archive | 1989

Art, Science and Engineering

David M. W. Powers; Christopher C. R. Turk

Our subject is the acquisition of Natural Language (NL) by computers. NL is not, in our view, a surface expression, or epiphenomenon, of a deeper, underlying cognitive process in the human brain. It is rather fundamental to, and pervasive of, cognition itself. For this reason we think that language is not the sole preserve of linguists, but is pivotal in all our interactions with the world, in our science, and in our thought.

Collaboration


Dive into the David M. W. Powers's collaboration.

Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge