Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Derek H. Sleeman is active.

Publication


Featured researches published by Derek H. Sleeman.


Cognitive Science | 1984

An attempt to understand students' understanding of basic algebra

Derek H. Sleeman

This paper reports the results obtained with a group of 24 14-year-old students when presented with a set of algebra tasks by the Leeds Modelling System, LMS. These same students were given a comparable paper-and-pencil test and detailed interviews some four months later. The latter studies uncovered several kinds of student misunderstandings that LMS had not detected. Some students had profound misunderstandings of algebraic notation: Others used strategies such as substituting numbers for variables until the equation balanced. Additionally, it appears that the student errors fall into several distinct classes: namely, manipulative, parsing, clerical, and “random.” LMS and its rule database have been enhanced as the result of this experiment, and LMS is now able to diagnose the majority of the errors encountered in this experiment. Finally, the paper gives a process-oriented explanation for student errors, and re-examines related work in cognitive modelling in the light of the types of student errors reported in this experiment. Misgeneralization is a mechanism suggested to explain some of the mal-rules noted in this study.


International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 1973

Towards more intelligent teaching systems

J.R. Hartley; Derek H. Sleeman

This paper suggests criteria against which the “intelligence” of a teaching machine can be judged. With the electronic computer in mind, distinctions are made between pre-structured, generative, adaptive, and self-improving teaching systems. From the work which has been carried out at Leeds, examples are taken which illustrate the characteristics and intelligence of these systems and the requirements for their implementation.


IEEE Intelligent Systems | 2001

Better knowledge management through knowledge engineering

Alun David Preece; Alan Flett; Derek H. Sleeman; David A. Curry; Nigel Meany; Phil Perry

The authors believe that current knowledge management practice significantly under-utilizes knowledge engineering technology, despite recent efforts to promote its use. They focus on two knowledge engineering processes: using knowledge acquisition processes to capture structured knowledge systematically; and using knowledge representation technology to store the knowledge, preserving important relationships that are far richer than those possible in conventional databases. To demonstrate the usefulness of these processes, we present a case study in which the drilling optimization group of a large oil and gas service company uses knowledge engineering practices to support the three facets of the knowledge management task: knowledge capture; knowledge storage; and knowledge deployment.


International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 1985

UMFE: a user modelling front-end subsystem

Derek H. Sleeman

Abstract The paper argues that user models are an essential component of any system which attempts to be “user friendly”, and that expert systems should tailor explanations to their users, be they super-experts or novices. In particular, this paper discusses a data-driven user modelling front-end subsystem, UMFE, which assumes that the user has asked a question of the main system (e.g. an expert system, intelligent tutoring system etc.), and that the system provides a response which is passed to UMFE. UMFE determines the users level of sophistication by asking as few questions as possible, and then presents a response in terms of concepts which UMFE believes the user understands. Investigator-defined inference rules are then used to suggest additional concepts the user may/may not know, given the concepts the user indicated he or she knew in earlier questioning. Several techniques are discussed for detecting and removing inconsistencies in the user model. Additionally, UMFE modifies its inference rules for individual users when it detects certain types of inconsistencies. UMFE is a portable domain-independent implementation of a system which infers overlay models for users. UMFE has been used in conjunction with NEOMYCIN; and the paper contains several protocols which demonstrate its principal features. The paper concludes with a critique of UMFE and suggestions for enhancing the current system.


International Conference on Innovative Techniques and Applications of Artificial Intelligence | 2004

OntoSearch: An Ontology Search Engine

Yi Zhang; Wamberto Weber Vasconcelos; Derek H. Sleeman

Reuse of knowledge bases and the semantic web are two promising areas in knowledge technologies. Given some user requirements, finding the suitable ontologies is an important task in both these areas. This paper discusses our work on OntoSearch, a kind of “ontology Google”, which can help users find ontologies on the Internet. OntoSearch combines Google Web APIs with a hierarchy visualization technique. It allows the user to perform keyword searches on certain types of “ontology” files, and to visually inspect the files to check their relevance. OntoSearch system is based on Java, JSP, Jena and JBoss technologies.


Artificial Intelligence | 1981

Modelling student's problem solving

Derek H. Sleeman; M. J. Smith

Abstract The task of modelling a students problem solving is intrinsically one of induction. In this paper, a further instance of induction formulated as a search is reported. A formulation of this search problem which ‘contains’ the Combinatorics is discussed at some length; and several heuristics which further reduce the size of the space defined by the domains rules and mal-rules are presented. The Production Rule representation of (some) arithmetic operators is given, and the behaviour of generated models with examples is included. The paper concludes with a review of the basic assumptions made by the Modelling System, LMS, and suggestions for some extensions.


Journal of Educational Computing Research | 1986

Pascal and High School Students: A Study of Errors

Derek H. Sleeman; Ralph T. Putnam; Juliet Baxter; Laiani Kuspa

A screening test was given to three classes of high school students, who were just completing introductory semester-long courses in Pascal. These tests were graded, and subsequently thirty-five students were given detailed clinical interviews. These interviews showed that errors were made with essentially every Pascal construct. Over half the students were classified as having major difficulties—fewer than 10 percent had no difficulties. The errors noted are discussed in detail in this article. A major finding is that the students attribute to the computer the reasoning power of an average person. The article also speculates about how difficult it might be to remediate the errors found, and concludes with an outline of future work.


Artificial Intelligence | 1997

Scientific discovery and simplicity of method

Herbert A. Simon; Raúl E. Valdés-Pérez; Derek H. Sleeman

3-Benzoyl-3-thiocyanatopropionic acid, alkyl esters are prepared by the reaction of a 3-benzoyl-3-halopropionic acid alkyl ester with an alkali metal thiocyanate. The products so produced exhibit antitubercular activity.


Proceedings of SPIE, the International Society for Optical Engineering | 2008

Matching sensors to missions using a knowledge-based approach

Alun David Preece; Mario Gómez; Geeth de Mel; Wamberto Weber Vasconcelos; Derek H. Sleeman; Stuart Colley; Gavin Pearson; Tien Pham; Thomas F. La Porta

Making decisions on how best to utilise limited intelligence, surveillance and reconnaisance (ISR) resources is a key issue in mission planning. This requires judgements about which kinds of available sensors are more or less appropriate for specific ISR tasks in a mission. A methodological approach to addressing this kind of decision problem in the military context is the Missions and Means Framework (MMF), which provides a structured way to analyse a mission in terms of tasks, and assess the effectiveness of various means for accomplishing those tasks. Moreover, the problem can be defined as knowledge-based matchmaking: matching the ISR requirements of tasks to the ISR-providing capabilities of available sensors. In this paper we show how the MMF can be represented formally as an ontology (that is, a specification of a conceptualisation); we also represent knowledge about ISR requirements and sensors, and then use automated reasoning to solve the matchmaking problem. We adopt the Semantic Web approach and the Web Ontology Language (OWL), allowing us to import elements of existing sensor knowledge bases. Our core ontologies use the description logic subset of OWL, providing efficient reasoning. We describe a prototype tool as a proof-of-concept for our approach. We discuss the various kinds of possible sensor-mission matches, both exact and inexact, and how the tool helps mission planners consider alternative choices of sensors.


Machine Learning | 1990

Extending Domain Theories: Two Case Studies in Student Modeling

Derek H. Sleeman; Haym Hirsh; Ian Ellery; In-Yung Kim

By its very nature, artificial intelligence is concerned with investigating topics that are ill-defined and ill-understood. This paper describes two approaches to expanding a good but incomplete theory of a domain. The first uses the domain theory as far as possible and fills in specific gaps in the reasoning process, generalizing the suggested missing steps and adding them to the domain theory. The second takes existing operators of the domain theory and applies perturbations to form new plausible operators for the theory. The specific domain to which these techniques have been applied is high-school algebra problems. The domain theory is represented as operators corresponding to algebraic manipulations, and the problem of expanding the domain theory becomes one of discovering new algebraic operators. The general framework used is one of generate and test—generating new operators for the domain and using tests to filter out unreasonable ones. The paper compares two algorithms, INFER and MALGEN, examining their performance on actual data collected in two Scottish schools and concluding with a critical discussion of the two methods.

Collaboration


Dive into the Derek H. Sleeman's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Susan Craw

Robert Gordon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Suraj Ajit

University of Aberdeen

View shared research outputs
Researchain Logo
Decentralizing Knowledge