Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Malcolm Bauer is active.

Publication


Featured researches published by Malcolm Bauer.


International Journal of Testing | 2004

Introduction to Evidence Centered Design and Lessons Learned From Its Application in a Global E-Learning Program

John T. Behrens; Robert J. Mislevy; Malcolm Bauer; David M. Williamson; Roy Levy

This articles introduces the assessment and deployment contexts of the Networking Performance Skill System (NetPASS) project and the articles in this section that report on findings from this endeavor. First, the educational context of the Cisco Networking Academy Program is described. Second, the basic outline of Evidence Centered Design is described. In the third section, the intersection of these two activities in the NetPASS project is described and the subsequent articles introduced.


International Journal of Testing | 2004

Design Rationale for a Complex Performance Assessment

David M. Williamson; Malcolm Bauer; Linda S. Steinberg; Robert J. Mislevy; John T. Behrens; Sarah F. Demark

In computer-based interactive environments meant to support learning, students must bring a wide range of relevant knowledge, skills, and abilities to bear jointly as they solve meaningful problems in a learning domain. To function effectively as an assessment, a computer system must additionally be able to evoke and interpret observable evidence about targeted knowledge in a manner that is principled, defensible, and suited to the purpose at hand (e.g., licensure, achievement testing, coached practice). This article describes the foundations for the design of an interactive computer-based assessment of design, implementation, and troubleshooting in the domain of computer networking. The application is a prototype for assessing these skills as part of an instructional program, as interim practice tests and as chapter or end-of-course assessments. An Evidence Centered Design (ECD) framework was used to guide the work. An important part of this work is a cognitive task analysis designed (a) to tap the knowledge computer network specialists and students use when they design and troubleshoot networks and (b) to elicit behaviors that manifest this knowledge. After summarizing its results, we discuss implications of this analysis, as well as information gathered through other methods of domain analysis, for designing psychometric models, automated scoring algorithms, and task frameworks and for the capabilities required for the delivery of this example of a complex computer-based interactive assessment.


Archive | 2015

An Application of Exploratory Data Analysis in the Development of Game-Based Assessments

Kristen E. DiCerbo; Maria Bertling; Shonté Stephenson; Yue Jia; Robert J. Mislevy; Malcolm Bauer; G. Tanner Jackson

While the richness of data from games holds promise for making inferences about players’ knowledge, skills, and attributes (KSAs), standard methods for scoring and analysis do not exist. A key to serious game analytics that measure player KSAs is the identification of player actions that can serve as evidence in scoring models. While game-based assessments may be designed with hypotheses about this evidence, the open nature of game play requires exploration of records of player actions to understand the data obtained and to generate new hypotheses. This chapter demonstrates the use of the 4R’s of Exploratory Data Analysis (EDA): revelation, resistance, re-expression, and residuals to gain close familiarity with data, avoid being fooled, and uncover unexpected patterns. The interactive and iterative nature of EDA allows for the generation of hypotheses about the processes that generated the observed data. Through this framework, possible evidence pieces emerge and the chapter concludes with an explanation of how these can be combined in a measurement model using Bayesian Networks.


Interactive Technology and Smart Education | 2009

Combining Learning and Assessment in Assessment-Based Gaming Environments: A Case Study from a New York City School.

Diego Zapata-Rivera; Waverely VanWinkle; Bryan Doyle; Alyssa Buteux; Malcolm Bauer

Purpose – The purpose of this paper is to propose and demonstrate an evidence‐based scenario design framework for assessment‐based computer games.Design/methodology/approach – The evidence‐based scenario design framework is presented and demonstrated by using BELLA, a new assessment‐based gaming environment aimed at supporting student learning of vocabulary and math. BELLA integrates assessment and learning into an interactive gaming system that includes written conversations, math activities, oral and written feedback in both English and Spanish, and a visible psychometric model that is used to adaptively select activities as well as feedback levels. This paper also reports on a usability study carried out in a public middle school in New York City.Findings – The evidence‐based, scenario design framework proves to be instrumental in helping combine game and assessment requirements. BELLA demonstrates how advances in artificial intelligence in education, cognitive science, educational measurement, and vid...


artificial intelligence in education | 2017

New Directions in Formative Feedback in Interactive Learning Environments

Ilya M. Goldin; Susanne Narciss; Peter W. Foltz; Malcolm Bauer

Formative feedback is well known as a key factor in influencing learning. Modern interactive learning environments provide a broad range of ways to provide feedback to students as well as new tools to understand feedback and its relation to various learning outcomes. This issue focuses on the role of formative feedback through a lens of how technologies both support student learning and enhance our understanding of the mechanisms of feedback. The papers in the issue span a variety of feedback strategies, instructional domains, AI techniques, and educational use cases in order to improve and understand formative feedback in interactive learning environments. The issue encompasses three primary themes critical to understanding formative feedback: 1) the role of human information processing and individual learner characteristics for feedback efficiency, 2) how to deliver meaningful feedback to learners in domains of study where student work is difficult to assess, and 3) how human feedback sources (e.g., peer students) can be supported by user interfaces and technology-generated feedback.


Machine Learning | 1989

Conceptual Clustering, Categorization, and Polymorphy

Stephen José Hanson; Malcolm Bauer


artificial intelligence in education | 2007

Evidence-based Approach to Interacting with Open Student Models

Diego Zapata-Rivera; Eric G. Hansen; Valerie J. Shute; Jody S. Underwood; Malcolm Bauer


artificial intelligence in education | 2007

English ABLE

Diego Zapata-Rivera; Waverely VanWinkle; Valerie J. Shute; Jody S. Underwood; Malcolm Bauer


ETS Research Report Series | 2008

MONITORING AND FOSTERING LEARNING THROUGH GAMES AND EMBEDDED ASSESSMENTS

Valerie J. Shute; Matthew Ventura; Malcolm Bauer; Diego Zapata-Rivera


E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education | 2003

Using Evidence-Centered Design to Develop Advanced Simulation-Based Assessment and Training

Malcolm Bauer; David M. Williamson; Robert J. Mislevy; John T. Behrens

Collaboration


Dive into the Malcolm Bauer's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge