Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kenneth A. Yates is active.

Publication


Featured researches published by Kenneth A. Yates.


Journal of Surgical Education | 2011

The Effectiveness of a Cognitive Task Analysis Informed Curriculum to Increase Self-Efficacy and Improve Performance for an Open Cricothyrotomy

Julia Campbell; Leslie Tirapelle; Kenneth A. Yates; Richard E. Clark; Kenji Inaba; Donald J. Green; David Plurad; Lydia Lam; Andrew Tang; Roman Cestero; Maura E. Sullivan

OBJECTIVE This study explored the effects of a cognitive task analysis (CTA)-informed curriculum to increase surgical skills performance and self-efficacy beliefs for medical students and postgraduate surgical residents learning how to perform an open cricothyrotomy. METHODS Third-year medical students and postgraduate year 2 and 3 surgery residents were assigned randomly to either the CTA group (n = 12) or the control group (n = 14). The CTA group learned the open cricothyrotomy procedure using the CTA curriculum. The control group received the traditional curriculum. RESULTS The CTA group outperformed the control group significantly based on a 19-point checklist score (CTA mean score: 17.75, standard deviation [SD] = 2.34; control mean score: 15.14, SD = 2.48; p = 0.006). The CTA group also reported significantly higher self-efficacy scores based on a 140-point self-appraisal inventory (CTA mean score: 126.10, SD = 16.90; control: 110.67, SD = 16.8; p = 0.029). CONCLUSIONS The CTA curriculum was effective in increasing the performance and self-efficacy scores for postgraduate surgical residents and medical students performing an open cricothyrotomy.


Academic Medicine | 2014

The use of cognitive task analysis to reveal the instructional limitations of experts in the teaching of procedural skills.

Maura E. Sullivan; Kenneth A. Yates; Kenji Inaba; Lydia Lam; Richard E. Clark

Purpose Because of the automated nature of knowledge, experts tend to omit information when describing a task. A potential solution is cognitive task analysis (CTA). The authors investigated the percentage of knowledge experts omitted when teaching a cricothyrotomy to determine the percentage of additional knowledge gained during a CTA interview. Method Three experts were videotaped teaching a cricothyrotomy in 2010 at the University of Southern California. After transcription, they participated in CTA interviews for the same procedure. Three additional surgeons were recruited to perform a CTA for the procedure, and a “gold standard” task list was created. Transcriptions from the teaching sessions were compared with the task list to identify omitted steps (both “what” and “how” to do). Transcripts from the CTA interviews were compared against the task list to determine the percentage of knowledge articulated by each expert during the initial “free recall” (unprompted) phase of the CTA interview versus the amount of knowledge gained by using CTA elicitation techniques (prompted). Results Experts omitted an average of 71% (10/14) of clinical knowledge steps, 51% (14/27) of action steps, and 73% (3.6/5) of decision steps. For action steps, experts described “how to do it” only 13% (3.6/27) of the time. The average number of steps that were described increased from 44% (20/46) when unprompted to 66% (31/46) when prompted. Conclusions This study supports previous research that experts unintentionally omit knowledge when describing a procedure. CTA is a useful method to extract automated knowledge and augment expert knowledge recall during teaching.


ieee aerospace conference | 2011

Developing INOTS to support interpersonal skills practice

Julia Campbell; Mark G. Core; Ron Artstein; Lindsay Armstrong; Arno Hartholt; Cyrus A. Wilson; Kallirroi Georgila; Fabrizio Morbini; Edward Haynes; Dave Gomboc; Mike Birch; Jonathan Bobrow; H. Chad Lane; Jillian Gerten; Anton Leuski; David R. Traum; Matthew Trimmer; Rich DiNinni; Matthew Bosack; Timothy Jones; Richard E. Clark; Kenneth A. Yates

The Immersive Naval Officer Training System (INOTS) is a blended learning environment that merges traditional classroom instruction with a mixed reality training setting. INOTS supports the instruction, practice and assessment of interpersonal communication skills. The goal of INOTS is to provide a consistent training experience to supplement interpersonal skills instruction for Naval officer candidates without sacrificing trainee throughput and instructor control. We developed an instructional design from cognitive task analysis interviews with experts to serve as a framework for system development. We also leveraged commercial student response technology and research technologies including natural language recognition, virtual humans, realistic graphics, intelligent tutoring and automated instructor support tools. In this paper, we describe our methodologies for developing a blended learning environment, and our challenges adding mixed reality and virtual human technologies to a traditional classroom to support interpersonal skills training.1 2


Theoretical Issues in Ergonomics Science | 2011

Advancing the practice of cognitive task analysis: a call for taxonomic research

Kenneth A. Yates; David F. Feldon

Cognitive task analysis (CTA) captures unobservable cognitive processes, decisions and judgments of expert performance. Over 100 different CTA methods are identified in prior literature. However, existing classifications typically sort techniques by process rather than outcome, application or causal mechanism. Therefore, techniques can be misapplied and comparative analysis of methods made difficult. Based on the frequency distribution of CTA methods in 1065 studies, a subsample representing 60% of the most frequently published methods was coded based on elicitation and analysis techniques. Consistency of resulting applications was assessed. Inconsistent matching of CTA methods and subsequent applications indicate CTA is currently more craft than technology. Therefore, there is no robust basis for selecting one method over another for research or practice. Specific challenges include comparing the reliability and validity of individual methods and optimising selection of techniques for intended applications. Developing a causal taxonomy may facilitate such advancements.


Archive | 2010

An Analysis of the Failure of Electronic Media and Discovery‐Based Learning

Richard E. Clark; Kenneth A. Yates; Sean Early; Kathrine Moulton

The project or effort described here has been partially sponsored by the U.S. Army Research, Development, and Engineering Command (RDECOM). Statements and opinions expressed do not necessarily reflect the position or the policy of the United States Government, and no official endorsement should be inferred. The authors wish to acknowledge that some sections of this chapter have been drawn from previously published manuscripts or technical reports and were used by permission, and we wish to acknowledge our debt to Dr. David Feldon for some of the material in the discussion of adaptable learning.


EJISDC: The Electronic Journal on Information Systems in Developing Countries | 2010

Establishing a Sustainable Legal Information System in a Developing Country: A Practical Guide

Kenneth A. Yates; Charles E. Shapiro

In this paper, a practical systems analysis approach is described for the planning, development and implementation of the information technology required to have a sustainable legal information system in a developing country. Considerations involved to create, compile and distribute the countrys governing laws in electronic form are described. Alternative database search and retrieval options are discussed, as well as issues relating to distribution of the database online, on local media, or on both. Based on a reasonable set of assumptions and general requirements for a developing country, a model legal information system is then presented. By using the approach suggested in this paper, a developing country can fully evaluate the cost‐benefit tradeoffs, as well as all other tradeoffs, in determining the most appropriate information technology to use for the creation, compilation, and distribution of its laws in electronic form.


Computers in Human Behavior | 2007

Increasing validity in the evaluation of new distance learning technologies

David F. Feldon; Kenneth A. Yates

In the development of distance learning, advances in cognitive science merge with new technology to deliver instruction worldwide. However, one major difficulty in evaluating the efficacy of these tools is determining which elements of instruction truly lead to observed changes in student performance. The purpose of this paper is to briefly review current use of various research methods for evaluating instructional technologies, discuss previous solutions to balancing the conflicting demands of internal and external validity, and then to propose a new research design that achieves this goal in a manner compatible with many instructional technology applications. The design, called a Strand of Pearls design, leverages the practice of delivering instruction in sequential modules to generate robust findings for which claims of internal validity, external validity, and maximal generalizability can be made.


EJISDC: The Electronic Journal on Information Systems in Developing Countries | 2011

Establishing A Sustainable Legal Information System In Azerbaijan: A Case Study

Charles E. Shapiro; Kenneth A. Yates

In this case study we describe the application of the principles set forth in the Practical Guide (Yates & Shapiro, 2010) to establish a sustainable legal information system in Azerbaijan. The Azerbaijan system was developed and implemented from 2004–2006 pursuant to a USAID‐funded project. The initial goal of the project was to assist the Azerbaijan Ministry of Justice to create and maintain a sustainable legal information system, to enable public access to the countrys governing laws on a current, complete, and accurate basis 24 hours‐a‐day, 7 days a week via the Internet and on CD‐ROM. Various “on the ground” factors that contributed to the design of the database containing Azerbaijans laws and those that resulted in deviations from the original plan are discussed in detail, as are recommendations based on lessons learned during the project. Using human performance research as a framework, we conclude with a discussion of the key individual and team performance issues that must be addressed to successfully sustain a legal information system in a developing country.


Archive | 2009

8 An Analysis of the Failure of Electronic Media and Discovery-Based Learning: Evidence for the Performance Benefits of Guided Training Methods 263

Herbert H. Bell; Dee H. Andrews; Wallace H. Wulfeck; Steven W. Villachica; Deborah L. Stone; Richard E. Clark; Kenneth A. Yates; Sean Early; Kathrine Moulton; Richard E. Mayer; Ruth Colvin Clark


Journal of Surgical Research | 2012

The Use of Cognitive Task Analysis to Improve Instructional Descriptions of Procedures

Richard E. Clark; Carla M. Pugh; Kenneth A. Yates; Kenji Inaba; Donald J. Green; Maura E. Sullivan

Collaboration


Dive into the Kenneth A. Yates's collaboration.

Top Co-Authors

Avatar

Richard E. Clark

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Maura E. Sullivan

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Kathrine Moulton

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Kenji Inaba

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Sean Early

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

David F. Feldon

University of South Carolina

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Julia Campbell

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Lydia Lam

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge