Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ivonne J. Figueroa is active.

Publication


Featured researches published by Ivonne J. Figueroa.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2011

Reactive Task-Set Switching Ability, Not Working Memory Capacity, Predicts Change Blindness Sensitivity

Robert J. Youmans; Ivonne J. Figueroa; Olga Kramarova

Individual differences in working memory capacity (WMC) have been shown to predict how well people perform tasks that require directed attention, but the individual differences responsible for task-set switching and noticing behaviors are less well understood. In this study, 86 undergraduate students from California State University, Northridge completed a measure of WMC, a measure of cognitive flexibility, and attempted to identify disappearing objects in change-blindness slides. WMC was not related to our measure of cognitive flexibility or change detection, but cognitive flexibility was directly correlated with the ability to notice change. The findings suggest that the ability to notice sudden changes in an environment, an ability that is of paramount importance for the safe operation of complex machinery and systems, may be supported by individual differences that are independent of WMC.


international conference on engineering psychology and cognitive ergonomics | 2013

A New Behavioral Measure of Cognitive Flexibility

Christian A. Gonzalez; Ivonne J. Figueroa; Brooke G. Bellows; Dustin Rhodes; Robert J. Youmans

Individual differences in cognitive flexibility may underlie a variety of different user behaviors, but a lack of effective measurement tools has limited the predictive and descriptive potential of cognitive flexibility in human-computer interaction applications. This study presents a new computerized measure of cognitive flexibility, and then provides evidence for convergent validity. Our findings indicate moderate to strong correlations with the Trail Making Task, and in particular, those aspects of the task most closely associated with cognitive flexibility. Results of this study provide support for the validity of a new measure of cognitive flexibility. We conclude by discussing the measures potential applicability in the field of HCI.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2011

Developing an Easy-to-Administer, Objective, and Valid Assessment of Cognitive Flexibility

Ivonne J. Figueroa; Robert J. Youmans

Cognitive flexibility, the ability to abandon an active cognitive strategy in favor of another, may predict successful performance in tasks that require divided attention, but measuring cognitive flexibility is challenging. Here, two studies assessed cognitive flexibility using Grant and Berg’s (1948) Wisconsin Card Sorting Task (WCST) and an easy-to-administer puzzle task under development. Three variables of flexibility in the WCST were hypothesized to predict puzzle performance. In Study 1, undergraduate students (n = 88) from California State University, Northridge completed both the WCST and the puzzle. Results indicated that only the variable ‘trials to complete first category’ reliably correlated with puzzle performance, therefore a revised puzzle was created. In Study 2, undergraduate students (n = 40) from the same university repeated the experiment, resulting in a stronger relationship between ‘trials to complete first category’ and the modified puzzle. The results suggest that cognitive flexibility can be measured using puzzles that require frequent strategy shifts like those reported here.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2013

Failure to Maintain Set: A Measure of Distractibility or Cognitive Flexibility?

Ivonne J. Figueroa; Robert J. Youmans

The Wisconsin Card Sorting Test (WCST) is a general behavioral assessment that contains a myriad of dependent variables, each contributing to the overall assessment of executive function. In this paper, the authors explore the underlying ability that is measured by the variable failure to maintain set (FMS). Two opposing theories, cognitive flexibility and distractibility, are presented to determine what cognitive processes underlie failures to maintain set, and two analyses of archival data are presented. In analysis one, we analyzed data from a study where the WCST predicted creativity in participant constructions of Haiku poetry, but the analysis was not able distinguish whether FMS was predicting cognitively flexibility or distractibility. In analysis two, we analyzed data from a separate study where the WCST was used to predict vigilance in a divided attention task, and we detected that FMS inversely predicted vigilant performance. Our overall analysis suggests that FMS is an assessment of distractibility, not cognitive flexibility. We end with a discussion of the implications of our findings, and directions for future research.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2012

Individual differences in cognitive flexibility predict performance in vigilance tasks

Ivonne J. Figueroa; Robert J. Youmans

‘Real-world’ vigilance tasks are difficult to perform because they require sustained and divided attention. The present study investigated whether individual differences in a person’s cognitive flexibility, the ability to abandon one cognitive strategy in favor of another, can predict performance on a vigilance task. Sixty-one undergraduate students from California State University, Northridge participated in this study. The Wisconsin Card Sorting Task was used to measure participants’ level of cognitive flexibility. Vigilance was examined using a multi-screened Clock Task. Participants then performed either a nine-minute Static or Dynamic Clock task. Two variables of cognitive flexibility were found to predict signal detection. Cognitive flexibility may eventually become a useful individual difference measure that can help provide insight for vigilance training strategies.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2014

Cognitive Flexibility and Sustained Attention: See something, say something (even when it's not there)

Ivonne J. Figueroa; Robert J. Youmans; Tyler H. Shaw

Researchers investigating the relationship between individual differences and sustained attention tasks do not clearly find marked traits and abilities that are predictive of vigilant performance. Yet, this important research is applicable to tasks like driving, TSA monitoring, Air Traffic Control, and even for the Department of Homeland Security’s civilian campaign, “See something, say something.” In this paper, we take an individual differences approach to uncover the relationship between cognitive flexibility and sustained attention. Twenty-nine undergraduate students from George Mason University participated in this study for course credit. The Youmans Cognitive Flexibility Puzzle (Gonzalez, Figueroa, Bellows, Rhodes, & Youmans, 2013) was used to assess cognitive flexibility, and a modified version the Air Traffic Controller (ATC) task (Hitchcock, Warm, Matthews, Dember, Shear, Tripp et al., 2003) measured sustained attention. Mixed ANOVAs were used to analyze performance on the ATC task (hits, false alarms, reaction times). Highly flexible individuals were faster to respond despite missing signals and committing errors. Implications are discussed.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2013

The Effects of Task-Set Switching On Concurrent Verbal Protocol

Robert J. Youmans; Christian A. Gonzalez; Ivonne J. Figueroa; Brooke G. Bellows

Concurrent verbal protocol (CVP) is a common usability testing and analysis technique that requires people to continuously vocalize their thoughts as they complete a task. Given the widespread use of concurrent verbal protocols in applied domains, it is surprising how little is known regarding concurrent verbal protocol’s effect on task performance. In the current series of studies, we examined how concurrent verbal protocols affected performance on two tasks that required users to frequently switch between cognitive strategies. Data revealed that CVP slowed down participants in comparison with participants who completed tasks in silence. The number of strategy changes that were required to complete a task did not affect this performance decrement. We conclude by discussing the limitations of the experiments reported here, and with practical advice for usability experts who use CVP in their own work.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2012

Creating a Computerized Assessment of Cognitive Flexibility with a User-Friendly Participant and Experimenter Interface

Christian A. Gonzalez; Stephanie M. Pratt; William Benson; Ivonne J. Figueroa; Dustin Rhodes; Robert J. Youmans

Researchers are often faced with practical hurdles to data collection stemming from poorly designed research tools. In this set of studies, we utilized an iterative design process to develop a new assessment of individual differences in cognitive flexibility. The development cycle began with paper prototypes of the cognitive flexibility assessment and ended with a computerized prototype research tool. Here we outline our development process, report results from user testing, and demonstrate how human factors methodology, often used in product design, can also be successfully utilized to test and improve the user friendliness of psychology assessment tools.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2017

Determining Practice Effects on a Cognitive Flexibility Assessment

Jasmine S. Dang; Ivonne J. Figueroa; William S. Helton

The Youmans Cognitive Flexibility Assessment (YCFA) was created as an easy-to-administer and entertaining assessment of cognitive flexibility. Although a validation and reliability study of the YCFA was conducted, practice effects were not assessed. This present study aims to uncover whether practice effects occur for the YCFA. Thirty-six undergraduate university students completed four rounds of the YCFA (six trials per round). Practice effects occurred in the first and second rounds, but task performance stabilized after the third round. Implications are discussed.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2016

The Usability of Academic Advising forms

Yasaman Diederiks; Ivonne J. Figueroa

Universities provide students with a myriad of resources intended to help them understand their degree progress including the opportunity to attend academic advising. Yet, students often forget the information that they receive soon after their appointment, even when provided with an advising form. In the present study, George Mason University’s Department of Psychology contracted the authors to uncover why students have difficulty understanding the current advising form and to identify those top issues. Following pilot testing, a larger usability test was developed that tested student’s understanding of the seven most common questions that a typical advising meeting addresses. Twenty-five students completed a series of usability tasks and pre and post surveys in order to gauge their level of understanding of the current and a redesigned advising form. Differences between the current and redesigned form and questions were found. Implications for redesigned form are discussed and recommendations are provided.

Collaboration


Dive into the Ivonne J. Figueroa's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dustin Rhodes

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Olga Kramarova

California State University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge