R. Calen Walshe
University of Edinburgh
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by R. Calen Walshe.
Journal of Experimental Psychology: Learning, Memory and Cognition | 2009
Mark R. Blair; Marcus R. Watson; R. Calen Walshe; Fillip Maj
Humans have an extremely flexible ability to categorize regularities in their environment, in part because of attentional systems that allow them to focus on important perceptual information. In formal theories of categorization, attention is typically modeled with weights that selectively bias the processing of stimulus features. These theories make differing predictions about the degree of flexibility with which attention can be deployed in response to stimulus properties. Results from 2 eye-tracking studies show that humans can rapidly learn to differently allocate attention to members of different categories. These results provide the first unequivocal demonstration of stimulus-responsive attention in a categorization task. Furthermore, the authors found clear temporal patterns in the shifting of attention within trials that follow from the informativeness of particular stimulus features. These data provide new insights into the attention processes involved in categorization.
Vision Research | 2014
R. Calen Walshe; Antje Nuthmann
In two experiments we investigated the control of fixation durations in naturalistic scene viewing. Empirical evidence from the scene onset delay paradigm and numerical simulations of such data with the CRISP model [Psychological Review 117 (2010) 382-405] have suggested that processing related difficulties may lead to prolonged fixation durations. Here, we ask whether processing related facilitation may lead to comparable decreases to fixation durations. Research in visual search and reading have reported only uni-directional shifts. To address the question of unidirectional (slow down) as opposed to bidirectional (slow down and speed up) adjustment of fixation durations in the context of scene viewing, we used a saccade-contingent display change method to either reduce or increase the luminance of the scene during prespecified critical fixations. Degrading the stimulus by shifting luminance down resulted in an immediate increase to fixation durations. However, clarifying the stimulus by shifting luminance upwards did not result in a comparable decrease to fixation durations. These results suggest that the control of fixation durations in scene viewing is asymmetric, as has been reported for visual search and reading.
Journal of Vision | 2015
R. Calen Walshe; Antje Nuthmann
Saccadic eye movements are the primary vehicle by which human gaze is brought in alignment with vital visual information present in naturalistic scenes. Although numerous studies using the double-step paradigm have demonstrated that saccade preparation is subject to modification under certain conditions, this has yet to be studied directly within a naturalistic scene-viewing context. To reveal characteristic properties of saccade programming during naturalistic scene viewing, we contrasted behavior across three conditions. In the Static condition of the main experiment, double-step targets were presented following a period of stable fixation on a central cross. In a Scene condition, targets were presented while participants actively explored a naturalistic scene. During a Noise condition, targets were presented during active exploration of a 1/f noise-filtered scene. In Experiment 2, we measure saccadic responses in three Static conditions (Uniform, Scene, and Noise) in which the backgrounds are the same as Experiment 1 but scene exploration is no longer permitted. We find that the mechanisms underlying saccade modification generalize to both dynamic conditions. However, we show that a property of saccade programming known as the saccadic dead time (SDT), the interval prior to saccade onset during which a saccade may not be canceled or modified, is lower in the Static task than it is in the dynamic tasks. We also find a trend toward longer SDT in the Scene as compared with Noise conditions. We discuss the implication of these results for computational models of scene viewing, reading, and visual search tasks.
PLOS ONE | 2014
Caitlyn McColeman; Jordan I. Barnes; Lihan Chen; Kimberly Meier; R. Calen Walshe; Mark R. Blair
Learning how to allocate attention properly is essential for success at many categorization tasks. Advances in our understanding of learned attention are stymied by a chicken-and-egg problem: there are no theoretical accounts of learned attention that predict patterns of eye movements, making data collection difficult to justify, and there are not enough datasets to support the development of a rich theory of learned attention. The present work addresses this by reporting five measures relating to the overt allocation of attention across 10 category learning experiments: accuracy, probability of fixating irrelevant information, number of fixations to category features, the amount of change in the allocation of attention (using a new measure called Time Proportion Shift - TIPS), and a measure of the relationship between attention change and erroneous responses. Using these measures, the data suggest that eye-movements are not substantially connected to error in most cases and that aggregate trial-by-trial attention change is generally stable across a number of changing task variables. The data presented here provide a target for computational models that aim to account for changes in overt attentional behaviors across learning.
Cognitive Science | 2013
R. Calen Walshe; Antje Nuthmann
Perception | 2014
R. Calen Walshe; Antje Nuthmann
Cognitive Science | 2014
Jordan I. Barnes; Caitlyn McColeman; Ekaterina R. Stepanova; Mark R. Blair; R. Calen Walshe
Cognitive Science | 2015
Jordan I. Barnes; Mark R. Blair; Paul F. Tupper; R. Calen Walshe
Cognitive Science | 2013
Jordan I. Barnes; Mark R. Blair; Paul F. Tupper; R. Calen Walshe
17th European Conference on Eye Movements | 2013
R. Calen Walshe; Antje Nuthmann