Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Zachary J. J. Roper is active.

Publication


Featured researches published by Zachary J. J. Roper.


Journal of Experimental Psychology: Human Perception and Performance | 2013

Perceptual load corresponds with factors known to influence visual search

Zachary J. J. Roper; Joshua D. Cosman; Shaun P. Vecera

One account of the early versus late selection debate in attention proposes that perceptual load determines the locus of selection. Attention selects stimuli at a late processing level under low-load conditions but selects stimuli at an early level under high-load conditions. Despite the successes of perceptual load theory, a noncircular definition of perceptual load remains elusive. We investigated the factors that influence perceptual load by using manipulations that have been studied extensively in visual search, namely target-distractor similarity and distractor-distractor similarity. Consistent with previous work, search was most efficient when targets and distractors were dissimilar and the displays contained homogeneous distractors; search became less efficient when target-distractor similarity increased irrespective of display heterogeneity. Importantly, we used these same stimuli in a typical perceptual load task that measured attentional spillover to a task-irrelevant flanker. We found a strong correspondence between search efficiency and perceptual load; stimuli that generated efficient searches produced flanker interference effects, suggesting that such displays involved low perceptual load. Flanker interference effects were reduced in displays that produced less efficient searches. Furthermore, our results demonstrate that search difficulty, as measured by search intercept, has little bearing on perceptual load. We conclude that rather than be arbitrarily defined, perceptual load might be defined by well-characterized, continuous factors that influence visual search.


Psychological Science | 2014

Value-Driven Attentional Capture in Adolescence

Zachary J. J. Roper; Shaun P. Vecera; Jatin G. Vaidya

Adolescence has been characterized as a period of both opportunity and vulnerability. Numerous clinical conditions, including substance-use disorders, often emerge during adolescence. These maladaptive behaviors have been linked to problems with cognitive control, yet few studies have investigated how rewards differentially modulate attentional processes in adolescents versus adults. Here, we trained adults and adolescents on a visual task to establish stimulus-reward associations. Later, we assessed learning in an extinction task in which previously rewarded stimuli periodically appeared as distractors. Both age groups initially demonstrated value-driven attentional capture; however, the effect persisted longer in adolescents than in adults. The results could not be explained by developmental differences in visual working memory. Given the importance of attentional control to daily behaviors and clinical conditions such as attention-deficit/hyperactivity disorder, these results reveal that cognitive control failures in adolescence may be linked to a value-based attentional-capture effect.


Psychology of Learning and Motivation | 2014

Chapter Eight – The Control of Visual Attention: Toward a Unified Account

Shaun P. Vecera; Joshua D. Cosman; Daniel B. Vatterott; Zachary J. J. Roper

Visual attention is deployed through visual scenes to find behaviorally relevant targets. This attentional deployment—or attentional control—can be based on either stimulus factors, such as the salience of an object or region, or goal relevance, such as the match between an object and the target being searched for. Decades of research have measured attentional control by examining attentional interruption by a completely irrelevant distracting object, which may or may not capture attention. Based on the results of attentional capture tasks, the literature has distilled two alternative views of attentional control and capture: one focused on stimulus-driven factors and the other based on goal-driven factors. In the current paper, we propose an alternative in which stimulus-driven control and goal-driven control are not mutually exclusive but instead related through task dynamics, specifically experience. Attentional control is initially stimulus-driven. However, as participants gain experience with all aspects of a task, attentional control rapidly becomes increasingly goal-driven. We present four experiments that examine this experience-dependent attentional tuning. We show that to resist capture and be highly selective based on target properties, attention must be configured to aspects of a task through experience.Abstract Visual attention is deployed through visual scenes to find behaviorally relevant targets. This attentional deployment—or attentional control—can be based on either stimulus factors, such as the salience of an object or region, or goal relevance, such as the match between an object and the target being searched for. Decades of research have measured attentional control by examining attentional interruption by a completely irrelevant distracting object, which may or may not capture attention. Based on the results of attentional capture tasks, the literature has distilled two alternative views of attentional control and capture: one focused on stimulus-driven factors and the other based on goal-driven factors. In the current paper, we propose an alternative in which stimulus-driven control and goal-driven control are not mutually exclusive but instead related through task dynamics, specifically experience. Attentional control is initially stimulus-driven. However, as participants gain experience with all aspects of a task, attentional control rapidly becomes increasingly goal-driven. We present four experiments that examine this experience-dependent attentional tuning. We show that to resist capture and be highly selective based on target properties, attention must be configured to aspects of a task through experience.


Psychonomic Bulletin & Review | 2012

Searching for two things at once: Establishment of multiple attentional control settings on a trial-by-trial basis

Zachary J. J. Roper; Shaun P. Vecera

Recent work has demonstrated that attention can be configured to multiple potential targets in spatial search. However, this previous work relied on a fixed set of targets across multiple trials, allowing observers to offload attentional control settings to longer-term representations. In the present experiments, we demonstrate multiple attentional control settings that operate independently of space (Experiments 1 and 2). More important, we show that observers can be cued to different control settings on a trial-by-trial basis (Experiment 3). The latter result suggests that observers were capable of maintaining multiple control settings when the demands of the task required an attentional search for specific feature values. Attention can be configured to extract multiple feature values in a goal-directed manner, and this configuration can be can be dynamically engaged on a trial-by-trial basis. These results support recent findings that reveal the high precision, complexity, and flexibility of attentional control settings.


Psychonomic Bulletin & Review | 2014

Visual short-term memory load strengthens selective attention

Zachary J. J. Roper; Shaun P. Vecera

Perceptual load theory accounts for many attentional phenomena; however, its mechanism remains elusive because it invokes underspecified attentional resources. Recent dual-task evidence has revealed that a concurrent visual short-term memory (VSTM) load slows visual search and reduces contrast sensitivity, but it is unknown whether a VSTM load also constricts attention in a canonical perceptual load task. If attentional selection draws upon VSTM resources, then distraction effects—which measure attentional “spill-over”—will be reduced as competition for resources increases. Observers performed a low perceptual load flanker task during the delay period of a VSTM change detection task. We observed a reduction of the flanker effect in the perceptual load task as a function of increasing concurrent VSTM load. These findings were not due to perceptual-level interactions between the physical displays of the two tasks. Our findings suggest that perceptual representations of distractor stimuli compete with the maintenance of visual representations held in memory. We conclude that access to VSTM determines the degree of attentional selectivity; when VSTM is not completely taxed, it is more likely for task-irrelevant items to be consolidated and, consequently, affect responses. The “resources” hypothesized by load theory are at least partly mnemonic in nature, due to the strong correspondence they share with VSTM capacity.


Frontiers in Psychology | 2013

Response terminated displays unload selective attention

Zachary J. J. Roper; Shaun P. Vecera

Perceptual load theory successfully replaced the early vs. late selection debate by appealing to adaptive control over the efficiency of selective attention. Early selection is observed unless perceptual load (p-Load) is sufficiently low to grant attentional “spill-over” to task-irrelevant stimuli. Many studies exploring load theory have used limited display durations that perhaps impose artificial limits on encoding processes. We extended the exposure duration in a classic p-Load task to alleviate temporal encoding demands that may otherwise tax mnemonic consolidation processes. If the load effect arises from perceptual demands alone, then freeing-up available mnemonic resources by extending the exposure duration should have little effect. The results of Experiment 1 falsify this prediction. We observed a reliable flanker effect under high p-Load, response-terminated displays. Next, we orthogonally manipulated exposure duration and task-relevance. Counter-intuitively, we found that the likelihood of observing the flanker effect under high p-Load resides with the duration of the task-relevant array, not the flanker itself. We propose that stimulus and encoding demands interact to produce the load effect. Our account clarifies how task parameters differentially impinge upon cognitive processes to produce attentional “spill-over” by appealing to visual short-term memory as an additional processing bottleneck when stimuli are briefly presented.


Visual Cognition | 2015

Rewards shape attentional search modes

Zachary J. J. Roper; Shaun P. Vecera

Visual attention can be configured for specific stimulus features (feature search mode) or it can be non-specifically set for salient pop-outs (singleton detection mode). Additionally, monetary rewards have been shown to bias attention toward specific features, but it is unknown whether secondary reinforcers (images of US


Journal of Experimental Psychology: Human Perception and Performance | 2014

Location-Specific Effects of Attention During Visual Short-Term Memory Maintenance

Michi Matsukura; Joshua D. Cosman; Zachary J. J. Roper; Daniel B. Vatterott; Shaun P. Vecera

) can shape global attention via search modes. In a between-group study, we trained participants to value one search mode over the other. In a testing phase, a salient distractor captured the attention of the value-singleton group; however, the value feature group was completely unaffected. This suggests that rewards automatically bias global attention mechanisms and potentially mediate the handoff between stimulus-driven and goal-directed attentional control.


Archive | 2014

The Control of Visual Attention

Shaun P. Vecera; Joshua D. Cosman; Daniel B. Vatterott; Zachary J. J. Roper


Journal of Vision | 2011

Perceptual load effect is determined by resource demand and data limitation

Zachary J. J. Roper; Joshua D. Cosman; J. Toby Mordkoff; Shaun P. Vecera

Collaboration


Dive into the Zachary J. J. Roper's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge