Erik Van der Burg
University of Sydney
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Erik Van der Burg.
Journal of Vision | 2009
Mark Nieuwenstein; Erik Van der Burg; Jan Theeuwes; Brad Wyble; Mary C. Potter
The attentional blink (AB) refers to the finding that observers often miss the second of two masked visual targets (T1 and T2, e.g., letters) appearing within 200-500 ms. Although the presence of a T1 mask is thought to be required for this effect, we recently found that an AB deficit can be observed even in the absence of a T1 mask if T2 is shown very briefly and followed by a pattern mask (M. R. Nieuwenstein, M. C. Potter, & J. Theeuwes, 2009). Using such a sensitive T2 task, the present study sought to determine the minimum requirements for eliciting an AB deficit. To this end, we examined if the occurrence of an AB depends on T1 exposure duration, the requirement to perform a task for T1, and awareness of T1. The results showed that an AB deficit occurs regardless of the presentation duration of T1, and regardless of whether there is a T1 task. A boundary condition for the occurrence of an AB was found in conscious awareness of T1. With a near-threshold detection task for T1, attention blinked when T1 was seen, but not when T1 was missed. Accordingly, we conclude that the minimum requirement for an AB deficit is T1 awareness.
The Journal of Neuroscience | 2013
Erik Van der Burg; David Alais; John Cass
To combine information from different sensory modalities, the brain must deal with considerable temporal uncertainty. In natural environments, an external event may produce simultaneous auditory and visual signals yet they will invariably activate the brain asynchronously due to different propagation speeds for light and sound, and different neural response latencies once the signals reach the receptors. One strategy the brain uses to deal with audiovisual timing variation is to adapt to a prevailing asynchrony to help realign the signals. Here, using psychophysical methods in human subjects, we investigate audiovisual recalibration and show that it takes place extremely rapidly without explicit periods of adaptation. Our results demonstrate that exposure to a single, brief asynchrony is sufficient to produce strong recalibration effects. Recalibration occurs regardless of whether the preceding trial was perceived as synchronous, and regardless of whether a response was required. We propose that this rapid recalibration is a fast-acting sensory effect, rather than a higher-level cognitive process. An account in terms of response bias is unlikely due to a strong asymmetry whereby stimuli with vision leading produce bigger recalibrations than audition leading. A fast-acting recalibration mechanism provides a means for overcoming inevitable audiovisual timing variation and serves to rapidly realign signals at onset to maximize the perceptual benefits of audiovisual integration.
NeuroImage | 2011
Erik Van der Burg; Durk Talsma; Christian N. L. Olivers; Clayton Hickey; Jan Theeuwes
In dynamic cluttered environments, audition and vision may benefit from each other in determining what deserves further attention and what does not. We investigated the underlying neural mechanisms responsible for attentional guidance by audiovisual stimuli in such an environment. Event-related potentials (ERPs) were measured during visual search through dynamic displays consisting of line elements that randomly changed orientation. Search accuracy improved when a target orientation change was synchronized with an auditory signal as compared to when the auditory signal was absent or synchronized with a distractor orientation change. The ERP data show that behavioral benefits were related to an early multisensory interaction over left parieto-occipital cortex (50-60 ms post-stimulus onset), which was followed by an early positive modulation (80-100 ms) over occipital and temporal areas contralateral to the audiovisual event, an enhanced N2pc (210-250 ms), and a contralateral negative slow wave (CNSW). The early multisensory interaction was correlated with behavioral search benefits, indicating that participants with a strong multisensory interaction benefited the most from the synchronized auditory signal. We suggest that an auditory signal enhances the neural response to a synchronized visual event, which increases the chances of selection in a multiple object environment.
Journal of Experimental Psychology: Human Perception and Performance | 2007
Jan Theeuwes; Erik Van der Burg
Even though it is undisputed that prior information regarding the location of a target affects visual selection, the issue of whether information regarding nonspatial features, such as color and shape, has similar effects has been a matter of debate since the early 1980s. In the study described in this article, measures derived from signal detection theory were used to show that perceptual sensitivity is affected by a top-down set for spatial information but not by a top-down set for nonspatial information. This indicates that knowing where the target singleton is affects perceptual selectivity but that knowing what it is does not help selectivity. Furthermore, perceptual sensitivity can be enhanced by nonspatial features, but only through a process related to bottom-up priming. These findings have important implications for models of visual selection.
PLOS ONE | 2010
Erik Van der Burg; John Cass; Christian N. L. Olivers; Jan Theeuwes; David Alais
Background A prevailing view is that audiovisual integration requires temporally coincident signals. However, a recent study failed to find any evidence for audiovisual integration in visual search even when using synchronized audiovisual events. An important question is what information is critical to observe audiovisual integration. Methodology/Principal Findings Here we demonstrate that temporal coincidence (i.e., synchrony) of auditory and visual components can trigger audiovisual interaction in cluttered displays and consequently produce very fast and efficient target identification. In visual search experiments, subjects found a modulating visual target vastly more efficiently when it was paired with a synchronous auditory signal. By manipulating the kind of temporal modulation (sine wave vs. square wave vs. difference wave; harmonic sine-wave synthesis; gradient of onset/offset ramps) we show that abrupt visual events are required for this search efficiency to occur, and that sinusoidal audiovisual modulations do not support efficient search. Conclusions/Significance Thus, audiovisual temporal alignment will only lead to benefits in visual search if the changes in the component signals are both synchronized and transient. We propose that transient signals are necessary in synchrony-driven binding to avoid spurious interactions with unrelated signals when these occur close together in time.
Brain Research | 2008
Christian N. L. Olivers; Erik Van der Burg
The second of two targets (T2) presented in a rapid visual stream is often missed when presented shortly after the first (T1). This phenomenon has been referred to as the attentional blink. Here we show that the presentation of a synchronous sound enables T2 to escape the attentional blink, to the extent that performance was back at the level of T1. The sound also improved T1 identification, with little evidence for a trade-off between T1 and T2. Improvements were also found even if the sounds coincided with distractors on 82% of the trials, suggesting an automatic component. Sounds that preceded the targets had little to no effect on T2, suggesting that the enhancement was not due to alerting. These findings replicate and extend earlier work on audition-driven perceptual enhancement of single visual targets. They also have implications for theories of the attentional blink.
Attention Perception & Psychophysics | 2011
Jan Theeuwes; Erik Van der Burg
In the present study, observers viewed displays in which two equally salient color singletons were simultaneously present. Before each trial, observers received a word cue (e.g., the word red, or green) or a symbolic cue (a circle colored red or green) telling them which color singleton to select on the upcoming trial. Even though many theories of visual search predict that observers should be able to selectively attend the target color singleton, the results of the present study show that observers could not select the target singleton without interference from the irrelevant color singleton. The results indicate that the irrelevant color singleton captured attention. Only when the color of the target singleton remained the same from one trial to the next was selection perfect—an effect that is thought to be the result of passive automatic intertrial priming. The results of the present study demonstrate the limits of top-down attentional control.
Experimental Brain Research | 2015
Erik Van der Burg; Emily Orchard-Mills; David Alais
Following prolonged exposure to asynchronous multisensory signals, the brain adapts to reduce the perceived asynchrony. Here, in three separate experiments, participants performed a synchrony judgment task on audiovisual, audiotactile or visuotactile stimuli and we used inter-trial analyses to examine whether temporal recalibration occurs rapidly on the basis of a single asynchronous trial. Even though all combinations used the same subjects, task and design, temporal recalibration occurred for audiovisual stimuli (i.e., the point of subjective simultaneity depended on the preceding trial’s modality order), but none occurred when the same auditory or visual event was combined with a tactile event. Contrary to findings from prolonged adaptation studies showing recalibration for all three combinations, we show that rapid, inter-trial recalibration is unique to audiovisual stimuli. We conclude that recalibration occurs at two different timescales for audiovisual stimuli (fast and slow), but only on a slow timescale for audiotactile and visuotactile stimuli.
Psychological Science | 2013
Erik Van der Burg; Edward Awh; Christian N. L. Olivers
The human visual attention system is geared toward detecting the most salient and relevant events in an overwhelming stream of information. There has been great interest in measuring how many visual events can be processed at a time, and most of the work has suggested that the limit is three to four. However, attention to a visual stimulus can also be driven by a synchronous auditory event. The present work indicates that a fundamentally different limit applies to audiovisual processing, such that at most only a single audiovisual event can be processed at a time. This limited capacity is not due to a limitation in visual selection; participants were able to process about four visual objects simultaneously. Instead, we propose that audiovisual orienting is subject to a fundamentally different capacity limit than pure visual selection is.
Psychonomic Bulletin & Review | 2008
Jan Theeuwes; Erik Van der Burg; Artem V. Belopolsky
It has been claimed that the detection of a feature singleton can be based on activity in a feature map that allows coarse coding that something unique is present in the visual field. In the present study, participants detected the presence or absence of a color singleton. Even though the letter form of the color singleton was task-irrelevant, we showed that repeating the letter form of the singleton resulted in repetition priming on the next trial. Such repetition priming was not found when a nonsingleton letter was repeated as the singleton. Since the letter form of the color singleton could only be picked up by focal attention, the repetition priming effect indicates that focal attention is allocated to the feature singleton even in the simplest present-absent feature detection tasks. We showed that this effect is equally strong in conditions of low and high perceptual load. These results are inconsistent with theories that it is possible to detect a feature singleton without directing some form of attention to its location.