David Alais
University of Sydney
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by David Alais.
Seeing and Perceiving | 2010
David Alais; Fiona N. Newell; Pascal Mamassian
Research in multisensory processes has exploded over the last decade. Tremendous advances have been made in a variety of fields from single-unit neural recordings and functional brain imaging through to behaviour, perception and cognition. These diverse approaches have highlighted how the senses work together to produce a coherent multimodal representation of the external world that enables us to function better by exploiting the redundancies and complementarities provided by multiple sensory modalities. With large numbers of new students and researchers being attracted to multisensory research, and the multi-disciplinary nature of the work, our aim in this review is to provide an overview of multisensory processing that includes all fields in a single review. Our intention is to provide a comprehensive source for those interested in learning about multisensory processes, covering a variety of sensory combinations and methodologies, and tracing the path from single-unit neurophysiology through to perception and cognitive functions such as attention and speech.
The Journal of Neuroscience | 2009
Raymond van Ee; Jeroen J. A. van Boxtel; Amanda Parker; David Alais
The neural mechanisms underlying attentional selection of competing neural signals for awareness remains an unresolved issue. We studied attentional selection, using perceptually ambiguous stimuli in a novel multisensory paradigm that combined competing auditory and competing visual stimuli. We demonstrate that the ability to select, and attentively hold, one of the competing alternatives in either sensory modality is greatly enhanced when there is a matching cross-modal stimulus. Intriguingly, this multimodal enhancement of attentional selection seems to require a conscious act of attention, as passively experiencing the multisensory stimuli did not enhance control over the stimulus. We also demonstrate that congruent auditory or tactile information, and combined auditory–tactile information, aids attentional control over competing visual stimuli and visa versa. Our data suggest a functional role for recently found neurons that combine voluntarily initiated attentional functions across sensory modalities. We argue that these units provide a mechanism for structuring multisensory inputs that are then used to selectively modulate early (unimodal) cortical processing, boosting the gain of task-relevant features for willful control over perceptual awareness.
Proceedings of the Royal Society of London B: Biological Sciences | 2006
David Alais; Concetta Morrone; David C. Burr
Current models of attention, typically claim that vision and audition are limited by a common attentional resource which means that visual performance should be adversely affected by a concurrent auditory task and vice versa. Here, we test this implication by measuring auditory (pitch) and visual (contrast) thresholds in conjunction with cross-modal secondary tasks and find that no such interference occurs. Visual contrast discrimination thresholds were unaffected by a concurrent chord or pitch discrimination, and pitch-discrimination thresholds were virtually unaffected by a concurrent visual search or contrast discrimination task. However, if the dual tasks were presented within the same modality, thresholds were raised by a factor of between two (for visual discrimination) and four (for auditory discrimination). These results suggest that at least for low-level tasks such as discriminations of pitch and contrast, each sensory modality is under separate attentional control, rather than being limited by a supramodal attentional resource. This has implications for current theories of attention as well as for the use of multi-sensory media for efficient informational transmission.
Psychological Science | 2006
Chris L. E. Paffen; David Alais; Frans A. J. Verstraten
During binocular rivalry, incompatible images presented dichoptically compete for perceptual dominance. It has long been debated whether binocular rivalry can be controlled by attention. Most studies have shown that voluntary control over binocular rivalry is limited. We sought to remove attention from binocular rivalry by presenting a concurrent task. Diverting attention slowed the rivalry alternation rate, and did so in proportion to the difficulty of the concurrent task. Even a very demanding distractor task, however, did not arrest rivalry alternations completely. Given that diverting attention was equivalent to lowering the contrast of the rival stimuli, the ability of attention to speed binocular rivalry is most likely due to an increase in the effective contrast of the stimuli through boosting the gain of the cortical response. This increase in effective contrast will ultimately lead to a perceptual switch, thereby limiting voluntary control. Thus, attention speeds rivalry alternations, but has no inherent control over the rivalry process.
Current Biology | 2010
David Alais; John Cass; Robert P O'Shea; Randolph Blake
When viewing a different stimulus with each eye, we experience the remarkable phenomenon of binocular rivalry: alternations in consciousness between the stimuli [1, 2]. According to a popular theory first proposed in 1901, neurons encoding the two stimuli engage in reciprocal inhibition [3-8] so that those processing one stimulus inhibit those processing the other, yielding consciousness of one dominant stimulus at any moment and suppressing the other. Also according to the theory, neurons encoding the dominant stimulus adapt, weakening their activity and the inhibition they can exert, whereas neurons encoding the suppressed stimulus recover from adaptation until the balance of activity reverses, triggering an alternation in consciousness. Despite its popularity, this theory has one glaring inconsistency with data: during an episode of suppression, visual sensitivity to brief probe stimuli in the dominant eye should decrease over time and should increase in the suppressed eye, yet sensitivity appears to be constant [9, 10]. Using more appropriate probe stimuli (experiment 1) in conjunction with a new method (experiment 2), we found that sensitivities in dominance and suppression do show the predicted complementary changes.
Nature Neuroscience | 1999
David Alais; Randolph Blake
Single-cell and neuroimaging studies reveal that attention focused on a visual object markedly amplifies neural activity produced by features of the attended object. In a psychophysical study, we found that visual attention could modulate the strength of weak motion signals to the point that the perceived direction of motion, putatively registered early in visual processing, was powerfully altered. This strong influence of attention on early motion processing, beside complementing neurophysiological evidence for attentional modulation early in the visual pathway, can be measured in terms of equivalent motion energy, and thus provides a useful metric for quantifying attentions effects.
Journal of Vision | 2010
Susan G. Wardle; John Cass; Kevin R. Brooks; David Alais
To study the effect of blur adaptation on accommodative variability, accommodative responses and pupil diameters in myopes (n = 22) and emmetropes (n = 19) were continuously measured before, during, and after exposure to defocus blur. Accommodative and pupillary response measurements were made by an autorefractor during a monocular reading exercise. The text was presented on a computer screen at 33 cm viewing distance with a rapid serial visual presentation paradigm. After baseline testing and a 5-min rest, blur was induced by wearing either an optimally refractive lens, or a +1.0 DS or a +3.0 DS defocus lens. Responses were continuously measured during a 5-min period of adaptation. The lens was then removed, and measurements were again made during a 5-min post-adaptation period. After a second 5-min rest, a final post-adaptation period was measured. No significant change of baseline accommodative responses was found after the 5-min period of adaptation to the blurring lenses (p > 0.05). Compared to the pre-adaptation level, both refractive groups had similar and significant increases in accommodative variability right after blur adaptation to both defocus lenses. After the second rest period, the accommodative variability in both groups returned to the pre-adaptation level. The results indicate that blur adaptation has a short-term effect on the accommodative system to elevate instability of the accommodative response. Mechanisms underlying the increase in accommodative variability by blur adaptation and possible influences of the accommodation stability on myopia development were discussed.
Vision Research | 2003
Vincent A. Nguyen; Alan W. Freeman; David Alais
Binocular rivalry refers to the alternating perception that occurs when the two eyes are presented with incompatible stimuli: one monocular image is seen exclusively for several seconds before disappearing as the other image comes into view. The unseen stimulus is physically present but is not perceived because the sensory signals it elicits are suppressed. The neural site of this binocular rivalry suppression is a source of continuing controversy. We psychophysically tested human subjects, using test probes designed to selectively activate the visual system at a variety of processing stages. The results, which apply to both form and motion judgements, show that the sensitivity loss during suppression increases as the subjects task becomes more sophisticated. We conclude that binocular rivalry suppression is present at a number of stages along two visual cortical pathways, and that suppression deepens as the visual signal progresses along these pathways.
Progress in Brain Research | 2006
David C. Burr; David Alais
Robust perception requires that information from by our five different senses be combined at some central level to produce a single unified percept of the world. Recent theory and evidence from many laboratories suggests that the combination does not occur in a rigid, hardwired fashion, but follows flexible situation-dependent rules that allow information to be combined with maximal efficiency. In this review we discuss recent evidence from our laboratories investigating how information from auditory and visual modalities is combined. The results support the notion of Bayesian combination. We also examine temporal alignment of auditory and visual signals, and show that perceived simultaneity does not depend solely on neural latencies, but involves active processes that compensate, for example, for the physical delay introduced by the relatively slow speed of sound. Finally, we go on to show that although visual and auditory information is combined to maximize efficiency, attentional resources for the two modalities are largely independent.
Nature Neuroscience | 2001
Jean Lorenceau; David Alais
Visual analyses of form and motion proceed along parallel streams. Unified perception of moving forms requires interactions between these streams, although whether the interactions occur early or late in cortical processing remains unresolved. Using rotating outlined shapes sampled through apertures, we showed that binding local motions into global object motion depends strongly on spatial configuration. Identical local motion components are perceived coherently when they define closed configurations, but usually not when they define open configurations. Our experiments show this influence arises in early cortical levels and operates as a form-based veto of motion integration in the absence of closure.