Emily Orchard-Mills
University of Sydney
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Emily Orchard-Mills.
Experimental Brain Research | 2015
Erik Van der Burg; Emily Orchard-Mills; David Alais
Following prolonged exposure to asynchronous multisensory signals, the brain adapts to reduce the perceived asynchrony. Here, in three separate experiments, participants performed a synchrony judgment task on audiovisual, audiotactile or visuotactile stimuli and we used inter-trial analyses to examine whether temporal recalibration occurs rapidly on the basis of a single asynchronous trial. Even though all combinations used the same subjects, task and design, temporal recalibration occurred for audiovisual stimuli (i.e., the point of subjective simultaneity depended on the preceding trial’s modality order), but none occurred when the same auditory or visual event was combined with a tactile event. Contrary to findings from prolonged adaptation studies showing recalibration for all three combinations, we show that rapid, inter-trial recalibration is unique to audiovisual stimuli. We conclude that recalibration occurs at two different timescales for audiovisual stimuli (fast and slow), but only on a slow timescale for audiotactile and visuotactile stimuli.
Scientific Reports | 2015
Jean-Paul Noel; Mark T. Wallace; Emily Orchard-Mills; David Alais; Erik Van der Burg
Perception and behavior are fundamentally shaped by the integration of different sensory modalities into unique multisensory representations, a process governed by spatio-temporal correspondence. Prior work has characterized temporal perception using the point in time at which subjects are most likely to judge multisensory stimuli to be simultaneous (PSS) and the temporal binding window (TBW) over which participants are likely to do so. Here we examine the relationship between the PSS and the TBW within and between individuals, and within and between three sensory combinations: audiovisual, audiotactile and visuotactile. We demonstrate that TBWs correlate within individuals and across multisensory pairings, but PSSs do not. Further, we reveal that while the audiotactile and audiovisual pairings show tightly related TBWs, they also exhibit a differential relationship with respect to true and perceived multisensory synchrony. Thus, audiotactile and audiovisual temporal processing share mechanistic features yet are respectively functionally linked to objective and subjective synchrony.
Journal of Vision | 2013
Emily Orchard-Mills; Erik Van der Burg; David Alais
A recent study by Guzman-Martinez, Ortega, Grabowecky, Mossbridge, and Suzuki (2012) showed that participants match the frequency of an amplitude-modulated auditory stimulus to visual spatial frequency with a linear relationship and suggested this crossmodal matching guided attention to specific spatial frequencies. Here, we replicated this matching relationship and used the visual search paradigm to investigate whether auditory signals guide attention to matched visual spatial frequencies. Participants were presented with a search display of Gabors, all with different spatial frequencies. When the auditory signal was informative, improved search efficiency occurred for some spatial frequencies. However, when uninformative, a matched auditory signal produced no effect on visual search performance whatsoever. Moreover, search benefits were also observed when the auditory signal was informative, but did not match the spatial frequency. Together, these findings suggest that an amplitude-modulated auditory signal can influence visual selection of a matched spatial frequency, but the effect is due to top-down knowledge rather than resulting from automatic attentional capture derived from low-level mapping.
Attention Perception & Psychophysics | 2013
Emily Orchard-Mills; David Alais; Erik Van der Burg
Recently, Guzman-Martinez, Ortega, Grabowecky, Mossbridge, and Suzuki (Current Biology : CB, 22(5), 383–388, 2012) reported that observers could systematically match auditory amplitude modulations and tactile amplitude modulations to visual spatial frequencies, proposing that these cross-modal matches produced automatic attentional effects. Using a series of visual search tasks, we investigated whether informative auditory, tactile, or bimodal cues can guide attention toward a visual Gabor of matched spatial frequency (among others with different spatial frequencies). These cues improved visual search for some but not all frequencies. Auditory cues improved search only for the lowest and highest spatial frequencies, whereas tactile cues were more effective and frequency specific, although less effective than visual cues. Importantly, although tactile cues could produce efficient search when informative, they had no effect when uninformative. This suggests that cross-modal frequency matching occurs at a cognitive rather than sensory level and, therefore, influences visual search through voluntary, goal-directed behavior, rather than automatic attentional capture.
PLOS ONE | 2014
Thomas Charles Augustus Freeman; Johahn Leung; Ella Wufong; Emily Orchard-Mills; Simon Carlile; David Alais
Evidence that the auditory system contains specialised motion detectors is mixed. Many psychophysical studies confound speed cues with distance and duration cues and present sound sources that do not appear to move in external space. Here we use the ‘discrimination contours’ technique to probe the probabilistic combination of speed, distance and duration for stimuli moving in a horizontal arc around the listener in virtual auditory space. The technique produces a set of motion discrimination thresholds that define a contour in the distance-duration plane for different combination of the three cues, based on a 3-interval oddity task. The orientation of the contour (typically elliptical in shape) reveals which cue or combination of cues dominates. If the auditory system contains specialised motion detectors, stimuli moving over different distances and durations but defining the same speed should be more difficult to discriminate. The resulting discrimination contours should therefore be oriented obliquely along iso-speed lines within the distance-duration plane. However, we found that over a wide range of speeds, distances and durations, the ellipses aligned with distance-duration axes and were stretched vertically, suggesting that listeners were most sensitive to duration. A second experiment showed that listeners were able to make speed judgements when distance and duration cues were degraded by noise, but that performance was worse. Our results therefore suggest that speed is not a primary cue to motion in the auditory system, but that listeners are able to use speed to make discrimination judgements when distance and duration cues are unreliable.
Perception | 2016
Emily Orchard-Mills; Erik Van der Burg; David Alais
Temporal ventriloquism is the shift in perceived timing of a visual stimulus that occurs when an auditory stimulus is presented close in time. This study investigated whether crossmodal correspondence between auditory pitch and visual elevation modulates temporal ventriloquism. Participants were presented two visual stimuli (above and below fixation) across a range of stimulus onset asynchronies and were asked to judge the order of the events. A task-irrelevant auditory click was presented shortly before the first and another shortly after the second visual stimulus. There were two pitches used (low and high) and the congruency between the auditory and visual stimuli was manipulated. The results show that incongruent pairings between pitch and elevation abolish temporal ventriloquism. In contrast, the crossmodal correspondence effect was absent when the direction of the pitch change was fixed within sessions, reducing the saliency of the pitch change. The results support previous studies suggesting that in addition to spatial and temporal factors, crossmodal correspondences can influence binding of information across the senses, although these effects are likely to be dependent on the saliency of the crossmodal mapping.
Multisensory Research | 2013
Emily Orchard-Mills; Erik Van der Burg; David Alais
A recent study (Guzman-Martinez et al., 2012) showed that participants match the frequency of an amplitude-modulated auditory stimulus to visual spatial frequency with a linear relationship and suggested this crossmodal mapping automatically guided attention to specific spatial frequencies. We replicated the reported matching relationship and also performed matching between tactile and visual spatial frequency. We then used the visual search paradigm to investigate whether auditory or tactile cues can guide attention to matched visual spatial frequencies. Participants were presented with a search display containing multiple Gabors, all with different spatial frequencies. When the auditory or tactile cue was informative, improved search efficiency occurred for some matched spatial frequencies, with the specificity of the effect being greater for touch than audition. However, when uninformative neither auditory and tactile cues produced any effect on visual search performance. Furthermore, when informative, unmatched auditory cues (shifted substantially from the reported match, but still matched in relative position) improved search performance. Taken together, these findings suggest that although auditory and tactile cues can influence visual selection of a matched spatial frequency, the effects are due to top-down attentional control rather than automatic attentional capture derived from low-level mapping.
Multisensory Research | 2015
Hao Tam Ho; Emily Orchard-Mills; David Alais
Following prolonged exposure to audiovisual asynchrony, an observers point of subjective simultaneity (PSS) shifts in the direction of the leading modality. It has been debated whether other sensory pairings, such as vision and touch, lead to a similar temporal recalibration, and if so, whether the internal timing mechanism underlying lag visuotactile adaptation is centralised or distributed. To address these questions, we adapted observers to vision- and tactile-leading visuotactile asynchrony on either their left or right hand side in different blocks. In one test condition, participants performed a simultaneity judgment on the adapted side (unilateral) and in another they performed a simultaneity judgment on the non-adapted side (contralateral). In a third condition, participants adapted concurrently to equal and opposite asynchronies on each side and were tested randomly on either hand (bilateral opposed). Results from the first two conditions show that observers recalibrate to visuotactile asynchronies, and that the recalibration transfers to the non-adapted side. These findings suggest a centralised recalibration mechanism not linked to the adapted side and predict no recalibration for the bilateral opposed condition, assuming the adapted effects were equal on each side. This was confirmed in the group of participants that adapted to vision- and tactile-leading asynchrony on the right and left hand side, respectively. However, the other group (vision-leading on the left and tactile-leading on the right) did show a recalibration effect, suggesting a distributed mechanism. We discuss these findings in terms of a hybrid model that assumes the co-existence of a centralised and distributed timing mechanism.
Journal of Experimental Psychology: Human Perception and Performance | 2016
Emily Orchard-Mills; Erik Van der Burg; David Alais
The brain integrates signals from multiple modalities to provide a reliable estimate of environmental events. A temporally cluttered environment presents a challenge for sensory integration because of the risk of misbinding, yet it also provides scope for cross-modal binding to greatly enhance performance by highlighting multimodal events. We present a tactile search task in which fingertips received pulsed vibrations and participants identified which finger was stimulated in synchrony with an auditory signal. Results showed that performance for identifying the target finger was impaired when other fingers were stimulated, even though all fingers were stimulated sequentially. When the number of fingers vibrated was fixed, we found that both spatial and temporal factors constrained performance, because events occurring close to the target vibration in either space or time reduced accuracy. When tactile search was compared with visual search, we found overall performance was lower in touch than in vision, although the cost of reducing temporal separation between stimuli or increasing the presentation rate was similar for both target modalities. Audiotactile performance benefitted from increasing spatial separation between target and distractors, with a particularly strong benefit for locating the target on a different hand to the distractors, whereas the spatial manipulations did not affect audiovisual performance. The similar trends in performance for temporal manipulations across vision and touch suggest a common supramodal binding mechanism that, when combining audition and touch, is limited by the poor resolution of the underlying unisensory representation of touch in cluttered settings. (PsycINFO Database Record
Attention Perception & Psychophysics | 2015
David Alais; Emily Orchard-Mills; Erik Van der Burg