Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Warrick Roseboom is active.

Publication


Featured researches published by Warrick Roseboom.


Journal of Vision | 2013

Direction of visual apparent motion driven by perceptual organization of cross-modal signals

Warrick Roseboom; Takahiro Kawabe; Shin'ya Nishida

A critical function of the human brain is to determine the relationship between sensory signals. In the case of signals originating from different sensory modalities, such as audition and vision, several processes have been proposed that may facilitate perception of correspondence between two signals despite any temporal discrepancies in physical or neural transmission. One proposal, temporal ventriloquism, suggests that audio-visual temporal discrepancies can be resolved with a capture of visual event timing by that of nearby auditory events. Such an account implies a fundamental change in the timing representations of the involved events. Here we examine if such changes are necessary to account for a recently demonstrated effect, the modulation of visual apparent motion direction by audition. By contrast, we propose that the effect is driven by segmentation of the visual sequence on the basis of perceptual organization in the cross-modal sequence. Using different sequences of cross-modal (auditory and tactile) events, we found that the direction of visual apparent motion was not consistent with a temporal capture explanation. Rather, reports of visual apparent motion direction were dictated by perceptual organization within cross-modal sequences, determined on the basis of apparent relatedness. This result adds to the growing literature indicating the importance of apparent relatedness and sequence segmentation in apparent timing. Moreover, it demonstrates that, contrary to previous findings, cross-modal interaction can play a critical role in determining organization of signals within a single sensory modality.


Scientific Reports | 2013

The cross-modal double flash illusion depends on featural similarity between cross-modal inducers

Warrick Roseboom; Takahiro Kawabe; Shin'ya Nishida

Despite extensive evidence of the possible interactions between multisensory signals, it remains unclear at what level of sensory processing these interactions take place. When two identical auditory beeps (inducers) are presented in quick succession accompanied by a single visual flash, observers often report seeing two visual flashes, rather than the physical one - the double flash illusion. This compelling illusion has often been considered to reflect direct interactions between neural activations in different primary sensory cortices. Against this simple account, here we show that by simply changing the inducer signals between featurally distinct signals (e.g. high- and low-pitch beeps) the illusory double flash is abolished. This result suggests that a critical component underlying the illusion is perceptual grouping of the inducer signals, consistent with the notion that multisensory combination is preceded by determination of whether the relevant signals share a common source of origin.


Frontiers in Psychology | 2013

Audio-visual temporal recalibration can be constrained by content cues regardless of spatial overlap

Warrick Roseboom; Takahiro Kawabe; Shin'ya Nishida

It has now been well established that the point of subjective synchrony for audio and visual events can be shifted following exposure to asynchronous audio-visual presentations, an effect often referred to as temporal recalibration. Recently it was further demonstrated that it is possible to concurrently maintain two such recalibrated estimates of audio-visual temporal synchrony. However, it remains unclear precisely what defines a given audio-visual pair such that it is possible to maintain a temporal relationship distinct from other pairs. It has been suggested that spatial separation of the different audio-visual pairs is necessary to achieve multiple distinct audio-visual synchrony estimates. Here we investigated if this is necessarily true. Specifically, we examined whether it is possible to obtain two distinct temporal recalibrations for stimuli that differed only in featural content. Using both complex (audio visual speech; see Experiment 1) and simple stimuli (high and low pitch audio matched with either vertically or horizontally oriented Gabors; see Experiment 2) we found concurrent, and opposite, recalibrations despite there being no spatial difference in presentation location at any point throughout the experiment. This result supports the notion that the content of an audio-visual pair alone can be used to constrain distinct audio-visual synchrony estimates regardless of spatial overlap.


Current opinion in behavioral sciences | 2016

Adaptation for multisensory relative timing

Daniel Linares; Ignasi Cos; Warrick Roseboom

Perception of relative timing for signals arising from different sensory modalities depends on the recent history of experienced asynchrony between the signals. Recent findings suggest that the changes in perceived relative timing following asynchrony exposure parallel the perceptual changes caused by adaptation to non-temporal attributes. In both cases, previous sensory stimulation changes discriminability and briefly presented adaptors are nsufficient to produce perceptual changes that, functionally, ncan be consistent with repulsion and recalibration. Furthermore, a new class of after-effects in which reports are biased in the direction of the adaptor also occur for nboth temporal and non-temporal attributes. Computationally, nthe effects of previous sensory stimulation on behavior have nbeen assessed using Bayesian and population code models.


Scientific Reports | 2017

A deep-dream virtual reality platform for studying altered perceptual phenomenology

Keisuke Suzuki; Warrick Roseboom; David J. Schwartzman; Anil K. Seth

Altered states of consciousness, such as psychotic or pharmacologically-induced hallucinations, provide a unique opportunity to examine the mechanisms underlying conscious perception. However, the phenomenological properties of these states are difficult to isolate experimentally from other, more general physiological and cognitive effects of psychoactive substances or psychopathological conditions. Thus, simulating phenomenological aspects of altered states in the absence of these other more general effects provides an important experimental tool for consciousness science and psychiatry. Here we describe such a tool, which we call the Hallucination Machine. It comprises a novel combination of two powerful technologies: deep convolutional neural networks (DCNNs) and panoramic videos of natural scenes, viewed immersively through a head-mounted display (panoramic VR). By doing this, we are able to simulate visual hallucinatory experiences in a biologically plausible and ecologically valid way. Two experiments illustrate potential applications of the Hallucination Machine. First, we show that the system induces visual phenomenology qualitatively similar to classical psychedelics. In a second experiment, we find that simulated hallucinations do not evoke the temporal distortion commonly associated with altered states. Overall, the Hallucination Machine offers a valuable new technique for simulating altered phenomenology without directly altering the underlying neurophysiology.


bioRxiv | 2017

The Hallucination Machine: A Deep-Dream VR platform for Studying the Phenomenology of Visual Hallucinations

Keisuke Suzuki; Warrick Roseboom; David J. Schwartzman; Anil K. Seth

Altered states of consciousness, such as psychotic or pharmacologically-induced hallucinations, provide a unique opportunity to examine the mechanisms underlying conscious perception. However, the phenomenological properties of these states are difficult to isolate experimentally from other, more general physiological and cognitive effects of psychoactive substances or psychopathological conditions. Thus, simulating phenomenological aspects of altered states in the absence of these other more general effects provides an important experimental tool for consciousness science and psychiatry. Here we describe such a tool, the Hallucination Machine. It comprises a novel combination of two powerful technologies: deep convolutional neural networks (DCNNs) and panoramic videos of natural scenes, viewed immersively through a head-mounted display (panoramic VR). By doing this, we are able to simulate visual hallucinatory experiences in a biologically plausible and ecologically valid way. Two experiments illustrate potential applications of the Hallucination Machine. First, we show that the system induces visual phenomenology qualitatively similar to classical psychedelics. In a second experiment, we find that simulated hallucinations do not evoke the temporal distortion commonly associated with altered states. Overall, the Hallucination Machine offers a valuable new technique for simulating altered phenomenology without directly altering the underlying neurophysiology.


bioRxiv | 2018

The illusion of uniformity does not depend on low-level vision: evidence from sensory adaptation

Marta Suárez-Pinilla; Anil K. Seth; Warrick Roseboom

Visual experience appears richly detailed despite the poor resolution of the majority of the visual field, thanks to foveal-peripheral integration. The recently described Uniformity Illusion (UI), in which peripheral elements of a pattern seem to take on the properties of foveal elements, may shed light on this integration. We examined the basis of UI by generating adaptation to a pattern of Gabors suitable for producing UI on orientation. After removing the pattern, participants reported the tilt of a single peripheral Gabor. The tilt after-effect (TAE) followed the physical adapting orientation rather than the global orientation perceived under UI, even when the illusion had been reported for a long time. Conversely, a control experiment replacing illusory for physical uniformity for the same durations did produce an after-effect to the global orientation. Our results indicate that the UI is not associated with changes in sensory encoding, but likely depends on high-level processes.


bioRxiv | 2018

A Sensory Processing Hierarchy for Thermal Touch: Thermal Adaptation Occurs Prior to Thermal-Tactile Integration

Hsin-Ni Ho; Hiu Mei Chow; Sayaka Tsunokake; Warrick Roseboom

The brain consistently faces a challenge of whether and how to combine the available information sources to estimate the properties of an object explored by hand. Thermal referral (TR) is a phenomenon that demonstrates how thermal and tactile modalities coordinate to resolve inconsistencies in spatial and thermal information. When the middle three fingers of one hand are thermally stimulated, but only the outer two fingers are heated (or cooled), thermal uniformity is perceived across three fingers. This illusory experience of thermal uniformity in TR compensates for the discontinuity in the thermal sensation across the sites in contact. The neural loci of TR is unclear. While TR reflects the diffuse nature of the thermoceptive system, its similarities to perceptual filling-in and its facilitative role in object perception also suggest that TR might involve inference processes associated with object perception. To clarify the positioning of this thermo-tactile interaction in the sensory processing hierarchy, we used perceptual adaptation and Bayesian decision modelling techniques. Our results indicate that TR adaptation takes place at a peripheral stage where information about temperature inputs are still preserved for each finger, and that the thermal-tactile interaction occurs after this stage. We also show that the temperature integration across three fingers in TR is consistent with precision weighted averaging effect - Bayesian cue combination. Altogether, our findings suggest that for the sensory processing hierarchy of thermal touch, thermal adaptation occurs prior to thermo-tactile integration, which combines thermal and tactile information to give a unified percept to facilitate object recognition. Significance Statement Thermal touch refers to the perception of temperature of objects in contact with the skin and is key to object recognition based on thermal cues. While object perception is an inference process involving multisensory inputs, thermal referral (TR) is an illusion demonstrating how the brain’s interpretation of object temperature can deviate from physical reality. Here we used TR to explore the processing hierarchy of thermal touch. We show that adaptation of thermal perception occurs prior to integration of thermal information across tactile locations. Further, we show that TR results from simple averaging of thermal sensation across locations. Our results illuminate the flexibility of the processing that underlies thermal-tactile interactions and facilitates object exploration and identification in our complicated natural environment.


Journal of Vision | 2018

Serial dependence in the perception of visual variance

Marta Suárez-Pinilla; Anil K. Seth; Warrick Roseboom

The recent history of perceptual experience has been shown to influence subsequent perception. Classically, this dependence on perceptual history has been examined in sensory-adaptation paradigms, wherein prolonged exposure to a particular stimulus (e.g., a vertically oriented grating) produces changes in perception of subsequently presented stimuli (e.g., the tilt aftereffect). More recently, several studies have investigated the influence of shorter perceptual exposure with effects, referred to as serial dependence, being described for a variety of low- and high-level perceptual dimensions. In this study, we examined serial dependence in the processing of dispersion statistics, namely variance—a key descriptor of the environment and indicative of the precision and reliability of ensemble representations. We found two opposite serial dependences operating at different timescales, and likely originating at different processing levels: A positive, Bayesian-like bias was driven by the most recent exposures, dependent on feature-specific decision making and appearing only when high confidence was placed in that decision; and a longer lasting negative bias—akin to an adaptation aftereffect—becoming manifest as the positive bias declined. Both effects were independent of spatial presentation location and the similarity of other close traits, such as mean direction of the visual variance stimulus. These findings suggest that visual variance processing occurs in high-level areas but is also subject to a combination of multilevel mechanisms balancing perceptual stability and sensitivity, as with many different perceptual dimensions.


I-perception | 2018

The Illusion of Uniformity Does Not Depend on the Primary Visual Cortex: Evidence From Sensory Adaptation:

Marta Suárez-Pinilla; Anil K. Seth; Warrick Roseboom

Visual experience appears richly detailed despite the poor resolution of the majority of the visual field, thanks to foveal-peripheral integration. The recently described uniformity illusion (UI), wherein peripheral elements of a pattern take on the appearance of foveal elements, may shed light on this integration. We examined the basis of UI by generating adaptation to a pattern of Gabors suitable for producing UI on orientation. After removing the pattern, participants reported the tilt of a single peripheral Gabor. The tilt aftereffect followed the physical adapting orientation rather than the global orientation perceived under UI, even when the illusion had been reported for a long time. Conversely, a control experiment replacing illusory uniformity with a physically uniform Gabor pattern for the same durations did produce an aftereffect to the global orientation. Results indicate that UI is not associated with changes in sensory encoding at V1 but likely depends on higher level processes.

Collaboration


Dive into the Warrick Roseboom's collaboration.

Top Co-Authors

Avatar

Shin'ya Nishida

Nippon Telegraph and Telephone

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge