Souta Hidaka
Rikkyo University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Souta Hidaka.
PLOS ONE | 2009
Souta Hidaka; Yuko Manaka; Wataru Teramoto; Yoichi Sugita; Ryota Miyauchi; Jiro Gyoba; Yôiti Suzuki; Yukio Iwaya
Background Audition provides important cues with regard to stimulus motion although vision may provide the most salient information. It has been reported that a sound of fixed intensity tends to be judged as decreasing in intensity after adaptation to looming visual stimuli or as increasing in intensity after adaptation to receding visual stimuli. This audiovisual interaction in motion aftereffects indicates that there are multimodal contributions to motion perception at early levels of sensory processing. However, there has been no report that sounds can induce the perception of visual motion. Methodology/Principal Findings A visual stimulus blinking at a fixed location was perceived to be moving laterally when the flash onset was synchronized to an alternating left-right sound source. This illusory visual motion was strengthened with an increasing retinal eccentricity (2.5 deg to 20 deg) and occurred more frequently when the onsets of the audio and visual stimuli were synchronized. Conclusions/Significance We clearly demonstrated that the alternation of sound location induces illusory visual motion when vision cannot provide accurate spatial information. The present findings strongly suggest that the neural representations of auditory and visual motion processing can bias each other, which yields the best estimates of external events in a complementary manner.
PLOS ONE | 2010
Wataru Teramoto; Souta Hidaka; Yoichi Sugita
Background Vision provides the most salient information with regard to stimulus motion, but audition can also provide important cues that affect visual motion perception. Here, we show that sounds containing no motion or positional cues can induce illusory visual motion perception for static visual objects. Methodology/Principal Findings Two circles placed side by side were presented in alternation producing apparent motion perception and each onset was accompanied by a tone burst of a specific and unique frequency. After exposure to this visual apparent motion with tones for a few minutes, the tones became drivers for illusory motion perception. When the flash onset was synchronized to tones of alternating frequencies, a circle blinking at a fixed location was perceived as lateral motion in the same direction as the previously exposed apparent motion. Furthermore, the effect lasted at least for a few days. The effect was well observed at the retinal position that was previously exposed to apparent motion with tone bursts. Conclusions/Significance The present results indicate that strong association between sound sequence and visual motion is easily formed within a short period and that, after forming the association, sounds are able to trigger visual motion perception for a static visual object.
Neuroscience Letters | 2010
Wataru Teramoto; Yuko Manaka; Souta Hidaka; Yoichi Sugita; Ryota Miyauchi; Shuichi Sakamoto; Jiro Gyoba; Yukio Iwaya; Yôiti Suzuki
The alternation of sounds in the left and right ears induces motion perception of a static visual stimulus (SIVM: Sound-Induced Visual Motion). In this case, binaural cues were of considerable benefit in perceiving locations and movements of the sounds. The present study investigated how a spectral cue - another important cue for sound localization and motion perception - contributed to the SIVM. In experiments, two alternating sound sources aligned in the vertical plane were presented, synchronized with a static visual stimulus. We found that the proportion of the SIVM and the magnitude of the perceived movements of the static visual stimulus increased with an increase of retinal eccentricity (1.875-30 degree), indicating the influence of the spectral cue on the SIVM. These findings suggest that the SIVM can be generalized to the whole two dimensional audio-visual space, and strongly imply that there are common neural substrates for auditory and visual motion perception in the brain.
PLOS ONE | 2011
Souta Hidaka; Wataru Teramoto; Yoichi Sugita; Yuko Manaka; Shuichi Sakamoto; Yôiti Suzuki
Background Vision provides the most salient information with regard to the stimulus motion. However, it has recently been demonstrated that static visual stimuli are perceived as moving laterally by alternating left-right sound sources. The underlying mechanism of this phenomenon remains unclear; it has not yet been determined whether auditory motion signals, rather than auditory positional signals, can directly contribute to visual motion perception. Methodology/Principal Findings Static visual flashes were presented at retinal locations outside the fovea together with a lateral auditory motion provided by a virtual stereo noise source smoothly shifting in the horizontal plane. The flash appeared to move by means of the auditory motion when the spatiotemporal position of the flashes was in the middle of the auditory motion trajectory. Furthermore, the lateral auditory motion altered visual motion perception in a global motion display where different localized motion signals of multiple visual stimuli were combined to produce a coherent visual motion perception. Conclusions/Significance These findings suggest there exist direct interactions between auditory and visual motion signals, and that there might be common neural substrates for auditory and visual motion processing.
Neuroscience Research | 2012
Souta Hidaka; Hiroshi Shibata; Michiyo Kurihara; Akihiro Tanaka; Akitsugu Konno; Suguru Maruyama; Jiro Gyoba; Hiroko Hagiwara; Masatoshi Koizumi
We investigated brain activity in 3-5-year-old preschoolers as they listened to connected speech stimuli in Japanese (first language), English (second language), and Chinese (a rarely exposed, foreign language) using near-infrared spectroscopy. Unlike the younger preschoolers who had been exposed to English for almost 1 year, brain activity in the bilateral frontal regions of the older preschoolers who had been exposed to English for almost 2 years was higher for Japanese and English speech stimuli than for Chinese. This tendency seemed to be similar to that observed in adults who had learned English for some years. These results indicate that exposure to a second language affects brain activity to language stimuli among preschoolers.
Journal of Vision | 2011
Souta Hidaka; Masayoshi Nagai; Allison B. Sekuler; Patrick J. Bennett; Jiro Gyoba
Letter discrimination performance is degraded when a letter is presented within an apparent motion (AM) trajectory of a spot. This finding suggests that the internal representation of AM stimuli can perceptually interact with other stimuli. In this study, we demonstrated that AM interference could also occur for pattern detection. We found that target (Gabor patch) detection performance was degraded within an AM trajectory. Further, this AM interference weakened when the differences in orientation between the AM stimuli and target became greater. We also revealed that AM interference occurred for the target with spatiotemporally intermediate orientations of the inducers that changed their orientation during AM. In contrast, the differences in phase among the stimuli did not affect the occurrence of AM interference. These findings suggest that AM stimuli and their internal representations affect lower visual processes involved in detecting a pattern in the AM trajectory and that the internal object representation of an AM stimulus selectively reflects and maintains the stimulus attribute.
BMC Neuroscience | 2011
Souta Hidaka; Wataru Teramoto; Maori Kobayashi; Yoichi Sugita
BackgroundAfter a prolonged exposure to a paired presentation of different types of signals (e.g., color and motion), one of the signals (color) becomes a driver for the other signal (motion). This phenomenon, which is known as contingent motion aftereffect, indicates that the brain can establish new neural representations even in the adults brain. However, contingent motion aftereffect has been reported only in visual or auditory domain. Here, we demonstrate that a visual motion aftereffect can be contingent on a specific sound.ResultsDynamic random dots moving in an alternating right or left direction were presented to the participants. Each direction of motion was accompanied by an auditory tone of a unique and specific frequency. After a 3-minutes exposure, the tones began to exert marked influence on the visual motion perception, and the percentage of dots required to trigger motion perception systematically changed depending on the tones. Furthermore, this effect lasted for at least 2 days.ConclusionsThese results indicate that a new neural representation can be rapidly established between auditory and visual modalities.
Journal of Vision | 2012
Wataru Teramoto; Souta Hidaka; Yoichi Sugita; Shuichi Sakamoto; Jiro Gyoba; Yukio Iwaya; Yôiti Suzuki
Auditory temporal or semantic information often modulates visual motion events. However, the effects of auditory spatial information on visual motion perception were reported to be absent or of smaller size at perceptual level. This could be caused by a superiority of vision over hearing in reliability of motion information. Here, we manipulated the retinal eccentricity of visual motion and challenged the previous findings. Visual apparent motion stimuli were presented in conjunction with a sound delivered alternately from two horizontally or vertically aligned loudspeakers; the direction of visual apparent motion was always perpendicular to the direction in which the sound alternated. We found that the perceived direction of visual motion could be consistent with the direction in which the sound alternated or lay between this direction and that of actual visual motion. The deviation of the perceived direction of motion from the actual direction was more likely to occur at larger retinal eccentricities. These findings suggest that the auditory and visual modalities can mutually influence one another in motion processing so that the brain obtains the best estimates of external events.
Frontiers in Integrative Neuroscience | 2015
Souta Hidaka; Wataru Teramoto; Yoichi Sugita
Research regarding crossmodal interactions has garnered much interest in the last few decades. A variety of studies have demonstrated that multisensory information (vision, audition, tactile sensation, and so on) can perceptually interact with each other in the spatial and temporal domains. Findings regarding crossmodal interactions in the spatiotemporal domain (i.e., motion processing) have also been reported, with updates in the last few years. In this review, we summarize past and recent findings on spatiotemporal processing in crossmodal interactions regarding perception of the external world. A traditional view regarding crossmodal interactions holds that vision is superior to audition in spatial processing, but audition is dominant over vision in temporal processing. Similarly, vision is considered to have dominant effects over the other sensory modalities (i.e., visual capture) in spatiotemporal processing. However, recent findings demonstrate that sound could have a driving effect on visual motion perception. Moreover, studies regarding perceptual associative learning reported that, after association is established between a sound sequence without spatial information and visual motion information, the sound sequence could trigger visual motion perception. Other sensory information, such as motor action or smell, has also exhibited similar driving effects on visual motion perception. Additionally, recent brain imaging studies demonstrate that similar activation patterns could be observed in several brain areas, including the motion processing areas, between spatiotemporal information from different sensory modalities. Based on these findings, we suggest that multimodal information could mutually interact in spatiotemporal processing in the percept of the external world and that common perceptual and neural underlying mechanisms would exist for spatiotemporal processing.
Scientific Reports | 2013
Masakazu Ide; Souta Hidaka
An input (e.g., airplane takeoff sound) to a sensory modality can suppress the percept of another input (e.g., talking voices of neighbors) of the same modality. This perceptual suppression effect is evidence that neural responses to different inputs closely interact with each other in the brain. While recent studies suggest that close interactions also occur across sensory modalities, crossmodal perceptual suppression effect has not yet been reported. Here, we demonstrate that tactile stimulation can suppress the percept of visual stimuli: Visual orientation discrimination performance was degraded when a tactile vibration was applied to the observers index finger of hands. We also demonstrated that this tactile suppression effect on visual perception occurred primarily when the tactile and visual information were spatially and temporally consistent. The current findings would indicate that neural signals could closely and directly interact with each other, sufficient to induce the perceptual suppression effect, even across sensory modalities.
Collaboration
Dive into the Souta Hidaka's collaboration.
National Institute of Advanced Industrial Science and Technology
View shared research outputsNational Institute of Advanced Industrial Science and Technology
View shared research outputs