Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Norimichi Kitagawa is active.

Publication


Featured researches published by Norimichi Kitagawa.


Nature | 2002

Hearing visual motion in depth

Norimichi Kitagawa; Shigeru Ichihara

Auditory spatial perception is strongly affected by visual cues. For example, if auditory and visual stimuli are presented synchronously but from different positions, the auditory event is mislocated towards the locus of the visual stimulus—the ventriloquism effect. This ‘visual capture’ also occurs in motion perception in which a static auditory stimulus appears to move with the visual moving object. We investigated how the human perceptual system coordinates complementary inputs from auditory and visual senses. Here we show that an auditory aftereffect occurs from adaptation to visual motion in depth. After a few minutes of viewing a square moving in depth, a steady sound was perceived as changing loudness in the opposite direction. Adaptation to a combination of auditory and visual stimuli changing in a compatible direction increased the aftereffect and the effect of visual adaptation almost disappeared when the directions were opposite. On the other hand, listening to a sound changing in intensity did not affect the visual changing-size aftereffect. The results provide psychophysical evidence that, for processing of motion in depth, the auditory system responds to both auditory changing intensity and visual motion in depth.


International Journal of Psychophysiology | 2003

Audio–visual integration in temporal perception

Yuji Wada; Norimichi Kitagawa; Kaoru Noguchi

In situations of audio-visual interaction, research has generally found that audition prevails over vision in temporal perception, while vision is dominant over audition for spatial perception. Modality appropriateness to a given task generally determines the direction of this inter-modality effect. However, we found a reverse effect in some situations where a change in the frequency of visual stimuli was associated with a perceived change in the frequency of auditory stimuli. In our experiment, 12 participants were asked to judge the change in the frequency of visual and auditory stimuli using a visual flicker and auditory flutter stimuli. In some conditions either the auditory or the visual information was ambiguous. In addition to confirming the expected finding that a change in the frequency of the auditory stimuli induced a perceived change in the frequency of the visual stimuli, we found a new phenomenon. When ambiguous auditory temporal cues were presented, the change in the frequency of the visual stimuli was associated with a perceived change in the frequency of the auditory stimuli. This suggests that cross-modal asymmetry effects are influenced by the reliability of visual and auditory information as well as modality appropriateness.


Current Biology | 2012

Action sounds recalibrate perceived tactile distance

Ana Tajadura-Jiménez; Aleksander Väljamäe; Iwaki Toshima; Toshitaka Kimura; Norimichi Kitagawa

Summary Almost every bodily movement, from the most complex to the most mundane, such as walking, can generate impact sounds that contain 360° spatial information of high temporal resolution. Given the strong connection of auditory cues to our body actions, and the dependency of body-awareness on the interaction between peripheral sensory inputs and mental body-representations, one could assume that audition plays a specific role in this interaction. Despite the conclusive evidence for the role that the integration of vision, touch and proprioception plays in updating body-representations [1,2], hardly any study has looked at the contribution of audition. We show that the representation of a key property of ones body, like its length, is affected by the sound of ones actions. Participants tapped on a surface while progressively extending their right arm sideways, and in synchrony with each tap participants listened to a tapping sound. In the critical condition, the sound originated at double the distance at which participants actually tapped. After exposure to this condition, tactile distances on the test right arm, as compared to distances on the reference left arm, felt bigger than those before the exposure. No evidence of changes in tactile distance reports was found at the quadruple tapping sound distance or the asynchronous auditory feedback conditions. Our results suggest that tactile perception is referenced to an implicit body-representation which is informed by auditory feedback. This is the first evidence of the contribution of self-produced sounds to body-representation, addressing the auditory-dependent plasticity of body-representation and its spatial boundaries.


Neuropsychologia | 2009

Auditory-somatosensory multisensory interactions are spatially modulated by stimulated body surface and acoustic spectra

Ana Tajadura-Jiménez; Norimichi Kitagawa; Aleksander Väljamäe; Massimiliano Zampini; Micah M. Murray; Charles Spence

Previous research has provided inconsistent results regarding the spatial modulation of auditory-somatosensory interactions. The present study reports three experiments designed to investigate the nature of these interactions in the space close to the head. Human participants made speeded detection responses to unimodal auditory, somatosensory, or simultaneous auditory-somatosensory stimuli. In Experiment 1, electrocutaneous stimuli were presented to either earlobe, while auditory stimuli were presented from the same versus opposite sides, and from one of two distances (20 vs. 70 cm) from the participants head. The results demonstrated a spatial modulation of auditory-somatosensory interactions when auditory stimuli were presented from close to the head. In Experiment 2, electrocutaneous stimuli were delivered to the hands, which were placed either close to or far from the head, while the auditory stimuli were again presented at one of two distances. The results revealed that the spatial modulation observed in Experiment 1 was specific to the particular body part stimulated (head) rather than to the region of space (i.e. around the head) where the stimuli were presented. The results of Experiment 3 demonstrate that sounds that contain high-frequency components are particularly effective in eliciting this auditory-somatosensory spatial effect. Taken together, these findings help to resolve inconsistencies in the previous literature and suggest that auditory-somatosensory multisensory integration is modulated by the stimulated body surface and acoustic spectra of the stimuli presented.


Experimental Brain Research | 2005

Investigating the effect of a transparent barrier on the crossmodal congruency effect.

Norimichi Kitagawa; Charles Spence

A transparent barrier, such as a window, protects us from approaching objects (such as flies), although we can still see the objects coming toward us through the occluder. In the present study, we examined whether this potential dissociation between tactile and visual experience introduced by the presence of a transparent barrier also affects visual–tactile interactions in normal participants, as indexed by performance in the crossmodal congruency task. Participants discriminated the elevation of vibrotactile target stimuli (upper vs lower) presented to the left or right hand while trying to ignore visual distractor lights that could independently be presented from upper or lower locations on either the same or the opposite side. A transparent occluder was placed between the vibrotactile targets and the visual stimuli (the barrier occluded the vibrotactile targets in Experiment 1 and the visual distractors in Experiment 2). Vibrotactile elevation discrimination performance was slower and less accurate when the distractor lights were presented from incongruent locations relative to the target (e.g., lower tactile target with upper distractor light). However, there were no significant differences between performance with and without the transparent occluder present. This pattern of results was replicated in Experiment 3 under conditions where the participants were periodically required to reach around the transparent occluder in order to press buttons placed near to the visual distractors. Taken together, these results support recent neuropsychological evidence [Farnè et al. (2003) Int J Psychophysiol 50:51–61] suggesting that visual–tactile interactions in peripersonal space are not necessarily modulated by conscious awareness of the impossibility of our hand being touched by the visual stimuli.


Perception | 2007

Contrast and Depth Perception: Effects of Texture Contrast and Area Contrast

Shigeru Ichihara; Norimichi Kitagawa; Hiromi Akutsu

Many objects in natural scenes have textures on their surfaces. Contrast of the texture surfaces (the texture contrast) reduces when the viewing distance increases. Similarly, contrast between the surfaces of the objects and the background (the area contrast) reduces when the viewing distance increases. The texture contrast and the area contrast were defined by the contrast between random dots, and by the contrast between the average luminance of the dot pattern and the luminance of the background, respectively. To examine how these two types of contrast influence depth perception, we ran two experiments. In both experiments two areas of random-dot patterns were presented against a uniform background, and participants rated relative depth between the two areas. We found that the rated depth of the patterned areas increased with increases in texture contrast. Furthermore, the effect of the texture contrast on depth judgment increased when the area contrast became low.


Journal of Experimental Psychology: Human Perception and Performance | 2009

The Tactile Continuity Illusion.

Norimichi Kitagawa; Yuka Igarashi; Makio Kashino

We can perceive the continuity of an object or event by integrating spatially/temporally discrete sensory inputs. The mechanism underlying this perception of continuity has intrigued many researchers and has been well documented in both the visual and auditory modalities. The present study shows for the first time to our knowledge that an illusion of continuity also occurs with vibrotactile stimulation. We found that when the brief temporal gaps inserted into a vibrotactile target were filled with vibrotactile noise, the target vibration was perceived to continue through the noise if the target vibration was sufficiently weak relative to the noise. It is important that the illusory continuity of the vibration cannot be distinguished from the physically continuous vibration. These results therefore suggest that the continuity illusion is common to multiple sensory modalities and that it reflects a fundamental principle of perception.


PLOS ONE | 2013

Speech Misperception: Speaking and Seeing Interfere Differently with Hearing

Takemi Mochida; Toshitaka Kimura; Sadao Hiroya; Norimichi Kitagawa; Hiroaki Gomi; Tadahisa Kondo

Speech perception is thought to be linked to speech motor production. This linkage is considered to mediate multimodal aspects of speech perception, such as audio-visual and audio-tactile integration. However, direct coupling between articulatory movement and auditory perception has been little studied. The present study reveals a clear dissociation between the effects of a listener’s own speech action and the effects of viewing another’s speech movements on the perception of auditory phonemes. We assessed the intelligibility of the syllables [pa], [ta], and [ka] when listeners silently and simultaneously articulated syllables that were congruent/incongruent with the syllables they heard. The intelligibility was compared with a condition where the listeners simultaneously watched another’s mouth producing congruent/incongruent syllables, but did not articulate. The intelligibility of [ta] and [ka] were degraded by articulating [ka] and [ta] respectively, which are associated with the same primary articulator (tongue) as the heard syllables. But they were not affected by articulating [pa], which is associated with a different primary articulator (lips) from the heard syllables. In contrast, the intelligibility of [ta] and [ka] was degraded by watching the production of [pa]. These results indicate that the articulatory-induced distortion of speech perception occurs in an articulator-specific manner while visually induced distortion does not. The articulator-specific nature of the auditory-motor interaction in speech perception suggests that speech motor processing directly contributes to our ability to hear speech.


Experimental Brain Research | 2010

Influence of the body on crossmodal interference effects between tactile and two-dimensional visual stimuli

Yuka Igarashi; Norimichi Kitagawa; Shigeru Ichihara

We investigated how tactile discrimination performance was interfered with by irrelevant two-dimensional visual stimuli using the crossmodal interference task. Participants made speeded discrimination responses to the location of vibrotactile targets presented to either tip or base of their forefinger, while trying to ignore simultaneously presented visual distractors presented to either side of the central fixation on a front display. The array of visual distractors was presented at four different angles, and the participants rested their stimulated hand on a desk in either a forward-pointing or inward-pointing posture. Although there was apparently no specific spatial relationship between the tactile and two-dimensional visual stimuli arrays and the spatial response requirement was controlled, visuotactile interference effects occurred between them. Moreover, we found that the spatial relationships between the arrays depended on the potential range of movement and the current posture of the vibrotactile-stimulated hand and possibly the stored orientation of our hand representation, even without any explicit cue referring to hands. Our results suggest that the visuotactile spatial interactions involve multiple mechanisms regarding our bodily perception and our internal body representation.


Frontiers in Psychology | 2016

Action sounds modulate arm reaching movements

Ana Tajadura-Jiménez; Torsten Marquardt; David Swapp; Norimichi Kitagawa; Nadia Bianchi-Berthouze

Our mental representations of our body are continuously updated through multisensory bodily feedback as we move and interact with our environment. Although it is often assumed that these internal models of body-representation are used to successfully act upon the environment, only a few studies have actually looked at how body-representation changes influence goal-directed actions, and none have looked at this in relation to body-representation changes induced by sound. The present work examines this question for the first time. Participants reached for a target object before and after adaptation periods during which the sounds produced by their hand tapping a surface were spatially manipulated to induce a representation of an elongated arm. After adaptation, participants’ reaching movements were performed in a way consistent with having a longer arm, in that their reaching velocities were reduced. These kinematic changes suggest auditory-driven recalibration of the somatosensory representation of the arm morphology. These results provide support to the hypothesis that one’s represented body size is used as a perceptual ruler to measure objects’ distances and to accordingly guide bodily actions.

Collaboration


Dive into the Norimichi Kitagawa's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Makio Kashino

Tokyo Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Shigeru Ichihara

Tokyo Metropolitan University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Aleksander Väljamäe

Chalmers University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yuka Igarashi

Tokyo Metropolitan University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge