Takanori Kochiyama
Primate Research Institute
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Takanori Kochiyama.
NeuroImage | 2004
Mitsuo Suzuki; Ichiro Miyai; Takeshi Ono; Ichiro Oda; Ikuo Konishi; Takanori Kochiyama; Kisou Kubota
We investigated changes of regional activation in the frontal cortices as assessed by changes of hemoglobin oxygenation during walking at 3 and 5 km/h and running at 9 km/h on a treadmill using a near-infrared spectroscopic (NIRS) imaging technique. During the acceleration periods immediately preceded reaching the steady walking or running speed, the levels of oxygenated hemoglobin (oxyHb) increased, but those of deoxygenated hemoglobin (deoxyHb) did not in the frontal cortices. The changes were greater at the higher locomotor speed in the bilateral prefrontal cortex and the premotor cortex, but there were less speed-associated changes in the sensorimotor cortices. The medial prefrontal activation was most prominent during the running task. These results indicate that the prefrontal and premotor cortices are involved in adapting to locomotor speed on the treadmill. These areas might predominantly participate in the control of running rather than walking.
The Journal of Neuroscience | 2002
Eiichi Naito; Takanori Kochiyama; Ryo Kitada; Satoshi Nakamura; Michikazu Matsumura; Yoshiharu Yonekura; Norihiro Sadato
It has been proposed that motor imagery contains an element of sensory experiences (kinesthetic sensations), which is a substitute for the sensory feedback that would normally arise from the overt action. No evidence has been provided about whether kinesthetic sensation is centrally simulated during motor imagery. We psychophysically tested whether motor imagery of palmar flexion or dorsiflexion of the right wrist would influence the sensation of illusory palmar flexion elicited by tendon vibration. We also tested whether motor imagery of wrist movement shared the same neural substrates involving the illusory sensation elicited by the peripheral stimuli. Regional cerebral blood flow was measured with H215O and positron emission tomography in 10 right-handed subjects. The right tendon of the wrist extensor was vibrated at 83 Hz (“illusion”) or at 12.5 Hz with no illusion (“vibration”). Subjects imagined doing wrist movements of alternating palmar and dorsiflexion at the same speed with the experienced illusory movements (“imagery”). A “rest” condition with eyes closed was included. We identified common active fields between the contrasts of imagery versus rest and illusion versus vibration. Motor imagery of palmar flexion psychophysically enhanced the experienced illusory angles of plamar flexion, whereas dorsiflexion imagery reduced it in the absence of overt movement. Motor imagery and the illusory sensation commonly activated the contralateral cingulate motor areas, supplementary motor area, dorsal premotor cortex, and ipsilateral cerebellum. We conclude that kinesthetic sensation associated with imagined movement is internally simulated during motor imagery by recruiting multiple motor areas.
Neuroreport | 2001
Wataru Sato; Takanori Kochiyama; Sakiko Yoshikawa; Michikazu Matsumura
To investigate the hypothesis that early visual processing of stimuli might be boosted by signals of emotionality, we analyzed event related potentials (ERPs) of twelve right-handed normal subjects. Gray-scale still images of faces with emotional (fearful and happy) or neutral expressions were presented randomly while the subjects performed gender discrimination of the faces. The results demonstrated that the faces with emotion (both fear and happiness) elicited a larger negative peak at about 270 ms (N270) over the posterior temporal areas, covering a broad range of posterior visual areas. The result of independent component analysis (ICA) on the ERP data suggested that this posterior N270 had a synchronized positive activity at the frontal–midline electrode. These findings confirm that the emotional signal boosts early visual processing of the stimuli. This enhanced activity might be implemented by the amygdalar re-entrant projections.
NeuroImage | 2004
Wataru Sato; Sakiko Yoshikawa; Takanori Kochiyama; Michikazu Matsumura
Neuroimaging studies have shown activity in the amygdala in response to facial expressions of emotion, but the specific role of the amygdala remains unknown. We hypothesized that the amygdala is involved in emotional but not basic sensory processing for facial expressions. To test this hypothesis, we manipulated the face directions of emotional expressions in the unilateral visual fields; this manipulation made it possible to alter the emotional significance of the facial expression for the observer without affecting the physical features of the expression. We presented angry/neutral expressions looking toward/away from the subject and depicted brain activity using fMRI. After the image acquisitions, the subjects experience of negative emotion when perceiving each stimulus was also investigated. The left amygdala showed the interaction between emotional expression and face direction, indicating higher activity for angry expressions looking toward the subjects than angry expressions looking away from them. The experienced emotion showed the corresponding interaction. Regression analysis showed a positive relation between the left amygdala activity and experienced emotion. These results suggest that the amygdala is involved in emotional but not visuoperceptual processing for emotional facial expressions, which specifically includes the decoding of emotional significance and elicitation of ones own emotions corresponding to that significance.
The Journal of Neuroscience | 2006
Ryo Kitada; Tomonori Kito; Daisuke N. Saito; Takanori Kochiyama; Michikazu Matsumura; Norihiro Sadato; Susan J. Lederman
Humans can judge grating orientation by touch. Previous studies indicate that the extrastriate cortex is involved in tactile orientation judgments, suggesting that this area is related to visual imagery. However, it has been unclear which neural mechanisms are crucial for the tactile processing of orientation, because visual imagery is not always required for tactile spatial tasks. We expect that such neural mechanisms involve multisensory areas, because our perception of space is highly integrated across modalities. The current study uses functional magnetic resonance imaging during the classification of grating orientations to evaluate the neural substrates responsible for the multisensory spatial processing of orientation. We hypothesized that a region within the intraparietal sulcus (IPS) would be engaged in orientation processing, regardless of the sensory modality. Sixteen human subjects classified the orientations of passively touched gratings and performed two control tasks with both the right and left hands. Tactile orientation classification activated regions around the right postcentral sulcus and IPS, regardless of the hand used, when contrasted with roughness classification of the same stimuli. Right-lateralized activation was confirmed in these regions by evaluating the hemispheric effects of tactile spatial processing with both hands. In contrast, visual orientation classification activated the left middle occipital gyrus when contrasted with color classification of the same stimuli. Furthermore, visual orientation classification activated a part of the right IPS that was also activated by the tactile orientation task. Thus, we suggest that a part of the right IPS is engaged in the multisensory spatial processing of grating orientation.
NeuroImage | 2009
Wataru Sato; Takanori Kochiyama; Shota Uono; Sakiko Yoshikawa
Eye gaze, hand-pointing gestures, and arrows automatically trigger attentional shifts. Although it has been suggested that common neural mechanisms underlie these three types of attentional shifts, this issue remains unsettled. We measured brain activity using fMRI while participants observed directional and non-directional stimuli, including eyes, hands, and arrows, to investigate this issue. Conjunction analyses revealed that the posterior superior temporal sulcus (STS), the inferior parietal lobule, the inferior frontal gyrus, and the occipital cortices in the right hemisphere were more active in common in response to directional versus non-directional stimuli. These results suggest commonalities in the neurocognitive mechanisms underlying the automatic attentional shifts triggered by gaze, gestures, and symbols.
Journal of Cognitive Neuroscience | 2009
Ryo Kitada; Ingrid S. Johnsrude; Takanori Kochiyama; Susan J. Lederman
Humans can recognize common objects by touch extremely well whenever vision is unavailable. Despite its importance to a thorough understanding of human object recognition, the neuroscientific study of this topic has been relatively neglected. To date, the few published studies have addressed the haptic recognition of nonbiological objects. We now focus on haptic recognition of the human body, a particularly salient object category for touch. Neuroimaging studies demonstrate that regions of the occipito-temporal cortex are specialized for visual perception of faces (fusiform face area, FFA) and other body parts (extrastriate body area, EBA). Are the same category-sensitive regions activated when these components of the body are recognized haptically? Here, we use fMRI to compare brain organization for haptic and visual recognition of human body parts. Sixteen subjects identified exemplars of faces, hands, feet, and nonbiological control objects using vision and haptics separately. We identified two discrete regions within the fusiform gyrus (FFA and the haptic face region) that were each sensitive to both haptically and visually presented faces; however, these two regions differed significantly in their response patterns. Similarly, two regions within the lateral occipito-temporal area (EBA and the haptic body region) were each sensitive to body parts in both modalities, although the response patterns differed. Thus, although the fusiform gyrus and the lateral occipito-temporal cortex appear to exhibit modality-independent, category-sensitive activity, our results also indicate a degree of functional specialization related to sensory modality within these structures.
NeuroImage | 2005
Ryo Kitada; Toshihiro Hashimoto; Takanori Kochiyama; Tomonori Kito; Tomohisa Okada; Michikazu Matsumura; Susan J. Lederman; Norihiro Sadato
Human subjects can tactually estimate the magnitude of surface roughness. Although many psychophysical and neurophysiological experiments have elucidated the peripheral neural mechanisms that underlie tactile roughness estimation, the associated cortical mechanisms are not well understood. To identify the brain regions responsible for the tactile estimation of surface roughness, we used functional magnetic resonance imaging (fMRI). We utilized a combination of categorical (subtraction) and parametric factorial approaches wherein roughness was varied during both the task and its control. Fourteen human subjects performed a tactile roughness-estimation task and received the identical tactile stimulation without estimation (no-estimation task). The bilateral parietal operculum (PO), insula and right lateral prefrontal cortex showed roughness-related activation. The bilateral PO and insula showed activation during the no-estimation task, and hence might represent the sensory-based processing during roughness estimation. By contrast, the right prefrontal cortex is more related to the cognitive processing, as there was activation during the estimation task compared with the no-estimation task, but little activation was observed during the no-estimation task in comparison with rest. The lateral prefrontal area might play an important cognitive role in tactile estimation of surface roughness, whereas the PO and insula might be involved in the sensory processing that is important for estimating surface roughness.
NeuroImage | 2005
Takanori Kochiyama; Tomoyo Morita; Tomohisa Okada; Yoshiharu Yonekura; Michikazu Matsumura; Norihiro Sadato
Task-related motion is a major source of noise in functional magnetic-resonance imaging (fMRI) time series. The motion effect usually persists even after perfect spatial realignment is achieved. Here, we propose a new method to remove a certain type of task-related motion effect that persists after realignment. The procedure consists of the following: the decomposition of the realigned time-series data into spatially-independent components using independent-component analysis (ICA); the automatic classification and rejection of the ICs of the task-related residual motion effects; and finally, a reconstruction without them. To classify the ICs, we utilized the associated task-related changes in signal intensity and variance. The effectiveness of the method was verified using an fMRI experiment that explicitly included head motion as a main effect. The results indicate that our ICA-based method removed the task-related motion effects more effectively than the conventional voxel-wise regression-based method.
Neuropsychologia | 2011
Wataru Sato; Takanori Kochiyama; Shota Uono; Kazumi Matsuda; Keiko Usui; Yushi Inoue; Motomi Toichi
Neuroimaging studies have reported greater activation of the human amygdala in response to emotional facial expressions, especially for fear. However, little is known about how fast this activation occurs. We investigated this issue by recording the intracranial field potentials of the amygdala in subjects undergoing pre-neurosurgical assessment (n=6). The subjects observed fearful, happy, and neutral facial expressions. Time-frequency statistical parametric mapping analyses revealed that the amygdala showed greater gamma-band activity in response to fearful compared with neutral facial expressions at 50-150 ms, with a peak at 135 ms. These results indicate that the human amygdala is able to rapidly process fearful facial expressions.