Takahiko Koike
National Institute of Information and Communications Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Takahiko Koike.
Experimental Brain Research | 2009
Satoru Miyauchi; Masaya Misaki; Shigeyuki Kan; Takahide Fukunaga; Takahiko Koike
To identify the neural substrate of rapid eye movements (REMs) during REM sleep in humans, we conducted simultaneous functional magnetic resonance imaging (fMRI) and polysomnographic recording during REM sleep. Event-related fMRI analysis time-locked to the occurrence of REMs revealed that the pontine tegmentum, ventroposterior thalamus, primary visual cortex, putamen and limbic areas (the anterior cingulate, parahippocampal gyrus and amygdala) were activated in association with REMs. A control experiment during which subjects made self-paced saccades in total darkness showed no activation in the visual cortex. The REM-related activation of the primary visual cortex without visual input from the retina provides neural evidence for the existence of human ponto-geniculo-occipital waves (PGO waves) and a link between REMs and dreaming. Furthermore, the time-course analysis of blood oxygenation level-dependent responses indicated that the activation of the pontine tegmentum, ventroposterior thalamus and primary visual cortex started before the occurrence of REMs. On the other hand, the activation of the putamen and limbic areas accompanied REMs. The activation of the parahippocampal gyrus and amygdala simultaneously with REMs suggests that REMs and/or their generating mechanism are not merely an epiphenomenon of PGO waves, but may be linked to the triggering activation of these areas.
Neuroscience Research | 2011
Takahiko Koike; Shigeyuki Kan; Masaya Misaki; Satoru Miyauchi
Recent studies have compared default-mode network (DMN) connectivity in different arousal levels to investigate the relationship between consciousness and DMN. The comparison between the DMN in rapid eye movement (REM) sleep with that in non-REM (NREM) sleep is useful for revealing the relationship between arousal level and DMN, because the arousal level is at its lowest during deep NREM, while during REM sleep it is as high as wakefulness. Functional magnetic resonance imaging (fMRI) and polysomnogram data were acquired from participants in REM, deep NREM, and light NREM sleep, and the DMN was compared using functional connectivity analysis. Our analysis revealed that functional connectivity among the DMN core regions - the posterior cingulate cortex, rostral anterior cingulate cortex, and inferior parietal lobule - remained consistent across sleep states. In contrast, connectivity involving the DMN subsystems of REM sleep differs from that of NREM sleep, and the change well accounts for the characteristics of REM sleep. Our results suggest that both the DMN core region and subsystems may not relate to the maintenance of arousal. The DMN core network and subsystems may respectively serve to integrate brain regions and perform function specific to each level of arousal.
Frontiers in Human Neuroscience | 2012
Hiroki C. Tanabe; Hirotaka Kosaka; Daisuke N. Saito; Takahiko Koike; Masamichi J. Hayashi; Keise Izuma; Hidetsugu Komeda; Makoto Ishitobi; Masao Omori; Toshio Munesue; Hidehiko Okazawa; Yuji Wada; Norihiro Sadato
Persons with autism spectrum disorders (ASD) are known to have difficulty in eye contact (EC). This may make it difficult for their partners during face to face communication with them. To elucidate the neural substrates of live inter-subject interaction of ASD patients and normal subjects, we conducted hyper-scanning functional MRI with 21 subjects with autistic spectrum disorder (ASD) paired with typically-developed (normal) subjects, and with 19 pairs of normal subjects as a control. Baseline EC was maintained while subjects performed real-time joint-attention task. The task-related effects were modeled out, and inter-individual correlation analysis was performed on the residual time-course data. ASD–Normal pairs were less accurate at detecting gaze direction than Normal–Normal pairs. Performance was impaired both in ASD subjects and in their normal partners. The left occipital pole (OP) activation by gaze processing was reduced in ASD subjects, suggesting that deterioration of eye-cue detection in ASD is related to impairment of early visual processing of gaze. On the other hand, their normal partners showed greater activity in the bilateral occipital cortex and the right prefrontal area, indicating a compensatory workload. Inter-brain coherence in the right IFG that was observed in the Normal-Normal pairs (Saito et al., 2010) during EC diminished in ASD–Normal pairs. Intra-brain functional connectivity between the right IFG and right superior temporal sulcus (STS) in normal subjects paired with ASD subjects was reduced compared with in Normal–Normal pairs. This functional connectivity was positively correlated with performance of the normal partners on the eye-cue detection. Considering the integrative role of the right STS in gaze processing, inter-subject synchronization during EC may be a prerequisite for eye cue detection by the normal partner.
Neuroscience Research | 2015
Takahiko Koike; Hiroki C. Tanabe; Norihiro Sadato
Using a technique for measuring brain activity simultaneously from two people, known as hyperscanning, we can calculate inter-brain neural effects that appear only in interactions between individuals. Hyperscanning studies using fMRI are advantageous in that they can precisely determine the region(s) involved in inter-brain effects. However, it is almost impossible to record inter-brain effects in daily life. By contrast, hyperscanning EEG studies have high temporal resolution and could be used to capture moment-to-moment interactions. In addition, EEG instrumentation is portable and easy to wear, offering the opportunity to record inter-brain effects during daily-life interactions. However, the disadvantage of this approach is that it is difficult to localize the epicenter of the inter-brain effect. fNIRS has better temporal resolution and portability than fMRI, but has limited spatial resolution and a limited ability to record deep brain structures. Future studies should employ hyperscanning EEG-fMRI, because this approach combines the high temporal resolution of EEG with the high spatial resolution of fMRI. Hyperscanning EEG-fMRI allows us to use inter-brain effects as neuromarkers of the properties of social interactions in daily life. We also wish to emphasize the need to develop a mathematical model explaining how two brains can exhibit synchronized activity.
NeuroImage | 2016
Takahiko Koike; Hiroki C. Tanabe; Shuntaro Okazaki; Eri Nakagawa; Akihiro T. Sasaki; Koji Shimada; Sho K. Sugawara; Haruka K. Takahashi; Kazufumi Yoshihara; Jorge Bosch-Bayard; Norihiro Sadato
During a dyadic social interaction, two individuals can share visual attention through gaze, directed to each other (mutual gaze) or to a third person or an object (joint attention). Shared attention is fundamental to dyadic face-to-face interaction, but how attention is shared, retained, and neutrally represented in a pair-specific manner has not been well studied. Here, we conducted a two-day hyperscanning functional magnetic resonance imaging study in which pairs of participants performed a real-time mutual gaze task followed by a joint attention task on the first day, and mutual gaze tasks several days later. The joint attention task enhanced eye-blink synchronization, which is believed to be a behavioral index of shared attention. When the same participant pairs underwent mutual gaze without joint attention on the second day, enhanced eye-blink synchronization persisted, and this was positively correlated with inter-individual neural synchronization within the right inferior frontal gyrus. Neural synchronization was also positively correlated with enhanced eye-blink synchronization during the previous joint attention task session. Consistent with the Hebbian association hypothesis, the right inferior frontal gyrus had been activated both by initiating and responding to joint attention. These results indicate that shared attention is represented and retained by pair-specific neural synchronization that cannot be reduced to the individual level.
NeuroImage | 2014
Takamitsu Watanabe; Shigeyuki Kan; Takahiko Koike; Masaya Misaki; Seiki Konishi; Satoru Miyauchi; Yasushi Miyahsita; Naoki Masuda
Brain activity dynamically changes even during sleep. A line of neuroimaging studies has reported changes in functional connectivity and regional activity across different sleep stages such as slow-wave sleep (SWS) and rapid-eye-movement (REM) sleep. However, it remains unclear whether and how the large-scale network activity of human brains changes within a given sleep stage. Here, we investigated modulation of network activity within sleep stages by applying the pairwise maximum entropy model to brain activity obtained by functional magnetic resonance imaging from sleeping healthy subjects. We found that the brain activity of individual brain regions and functional interactions between pairs of regions significantly increased in the default-mode network during SWS and decreased during REM sleep. In contrast, the network activity of the fronto-parietal and sensory-motor networks showed the opposite pattern. Furthermore, in the three networks, the amount of the activity changes throughout REM sleep was negatively correlated with that throughout SWS. The present findings suggest that the brain activity is dynamically modulated even in a sleep stage and that the pattern of modulation depends on the type of the large-scale brain networks.
NeuroImage | 2016
Masahiro Matsunaga; Hiroaki Kawamichi; Takahiko Koike; Kazufumi Yoshihara; Yumiko Yoshida; Haruka K. Takahashi; Eri Nakagawa; Norihiro Sadato
Happiness is one of the most fundamental human goals, which has led researchers to examine the source of individual happiness. Happiness has usually been discussed regarding two aspects (a temporary positive emotion and a trait-like long-term sense of being happy) that are interrelated; for example, individuals with a high level of trait-like subjective happiness tend to rate events as more pleasant. In this study, we hypothesized that the interaction between the two aspects of happiness could be explained by the interaction between structure and function in certain brain regions. Thus, we first assessed the association between gray matter density (GMD) of healthy participants and trait-like subjective happiness using voxel-based morphometry (VBM). Further, to assess the association between the GMD and brain function, we conducted functional magnetic resonance imaging (MRI) using the task of positive emotion induction (imagination of several emotional life events). VBM indicated that the subjective happiness was positively correlated with the GMD of the rostral anterior cingulate cortex (rACC). Functional MRI demonstrated that experimentally induced temporal happy feelings were positively correlated with subjective happiness level and rACC activity. The rACC response to positive events was also positively correlated with its GMD. These results provide convergent structural and functional evidence that the rACC is related to happiness and suggest that the interaction between structure and function in the rACC may explain the trait-state interaction in happiness.
Neuropsychologia | 2016
Tomoko Matsui; Tagiru Nakamura; Akira Utsumi; Akihiro T. Sasaki; Takahiko Koike; Yumiko Yoshida; Tokiko Harada; Hiroki C. Tanabe; Norihiro Sadato
A hearers perception of an utterance as sarcastic depends on integration of the heard statement, the discourse context, and the prosody of the utterance, as well as evaluation of the incongruity among these aspects. The effect of prosody in sarcasm comprehension is evident in everyday conversation, but little is known about its underlying mechanism or neural substrates. To elucidate the neural underpinnings of sarcasm comprehension in the auditory modality, we conducted a functional MRI experiment with 21 adult participants. The participants were provided with a short vignette in which a child had done either a good or bad deed, about which a parent made a positive comment. The participants were required to judge the degree of the sarcasm in the parents positive comment (praise), which was accompanied by either positive or negative affective prosody. The behavioral data revealed that an incongruent combination of utterance and the context (i.e., the parents positive comment on a bad deed by the child) induced perception of sarcasm. There was a significant interaction between context and prosody: sarcasm perception was enhanced when positive prosody was used in the context of a bad deed or, vice versa, when negative prosody was used in the context of a good deed. The corresponding interaction effect was observed in the rostro-ventral portion of the left inferior frontal gyrus corresponding to Brodmanns Area (BA) 47. Negative prosody incongruent with a positive utterance (praise) activated the bilateral insula extending to the right inferior frontal gyrus, anterior cingulate cortex, and brainstem. Our findings provide evidence that the left inferior frontal gyrus, particularly BA 47, is involved in integration of discourse context and utterance with affective prosody in the comprehension of sarcasm.
PLOS ONE | 2015
Shuntaro Okazaki; Masako Hirotani; Takahiko Koike; Jorge Bosch-Bayard; Haruka K. Takahashi; Maho Hashiguchi; Norihiro Sadato
People’s behaviors synchronize. It is difficult, however, to determine whether synchronized behaviors occur in a mutual direction—two individuals influencing one another—or in one direction—one individual leading the other, and what the underlying mechanism for synchronization is. To answer these questions, we hypothesized a non-leader-follower postural sway synchronization, caused by a reciprocal visuo-postural feedback system operating on pairs of individuals, and tested that hypothesis both experimentally and via simulation. In the behavioral experiment, 22 participant pairs stood face to face either 20 or 70 cm away from each other wearing glasses with or without vision blocking lenses. The existence and direction of visual information exchanged between pairs of participants were systematically manipulated. The time series data for the postural sway of these pairs were recorded and analyzed with cross correlation and causality. Results of cross correlation showed that postural sway of paired participants was synchronized, with a shorter time lag when participant pairs could see one another’s head motion than when one of the participants was blindfolded. In addition, there was less of a time lag in the observed synchronization when the distance between participant pairs was smaller. As for the causality analysis, noise contribution ratio (NCR), the measure of influence using a multivariate autoregressive model, was also computed to identify the degree to which one’s postural sway is explained by that of the other’s and how visual information (sighted vs. blindfolded) interacts with paired participants’ postural sway. It was found that for synchronization to take place, it is crucial that paired participants be sighted and exert equal influence on one another by simultaneously exchanging visual information. Furthermore, a simulation for the proposed system with a wider range of visual input showed a pattern of results similar to the behavioral results.
Neuroreport | 2008
Shigeyuki Kan; Masaya Misaki; Takahiko Koike; Satoru Miyauchi
Studies on saccadic eye movements in humans and animals reported decreased cortical activation accompanying saccades in visual motion sensitive area MT+/V5, implying that the region is the neural basis of saccadic suppression. This, however, conflicts with findings that MT+/V5 is activated by saccades. As MT+/V5 can be subdivided into middle temporal (MT) and medial superior temporal (MST), these regions may have distinct functional roles that cause the discrepancy. To test this hypothesis, we compared the activation of MT with that of MST during exploratory saccades and visually guided saccades. MST was activated only during visually guided saccades, whereas MT was not activated by either. These findings support our hypothesis and suggest that the activity of these regions is differentially modulated depending on extraretinal information.
Collaboration
Dive into the Takahiko Koike's collaboration.
National Institute of Information and Communications Technology
View shared research outputsNational Institute of Information and Communications Technology
View shared research outputsNational Institute of Information and Communications Technology
View shared research outputs