Goh Matsuda
University of Tokyo
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Goh Matsuda.
PLOS ONE | 2008
Ari Ueno; Satoshi Hirata; Kohki Fuwa; Keiko Sugama; Kiyo Kusunoki; Goh Matsuda; Hirokata Fukushima; Kazuo Hiraki; Masaki Tomonaga; Toshikazu Hasegawa
Background For decades, the chimpanzee, phylogenetically closest to humans, has been analyzed intensively in comparative cognitive studies. Other than the accumulation of behavioral data, the neural basis for cognitive processing in the chimpanzee remains to be clarified. To increase our knowledge on the evolutionary and neural basis of human cognition, comparative neurophysiological studies exploring endogenous neural activities in the awake state are needed. However, to date, such studies have rarely been reported in non-human hominid species, due to the practical difficulties in conducting non-invasive measurements on awake individuals. Methodology/Principal Findings We measured auditory event-related potentials (ERPs) of a fully awake chimpanzee, with reference to a well-documented component of human studies, namely mismatch negativity (MMN). In response to infrequent, deviant tones that were delivered in a uniform sound stream, a comparable ERP component could be detected as negative deflections in early latencies. Conclusions/Significance The present study reports the MMN-like component in a chimpanzee for the first time. In human studies, various ERP components, including MMN, are well-documented indicators of cognitive and neural processing. The results of the present study validate the use of non-invasive ERP measurements for studies on cognitive and neural processing in chimpanzees, and open the way for future studies comparing endogenous neural activities between humans and chimpanzees. This signifies an essential step in hominid cognitive neurosciences.
Biology Letters | 2010
Ari Ueno; Satoshi Hirata; Kohki Fuwa; Keiko Sugama; Kiyo Kusunoki; Goh Matsuda; Hirokata Fukushima; Kazuo Hiraki; Masaki Tomonaga; Toshikazu Hasegawa
The brain activity of a fully awake chimpanzee being presented with her name was investigated. Event-related potentials (ERPs) were measured for each of the following auditory stimuli: the vocal sound of the subjects own name (SON), the vocal sound of a familiar name of another group member, the vocal sound of an unfamiliar name and a non-vocal sound. Some differences in ERP waveforms were detected between kinds of stimuli at latencies at which P3 and Nc components are typically observed in humans. Following stimulus onset, an Nc-like negative shift at approximately 500 ms latency was observed, particularly in response to SON. Such specific ERP patterns suggest that the chimpanzee processes her name differently from other sounds.
Frontiers in Human Neuroscience | 2014
Sachiyo Ozawa; Goh Matsuda; Kazuo Hiraki
This study investigated the neural processing underlying the cognitive control of emotions induced by the presentation of task-irrelevant emotional pictures before a working memory task. Previous studies have suggested that the cognitive control of emotion involves the prefrontal regions. Therefore, we measured the hemodynamic responses that occurred in the prefrontal region with a 16-channel near-infrared spectroscopy (NIRS) system. In our experiment, participants observed two negative or two neutral pictures in succession immediately before a 1-back or 3-back task. Pictures were selected from the International Affective Picture System (IAPS). We measured the changes in the concentration of oxygenated hemoglobin (oxyHb) during picture presentation and during the n-back task. The emotional valence of the picture affected the oxyHb changes in anterior parts of the medial prefrontal cortex (MPFC) (located in the left and right superior frontal gyrus) and left inferior frontal gyrus during the n-back task; the oxyHb changes during the task were significantly greater following negative rather than neutral stimulation. As indicated in a number of previous studies, and the time courses of the oxyHb changes in our study, activation in these locations is possibly led by cognitive control of emotion, though we cannot deny it may simply be emotional responses. There were no effects of emotion on oxyHb changes during picture presentation or on n-back task performance. Although further studies are necessary to confirm this interpretation, our findings suggest that NIRS can be used to investigate neural processing during emotional control.
Archive | 2005
Goh Matsuda; Kazuo Hiraki
Wide deactivation in the DPFC during video game play was observed in experiment 1. In addition, the results of experiment 2 indicated that similar regions were also deactivated during video game observation. Because simple finger action activated these same regions, we conclude that DPFC deactivation during video games is mainly due to the visual aspects of video games and, further, that it depends more on a player’s positive attention to the game than its visual attributes.
PLOS ONE | 2010
Hirokata Fukushima; Satoshi Hirata; Ari Ueno; Goh Matsuda; Kohki Fuwa; Keiko Sugama; Kiyo Kusunoki; Masahiro Hirai; Kazuo Hiraki; Masaki Tomonaga; Toshikazu Hasegawa
Background The neural system of our closest living relative, the chimpanzee, is a topic of increasing research interest. However, electrophysiological examinations of neural activity during visual processing in awake chimpanzees are currently lacking. Methodology/Principal Findings In the present report, skin-surface event-related brain potentials (ERPs) were measured while a fully awake chimpanzee observed photographs of faces and objects in two experiments. In Experiment 1, human faces and stimuli composed of scrambled face images were displayed. In Experiment 2, three types of pictures (faces, flowers, and cars) were presented. The waveforms evoked by face stimuli were distinguished from other stimulus types, as reflected by an enhanced early positivity appearing before 200 ms post stimulus, and an enhanced late negativity after 200 ms, around posterior and occipito-temporal sites. Face-sensitive activity was clearly observed in both experiments. However, in contrast to the robustly observed face-evoked N170 component in humans, we found that faces did not elicit a peak in the latency range of 150–200 ms in either experiment. Conclusions/Significance Although this pilot study examined a single subject and requires further examination, the observed scalp voltage patterns suggest that selective processing of faces in the chimpanzee brain can be detected by recording surface ERPs. In addition, this non-invasive method for examining an awake chimpanzee can be used to extend our knowledge of the characteristics of visual cognition in other primate species.
Scientific Reports | 2013
Satoshi Hirata; Goh Matsuda; Ari Ueno; Hirokata Fukushima; Koki Fuwa; Keiko Sugama; Kiyo Kusunoki; Masaki Tomonaga; Kazuo Hiraki; Toshikazu Hasegawa
Advancement of non-invasive brain imaging techniques has allowed us to examine details of neural activities involved in affective processing in humans; however, no comparative data are available for chimpanzees, the closest living relatives of humans. In the present study, we measured event-related brain potentials in a fully awake adult chimpanzee as she looked at affective and neutral pictures. The results revealed a differential brain potential appearing 210 ms after presentation of an affective picture, a pattern similar to that in humans. This suggests that at least a part of the affective process is similar between humans and chimpanzees. The results have implications for the evolutionary foundations of emotional phenomena, such as emotional contagion and empathy.
Communicative & Integrative Biology | 2011
Satoshi Hirata; Goh Matsuda; Ari Ueno; Koki Fuwa; Keiko Sugama; Kiyo Kusunoki; Hirokata Fukushima; Kazuo Hiraki; Masaki Tomonaga; Toshikazu Hasegawa
The sound of one’s own name is one of the most salient auditory environmental stimuli. Several studies of human brain potentials have revealed some characteristic waveforms when we hear our own names. In a recent work, we investigated event-related potentials (ERPs) in a female chimpanzee and demonstrated that the ERP pattern generated when she heard her own name differed from that generated when she heard other sounds. However, her ERPs did not exhibit a prominent positive shift around 300 ms (P3) in response to her own name, as has been repeatedly shown in studies of human ERPs. The present study collected comparative data for adult humans using basically the same procedure as that used in our previous study of the chimpanzee. These results also revealed no prominent P3 to the human subjects’ own names. The lack of increased P3 is therefore likely due to our experimental protocol, in which we presented the subject’s own name relatively frequently. In contrast, our results revealed prominent negativity to the subject’s own name at around 500 ms in the chimpanzee and around 200 ms in human subjects. This may indicate that initial orientation to the sound of one’s own name is delayed in the chimpanzee.
Nature Human Behaviour | 2017
Yasuhiro Kanakogi; Yasuyuki Inoue; Goh Matsuda; David Butler; Kazuo Hiraki; Masako Myowa-Yamakoshi
Protective interventions by a third party on the behalf of others are generally admired, and as such are associated with our notions of morality, justice and heroism1–4. Indeed, stories involving such third-party interventions have pervaded popular culture throughout recorded human history, in myths, books and movies. The current developmental picture is that we begin to engage in this type of intervention by preschool age. For instance, 3-year-old children intervene in harmful interactions to protect victims from bullies5, and furthermore, not only punish wrongdoers but also give priority to helping the victim6. It remains unknown, however, when we begin to affirm such interventions performed by others. Here we reveal these developmental origins in 6- and 10-month old infants (N = 132). After watching aggressive interactions involving a third-party agent who either interfered or did not, 6-month-old infants preferred the former. Subsequent experiments confirmed the psychological processes underlying such choices: 6-month-olds regarded the interfering agent to be protecting the victim from the aggressor, but only older infants affirmed such an intervention after considering the intentions of the interfering agent. These findings shed light upon the developmental trajectory of perceiving, understanding and performing protective third-party interventions, suggesting that our admiration for and emphasis upon such acts — so prevalent in thousands of stories across human cultures — is rooted within the preverbal infant’s mind.
PeerJ | 2013
Hirokata Fukushima; Satoshi Hirata; Goh Matsuda; Ari Ueno; Kohki Fuwa; Keiko Sugama; Kiyo Kusunoki; Kazuo Hiraki; Masaki Tomonaga; Toshikazu Hasegawa
Evaluating the familiarity of faces is critical for social animals as it is the basis of individual recognition. In the present study, we examined how face familiarity is reflected in neural activities in our closest living relative, the chimpanzee. Skin-surface event-related brain potentials (ERPs) were measured while a fully awake chimpanzee observed photographs of familiar and unfamiliar chimpanzee faces (Experiment 1) and human faces (Experiment 2). The ERPs evoked by chimpanzee faces differentiated unfamiliar individuals from familiar ones around midline areas centered on vertex sites at approximately 200 ms after the stimulus onset. In addition, the ERP response to the image of the subject’s own face did not significantly diverge from those evoked by familiar chimpanzees, suggesting that the subject’s brain at a minimum remembered the image of her own face. The ERPs evoked by human faces were not influenced by the familiarity of target individuals. These results indicate that chimpanzee neural representations are more sensitive to the familiarity of conspecific than allospecific faces.
Archive | 2018
Goh Matsuda; Kazuo Hiraki; Hiroshi Ishiguro
We evaluate the humanlike-ness of humanoid robots using electroencephalography (EEG). As the activity of the human mirror-neuron system (MNS) is believed to reflect the humanlike-ness of observed agents, we compare the MNS activity of 17 participants while observing certain actions performed by a human, an extremely humanlike android, and a machine-like humanoid. We find the MNS to be significantly activated only when the participants observe actions performed by the human. Despite the participants’ rating of the android appearance as more humanlike than that of the robot, the MNS activity corresponding to each of the three agents does not differ. These findings suggest that appearance does not crucially affect MNS activity, and that factors such as motion should be targeted for improving the humanlike-ness of humanoid robots.