Hideaki Touyama
Toyama Prefectural University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Hideaki Touyama.
advances in computer entertainment technology | 2009
Fabien Lotte; Junya Fujisawa; Hideaki Touyama; Rika Ito; Michitaka Hirose; Anatole Lécuyer
Brain-Computer Interfaces (BCI) are communication systems that enable users to interact with computers using only brain activity. This activity is generally measured by ElectroEncephaloGraphy (EEG). A major limitation of BCI is the electrical sensitivity of EEG which causes severe deterioration of the signals when the user is moving. This constrains current EEG-based BCI to be used only by sitting and still subjects, hence limiting the use of BCI for applications such as video games. In this paper, we proposed a feasibility study to discover whether a BCI system, here based on the P300 brain signal, could be used with a moving subject. We recorded EEG signals from 5 users in 3 conditions: sitting, standing and walking. Analysis of the recorded signals suggested that despite the noise generated by the users motion, it was still possible to detect the P300 in the signals in each of the three conditions. This opens new perspective of applications using a wearable P300-based BCI as input device, e.g., for entertainment and video games.
international conference of the ieee engineering in medicine and biology society | 2008
Hideaki Touyama; Michitaka Hirose
A research on biometry based on human brain activities has lately been emerging. In this study, we investigate the feasibility of personal identification using one-channel electroencephalogram during photo retrieval in oddball paradigm. The use of non-target photo images was examined to improve the identification performances. Nine photo images were randomly presented one after another to five subjects. The Principal Component Analysis and the Linear Discriminant Analysis were applied for the signal processing. With EEG activities both during target and non-target photo retrieval, the algorithm successfully improved the identification rates. The rates were 87.2, 95.0, and 97.6% using 5, 10, and 20-time averaging, respectively. The performances with EEG only during target photo retrieval were lower by 5–13%. This study reveals a future possibility of photo retrieval tasks to realize the personal identification using human brain activities, which will yield rich controls of machine for the users of brain computer-interface.
international conference on universal access in human computer interaction | 2007
Hideaki Touyama; Michitaka Hirose
The human brain activities of steady-state visual evoked potentials, induced by a virtual panorama and two objects, were recorded for two subjects in immersive virtual environment. The linear discriminant analysis with single trial EEG data for 1.0 seconds resulted in 74.2 % of averaged recognition rate in inferring three gaze directions. The possibility of online interaction with 3D images in CAVE will be addressed for walking application or remote control of a robotic camera.
conference on human interface | 2007
Hideaki Touyama; Michitaka Hirose
The electroencephalogram signals of steady-state visual evoked potentials were recorded for three subjects in immersive virtual environment. A machine learning technique of support vector machine with single trial EEG data for 1.0 seconds resulted in 92.1 % of averaged recognition rate in selecting a virtual button among two. The online demonstrations in CAVE showed good performance in position control of a simple 3D object.
ieee virtual reality conference | 2008
Junya Fujisawa; Hideaki Touyama; Michitaka Hirose
In this paper, the technology of non-invasive brain-computer interfacing (BCI) in immersing virtual environment was focused for navigation with no subject training. The common spatial patterns were for the first time applied to EEG-based BCI system in CAVE with five-screen configuration. The interfacing system was based on two brain states of left and right hand phantom movements. It was suggested that the navigation could be demonstrated with no subject training in immersing virtual environment.
international conference of the ieee engineering in medicine and biology society | 2008
Junya Fujisawa; Hideaki Touyama; Michitaka Hirose
In this paper, alpha band modulation during visual spatial attention without visual stimuli was focused. Visual spatial attention has been expected to provide a new channel of non-invasive independent brain computer interface (BCI), but little work has been done on the new interfacing method. The flickering stimuli used in previous work cause a decline of independency and have difficulties in a practical use. Therefore we investigated whether visual spatial attention could be detected without such stimuli. Further, the common spatial patterns (CSP) were for the first time applied to the brain states during visual spatial attention. The performance evaluation was based on three brain states of left, right and center direction attention. The 30-channel scalp electroencephalographic (EEG) signals over occipital cortex were recorded for five subjects. Without CSP, the analyses made 66.44 (range 55.42 to 72.27) % of average classification performance in discriminating left and right attention classes. With CSP, the averaged classification accuracy was 75.39 (range 63.75 to 86.13) %. It is suggested that CSP is useful in the context of visual spatial attention, and the alpha band modulation during visual spatial attention without flickering stimuli has the possibility of a new channel for independent BCI as well as motor imagery.In this paper, alpha band modulation during visual spatial attention without visual stimuli was focused. Visual spatial attention has been expected to provide a new channel of non-invasive independent brain computer interface (BCI), but little work has been done on the new interfacing method. The flickering stimuli used in previous work cause a decline of independency and have difficulties in a practical use. Therefore we investigated whether visual spatial attention could be detected without such stimuli. Further, the common spatial patterns (CSP) were for the first time applied to the brain states during visual spatial attention. The performance evaluation was based on three brain states of left, right and center direction attention. The 30-channel scalp electroencephalographic (EEG) signals over occipital cortex were recorded for five subjects. Without CSP, the analyses made 66.44 (range 55.42 to 72.27) % of average classification performance in discriminating left and right attention classes. With CSP, the averaged classification accuracy was 75.39 (range 63.75 to 86.13) %. It is suggested that CSP is useful in the context of visual spatial attention, and the alpha band modulation during visual spatial attention without flickering stimuli has the possibility of a new channel for independent BCI as well as motor imagery.
Journal of Advanced Computational Intelligence and Intelligent Informatics | 2016
Junwei Fan; Hideaki Touyama
Brain signals can be applied to human-computer interaction. By using the brain signals, we can detect the attention. Based on event-related potential P300 signals, the brain-machine interface enables the users to select the desired letters only by means of attention. Previous studies have reported the feasibility of the P300 signals with single subject to realize a novel information retrieval. In recent years, the collaborative EEG of the multiple subjects has been studied, with which the classification performance to detect attention was remarkably improved. In this paper, we propose the emotional face retrieval with the P300 signals of the multiple subjects. The number of subjects in the multiple subjects condition was 12. As a result, the F-measure value in the single subject condition was 0.618 ± 0.046 (standard deviation), and in the multiple subjects condition was 0.832. In conclusion, the classification performance of emotional face retrieval can be improved with collaborative P300 signals from multiple subjects. This technique might be applied to life log, computer supported cooperative work and neuromarketing in future.
systems, man and cybernetics | 2013
Hideaki Touyama; Kazuya Maeda
Brain-Computer Interface (BCI) has been more and more investigated today by using ElectroEncephaloGram (EEG). Usually, the BCI is applied to the subjects in sitting conditions to have high quality of the EEG. Such limited condition has prevented the users to use the novel interfacing system in their ordinary lives. In this paper, toward the life log application, we developed a prototype of wearable BCI system which enables us to operate it in ambulatory condition in outdoor environment. By using the auditory stimulations in oddball paradigm, the EEG was recorded in learning phase, and the classification of P300 signals was performed in the following testing phase. The users equipped the system and walked in outdoor to make online life log indexing. As the results, in this preliminary study, it is suggested that the BCI system could successfully record the target scenes in real-time only from the users brain activities.
international conference on human-computer interaction | 2011
Hideaki Touyama; Kazuya Maeda
In this paper, we studied electroencephalogram during ambulatory conditions in outdoor environment. Five healthy subjects participated in this experiment. The task of the self-paced walking subjects was to count the number of appearances of the target auditory stimulus using oddball paradigm. We observed P300 evoked potentials in ambulatory conditions in outdoor environment as well as sitting conditions in indoor environment. Our results are encouraging and make new direction to promising novel applications of ambulatory BCIs.
international conference of the ieee engineering in medicine and biology society | 2010
Hideaki Touyama
In this paper, we investigated the quality of ElectroEncephaloGraphic (EEG) signals during performing physical movements. By using a portable EEG device, the Steady-State Visual Evoked Potential (SSVEP) was recorded on parietal and occipital locations. The SSVEP induced by flickering stimuli was successfully observed in the self-paced mimic walking conditions as well as in the sitting conditions. To see the dependence of temporal and spatial filters on the potential performance of Brain-Computer Interfaces (BCI) we applied the signal processing of Principal Component Analysis and Linear Discriminant Analysis. The pattern recognition performances in inferring the subjects eye gaze directions from the EEG signals could be perfect even in the self-paced mimic walking conditions. It was found that three electrodes on parieto-occipital and occipital locations were essential in order to have perfect performances. From these results, we conclude that the applications using SSVEP-based BCI can be realized even in the physically moving context.