Masatsugu Sakajiri
University of Tsukuba
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Masatsugu Sakajiri.
systems, man and cybernetics | 2012
Takahiro Miura; Haruo Matsuzaka; Masatsugu Sakajiri; Tsukasa Ono
Along with the spread of touchscreen computers, the accessibility of the touchscreen interfaces should be secured for visually impaired people. Some of them had interest and own touchscreen computers, usage condition and needs of the computer was not investigated. Specifically, following situations concerning touchscreen computers remain unknown: what kind of inconvenience they found in their using, what kind of functions they need, why some of them avoid using regardless of their interest, etc. In order to propose and familiarize user-friendly interface for visually impaired people, it is necessary not only to research accessible touchscreen computer but also to investigate current situation regarding touchscreen computers. In this paper, we aimed at demonstrating the needs and usages of touchscreen interfaces such as smartphones and tablet computers in people with visual impairment. These situations are investigated through a questionnaire for Japanese visually impaired people. Part of the results indicated that most of them would like to use touchscreen computer, that most of owners uses by means of screen readers regardless of their visual conditions, and that dynamic and even stable tactile feedback can be effective.
systems, man and cybernetics | 2011
Takahiro Miura; Yuka Ebihara; Masatsugu Sakajiri; Tohru Ifukube
The visually-impaired people have the difficulty on perceive the largeness of a space and objects existed in a space by means of visual information. Particularly auditory-trained visually-impaired people can recognize 3-D spatial information with environmental sounds. However, systematic learning method of auditory training for acquired visually-impaired is not established sufficiently because the self-experience of the visually-impaired people is the main reason of ability acquisition in the actual environment because they do not have enough information about moving in the real environment. In this paper, the authors aim at demonstrating moving situations and moving needs of the visually-impaired, for example, which acoustical factors can be used in some living situation, what kind of environmental situation they think it difficult to perceive silent objects and what kind of aid they need. Results indicated following facts: Totally visually-impaired people tend to get more spatial information from auditory than not-totally visually-impaired people. Regarding available auditory cues, items of rotating head in order to listen carefully to environmental sound, and hitting floors stronger by a white cane or foot for voluming up reflected or reverberated sounds were much selected by the totally visually-impaired participants who can perceive obstacles with auditory information. Result of question about conventional devices indicated that most totally visually-impaired participants selected electrical sounding device informing obstacle distance, while most not-totally visually-impaired participants selected vibrating device informing the distance of walls or doors and tactually-stimulated device informing obstacle distance.
conference on computers and accessibility | 2013
Takahiro Miura; Ken-ichiro Yabu; Masatsugu Sakajiri; Mari Ueda; Junya Suzuki; Atsushi Hiyama; Michitaka Hirose; Tohru Ifukube
Accessibility information can allow disabled people to identify suitable pathways to reach their destinations, but it is difficult to obtain new accessible pathway information rapidly because of limited local information disclosure. Thus, it is necessary to develop a comprehensive system that acquires barrier-free information from various sources and makes that information available in an intuitive form. In this study, we aimed to develop a social platform to obtain and present appropriate information depending on the users situation, such as the users disabilities and location, and to share the barrier-free information provided by other users.
international conference on computers helping people with special needs | 2016
Masaki Matsuo; Takahiro Miura; Masatsugu Sakajiri; Junji Onishi; Tsukasa Ono
Although many computer games have recently become diversified, plenty of effort and ingenuity is needed to produce games that persons with a total visual impairment can enjoy. Though some games for visually impaired persons have been developed, games that use only auditory information present challenges for sighted persons. Moreover, unfortunately, it is still difficult for visually impaired persons to play the same game with sighted persons and for sighted and visually impaired persons to share a common subject. To solve this problem, we developed a barrier-free game that both sighted and visually impaired persons can play using their dominant senses including visual, auditory and tactile senses. Moreover, we developed a map editor for a game developer with blindness and provided an integrated game development environment for them. In this paper, we describe the development and reflections of the barrier-free game and the map editor.
international conference on computers for handicapped persons | 2014
Takahiro Miura; Masatsugu Sakajiri; Haruo Matsuzaka; Murtada Eljailani; Kazuki Kudo; Naoya Kitamura; Junji Onishi; Tsukasa Ono
This paper demonstrates the usage of touchscreen interfaces in the Japanese visually impaired population by means questionnaire surveys conducted in 2011, 2012, and 2013. In 2011 and 2013, we carried out usage situations of touchscreens and the reasons why some of them did not use it. The surveys in 2012 and 2013 comprised the questionnaire items regarding specific manipulation situations of touchscreens. Some of the results indicate that an increasing number of visually impaired people used and required to use touchscreen computers; some of them did not want to use it because they were satisfied with conventional cell phones, and because they are waiting for the device which can feedback tactually; the users of touchscreen computers with total and partial vi- sual impairments mainly uses double-tapping after tracing for selecting buttons and objects; The proper uses and manipulations of smartphones and tablet computers mainly depends on the application usability and the screen size, respectively.
2014 IEEE Symposium on Computational Intelligence in Robotic Rehabilitation and Assistive Technologies (CIR2AT) | 2014
Masatsugu Sakajiri; Shigeki Miyoshi; Junji Onishi; Tsukasa Ono; Tohru Ifukube
Deafblind and hearing impaired persons cannot perceive their own voice pitch, and thus have difficulty controlling it. While singing, the voice pitch needs to be controlled to maintain a stable tone. To address this problem, a tactile voice pitch control system was developed to assist such people in singing. In a previous study, two deafblind subjects used the proposed system to control their voice pitch with accuracy comparable to that of the hearing children. In the present study, we investigate the proprioceptive pitch control and the effect of the proposed voice pitch control system on normal-hearing people under conditions of added noise. The results show that the total average mean deviation without tactile feedback is 405.6 cents (SD: 42.4), whereas, with tactile feedback, it is 57.5 cents (SD: 12.2).
systems, man and cybernetics | 2012
Masatsugu Sakajiri; Kenryu Nakamura; Satoshi Fukushima; Shigeki Miyoshi; Tohru Ifukube
It is difficult for the deafblind or the hearing impaired to control the pitch of their voice because they cannot perceive it. In particular, when singing, it is very difficult for them to control their voice pitch because they need to maintain a stable tone. We have developed a voice pitch control system to assist such people in their singing by means of a tactile display. Using this tactile feedback system, we verified in a previous study that two deafblind subjects were able to control the pitch of their voices with as much accuracy as hearing children. By using this tactile feedback system, the correspondence between musical scale and proprioceptive sensation (muscular sensation and so on) of the two subjects was returned to pre-hearing loss levels. They sung using not only tactile feedback but also proprioceptive feedback. In this paper, we investigate the effect of voice pitch training, without auditory feedback, using our tactile feedback display system. Eight hearing participants were examined under two conditions-“without tactile feedback” and “with tactile feedback”-to ascertain their abilities to control their pitch while subjected to masking noise. The results indicated the subjects could sing with more accurate pitch using only proprioceptive feedback after tactile feedback pitch control training than that of before tactile feedback one.
systems, man and cybernetics | 2011
Masatsugu Sakajiri; Shigeki Miyoshi; Kenryu Nakamura; Satoshi Fukushima; Tohru Ifukube
It is difficult for the deafblind or the hearing impaired to control the pitch of their voice because they cannot perceive it. In particular, when singing, it is very difficult for them to control their voice pitch because they need to maintain a stable tone. We have developed a voice pitch control system to assist their singing by means of a tactile display. Using this tactile feedback system, we verified in a previous study that two deafblind subjects were able to control the pitch of their voices with as much accuracy as hearing children. By using this tactile feedback system, the correspondence between musical scale and proprioceptive sensation (muscular sensation and so on) of the two subjects was returned to pre-hearing loss levels. They sung using not only tactile feedback but also proprioceptive feedback. In this paper, we investigate the ability of hearing subjects to control the pitch of their voice, without auditory feedback, using our tactile feedback system. Seven hearing participants were examined under two conditions—“without tactile feedback” and “with tactile feedback”—to ascertain their abilities to control their pitch while subjected to masking noise. The results indicated hearing subjects could not sing with accurate pitch using only proprioceptive feedback (“without tactile feedback” condition).
augmented human international conference | 2018
Takahiro Miura; Shimpei Soga; Masaki Matsuo; Masatsugu Sakajiri; Junji Onishi; Tsukasa Ono
Goalball, one of the official Paralympic events, is popular with visually impaired people all over the world. The purpose of goalball is to throw the specialized ball, with bells inside it, to the goal line of the opponents as many times as possible while defenders try to block the thrown ball with their bodies. Since goalball players cannot rely on visual information, they need to grasp the game situation using their auditory sense. However, it is hard, especially for beginners, to perceive the direction and distance of the thrown ball. In addition, they generally tend to be afraid of the approaching ball because, without visual information, they could be hit by a high-speed ball. In this paper, our goal is to develop an application called GoalBaural (Goalball + aural) that enables goalball players to improve the recognizability of the direction and distance of a thrown ball without going onto the court and playing goalball. The evaluation result indicated that our application would be efficient in improving the speed and the accuracy of locating the balls.
international conference on computers for handicapped persons | 2014
Takahiro Miura; Masatsugu Sakajiri; Murtada Eljailani; Haruo Matsuzaka; Junji Onishi; Tsukasa Ono
Regardless of the improvement of accessibility functions, people with visual impairments have problems using touchscreen computers. Though the size of accessible objects may differ for visually impaired users because of the manipulations under screen readers are different from those without screen readers, the characteristics of desired objects and useful gestures on the touchscreen computers for the visually impaired remain unclear. In this paper, our objective is to clarify the accessible single button characteristics and preferable gestures for visually impaired users of touchscreen computers. We studied these characteristics by evaluating the single button interaction of touchscreen interfaces for visually impaired people under a screen reader condition. As a result, the performance of task completion time on selecting task with a single button decreased as the button size became larger; they were ranked in descending order of double-tapping after flicking, double-tapping after tracing, and split-tapping after tracing.