Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Junji Onishi is active.

Publication


Featured researches published by Junji Onishi.


systems, man and cybernetics | 2011

Contour pattern recognition through auditory labels of freeman chain codes for people with visual impairments

Junji Onishi; Tsukasa Ono

Tactile devices are commonly used for pattern recognition learning by people with visual impairments. However, it is often difficult for such persons to recognize the detailed and mathematical shape of objects when using such devices. The primary difficulty is gaining a sufficiently accurate understanding of the shape of an object to permit its correct reconstruction based on tactile device feedback. Furthermore, people with visual impairments have less opportunity to discern visual images and must therefore apply their own sense of understanding to the shape of an object as gained from the tactile experiences. Thus, they are often not sure if it reflects the actual shape. To solve this problem, we propose a pattern recognition technique utilizing auditory labels based on a contour chain code. Using a Freeman chain, object shapes are represented as a sequence of steps, each in one of eight directions; each step is designated by an integer from 0 to 7. Our proposed method takes advantage of Freeman chain codes in order to present the detailed shape of objects. This report describes our proposed method and presents experimental results that verify its effectiveness in support of mathematical education efforts for visually impaired persons.


international conference on computers helping people with special needs | 2016

Audible Mapper & ShadowRine: Development of Map Editor Using only Sound in Accessible Game for Blind Users, and Accessible Action RPG for Visually Impaired Gamers

Masaki Matsuo; Takahiro Miura; Masatsugu Sakajiri; Junji Onishi; Tsukasa Ono

Although many computer games have recently become diversified, plenty of effort and ingenuity is needed to produce games that persons with a total visual impairment can enjoy. Though some games for visually impaired persons have been developed, games that use only auditory information present challenges for sighted persons. Moreover, unfortunately, it is still difficult for visually impaired persons to play the same game with sighted persons and for sighted and visually impaired persons to share a common subject. To solve this problem, we developed a barrier-free game that both sighted and visually impaired persons can play using their dominant senses including visual, auditory and tactile senses. Moreover, we developed a map editor for a game developer with blindness and provided an integrated game development environment for them. In this paper, we describe the development and reflections of the barrier-free game and the map editor.


international conference on computers for handicapped persons | 2014

Usage Situation Changes of Touchscreen Computers in Japanese Visually Impaired People: Questionnaire Surveys in 2011-2013

Takahiro Miura; Masatsugu Sakajiri; Haruo Matsuzaka; Murtada Eljailani; Kazuki Kudo; Naoya Kitamura; Junji Onishi; Tsukasa Ono

This paper demonstrates the usage of touchscreen interfaces in the Japanese visually impaired population by means questionnaire surveys conducted in 2011, 2012, and 2013. In 2011 and 2013, we carried out usage situations of touchscreens and the reasons why some of them did not use it. The surveys in 2012 and 2013 comprised the questionnaire items regarding specific manipulation situations of touchscreens. Some of the results indicate that an increasing number of visually impaired people used and required to use touchscreen computers; some of them did not want to use it because they were satisfied with conventional cell phones, and because they are waiting for the device which can feedback tactually; the users of touchscreen computers with total and partial vi- sual impairments mainly uses double-tapping after tracing for selecting buttons and objects; The proper uses and manipulations of smartphones and tablet computers mainly depends on the application usability and the screen size, respectively.


2014 IEEE Symposium on Computational Intelligence in Robotic Rehabilitation and Assistive Technologies (CIR2AT) | 2014

Tactile pitch feedback system for deafblind or hearing impaired persons singing accuracy of hearing persons under conditions of added noise

Masatsugu Sakajiri; Shigeki Miyoshi; Junji Onishi; Tsukasa Ono; Tohru Ifukube

Deafblind and hearing impaired persons cannot perceive their own voice pitch, and thus have difficulty controlling it. While singing, the voice pitch needs to be controlled to maintain a stable tone. To address this problem, a tactile voice pitch control system was developed to assist such people in singing. In a previous study, two deafblind subjects used the proposed system to control their voice pitch with accuracy comparable to that of the hearing children. In the present study, we investigate the proprioceptive pitch control and the effect of the proposed voice pitch control system on normal-hearing people under conditions of added noise. The results show that the total average mean deviation without tactile feedback is 405.6 cents (SD: 42.4), whereas, with tactile feedback, it is 57.5 cents (SD: 12.2).


augmented human international conference | 2018

GoalBaural: A Training Application for Goalball-related Aural Sense

Takahiro Miura; Shimpei Soga; Masaki Matsuo; Masatsugu Sakajiri; Junji Onishi; Tsukasa Ono

Goalball, one of the official Paralympic events, is popular with visually impaired people all over the world. The purpose of goalball is to throw the specialized ball, with bells inside it, to the goal line of the opponents as many times as possible while defenders try to block the thrown ball with their bodies. Since goalball players cannot rely on visual information, they need to grasp the game situation using their auditory sense. However, it is hard, especially for beginners, to perceive the direction and distance of the thrown ball. In addition, they generally tend to be afraid of the approaching ball because, without visual information, they could be hit by a high-speed ball. In this paper, our goal is to develop an application called GoalBaural (Goalball + aural) that enables goalball players to improve the recognizability of the direction and distance of a thrown ball without going onto the court and playing goalball. The evaluation result indicated that our application would be efficient in improving the speed and the accuracy of locating the balls.


international conference on computers for handicapped persons | 2014

Accessible Single Button Characteristics of Touchscreen Interfaces under Screen Readers in People with Visual Impairments

Takahiro Miura; Masatsugu Sakajiri; Murtada Eljailani; Haruo Matsuzaka; Junji Onishi; Tsukasa Ono

Regardless of the improvement of accessibility functions, people with visual impairments have problems using touchscreen computers. Though the size of accessible objects may differ for visually impaired users because of the manipulations under screen readers are different from those without screen readers, the characteristics of desired objects and useful gestures on the touchscreen computers for the visually impaired remain unclear. In this paper, our objective is to clarify the accessible single button characteristics and preferable gestures for visually impaired users of touchscreen computers. We studied these characteristics by evaluating the single button interaction of touchscreen interfaces for visually impaired people under a screen reader condition. As a result, the performance of task completion time on selecting task with a single button decreased as the button size became larger; they were ranked in descending order of double-tapping after flicking, double-tapping after tracing, and split-tapping after tracing.


systems, man and cybernetics | 2013

Fundamental Study on Tactile Cognition through Haptic Feedback Touchscreen

Junji Onishi; Masatsugu Sakajiri; Takahiro Miurat; Tsukasa Ono

The touch screens of recent years are used in variety of purposes, since it can implement a switch to the software that can control easily and cheaply as well as present visual images at the same time. Traditional switch had physical shape and tactile feel, which made it possible to find and operate by touch, but with touch screen type switch, the texture would be uniform all over like the surface of a glass, so in order to operate the switch, one has to gaze at the image. This presents a huge obstacle to visually impaired or aged persons who cannot easily access visual information. On the other hand, the technology to bring about virtual reality with haptic feedback is beginning to be noticed, and touch screens with haptic feedback functions are gradually becoming viable. So in this paper, based on the assumption that in the coming days securing accessibility will be achieved by using haptic perception, humans virtual haptic information differentiation accuracy and memory will be examined, using touch screens with haptic feedback function. The main focus will be on the result of fundamental study carried out with visually impaired persons.


international conference on computers helping people with special needs | 2018

AcouSTTic: A Training Application of Acoustic Sense on Sound Table Tennis (STT)

Takahiro Miura; Masaya Fujito; Masaki Matsuo; Masatsugu Sakajiri; Junji Onishi; Tsukasa Ono

Sound table tennis (also known as vision impaired table tennis, and abbreviated as STT) is one of the most internationally popular sports for visually impaired people. Since the players of STT cannot rely on visual information, they must grasp the game situation using their auditory sense. However, it is difficult, especially for STT beginners, to perceive the state of the game, including the positions of the opponents, and the direction and distance of the ball. Therefore, in this paper, we aim to develop an application that enables STT players, especially beginners, to train their acoustic sense to instantaneously recognize the direction and distance of the ball at the service phase hit by opponents without going to the gym. We implemented the application named AcouSTTic (Acoustic + STT), and then evaluated its training effectiveness.


international conference on computers helping people with special needs | 2018

Virtual Museum for People with Low Vision: Comparison of the Experience on Flat and Head-Mounted Displays

Takahiro Miura; Gentaro Ando; Junji Onishi; Masaki Matsuo; Masatsugu Sakajiri; Tsukasa Ono

An increasing number of virtual museums (VMs) are used as educational materials because the VM can provide experiencing and learning virtual hands-on exhibitions without being limited in a place and a time for the users. However, most of the VM does not always accessible for people with low vision because of the limited functions including elusive annotations, passive zooming control, and incompatibility of various accessibility functions. In this paper, our objective is to demonstrate the issues and the solutions of VMs for these people. We first developed a prototype of VMs for low vision and then experimentally evaluated the VMs to find the requirements that people with low vision can easily control and have experience learning in VMs. The result shows that those who have immersive tendency would prefer the VM with an HMD.


international conference on human-computer interaction | 2017

Inclusive Side-Scrolling Action Game Securing Accessibility for Visually Impaired People

Masaki Matsuo; Takahiro Miura; Masatsugu Sakajiri; Junji Onishi; Tsukasa Ono

Though many computer games have recently become accessible for gamers with visual impairments, these players still face difficulty in manipulating game characters and acquiring visual information. It is true that although an increasing number of games for visually impaired people called audio games are being developed, many of these games cannot satisfy their basic needs because of the shortage of contents and are difficult for sighted people because of no visual information. Based on this situation, we have been developing accessible games for visually impaired people that feature enriched materials and multimodal information presentation. However, the needs of real-time action on accessible games remain unsolved. In this article, our objective is to develop an inclusive side scroller game with high real-time performance and accessibility functions for visually impaired people, and be available to play with more than one person including sighted persons.

Collaboration


Dive into the Junji Onishi's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge