Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kaoruko Ohtani is active.

Publication


Featured researches published by Kaoruko Ohtani.


2012 IEEE International Conference on Emerging Signal Processing Applications | 2012

Augmentative and alternative communication with digital assistant for autistic children

Ippei Torii; Kaoruko Ohtani; Takahito Niwa; Ayuki Yamamoto; Naohiro Ishii

“Lets Talk!” is a new AAC (Augmentative and Alternative Communication) application for personal digital assistant for autistic children. This new and remarkable application has many particular advantages comparing to existing AAC. We especially focused on an easy and simple manipulation. By tapping a symbol on a screen of a PDA with this application, a user can show his/her thoughts with pictures and sounds to others easily. There are 2 modes which can be switched depending on different situations of users. It has 120 symbols based on daily life and a user can also create the original page with new icons made by pictures or sound. The operation of “Lets Talk!” is simple and easy. When a user chooses one from 12 categories on the top page, more specific symbols will be appeared and voice sound will come out by touching a symbol. For example, if a user chooses “Eat” category and “bread” for a symbol, the voice will say “I want to eat bread.” There are 2 modes which can be switched depending on different situations of users. On “Supportive Mode”, a supporter shows the application to a user. On Self-use Mode, a user can tell what he/she wants directly with categories and symbols. It is possible to make original icons with a camera or a voice recorder in PDA. A user also can customize an original page by arranging icons he made or existing symbols.


software engineering, artificial intelligence, networking and parallel/distributed computing | 2013

Development and Study of Support Applications for Autistic Children

Ippei Torii; Kaoruko Ohtani; Takahito Niwa; Naohiro Ishii

We developed a new VOCA (Voice Output Communication Aid) application for personal digital assistant (PDA), “Lets Talk!” for autistic children. This application has many particular advantages comparing to existing VOCA. We especially focused on an easy and simple manipulation. By tapping a symbol on a screen of a PDA with this application, a user can show his/her thoughts with pictures and sounds to others easily. There are 2 modes that can be switched depending on different situations of users. It has 120 symbols based on daily life and a user can also create the original page with new icons made by pictures or sound. A user also can customize an original page by arranging icons he made or existing symbols. On the newest version of this application, we added Task Schedule System to stimulate motivations of children to do something by them. On the last part of this study, we show some case studies. We introduced this application to students in a school for handicapped children and collected data.


international conference on tools with artificial intelligence | 2013

Study and Development of Support Tool with Blinks for Physically Handicapped Children

Ippei Torii; Kaoruko Ohtani; Takahito Niwa; Naohiro Ishii

In this study, we try to develop a new application for physically handicapped children to communicate with others by a blink. Because of limited body movements and mental disorders, many of them cannot communicate with their families or caregivers. We think if they can use application in smart phones by a blink, it will be big help for them to tell caregivers what they really need or want to tell. First, we try to detect an eye area by using OpenCv. Then we develop the way to detect opening and closing of eyes. We combine the method using situation and using complexity of image to get more accurate results to detect a blink. The level of handicapped is very varied in children. So we will try to develop the application to be able to customize depends on the situation of users. And also, we will try to reduce the error to detect a blink and pursue the high precision of the eye chased program.


international conference on computational science | 2015

Detecting Eye-Direction Using Afterimage and Ocular Movement

Ippei Torii; Kaoruko Ohtani; Naohiro Ishii

A support application for physically handicapped children to communicate with others by blinks is studied in this paper. OpenCv was used to detect an eye area. The method using saturation and using complexity of image are combined to get more accurate results to detect blinks. The technique is developed into a communication application that has the accurate and high precision blink determination system to detect letters. This devise can also put the words into sound. This blink detection method is used to develop the non-contact communication support tool that judges the eye direction by ocular movement. Combination of blink determination and detecting gaze detraction made it possible to choose letters remarkably fast and acclate. Furthermore, the vibration of the center point of the eyes is digitized by the comparison with the afterimage and middle point of the amount of change with scatter diagram is replaced to distinguish the state of an user. In future study, association with fatigue degree, sleep shortage and an intensive degree and the blurring of the eyeball vibration of right and left can be found using neural network.


international conference on human-computer interaction | 2013

Development of Support Applications for Elderly and Handicapped People with ICT Infrastructure

Ippei Torii; Kaoruko Ohtani; Takahito Niwa; Naohiro Ishii

We work on studying and developing support applications for aged and handicapped people. First, we developed a new communication assistant application for autistic children, ”Let’s Talk!”. We especially focused on an easy and simple manipulation. We also developed a to-do application for kids, “Hanamaru” and a scheduler application for elderly people, “Anshin”. We used ICT infrastructure, especially computer network systems such as SNS (Twitter, Facebook), e-mail, Skype, Line, and a message board on the web site, to collect the requests and opinions of users, and tried to feed it back to improve the applications.


Procedia Computer Science | 2014

Study and Application of Detecting Blinks for Communication Assistant Tool

Ippei Torii; Kaoruko Ohtani; Naohiro Ishii

Abstract We try to develop a support application for physically handicapped children to communicate with others by blinks. Because of limited body movements and mental disorders, many of them cannot communicate with their families or caregivers. If they can use applications in smart phones by blinks, they will be able to tell what they really need or want. Fist, we try to detect an eye area by using OpenCv. Then we develop the way to detect opening and closing of eyes. We combine the method using saturation and using complexity of image to get more accurate results to detect blinks. Then we develop the technique into a communication application that has the accurate and high-precision blink determination system to detect letters and put them into sound.


Archive | 2012

Voice Communication Aid with Personal Digital Assistant for Autistic Children

Ippei Torii; Kaoruko Ohtani; Nahoko Shirahama; Takahito Niwa; Naohiro Ishii

“Let’s Talk!” is a new AAC (Augmentative and Alternative Communication) application for personal digital assistant for autistic children. This new and remarkable application has many particular advantages comparing to existing AAC. We especially focused on an easy and simple manipulation. By tapping a symbol on a screen of a PDA with this application, a user can show his/her thoughts with pictures and sounds to others easily. There are 2 modes which can be switched depending on different situations of users. It has 120 symbols based on daily life and a user can also create the original page with new icons made by pictures or sound. The operation of “Let’s Talk!” is simple and easy. When a user chooses one from 12 categories on the top page, more specific symbols will be appeared and voice sound will come out by touching a symbol. For example, if a user chooses “Eat” category and “bread” for a symbol, the voice will say “I want to eat bread.” There are 2 modes which can be switched depending on different situations of users. On “Supportive Mode”, a supporter shows the application to a user. On Self-use Mode, a user can tell what he/she wants directly with categories and symbols. It is possible to make original icons with a camera or a voice recorder in PDA. A user also can customize an original page by arranging icons he made or existing symbols.


international conference on information intelligence systems and applications | 2015

System and method for detecting gaze direction

Ippei Torii; Takahito Niwa; Kaoruko Ohtani; Naohiro Ishii

We try to develop a support application for physically handicapped children to communicate with others by blinks. Fist, we try to detect an eye area by using OpenCv. Then we develop the way to detect opening and closing of eyes. We combine the method using saturation and using complexity of image to get more accurate results to detect blinks. Then we develop the technique into a communication application that has the accurate and high-precision blink determination system to detect letters and put them into sound. Then we applied this highly precise blink detection method to developing the non-contact communication support tool which judges the eye direction from ocular movement. We combine eyes direction and blink to choose a letter. This blink determination and gaze detraction detecting made it possible to choose letters remarkably fast. Furthermore, we digitized blurring of the vibration of the center point of the right and left eyes by the comparison with the afterimage and replaced middle point of the amount of change with scatter diagram and distinguished the state of the subject. We will find association with fatigue degree, sleep shortage and an intensive degree and the blurring of the eyeball vibration of right and left using neural network in future.


international conference on universal access in human-computer interaction | 2016

Development of Assessment Tool Judging Autism by Ocular Movement Measurement

Ippei Torii; Kaoruko Ohtani; Takahito Niwa; Naohiro Ishii

In this study, the development of the objectivity index for the diagnosis of the children who has Kanner syndrome with a lack of the communication ability and an evaluation of the curative effect using the ocular movement measurement is discussed. In past study, we developed communication applications “Eye Talk” and “Eye Tell” for people who have difficulty in conversation and writing such as children with physical disability, ALS patients or elderlies using the blink determination system. The team of Dr. Kitazawa in Graduate School of Frontier Biosciences in Osaka University performed the clinical application to distinguish Kanner syndrome group by measuring “where and when” he/she looks at using Tobii eye tracker. Our study is a judgment by the ocular movement measurement. We developed the image processing technique by afterimage used in the blink determination. First the eye area is captured by a front camera of laptop PC. Second, we extracted the pixels of pupils with 30–40 fps of accuracy and digitized eyeball movements. We converted the difference in eyeball movements between the right and left eyes into a graph and define it in multidimensional measure. We measured the amount of the degree that the eyes of the subject run off the track based on the afterimage, then added up the amount of change of right and left eyes and showed the total. After we corrected data, we set the identification border with density function of the distribution, cumulative frequency function, and ROC curve. With this, we established an objective index to determine Kanner syndrome, normal, false positive, and false negative. Furthermore, after analyzing the data in a two-dimensional coordinate, difference between autistic group and typical developmental group became clear. There were few differences in children who are on the border line between autistic and non-autistic comparing with typical developmental children when we validated with the fixation. However, the identification border could be detected definitely in pursuit.


ieee international conference on cloud computing technology and science | 2016

Measurement of Ocular Movement Abnormality in Pursuit Eye Movement (PEM) of Autism Spectrum Children with Disability

Ippei Torii; Kaoruko Ohtani; Naohiro Ishii

In this study, the development of the objectivity index for the diagnosis of the children who has autism spectrum and an evaluation of the curative effect using the ocular movement measurement is discussed. The image processing technique was developed by afterimage used in the blink determination. We measured the amount of the degree that the eyes of the subject run off the track based on the afterimage, then added up the amount of change of right and left eyes and showed the total. After correcting data, the identification border with density function of the distribution, cumulative frequency function, and ROC curve was set. With this, the objective index to determine autism, normal, false positive, and false negative was established. Furthermore, after analyzing the data in a two-dimensional coordinate, difference between the autism group and the normal developing group became clear. There were few differences in children who are on the borderline between autistic and non-autistic comparing with normal developing children when we validated with the fixation. However, the identification border could be detected definitely in pursuit. It was revealed that this inspection technique to capture eyeball movements by afterimage could detect disorders of sociability clearly and easily. In many educational institutions, this method can be used to evaluate learning and curative effects in future.

Collaboration


Dive into the Kaoruko Ohtani's collaboration.

Top Co-Authors

Avatar

Ippei Torii

Aichi Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Naohiro Ishii

Aichi Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Takahito Niwa

Aichi Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Shunki Takami

Aichi Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ayuki Yamamoto

Aichi Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge