Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Takahito Niwa is active.

Publication


Featured researches published by Takahito Niwa.


2012 IEEE International Conference on Emerging Signal Processing Applications | 2012

Augmentative and alternative communication with digital assistant for autistic children

Ippei Torii; Kaoruko Ohtani; Takahito Niwa; Ayuki Yamamoto; Naohiro Ishii

“Lets Talk!” is a new AAC (Augmentative and Alternative Communication) application for personal digital assistant for autistic children. This new and remarkable application has many particular advantages comparing to existing AAC. We especially focused on an easy and simple manipulation. By tapping a symbol on a screen of a PDA with this application, a user can show his/her thoughts with pictures and sounds to others easily. There are 2 modes which can be switched depending on different situations of users. It has 120 symbols based on daily life and a user can also create the original page with new icons made by pictures or sound. The operation of “Lets Talk!” is simple and easy. When a user chooses one from 12 categories on the top page, more specific symbols will be appeared and voice sound will come out by touching a symbol. For example, if a user chooses “Eat” category and “bread” for a symbol, the voice will say “I want to eat bread.” There are 2 modes which can be switched depending on different situations of users. On “Supportive Mode”, a supporter shows the application to a user. On Self-use Mode, a user can tell what he/she wants directly with categories and symbols. It is possible to make original icons with a camera or a voice recorder in PDA. A user also can customize an original page by arranging icons he made or existing symbols.


software engineering, artificial intelligence, networking and parallel/distributed computing | 2013

Development and Study of Support Applications for Autistic Children

Ippei Torii; Kaoruko Ohtani; Takahito Niwa; Naohiro Ishii

We developed a new VOCA (Voice Output Communication Aid) application for personal digital assistant (PDA), “Lets Talk!” for autistic children. This application has many particular advantages comparing to existing VOCA. We especially focused on an easy and simple manipulation. By tapping a symbol on a screen of a PDA with this application, a user can show his/her thoughts with pictures and sounds to others easily. There are 2 modes that can be switched depending on different situations of users. It has 120 symbols based on daily life and a user can also create the original page with new icons made by pictures or sound. A user also can customize an original page by arranging icons he made or existing symbols. On the newest version of this application, we added Task Schedule System to stimulate motivations of children to do something by them. On the last part of this study, we show some case studies. We introduced this application to students in a school for handicapped children and collected data.


international conference on tools with artificial intelligence | 2013

Study and Development of Support Tool with Blinks for Physically Handicapped Children

Ippei Torii; Kaoruko Ohtani; Takahito Niwa; Naohiro Ishii

In this study, we try to develop a new application for physically handicapped children to communicate with others by a blink. Because of limited body movements and mental disorders, many of them cannot communicate with their families or caregivers. We think if they can use application in smart phones by a blink, it will be big help for them to tell caregivers what they really need or want to tell. First, we try to detect an eye area by using OpenCv. Then we develop the way to detect opening and closing of eyes. We combine the method using situation and using complexity of image to get more accurate results to detect a blink. The level of handicapped is very varied in children. So we will try to develop the application to be able to customize depends on the situation of users. And also, we will try to reduce the error to detect a blink and pursue the high precision of the eye chased program.


international conference on human-computer interaction | 2013

Development of Support Applications for Elderly and Handicapped People with ICT Infrastructure

Ippei Torii; Kaoruko Ohtani; Takahito Niwa; Naohiro Ishii

We work on studying and developing support applications for aged and handicapped people. First, we developed a new communication assistant application for autistic children, ”Let’s Talk!”. We especially focused on an easy and simple manipulation. We also developed a to-do application for kids, “Hanamaru” and a scheduler application for elderly people, “Anshin”. We used ICT infrastructure, especially computer network systems such as SNS (Twitter, Facebook), e-mail, Skype, Line, and a message board on the web site, to collect the requests and opinions of users, and tried to feed it back to improve the applications.


Archive | 2012

Voice Communication Aid with Personal Digital Assistant for Autistic Children

Ippei Torii; Kaoruko Ohtani; Nahoko Shirahama; Takahito Niwa; Naohiro Ishii

“Let’s Talk!” is a new AAC (Augmentative and Alternative Communication) application for personal digital assistant for autistic children. This new and remarkable application has many particular advantages comparing to existing AAC. We especially focused on an easy and simple manipulation. By tapping a symbol on a screen of a PDA with this application, a user can show his/her thoughts with pictures and sounds to others easily. There are 2 modes which can be switched depending on different situations of users. It has 120 symbols based on daily life and a user can also create the original page with new icons made by pictures or sound. The operation of “Let’s Talk!” is simple and easy. When a user chooses one from 12 categories on the top page, more specific symbols will be appeared and voice sound will come out by touching a symbol. For example, if a user chooses “Eat” category and “bread” for a symbol, the voice will say “I want to eat bread.” There are 2 modes which can be switched depending on different situations of users. On “Supportive Mode”, a supporter shows the application to a user. On Self-use Mode, a user can tell what he/she wants directly with categories and symbols. It is possible to make original icons with a camera or a voice recorder in PDA. A user also can customize an original page by arranging icons he made or existing symbols.


international conference on information intelligence systems and applications | 2015

System and method for detecting gaze direction

Ippei Torii; Takahito Niwa; Kaoruko Ohtani; Naohiro Ishii

We try to develop a support application for physically handicapped children to communicate with others by blinks. Fist, we try to detect an eye area by using OpenCv. Then we develop the way to detect opening and closing of eyes. We combine the method using saturation and using complexity of image to get more accurate results to detect blinks. Then we develop the technique into a communication application that has the accurate and high-precision blink determination system to detect letters and put them into sound. Then we applied this highly precise blink detection method to developing the non-contact communication support tool which judges the eye direction from ocular movement. We combine eyes direction and blink to choose a letter. This blink determination and gaze detraction detecting made it possible to choose letters remarkably fast. Furthermore, we digitized blurring of the vibration of the center point of the right and left eyes by the comparison with the afterimage and replaced middle point of the amount of change with scatter diagram and distinguished the state of the subject. We will find association with fatigue degree, sleep shortage and an intensive degree and the blurring of the eyeball vibration of right and left using neural network in future.


International Conference on Intelligent Decision Technologies | 2018

Measurement of Abnormality in Eye Movement with Autism and Application for Detect Fatigue Level

Ippei Torii; Takahito Niwa; Naohiro Ishii

The authors aimed to establish an objective diagnostic criterion for children with autism spectrum disorder. To this end, they conducted eye-tracking tests on children with autism and neurotypical children. They obtained the pixel number variation (a numerical value) in gaze direction based on the center of mass of pixels associated with the pupil. The results were then plotted onto a two-dimensional graph, and distributions based on probability density function and receiver operating characteristic curve analysis were ascertained. This analysis yielded a decision boundary clearly demarcating autism and neurotypical distributions and, thus, confirming the reliability of the method. This finding suggests that this technique of measuring abnormality in pixel number is effective for distinguishing individuals with autism from those with typical development and, thus, can serve as an objective criterion for the diagnosis of autism. On other hand, In modern society, many people are under stress. It is over accumulated and autonomic nerves is disturbed that fatigued and it causes of mental disorders such as depression and autonomic imbalance. The evaluation of fatigue/fatigue feeling was subjectively evaluated use visual analog scale (VAS) and questionnaire paper. That means we not had way to be objectively evaluate. In this study, we will apply the system to verify fatigue level judgment by image processing and verify the relevance of fatigue degree and eye movement.


International Conference on Intelligent Decision Technologies | 2018

Measurement of Line-of-Sight Detection Using Pixel Quantity Variation for Autistic Disorder

Takahito Niwa; Ippei Torii; Naohiro Ishii

In this study, we develop a tool to support physically disabled people’s communication and an assessment tool to measure the intelligence index of autistic children, which uses eye movements with image processing. For the measurement of eye movements, we newly developed a pixel center of gravity method that detects in which the direction of the eye movement is shown in the point where the weights of the black pixels moved. This method is different from using the conventional black eye detection or ellipse detection. The method enables accurate detection even when a physically handicapped person uses. On the other hand, the assessment tool that measures the intelligence index of autistic children prepares dedicated goggles that combines light emitting diodes and near-infrared cameras. Further, it measures left and right eye movement differences.


international conference on universal access in human-computer interaction | 2016

Development of Assessment Tool Judging Autism by Ocular Movement Measurement

Ippei Torii; Kaoruko Ohtani; Takahito Niwa; Naohiro Ishii

In this study, the development of the objectivity index for the diagnosis of the children who has Kanner syndrome with a lack of the communication ability and an evaluation of the curative effect using the ocular movement measurement is discussed. In past study, we developed communication applications “Eye Talk” and “Eye Tell” for people who have difficulty in conversation and writing such as children with physical disability, ALS patients or elderlies using the blink determination system. The team of Dr. Kitazawa in Graduate School of Frontier Biosciences in Osaka University performed the clinical application to distinguish Kanner syndrome group by measuring “where and when” he/she looks at using Tobii eye tracker. Our study is a judgment by the ocular movement measurement. We developed the image processing technique by afterimage used in the blink determination. First the eye area is captured by a front camera of laptop PC. Second, we extracted the pixels of pupils with 30–40 fps of accuracy and digitized eyeball movements. We converted the difference in eyeball movements between the right and left eyes into a graph and define it in multidimensional measure. We measured the amount of the degree that the eyes of the subject run off the track based on the afterimage, then added up the amount of change of right and left eyes and showed the total. After we corrected data, we set the identification border with density function of the distribution, cumulative frequency function, and ROC curve. With this, we established an objective index to determine Kanner syndrome, normal, false positive, and false negative. Furthermore, after analyzing the data in a two-dimensional coordinate, difference between autistic group and typical developmental group became clear. There were few differences in children who are on the border line between autistic and non-autistic comparing with typical developmental children when we validated with the fixation. However, the identification border could be detected definitely in pursuit.


ieee international conference on cloud computing technology and science | 2016

Development of Evacuation Guiding System for Earthquake Disaster

Takahito Niwa; Ippei Torii; Naohiro Ishii

It is predicted that Nankai Trough Earthquake will occur in the near future in Japan. Based on this prediction, we develop the disaster simulation by Projection Mapping. Our study aims to promote disaster information visualization, disaster reduction education and voluntary/active individual behavior for disaster reduction. We mapped the video to the diorama of the three-dimensional map model of Aichi in Japan that was made by the 3D printer. It enables to present the hazard map that could only be transferred in a plane so far by three-dimensional way or animation. In addition, Nagoya Municipal Minato Disaster Prevention Center purchased this device on January 20, 2015, and set it as a permanent exhibition. This device is very popular and used by many children. We developed this study and ported a simulation equipment for application for smartphones. We utilized the data of disaster drills carried out in Minamichita and Utsumi area in Aichi, Japan and developed the location identification system and the refuge navigation system.

Collaboration


Dive into the Takahito Niwa's collaboration.

Top Co-Authors

Avatar

Ippei Torii

Aichi Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Naohiro Ishii

Aichi Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Kaoruko Ohtani

Aichi Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Manabu Onogi

Aichi Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Yousuke Okada

Aichi Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ayuki Yamamoto

Aichi Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Shota Murayama

Aichi Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge