Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kota Akehi is active.

Publication


Featured researches published by Kota Akehi.


international conference on human interface and management of information | 2015

Computer Input System Using Eye Glances

Shogo Matsuno; Kota Akehi; Naoaki Itakura; Tota Mizuno; Kazuyuki Mito

We have developed a real-time Eye Glance input interface using a Web camera to capture eye gaze inputs. In previous studies, an eye control input interface was developed using an electro-oculograph (EOG) amplified by AC coupling. Our proposed Eye Gesture input interface used a combination of eye movements and did not require the restriction of head movement, unlike conventional eye gaze input methods. However, this method required an input start operation before capturing could commence. This led us to propose the Eye Glance input method that uses a combination of contradirectional eye movements as inputs and avoids the need for start operations. This method required the use of electrodes, which were uncomfortable to attach. The interface was therefore changed to a camera that used facial pictures to record eye movements to realize an improved noncontact and low-restraint interface. The Eye Glance input method measures the directional movement and time required by the eye to move a specified distance using optical flow with OpenCV from Intel. In this study, we analyzed the waveform obtained from eye movements using a purpose-built detection algorithm. In addition, we examined the reasons for detecting a waveform when eye movements failed.


innovative mobile and internet services in ubiquitous computing | 2018

Discrimination of Eye Blinks and Eye Movements as Features for Image Analysis of the Around Ocular Region for Use as an Input Interface

Shogo Matsuno; Masatoshi Tanaka; Keisuke Yoshida; Kota Akehi; Naoaki Itakura; Tota Mizuno; Kazuyuki Mito

This paper examines an input method for ocular analysis that incorporates eye-motion and eye-blink features to enable an eye-controlled input interface that functions independent of gaze-position measurement. This was achieved by analyzing the visible light in images captured without using special equipment. We propose applying two methods. One method detects eye motions using optical flow. The other method classifies voluntary eye blinks. The experimental evaluations assessed both identification algorithms simultaneously. Both algorithms were also examined for applicability in an input interface. The results have been consolidated and evaluated. This paper concludes by considering of the future of this topic.


international conference on human-computer interaction | 2017

Automatic Classification of Eye Blinks and Eye Movements for an Input Interface Using Eye Motion

Shogo Matsuno; Masatoshi Tanaka; Keisuke Yoshida; Kota Akehi; Naoaki Itakura; Tota Mizuno; Kazuyuki Mito

The objective of this study is to develop a multi gesture input interface using several eye motions simultaneously. In this study, we proposed a new automatic classification method for eye blinks and eye movements from moving images captured using a web camera installed on an information device. Eye motions were classified using two methods of image analysis. One method is the classification of the moving direction based on optical flow. The other method is the detection of voluntary blinks based on integral value of eye blink waveform recorded by changing the eye opening area. We developed an algorithm to run the two methods simultaneously. We also developed a classification system based on the proposed method and conducted experimental evaluation in which the average classification rate was 79.33%. This indicates that it is possible to distinguish multiple eye movements using a general video camera.


international conference on human-computer interaction | 2017

Development of Device for Measurement of Skin Potential by Grasping of the Device

Tota Mizuno; Shogo Matsuno; Kota Akehi; Kazuyuki Mito; Naoaki Itakura; Hirotoshi Asano

In this study, we developed a device for measuring skin potential activity requiring the subject to only grasp the interface. There is an extant method for measuring skin potential activity, which is an indicator for evaluating Mental Work-Load (MWL). It exploits the fact that when a human being experiences mental stress, such as tension or excitement, emotional sweating appears at skin sites such as the palm and sole; concomitantly, the skin potential at these sites varies. At present, skin potential activity of the hand is measured by electrodes attached to the whole arm. Alternatively, if a method can be developed to measure skin potential activity (and in turn emotional sweating) by an electrode placed on the palm only, it would be feasible to develop a novel portable burden-evaluation interface that can measure the MWL with the subject holding the interface. In this study, a prototype portable load-evaluation interface was investigated for its capacity to measure skin potential activity while the interface is held in the subject’s hand. This interface, wherein an electrode is attached to the device, rather than directly to the hand, can measure the parameters with the subject gripping the device. Moreover, by attaching the electrode laterally rather than longitudinally to the device, a touch by the subject, at any point on the sides of the device, enables measurement. The electrodes used in this study were tin foil tapes. In the experiment, subjects held the interface while it measured their MWL. However, the amplitude of skin potential activity (which reflects the strength of the stimulus administered on the subjects) obtained by the proposed method was lower than that obtained by the conventional method. Nonetheless, because sweat response due to stimulation could be quantified with the proposed method, the study demonstrated the possibility of load measurements considering only the palm.


ieee region 10 conference | 2016

Eye-movement measurement for operating a smart device: A small-screen line-of-sight input system

Shogo Matsuno; Saitoh Sorao; Chida Susumu; Kota Akehi; Naoaki Itakura; Tota Mizuno; Kazuyuki Mito

A real-time eye-glance input interface is developed for a camera-enabled smartphone. Eye-glance input is one of various line-of-sight input methods that capture eye movements over a relatively small screen. In previous studies, a quasi-eye-control input interface was developed using the eye-gaze method, which uses gaze position as an input trigger. This method has allowed intuitive and accurate inputting to information devices. However, there are certain problems with it: (1) measurement accuracy requires accurate calibration and a fixed positional relationship between user and system; (2) deciding input position by eye-gaze time slows down the inputting process; (3) it is necessary to present orientation information when performing input. Put differently, problem (3) requires the accuracy of any eye-gaze measuring device to increase as the screen becomes smaller. The eye-gaze method has traditionally needed a relatively wide screen, which has made eye-control input difficult with a smartphone. Our proposed method can solve this problem because the required input accuracy is independent of screen size. We report a prototype input interface based on an eye-glance input method for a smartphone. This system has an experimentally measured line-of-sight accuracy of ∼70%.


The International Conference on Electronics and Software Science (ICESS2015) | 2015

Improvement in Eye Glance Input Interface Using OpenCV

Kota Akehi; Shogo Matuno, Naoaki Itakura, Tota Mizuno; Kazuyuki Mito


Ieej Transactions on Electronics, Information and Systems | 2017

A Multiple-choice Input Interface using Slanting Eye Glance

Shogo Matsuno; Yuta Ito; Kota Akehi; Naoaki Itakura; Tota Mizuno; Kazuyuki Mito


advances in computer-human interaction | 2016

Autonomic Nervous Activity estimation algorithm with Facial Skin Thermal Image

Tota Mizuno; Shusuke Kawazura; Kota Akehi; Shogo Matsuno; Hirotoshi Asano; Kazuyuki Mito; Naoaki Itakura


Ieej Transactions on Electronics, Information and Systems | 2016

Measuring Facial Skin Temperature Changes Caused by Mental Work-Load with Infrared Thermography

Tota Mizuno; Takeru Sakai; Shunsuke Kawazura; Hirotoshi Asano; Kota Akehi; Shogo Matsuno; Kazuyuki Mito; Yuichiro Kume; Naoaki Itakura


Journal of Signal Processing | 2017

Non-contact Eye-Glance Input Interface Using Video Camera

Kota Akehi; Shogo Matuno; Naoaki Itakura; Tota Mizuno; Kazuyuki Mito

Collaboration


Dive into the Kota Akehi's collaboration.

Top Co-Authors

Avatar

Kazuyuki Mito

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar

Naoaki Itakura

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar

Tota Mizuno

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar

Shogo Matsuno

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chida Susumu

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar

Saitoh Sorao

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar

Shogo Matuno

University of Electro-Communications

View shared research outputs
Researchain Logo
Decentralizing Knowledge