Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kazuyuki Mito is active.

Publication


Featured researches published by Kazuyuki Mito.


Journal of Physiological Sciences | 2008

Histological skeletal muscle damage and surface EMG relationships following eccentric contractions.

Yutaka Kano; Kazumi Masuda; H Furukawa; Mizuki Sudo; Kazuyuki Mito; Kazuyoshi Sakamoto

This study examined the effects of a different number of eccentric contractions (ECs) on histological characteristics, surface electromyogram (EMG) parameters (integral EMG, iEMG; muscle fiber conduction velocity, MFCV; and action potential waveform), and isometric peak torque using the rat EC model. Male Wistar rats (n = 40) were anesthetized, and ECs were initiated in the tibialis anterior muscle via electrical stimulation while the muscle was being stretched by electromotor. The rats were grouped according to the number of ECs (EC1, EC5, EC10, EC20, EC30, EC40, and EC100). Three days after the ECs, surface EMG signals and isometric peak torque were measured during evoked twitch contractions via electrical stimulation of the peroneal nerve. The muscle damage was evaluated from hematoxylin-eosin (HE) stained cross sections as a relative number of damaged fibers to intact fibers. Intense histological muscle damage (approximately 50% to 70% of the fiber), loss of isometric peak torque, disturbance of action potential waveform, and depression of iEMG (approximately -60% to -70%) were observed at EC20, EC30, EC40, and EC100. On the other hand, the MFCV did not change in any EC group. Although muscle damage and pathological surface EMG signals were not found at EC10, isometric peak torque was reduced significantly. In conclusion, the extent of histological muscle damage is not proportionally related to the number of ECs. Muscle damage was reflected by iEMG and action potential waveforms, but not by MFCV, which remained unaffected even though approximately 50% to 70% of the fiber demonstrated injury.


international conference on human interface and management of information | 2015

Computer Input System Using Eye Glances

Shogo Matsuno; Kota Akehi; Naoaki Itakura; Tota Mizuno; Kazuyuki Mito

We have developed a real-time Eye Glance input interface using a Web camera to capture eye gaze inputs. In previous studies, an eye control input interface was developed using an electro-oculograph (EOG) amplified by AC coupling. Our proposed Eye Gesture input interface used a combination of eye movements and did not require the restriction of head movement, unlike conventional eye gaze input methods. However, this method required an input start operation before capturing could commence. This led us to propose the Eye Glance input method that uses a combination of contradirectional eye movements as inputs and avoids the need for start operations. This method required the use of electrodes, which were uncomfortable to attach. The interface was therefore changed to a camera that used facial pictures to record eye movements to realize an improved noncontact and low-restraint interface. The Eye Glance input method measures the directional movement and time required by the eye to move a specified distance using optical flow with OpenCV from Intel. In this study, we analyzed the waveform obtained from eye movements using a purpose-built detection algorithm. In addition, we examined the reasons for detecting a waveform when eye movements failed.


international conference on human computer interaction | 2013

Study of eye-glance input interface

Dekun Gao; Naoaki Itakura; Tota Mizuno; Kazuyuki Mito

Optical measurement devices for eye movements are generally expensive and it is often necessary to restrict user head movements when various eye-gaze input interfaces are used. Previously, we proposed a novel eye-gesture input interface that utilized electrooculography amplified via an AC coupling that does not require a head mounted display[1]. Instead, combinations of eye-gaze displacement direction were used as the selection criteria. When used, this interface showed a success rate approximately 97.2%, but it was necessary for the user to declare his or her intention to perform an eye gesture by blinking or pressing an enter key. In this paper, we propose a novel eye-glance input interface that can consistently recognize glance behavior without a prior declaration, and provide a decision algorithm that we believe is suitable for eye-glance input interfaces such as small smartphone screens. In experiments using our improved eye-glance input interface, we achieved a detection rate of approximately 93% and a direction determination success rate of approximately 79.3%. A smartphone screen design for use with the eye-glance input interface is also proposed.


international conference on human-computer interaction | 2016

Physiological and Psychological Evaluation by Skin Potential Activity Measurement Using Steering Wheel While Driving

Shogo Matsuno; Takahiro Terasaki; Shogo Aizawa; Tota Mizuno; Kazuyuki Mito; Naoaki Itakura

This paper proposes a new method for practical skin potential activity (SPA) measurement while driving a car by installing electrodes on the outer periphery of the steering wheel. Evaluating the psychophysiological state of the driver while driving is important for accident prevention. We investigated whether the physiological and psychological state of the driver can be evaluated by measuring SPA while driving. Therefore, we have devised a way to measure SPA measurement by installing electrodes in a handle. Electrodes are made of tin foil and are placed along the outer periphery of the wheel considering that their position while driving is not fixed. The potential difference is increased by changing the impedance through changing the width of electrodes. Moreover we try to experiment using this environment. An experiment to investigate the possibility of measuring SPA using the conventional and the proposed methods were conducted with five healthy adult males. A physical stimulus was applied to the forearm of the subjects. It was found that the proposed method could measure SPA, even though the result was slightly smaller than that of the conventional method of affixing electrodes directly on hands.


innovative mobile and internet services in ubiquitous computing | 2018

Discrimination of Eye Blinks and Eye Movements as Features for Image Analysis of the Around Ocular Region for Use as an Input Interface

Shogo Matsuno; Masatoshi Tanaka; Keisuke Yoshida; Kota Akehi; Naoaki Itakura; Tota Mizuno; Kazuyuki Mito

This paper examines an input method for ocular analysis that incorporates eye-motion and eye-blink features to enable an eye-controlled input interface that functions independent of gaze-position measurement. This was achieved by analyzing the visible light in images captured without using special equipment. We propose applying two methods. One method detects eye motions using optical flow. The other method classifies voluntary eye blinks. The experimental evaluations assessed both identification algorithms simultaneously. Both algorithms were also examined for applicability in an input interface. The results have been consolidated and evaluated. This paper concludes by considering of the future of this topic.


Artificial Life and Robotics | 2018

Estimating autonomic nerve activity using variance of thermal face images

Shogo Matsuno; Tota Mizuno; Hirotoshi Asano; Kazuyuki Mito; Naoaki Itakura

In this paper, we propose a novel method for evaluating mental workload (MWL) using variances in facial temperature. Moreover, our method aims to evaluate autonomic nerve activity using single facial thermal imaging. The autonomic nervous system is active under MWL. In previous studies, temperature differences between the nasal and forehead portions of the face were used in MWL evaluation and estimation. Hence, nasal skin temperature (NST) is said to be a reliable indicator of autonomic nerve activity. In addition, autonomic nerve activity has little effect on forehead temperature; thus, temperature differences between the nasal and forehead portions of the face have also been demonstrated to be a good indicator of autonomic nerve activity (along with other physiological indicators such as EEG and heart rate). However, these approaches have not considered temperature changes in other parts of the face. Thus, we propose novel method using variances in temperature for the entire face. Our proposed method enables capture of other parts of the face for temperature monitoring, thereby increasing evaluation and estimation accuracy at higher sensitivity levels than conventional methods. Finally, we also examined whether further high-precision evaluation and estimation was feasible. Our results proved that our proposed method is a highly accurate evaluation method compared with results obtained in previous studies using NST.


international conference on human-computer interaction | 2017

Automatic Classification of Eye Blinks and Eye Movements for an Input Interface Using Eye Motion

Shogo Matsuno; Masatoshi Tanaka; Keisuke Yoshida; Kota Akehi; Naoaki Itakura; Tota Mizuno; Kazuyuki Mito

The objective of this study is to develop a multi gesture input interface using several eye motions simultaneously. In this study, we proposed a new automatic classification method for eye blinks and eye movements from moving images captured using a web camera installed on an information device. Eye motions were classified using two methods of image analysis. One method is the classification of the moving direction based on optical flow. The other method is the detection of voluntary blinks based on integral value of eye blink waveform recorded by changing the eye opening area. We developed an algorithm to run the two methods simultaneously. We also developed a classification system based on the proposed method and conducted experimental evaluation in which the average classification rate was 79.33%. This indicates that it is possible to distinguish multiple eye movements using a general video camera.


international conference on human-computer interaction | 2017

Development of Device for Measurement of Skin Potential by Grasping of the Device

Tota Mizuno; Shogo Matsuno; Kota Akehi; Kazuyuki Mito; Naoaki Itakura; Hirotoshi Asano

In this study, we developed a device for measuring skin potential activity requiring the subject to only grasp the interface. There is an extant method for measuring skin potential activity, which is an indicator for evaluating Mental Work-Load (MWL). It exploits the fact that when a human being experiences mental stress, such as tension or excitement, emotional sweating appears at skin sites such as the palm and sole; concomitantly, the skin potential at these sites varies. At present, skin potential activity of the hand is measured by electrodes attached to the whole arm. Alternatively, if a method can be developed to measure skin potential activity (and in turn emotional sweating) by an electrode placed on the palm only, it would be feasible to develop a novel portable burden-evaluation interface that can measure the MWL with the subject holding the interface. In this study, a prototype portable load-evaluation interface was investigated for its capacity to measure skin potential activity while the interface is held in the subject’s hand. This interface, wherein an electrode is attached to the device, rather than directly to the hand, can measure the parameters with the subject gripping the device. Moreover, by attaching the electrode laterally rather than longitudinally to the device, a touch by the subject, at any point on the sides of the device, enables measurement. The electrodes used in this study were tin foil tapes. In the experiment, subjects held the interface while it measured their MWL. However, the amplitude of skin potential activity (which reflects the strength of the stimulus administered on the subjects) obtained by the proposed method was lower than that obtained by the conventional method. Nonetheless, because sweat response due to stimulation could be quantified with the proposed method, the study demonstrated the possibility of load measurements considering only the palm.


international conference on human-computer interaction | 2017

Investigation of Facial Region Extraction Algorithm Focusing on Temperature Distribution Characteristics of Facial Thermal Images

Tomoyuki Murata; Shogo Matsuno; Kazuyuki Mito; Naoaki Itakura; Tota Mizuno

In our previous research, we expanded the range to be analyzed to the entire face. This was because there were regions in the mouth, in addition to the nose, where the temperature fluctuated according to the mental workload (MWL). We evaluated the MWL with high accuracy by this method. However, it has been clarified in previous studies that the edge portion of the face, where there is no angle between the thermography and the object to be photographed, exhibits decreased emissivity measured by reflection or the like, and, as a result, the accuracy of the temperature data decreases. In this study, we aim to automatically extract the target facial region from the thermal image taken by thermography by focusing on the temperature distribution of the facial thermal image, as well as examine the automation of the evaluation. As a result of evaluating whether the analysis range can be automatically extracted from 80 facial images, we succeeded in an automatic extraction that can be analyzed from about 90% of the images.


international conference on computers helping people with special needs | 2016

A Study of an Intention Communication Assisting System Using Eye Movement

Shogo Matsuno; Yuta Ito; Naoaki Itakura; Tota Mizuno; Kazuyuki Mito

In this paper, we propose a new intention communication assisting system that uses eye movement. The proposed method solves the problems associated with a conventional eye gaze input method. A hands-free input method that uses the behavior of the eye, including blinking and line of sight, has been used for assisting the intention communication of people with severe physical disabilities. In particular, a line-of-sight input device that uses eye gazes has been used extensively because of its intuitive operation. In addition, this device can be used by any patient, except those with weak eye. However, the eye gaze method has disadvantages such as a certain level of input time is required for determining the eye gaze input, or it is necessary to present the information for fixation when performing input. In order to solve these problems, we propose a new line-of-sight input method, eye glance input method. Eye glance input can be performed in four directions by detecting reciprocating movement (eye glance) in the oblique direction. Using the proposed method, it is possible to perform rapid environmental control with simple measurements. In addition, we developed an evaluation system using electrooculogram based on the proposed method. The evaluation system experimentally evaluated the input accuracy of 10 subjects. As a result, an average accuracy of approximately 84.82 % was determined, which confirms the effectiveness of the proposed method. In addition, we examined the application of the proposed method to actual intention communication assisting systems.

Collaboration


Dive into the Kazuyuki Mito's collaboration.

Top Co-Authors

Avatar

Naoaki Itakura

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar

Tota Mizuno

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar

Kazuyoshi Sakamoto

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar

Shogo Matsuno

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar

Kota Akehi

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar

Kenichi Kaneko

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hitoshi Makabe

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge