Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Takashi Nagamatsu is active.

Publication


Featured researches published by Takashi Nagamatsu.


human factors in computing systems | 2010

MobiGaze: development of a gaze interface for handheld mobile devices

Takashi Nagamatsu; Michiya Yamamoto; Hiroshi Sato

Handheld mobile devices that have a touch screen are widely used but are awkward to use with one hand. To solve this problem, we propose MobiGaze, which is a user interface that uses ones gaze (gaze interface) to operate a handheld mobile device. By using stereo cameras, the users line of sight is detected in 3D, enabling the user to interact with a mobile device by means of his/her gaze. We have constructed a prototype system of MobiGaze that consists of two cameras with IR-LED, a Windows-based notebook PC, and iPod touch. Moreover, we have developed several applications for MobiGaze.


eye tracking research & application | 2010

Gaze estimation method based on an aspherical model of the cornea: surface of revolution about the optical axis of the eye

Takashi Nagamatsu; Yukina Iwamoto; Junzo Kamahara; Naoki Tanaka; Michiya Yamamoto

A novel gaze estimation method based on a novel aspherical model of the cornea is proposed in this paper. The model is a surface of revolution about the optical axis of the eye. The calculation method is explained on the basis of the model. A prototype system for estimating the point of gaze (POG) has been developed using this method. The proposed method has been found to be more accurate than the gaze estimation method based on a spherical model of the cornea.


eye tracking research & application | 2008

One-point calibration gaze tracking based on eyeball kinematics using stereo cameras

Takashi Nagamatsu; Junzo Kamahara; Takumi Iko; Naoki Tanaka

This paper presents a one-point calibration gaze tracking method based on eyeball kinematics using stereo cameras. By using two cameras and two light sources, the optic axis of the eye can be estimated. One-point calibration is required to estimate the angle of the visual axis from the optic axis. The eyeball rotates with optic and visual axes based on the eyeball kinematics (Listings law). Therefore, we introduced eyeball kinematics to the one-point calibration process in order to properly estimate the visual axis. The prototype system was developed and it was found that the accuracy was under 1° around the center and bottom of the display.


robot and human interactive communication | 2008

3D gaze tracking with easy calibration using stereo cameras for robot and human communication

Takashi Nagamatsu; Junzo Kamahara; Naoki Tanaka

This paper presents a method to estimate the optic and visual axes of an eye and the Point Of Gaze (POG) on the bases of Listingpsilas law using stereo cameras for three-dimensional (3D) gaze tracking. By using two cameras and two light sources, the optic axis of the eye can be estimated on the basis of a spherical model of a cornea. A one-point calibration is required to estimate the angle of the visual axis from the optic axis. However, a real cornea has an aspheric shape and therefore it is difficult to estimate the POG in all the directions accurately. We have used three light sources to improve the estimation of the POG. Two light sources near the center of the pupil in the camera image are used for estimating the POG. The experimental results show that the accuracy of the estimation of the visual axis is under about 1deg.


human factors in computing systems | 2009

Calibration-free gaze tracking using a binocular 3D eye model

Takashi Nagamatsu; Junzo Kamahara; Naoki Tanaka

This paper presents a calibration-free method for estimating the point of gaze (POG) on a display by using two pairs of stereo cameras. By using one pair of cameras and two light sources, the optical axis of the eye and the position of the center of the cornea can be estimated. This estimation is carried out by using a spherical model of the cornea. One pair of cameras is used for the estimation of the optical axis of the left eye, and the other pair is used for the estimation of the optical axis of the right eye. The point of intersection of optical axis with the display is termed the point of the optical axis (POA). The POG is approximately estimated as the midpoint of the line joining POAs of both the eyes with the display. We have developed a prototype system based on this method and demonstrated that the midpoint of POAs was closer to the fiducial point that the user gazed at than each POA.


conference on human interface | 2007

Development of a skill acquisition support system using expert's eye movement

Takashi Nagamatsu; Yohei Kaieda; Junzo Kamahara; Hiroyuki Shimada

The number of experts in various fields such as manufactures, traditional arts/crafts decreases because of their retirement in Japan. In the field of nuclear power generation, it also becomes problem especially in maintenance. The goal of this study was to develop an advanced instruction video system to support skill acquisition. We proposed to use eye movement as one of the experts skill information, and developed the system for supporting recording skill (scene video and the point of regard of an expert), making teaching materials in XML format and learning skills with the special user interface. Evaluation will be necessary by experiment to confirm the effectiveness of the training method.


eye tracking research & application | 2010

Development of eye-tracking pen display based on stereo bright pupil technique

Michiya Yamamoto; Takashi Nagamatsu; Tomio Watanabe

The intuitive user interfaces of PCs and PDAs, such as pen display and touch panel, have become widely used in recent times. In this study, we have developed an eye-tracking pen display based on the stereo bright pupil technique. First, the bright pupil camera was developed by examining the arrangement of cameras and LEDs for pen display. Next, the gaze estimation method was proposed for the stereo bright pupil camera, which enables one point calibration. Then, the prototype of the eye-tracking pen display was developed. The accuracy of the system was approximately 0.7° on average, which is sufficient for human interaction support. We also developed an eye-tracking tabletop as an application of the proposed stereo bright pupil technique.


acm multimedia | 2012

Conjunctive ranking function using geographic distance and image distance for geotagged image retrieval

Junzo Kamahara; Takashi Nagamatsu; Naoki Tanaka

Nowadays, an enormous number of photographic images are uploaded on the Internet by casual users. In this study, we consider the concept of embedding geographical identification of locations as geotags in images. We attempt to retrieve images having certain similarities (or identical objects) from a geotagged image dataset. We then define the images having identical objects as orthologous images. Using content-based image retrieval (CBIR), we propose a ranking function--orthologous identity function (OIF)--to estimate the degree to which two images contain similarities in the form of identical objects; OIF is a similarity rating function that uses the geographic distance and image distance of photographs. Further, we evaluate the OIF as a ranking function by calculating the mean reciprocal rank (MRR) using our experimental dataset. The results reveal that the OIF can improve the efficiency of retrieving orthologous images as compared to using only geographic distance or image distance.


eye tracking research & application | 2012

Mathematical model for wide range gaze tracking system based on corneal reflections and pupil using stereo cameras

Takashi Nagamatsu; Michiya Yamamoto; Ryuichi Sugano; Junzo Kamahara

In this paper, we propose a mathematical model for a wide range gaze tracking system based on corneal reflections and pupil using calibrated stereo cameras and light sources. We demonstrate a general calculation method for estimating the optical axis of the eye for a combination of non-coaxial and coaxial configurations of many cameras and light sources. Gaze estimation is possible only when light is reflected from the spherical surface of the cornea. Moreover, we provide a method for calculating the eye rotation range where gaze tracking can be achieved, which is useful for positioning cameras and light sources in real world applications.


international symposium on pervasive displays | 2014

Development of Corneal Reflection-based Gaze Tracking System for Public Use

Takashi Nagamatsu; Kaoruko Fukuda; Michiya Yamamoto

In this paper, we describe a corneal reflection-based gaze tracking system for public use. There are two problems to be solved when gaze tracking technologies are employed as the interaction mechanism for public display. One is to realize a personal calibration-free system. The other is the extension of the tracking range. We developed a personal calibration-free gaze tracking system at the Maritime Museum of Kobe University by using two calibrated cameras and multiple light sources. Furthermore, we are developing a system that changes lighted LEDs dynamically to estimate the gaze in a wide range.

Collaboration


Dive into the Takashi Nagamatsu's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tomio Watanabe

Okayama Prefectural University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hidekazu Yoshikawa

Harbin Engineering University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yutaka Ishii

Okayama Prefectural University

View shared research outputs
Top Co-Authors

Avatar

Hiroshi Sato

Kwansei Gakuin University

View shared research outputs
Researchain Logo
Decentralizing Knowledge