Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yasuyuki Saguchi is active.

Publication


Featured researches published by Yasuyuki Saguchi.


international conference on pattern recognition | 2006

Proposal of recordable pointer: Pointed position measurement by projecting interference concentric circle pattern with a pointing device

Yasuji Seko; Yoshinori Yamaguchi; Yasuyuki Saguchi; Jun Miyazaki; Hiroyasu Koshimizu

We propose a new pointing device that can measure pointed positions by processing the interference concentric circles projected with a pointing device. The pointing device has a donut-shaped lens that is designed so as both to make the laser source be two hypothetical sources for forming optical interference and to project the concentric circle pattern widely. Two image sensors set on the projected side capture small parts of the concentric circles, and its center coordinate that is a pointed position of the pointing device is calculated from two normal lines to the arcs of the circles. In practice, we succeeded in measuring the pointed position accurately by real-time processing of the widely projected concentric circle patterns. We demonstrated mouse cursor operation on a large screen with the pointing device and also used it as real-object-based user interface to show the related information of real objects by pointing them


international conference on pattern recognition | 2006

Firefly capturing method: Motion capturing by monocular camera with large spherical aberration of lens and Hough-transform-based image processing

Yasuji Seko; Yasuyuki Saguchi; Hiroyuki Hotta; Jun Miyazaki; Hiroyasu Koshimizu

We demonstrate a new motion capturing method that uses the monocular camera with large spherical aberration of lens to measure 3D positions of point light sources attached on an object in real time without any sequential lighting. Point light sources are transformed into circle patterns by the large spherical aberration of lens mounted in the camera. The diameter and center position of circle pattern give the distance and direction to the light source, resulting in measuring its 3D position. Circle patterns are extracted by video image processing based on Hough transform even if they are overlapped each other. We tracked the circle patterns by predicting their next positions by Kalman filter that includes the acceleration of movement. By combining these processing techniques we succeeded in demonstrating the motion capturing of several LEDs in real time, which is shown in 3D graphics


Proceedings of SPIE, the International Society for Optical Engineering | 2005

Simultaneous 3D position sensing by means of large-scale spherical aberration of lens and Hough transform technique

Yasuji Seko; Yasuyuki Saguchi; Yoshinori Yamaguchi; Hiroyuki Hotta; Kazumasa Murai; Jun Miyazaki; Hiroyasu Koshimizu

We demonstrate a real time 3D position sensing of multiple light sources by capturing their ring images that are transformed by the molecular lens system with large spherical aberration. The ring images change in diameter in accordance with the distance to the light sources, and the ring center positions determine the directions toward them. Therefore, the 3D positions of light sources are calculated by detecting the diameters and center positions of the circles. This time we succeeded to measure 3D positions of multiple light sources simultaneously in real time by extracting and tracking the circle patterns individually. Each circle is extracted by the Hough transform technique that uses not-closely-distributing three edge points to search the primal votes more than threshold, and is tracked by predicting the successive positions by Kalman filter. These processes make it possible to measure the 3D positions of light sources even in the case of overlapped plural circles. In the experiment, we could track several circle patterns measuring the center positions and diameters, namely, measuring the 3D positions of LEDs in real space. Measurement error of 3D positions for a LED was 6.8mm in average for 150 sampling points ranging from 450mm to 950mm in distance.


Archive | 2006

Position measurement system

Yasuji Seko; Yoshinori Yamaguchi; Yasuyuki Saguchi


Archive | 2009

Position measuring system, position measuring method and computer readable medium

Yasuji Seko; Hiroyuki Hotta; Yasuyuki Saguchi


Archive | 2008

Position measurement system, position measurement method and computer readable medium

Yasuji Seko; Hiroyuki Hotta; Yasuyuki Saguchi


Archive | 2009

OPERATION DETERMINING SYSTEM, OPERATION DETERMINING DEVICE AND COMPUTER READABLE MEDIUM

Kazutoshi Yatsuda; Hiroyuki Hotta; Yasuyuki Saguchi


Archive | 2008

Position measuring apparatus, object to be recognized, and program

Hiroyuki Hotta; Yasuyuki Saguchi; Yasuji Seko; 泰之 佐口; 宏之 堀田; 保次 瀬古


Archive | 2011

IMAGE PROCESSING APPARATUS, IMAGE PROCESSING SYSTEM, IMAGE PROCESSING METHOD, AND COMPUTER READABLE MEDIUM

Tetsuya Kimura; Kengo Shinozaki; Shoji Sakamoto; Yasuyuki Saguchi; Hideki Baba


Archive | 2010

Position measuring system, processing device for position measurement, and processing method for position measurement

Hiroyuki Hotta; Kazutoshi Yatsuda; Yasuyuki Saguchi

Collaboration


Dive into the Yasuyuki Saguchi's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge