Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Chul Woo Cho is active.

Publication


Featured researches published by Chul Woo Cho.


IEEE Transactions on Consumer Electronics | 2010

Gaze tracking system at a distance for controlling IPTV

Hyeon Chang Lee; Duc Thien Luong; Chul Woo Cho; Eui Chul Lee; Kang Ryoung Park

Gaze tracking is used for detecting the position that a user is looking at. In this research, a new gaze-tracking system and method are proposed to control a large-screen TV at a distance. This research is novel in the following four ways as compared to previous work: First, this is the first system for gaze tracking on a large-screen TV at a distance. Second, in order to increase convenience, the users eye is captured by a remote gaze-tracking camera not requiring a user to wear any device. Third, without the complicated calibrations among the screen, the camera and the eye coordinates, the gaze position on the TV screen is obtained by using a simple 2D method based on a geometric transform with pupil center and four cornea specular reflections. Fourth, by using a near-infrared (NIR) passing filter on the camera and NIR illuminators, the pupil region becomes distinctive in the input image irrespective of the change in the environmental visible light. Experimental results showed that the proposed system could be used as a new interface for controlling a TV with a 60-inch-wide screen (16:9).


Optical Engineering | 2009

Robust gaze-tracking method by using frontal-viewing and eye-tracking cameras

Chul Woo Cho; Ji Woo Lee; Eui Chul Lee; Kang Ryoung Park

Gaze-tracking technology is used to obtain the position of a users viewpoint and a new gaze-tracking method is proposed based on a wearable goggle-type device, which includes an eye-tracking camera and a frontal viewing camera. The proposed method is novel in five ways compared to previous research. First, it can track the users gazing position, allowing for the natural facial and eye movements by using frontal viewing and an eye-tracking camera. Second, an eye gaze position is calculated using a geometric transform, based on the mapping function among three rectangular regions. These are a rectangular region defined by the four pupil centers detected when a user gazes at the four corners of a monitor, a distorted monitor region observed by the frontal viewing camera, and an actual monitor region, respectively. Third, a facial gaze position is estimated based on the geometric center and the four internal angles of the monitor region detected by the frontal viewing camera. Fourth, a final gaze position is obtained by using the weighted summation of the eye and the facial gazing positions. Fifth, since a simple 2-D method is used to obtain the gazing position instead of a complicated 3-D method, the proposed method can be operated at real-time speeds. Experimental results show that the root mean square (rms) error of gaze estimation is less than 1 deg.


Sensors | 2013

Remote Gaze Tracking System on a Large Display

Hyeon Chang Lee; Won Oh Lee; Chul Woo Cho; Su Yeong Gwon; Kang Ryoung Park; Hee-Kyung Lee; Jihun Cha

We propose a new remote gaze tracking system as an intelligent TV interface. Our research is novel in the following three ways: first, because a user can sit at various positions in front of a large display, the capture volume of the gaze tracking system should be greater, so the proposed system includes two cameras which can be moved simultaneously by panning and tilting mechanisms, a wide view camera (WVC) for detecting eye position and an auto-focusing narrow view camera (NVC) for capturing enlarged eye images. Second, in order to remove the complicated calibration between the WVC and NVC and to enhance the capture speed of the NVC, these two cameras are combined in a parallel structure. Third, the auto-focusing of the NVC is achieved on the basis of both the users facial width in the WVC image and a focus score calculated on the eye image of the NVC. Experimental results showed that the proposed system can be operated with a gaze tracking accuracy of ±0.737°∼±0.775° and a speed of 5∼10 frames/s.


International Journal of Advanced Robotic Systems | 2013

Robust Eye and Pupil Detection Method for Gaze Tracking

Su Yeong Gwon; Chul Woo Cho; Hyeon Chang Lee; Won Oh Lee; Kang Ryoung Park

Robust and accurate pupil detection is a prerequisite for gaze detection. Hence, we propose a new eye/pupil detection method for gaze detection on a large display. The novelty of our research can be summarized by the following four points. First, in order to overcome the performance limitations of conventional methods of eye detection, such as adaptive boosting (Adaboost) and continuously adaptive mean shift (CAMShift) algorithms, we propose adaptive selection of the Adaboost and CAMShift methods. Second, this adaptive selection is based on two parameters: pixel differences in successive images and matching values determined by CAMShift. Third, a support vector machine (SVM)-based classifier is used with these two parameters as the input, which improves the eye detection performance. Fourth, the center of the pupil within the detected eye region is accurately located by means of circular edge detection, binarization and calculation of the geometric center. The experimental results show that the proposed method can detect the center of the pupil at a speed of approximately 19.4 frames/s with an RMS error of approximately 5.75 pixels, which is superior to the performance of conventional detection methods.


Sensors | 2014

Gaze Tracking System for User Wearing Glasses

Su Yeong Gwon; Chul Woo Cho; Hyeon Chang Lee; Won Oh Lee; Kang Ryoung Park

Conventional gaze tracking systems are limited in cases where the user is wearing glasses because the glasses usually produce noise due to reflections caused by the gaze trackers lights. This makes it difficult to locate the pupil and the specular reflections (SRs) from the cornea of the users eye. These difficulties increase the likelihood of gaze detection errors because the gaze position is estimated based on the location of the pupil center and the positions of the corneal SRs. In order to overcome these problems, we propose a new gaze tracking method that can be used by subjects who are wearing glasses. Our research is novel in the following four ways: first, we construct a new control device for the illuminator, which includes four illuminators that are positioned at the four corners of a monitor. Second, our system automatically determines whether a user is wearing glasses or not in the initial stage by counting the number of white pixels in an image that is captured using the low exposure setting on the camera. Third, if it is determined that the user is wearing glasses, the four illuminators are turned on and off sequentially in order to obtain an image that has a minimal amount of noise due to reflections from the glasses. As a result, it is possible to avoid the reflections and accurately locate the pupil center and the positions of the four corneal SRs. Fourth, by turning off one of the four illuminators, only three corneal SRs exist in the captured image. Since the proposed gaze detection method requires four corneal SRs for calculating the gaze position, the unseen SR position is estimated based on the parallelogram shape that is defined by the three SR positions and the gaze position is calculated. Experimental results showed that the average gaze detection error with 20 persons was about 0.70° and the processing time is 63.72 ms per each frame.


Optical Engineering | 2014

Binocular gaze detection method using a fuzzy algorithm based on quality measurements

Chul Woo Cho; Hyeon Chang Lee; Su Yeong Gwon; Jong Man Lee; Dongwook Jung; Kang Ryoung Park; Hyun-Cheol Kim; Jihun Cha

Abstract. Due to the limitations of gaze detection based on one eye, binocular gaze detection using the gaze positions of both eyes has been researched. Most previous binocular gaze detection research calculated a gaze position as the simple average position of the detected gaze points of both eyes. To improve this approach, we propose a new binocular gaze detection method using a fuzzy algorithm with quality measurement of both eyes. The proposed method is used in the following three ways. First, in order to combine the gaze points of the left and right eyes, we measure four qualities on both eyes: distortion by an eyelid, distortion by the specular reflection (SR), the level of circularity of the pupil, and the distance between the pupil boundary and the SR center. Second, in order to obtain a more accurate pupil boundary, we compensate the distorted boundary of a pupil by an eyelid based on information from the lower half-circle of the pupil. Third, the final gaze position is calculated using a fuzzy algorithm based on four quality-measured scores. Experimental results show that the root-mean-square error of gaze estimation by the proposed method is approximately 0.67518 deg.


Optical Engineering | 2012

Auto-focusing method for remote gaze tracking camera

Won Oh Lee; Hyeon Chang Lee; Chul Woo Cho; Su Yeong Gwon; Kang Ryoung Park; Hee-Kyung Lee; Jihun Cha

Gaze tracking determines what a user is looking at; the key challenge is to obtain well-focused eye images. This is not easy because the human eye is very small, whereas the required resolution of the image should be large enough for accurate detection of the pupil center. In addition, capturing a users eye image by a remote gaze tracking system within a large working volume at a long Z distance requires a panning/tilting mechanism with a zoom lens, which makes it more difficult to acquire focused eye images. To solve this problem, a new auto-focusing method for remote gaze tracking is proposed. The proposed approach is novel in the following four ways: First, it is the first research on an auto-focusing method for a remote gaze tracking system. Second by using user-dependent calibration at initial stage, the weakness of the previous methods that use facial width in captured image to estimate Z distance between a user and camera, wherein each person has the individual variation of facial width, is solved. Third, the parameters of the modeled formula for estimating the Z distance are adaptively updated using the least squares regression method. Therefore, the focus becomes more accurate over time. Fourth, the relationship between the parameters and the face width is fitted locally according to the Z distance instead of by global fitting, which can enhance the accuracy of Z distance estimation. The results of an experiment with 10,000 images of 10 persons showed that the mean absolute error between the ground-truth Z distance measured by a Polhemus Patriot device and that estimated by the proposed method was 4.84 cm. A total of 95.61% of the images obtained by the proposed method were focused and could be used for gaze detection.


Sensors | 2015

A New Gaze Estimation Method Considering External Light

Jong Man Lee; Hyeon Chang Lee; Su Yeong Gwon; Dongwook Jung; Weiyuan Pan; Chul Woo Cho; Kang Ryoung Park; Hyun-Cheol Kim; Jihun Cha

Gaze tracking systems usually utilize near-infrared (NIR) lights and NIR cameras, and the performance of such systems is mainly affected by external light sources that include NIR components. This is ascribed to the production of additional (imposter) corneal specular reflection (SR) caused by the external light, which makes it difficult to discriminate between the correct SR as caused by the NIR illuminator of the gaze tracking system and the imposter SR. To overcome this problem, a new method is proposed for determining the correct SR in the presence of external light based on the relationship between the corneal SR and the pupil movable area with the relative position of the pupil and the corneal SR. The experimental results showed that the proposed method makes the gaze tracking system robust to the existence of external light.


Optics and Lasers in Engineering | 2012

3D gaze tracking method using Purkinje images on eye optical model and pupil

Ji Woo Lee; Chul Woo Cho; Kwang Yong Shin; Eui Chul Lee; Kang Ryoung Park


Etri Journal | 2012

Gaze Detection by Wearable Eye-Tracking and NIR LED-Based Head-Tracking Device Based on SVR

Chul Woo Cho; Ji Woo Lee; Kwang Yong Shin; Eui Chul Lee; Kang Ryoung Park; Hee-Kyung Lee; Jihun Cha

Collaboration


Dive into the Chul Woo Cho's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jihun Cha

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hee-Kyung Lee

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hyun-Cheol Kim

Electronics and Telecommunications Research Institute

View shared research outputs
Researchain Logo
Decentralizing Knowledge