Pyojin Kim
Seoul National University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Pyojin Kim.
intelligent robots and systems | 2015
Pyojin Kim; Hyon Lim; H. Jin Kim
Sensitivity to illumination conditions poses a challenge when utilizing visual odometry (VO) in various applications. To make VO robust with respect to illumination conditions, they need to be considered explicitly. In this paper, we propose a direct visual odometry method which can handle illumination changes by considering an affine illumination model to compensate abrupt, local light variations during direct motion estimation process. The core of our proposed method is to estimate the relative camera pose and the parameters of the illumination changes by minimizing the sum of squared photometric error with efficient second-order minimization. We evaluate the performance of the proposed algorithm on synthetic and real RGB-D datasets with ground-truth. Our result implies that the proposed method successfully estimates 6-DoF pose under significant illumination changes whereas existing direct visual odometry methods either fail or lose accuracy.
systems, man and cybernetics | 2014
Pyojin Kim; Hyon Lim; H. Jin Kim
In this paper, we suggest a new 6-DoF velocity estimation algorithm using RGB and depth images. Autonomous control of mobile robots requires their velocity information. There exist numerous researches on estimating and measuring the velocity. However, more investigations are needed related to vision sensors and depth image. In this work, we propose an algorithm for velocity estimation with an RGB-D sensor based on image jacobian matrix usually used in image-based visual servoing. We validate the performance of the proposed estimation algorithm in various environments with the RGB-D benchmark dataset. The velocity estimation results show the high quality of estimated 6-DoF velocity compared to the ground truth velocity.
european conference on computer vision | 2018
Pyojin Kim; Brian Coltin; H. Jin Kim
We propose a new formulation for including orthogonal planar features as a global model into a linear SLAM approach based on sequential Bayesian filtering. Previous planar SLAM algorithms estimate the camera poses and multiple landmark planes in a pose graph optimization. However, since it is formulated as a high dimensional nonlinear optimization problem, there is no guarantee the algorithm will converge to the global optimum. To overcome these limitations, we present a new SLAM method that jointly estimates camera position and planar landmarks in the map within a linear Kalman filter framework. It is rotations that make the SLAM problem highly nonlinear. Therefore, we solve for the rotational motion of the camera using structural regularities in the Manhattan world (MW), resulting in a linear SLAM formulation. We test our algorithm on standard RGB-D benchmarks as well as additional large indoors environments, demonstrating comparable performance to other state-of-the-art SLAM methods without the use of expensive nonlinear optimization.
international conference on ubiquitous robots and ambient intelligence | 2017
Changhyeon Kim; Sang-Il Lee; Pyojin Kim; H. Jin Kim
This paper presents a fast RGB-D dense visual odometry estimating 12-DoF state information including 3D motion and 6-DoF spatial velocity of a camera-strapdown system. To reduce computational loads, we extract informative pixels through a zero-crossing difference of Gaussian (DoG) and non-maximum gradient pixel extraction. For extracted regions, the 3D motion is estimated through inverse compositional algorithm and the result of motion estimation is exploited to calculate 6-DoF spatial velocity of the camera. Additionally, we relieve noise in the raw velocity using the Kalman filter. Afterwards, we validate the proposed algorithm using TUM RGB-D datasets and simulation results are reported. Our algorithm not only presents similar performances with the popular dense visual odometry, DVO, but also runs up to 2 times faster than DVO.
international conference on robotics and automation | 2017
Pyojin Kim; Brian Coltin; Oleg Alexandrov; H. Jin Kim
We present an illumination-robust visual localization algorithm for Astrobee, a free-flying robot designed to autonomously navigate on the International Space Station (ISS). Astrobee localizes with a monocular camera and a pre-built sparse map composed of natural visual features. Astrobee must perform tasks not only during the day, but also at night when the ISS lights are dimmed. However, the localization performance degrades when the observed lighting conditions differ from the conditions when the sparse map was built. We investigate and quantify the effect of lighting variations on visual feature-based localization systems, and discover that maps built in darker conditions can also be effective in bright conditions, but the reverse is not true. We extend Astrobees localization algorithm to make it more robust to changing-light environments on the ISS by automatically recognizing the current illumination level, and selecting an appropriate map and camera exposure time. We extensively evaluate the proposed algorithm through experiments on Astrobee.
international conference on robotics and automation | 2018
Pyojin Kim; Brian Coltin; H. Jin Kim
computer vision and pattern recognition | 2018
Pyojin Kim; Brian Coltin; H. Jin Kim
International Journal of Control Automation and Systems | 2018
Pyojin Kim; Hyon Lim; H. Jin Kim
international conference of the ieee engineering in medicine and biology society | 2017
Hyun Kyung Kim; Pyojin Kim; Jong-Mo Seo
british machine vision conference | 2017
Pyojin Kim; Brian Coltin; Hyoun Jin Kim