Philip Fong
Evolution Robotics
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Philip Fong.
intelligent robots and systems | 2010
Ethan Eade; Philip Fong; Mario E. Munich
We present a graph-based SLAM approach, using monocular vision and odometry, designed to operate on computationally constrained platforms. When computation and memory are limited, visual tracking becomes difficult or impossible, and map representation and update costs must remain low. Our system constructs a map of structured views using only weak temporal assumptions, and performs recognition and relative pose estimation over the set of views. Visual observations are fused with differential sensors in an incrementally optimized graph representation. Using variable elimination and constraint pruning, the graph complexity and storage is kept linear in explored space rather than in time. We evaluate performance on sequences with ground truth, and also compare to a standard graph SLAM approach.
robotics science and systems | 2010
Jens-Steffen Gutmann; Ethan Eade; Philip Fong; Mario E. Munich
The constraints of a low-cost consumer product pose a major challenge for designing a localization system. In previous work, we introduced Vector Field SLAM [5], a system for simultaneously estimating robot pose and a vector field induced by stationary signal sources present in the environment. In this paper we show how this method can be realized on a low-cost embedded processing unit by applying the concepts of the Exactly Sparse Extended Information Filter [15]. By restricting the set of active features to the 4 nodes of the current cell, the size of the map becomes linear in the area explored by the robot while the time for updating the state can be held constant under certain approximations. We report results from running our method on an ARM 7 embedded board with 64 kByte RAM controlling a Roomba 510 vacuum cleaner in a standard test environment.
international conference on robotics and automation | 2010
Jens-Steffen Gutmann; Gabriel Brisson; Ethan Eade; Philip Fong; Mario E. Munich
Localization in unknown environments using low-cost sensors remains a challenge. This paper presents a new localization approach that learns the spatial variation of an observed continuous signal. We model the signal as a piece-wise linear function and estimate its parameters using a simultaneous localization and mapping (SLAM) approach. We apply our framework to a sensor measuring bearing to active beacons where measurements are systematically distorted due to occlusion and signal reflections of walls and other objects present in the environment. Experimental results from running GraphSLAM and EKF-SLAM on manually collected sensor measurements as well as on data recorded on a vacuum-cleaner robot validate our model.
intelligent robots and systems | 2012
Jens-Steffen Gutmann; Philip Fong; Mario E. Munich
Localization using continuous signals such as WiFi or active beacons is a cost-effective approach for enabling systematic navigation of robots. In our previous work we showed how localization maps, represented as regular grids, of such signals can be learned through application of vector field SLAM [1]. In this paper we describe a method that, given such a localization map, finds the pose of a mobile robot from observations of the signals. Our method first generates pose hypotheses by searching the localization map for places that best fit to a measurement taken by the robot. A localization filter using an extended Kalman filter (EKF) then verifies one pose hypothesis by tracking the pose over a short distance. In experiments carried out in a standard test environment equipped with active beacons we obtain an average position accuracy of 10 to 35 cm with a localization success rate of 96 to 99 %. The proposed method enables a robot mapping an environment using vector field SLAM to recover from kidnapping and resume its navigation.
IAS (2) | 2012
Jens-Steffen Gutmann; Dhiraj Goel; Mario E. Munich; Philip Fong
Vector field SLAM is a framework for localizing a mobile robot in an unknown environment by learning the spatial distribution of continuous signals such as those emitted by WiFi or active beacons. In our previous work we showed that this approach is capable of keeping a robot localized in small to medium sized areas, e.g. in a living room, where four continuous signals of an active beacon are measured. In this paper we extend the method to larger environments up to the size of a complete home by deploying more signal sources for covering the expanded area. We first analyze the complexity of vector field SLAM with respect to area size and number of signals and then describe an approximation that divides the localization map into decoupled sub-maps to keep memory and run-time requirements low. Experimental results from running the system in houses of up to 125 m2 demonstrate the performance of our approach. The presented methods are suitable for commercial low-cost products including robots for autonomous and systematic floor cleaning.
IEEE Transactions on Robotics | 2012
Jens-Steffen Gutmann; Ethan Eade; Philip Fong; Mario E. Munich
Archive | 2012
Jens-Steffen Gutmann; Philip Fong; Mario E. Munich
Archive | 2013
Dhiraj Goel; Ethan Eade; Philip Fong; Mario E. Munich
Archive | 2013
Ethan Eade; Philip Fong; Mario E. Munich
Archive | 2013
Philip Fong; Ethan Eade; Mario E. Munich