Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Anastasios I. Mourikis is active.

Publication


Featured researches published by Anastasios I. Mourikis.


international conference on robotics and automation | 2007

A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation

Anastasios I. Mourikis; Stergios I. Roumeliotis

In this paper, we present an extended Kalman filter (EKF)-based algorithm for real-time vision-aided inertial navigation. The primary contribution of this work is the derivation of a measurement model that is able to express the geometric constraints that arise when a static feature is observed from multiple camera poses. This measurement model does not require including the 3D feature position in the state vector of the EKF and is optimal, up to linearization errors. The vision-aided inertial navigation algorithm we propose has computational complexity only linear in the number of features, and is capable of high-precision pose estimation in large-scale real-world environments. The performance of the algorithm is demonstrated in extensive experimental results, involving a camera/IMU system localizing within an urban area.


IEEE Transactions on Robotics | 2009

Vision-Aided Inertial Navigation for Spacecraft Entry, Descent, and Landing

Anastasios I. Mourikis; Nikolas Trawny; Stergios I. Roumeliotis; Andrew Edie Johnson; Adnan Ansar; Larry H. Matthies

In this paper, we present the vision-aided inertial navigation (VISINAV) algorithm that enables precision planetary landing. The vision front-end of the VISINAV system extracts 2-D-to-3-D correspondences between descent images and a surface map (mapped landmarks), as well as 2-D-to-2-D feature tracks through a sequence of descent images (opportunistic features). An extended Kalman filter (EKF) tightly integrates both types of visual feature observations with measurements from an inertial measurement unit. The filter computes accurate estimates of the landers terrain-relative position, attitude, and velocity, in a resource-adaptive and hence real-time capable fashion. In addition to the technical analysis of the algorithm, the paper presents validation results from a sounding-rocket test flight, showing estimation errors of only 0.16 m/s for velocity and 6.4 m for position at touchdown. These results vastly improve current state of the art for terminal descent navigation without visual updates, and meet the requirements of future planetary exploration missions.


The International Journal of Robotics Research | 2013

High-precision, consistent EKF-based visual-inertial odometry

Mingyang Li; Anastasios I. Mourikis

In this paper, we focus on the problem of motion tracking in unknown environments using visual and inertial sensors. We term this estimation task visual–inertial odometry (VIO), in analogy to the well-known visual-odometry problem. We present a detailed study of extended Kalman filter (EKF)-based VIO algorithms, by comparing both their theoretical properties and empirical performance. We show that an EKF formulation where the state vector comprises a sliding window of poses (the multi-state-constraint Kalman filter (MSCKF)) attains better accuracy, consistency, and computational efficiency than the simultaneous localization and mapping (SLAM) formulation of the EKF, in which the state vector contains the current pose and the features seen by the camera. Moreover, we prove that both types of EKF approaches are inconsistent, due to the way in which Jacobians are computed. Specifically, we show that the observability properties of the EKF’s linearized system models do not match those of the underlying system, which causes the filters to underestimate the uncertainty in the state estimates. Based on our analysis, we propose a novel, real-time EKF-based VIO algorithm, which achieves consistent estimation by (i) ensuring the correct observability properties of its linearized system model, and (ii) performing online estimation of the camera-to-inertial measurement unit (IMU) calibration parameters. This algorithm, which we term MSCKF 2.0, is shown to achieve accuracy and consistency higher than even an iterative, sliding-window fixed-lag smoother, in both Monte Carlo simulations and real-world testing.


Journal of Field Robotics | 2007

Vision-aided inertial navigation for pin-point landing using observations of mapped landmarks

Nikolas Trawny; Anastasios I. Mourikis; Stergios I. Roumeliotis; Andrew Edie Johnson; James F. Montgomery

In this paper we describe an extended Kalman filter algorithm for estimating the pose and velocity of a spacecraft during entry, descent, and landing. The proposed estimator combines measurements of rotational velocity and acceleration from an inertial measurement unit (IMU) with observations of a priori mapped landmarks, such as craters or other visual features, that exist on the surface of a planet. The tight coupling of inertial sensory information with visual cues results in accurate, robust state estimates available at a high bandwidth. The dimensions of the landing uncertainty ellipses achieved by the proposed algorithm are three orders of magnitude smaller than those possible when relying exclusively on IMU integration. Extensive experimental and simulation results are presented, which demonstrate the applicability of the algorithm on real-world data and analyze the dependence of its accuracy on several system design parameters.


The International Journal of Robotics Research | 2010

Observability-based Rules for Designing Consistent EKF SLAM Estimators

Guoquan Huang; Anastasios I. Mourikis; Stergios I. Roumeliotis

In this work, we study the inconsistency problem of extended Kalman filter (EKF)-based simultaneous localization and mapping (SLAM) from the perspective of observability. We analytically prove that when the Jacobians of the process and measurement models are evaluated at the latest state estimates during every time step, the linearized error-state system employed in the EKF has an observable subspace of dimension higher than that of the actual, non-linear, SLAM system. As a result, the covariance estimates of the EKF undergo reduction in directions of the state space where no information is available, which is a primary cause of the inconsistency. Based on these theoretical results, we propose a general framework for improving the consistency of EKF-based SLAM. In this framework, the EKF linearization points are selected in a way that ensures that the resulting linearized system model has an observable subspace of appropriate dimension. We describe two algorithms that are instances of this paradigm. In the first, termed observability constrained (OC)-EKF, the linearization points are selected so as to minimize their expected errors (i.e. the difference between the linearization point and the true state) under the observability constraints. In the second, the filter Jacobians are calculated using the first-ever available estimates for all state variables. This latter approach is termed first-estimates Jacobian (FEJ)-EKF. The proposed algorithms have been tested both in simulation and experimentally, and are shown to significantly outperform the standard EKF both in terms of accuracy and consistency.


international conference on robotics and automation | 2012

Improving the accuracy of EKF-based visual-inertial odometry

Mingyang Li; Anastasios I. Mourikis

In this paper, we perform a rigorous analysis of EKF-based visual-inertial odometry (VIO) and present a method for improving its performance. Specifically, we examine the properties of EKF-based VIO, and show that the standard way of computing Jacobians in the filter inevitably causes inconsistency and loss of accuracy. This result is derived based on an observability analysis of the EKFs linearized system model, which proves that the yaw erroneously appears to be observable. In order to address this problem, we propose modifications to the multi-state constraint Kalman filter (MSCKF) algorithm [1], which ensure the correct observability properties without incurring additional computational cost. Extensive simulation tests and real-world experiments demonstrate that the modified MSCKF algorithm outperforms competing methods, both in terms of consistency and accuracy.


The International Journal of Robotics Research | 2007

Autonomous Stair Climbing for Tracked Vehicles

Anastasios I. Mourikis; Nikolas Trawny; Stergios I. Roumeliotis; Daniel M. Helmick; Larry H. Matthies

In this paper, an algorithm for autonomous stair climbing with a tracked vehicle is presented. The proposed method achieves robust performance under real-world conditions, without assuming prior knowledge of the stair geometry, the dynamics of the vehicles interaction with the stair surface, or lighting conditions. The approach relies on fast and accurate estimation of the robots heading and its position relative to the stair boundaries. An extended Kalman filter is used for quaternion-based attitude estimation, fusing rotational velocity measurements from a 3-axial gyroscope, and measurements of the stair edges acquired with an onboard camera. A two-tiered controller, comprised of a centering- and a heading-control module, utilizes the estimates to guide the robot rapidly, safely, and accurately upstairs. Both the theoretical analysis and implementation of the algorithm are presented in detail, and extensive experimental results demonstrating the algorithms performance are described.


international conference on robotics and automation | 2013

Real-time motion tracking on a cellphone using inertial sensing and a rolling-shutter camera

Mingyang Li; Byung Hyung Kim; Anastasios I. Mourikis

All existing methods for vision-aided inertial navigation assume a camera with a global shutter, in which all the pixels in an image are captured simultaneously. However, the vast majority of consumer-grade cameras use rolling-shutter sensors, which capture each row of pixels at a slightly different time instant. The effects of the rolling shutter distortion when a camera is in motion can be very significant, and are not modelled by existing visual-inertial motion-tracking methods. In this paper we describe the first, to the best of our knowledge, method for vision-aided inertial navigation using rolling-shutter cameras. Specifically, we present an extended Kalman filter (EKF)-based method for visual-inertial odometry, which fuses the IMU measurements with observations of visual feature tracks provided by the camera. The key contribution of this work is a computationally tractable approach for taking into account the rolling-shutter effect, incurring only minimal approximations. The experimental results from the application of the method show that it is able to track, in real time, the position of a mobile phone moving in an unknown environment with an error accumulation of approximately 0.8% of the distance travelled, over hundreds of meters.


computer vision and pattern recognition | 2008

A dual-layer estimator architecture for long-term localization

Anastasios I. Mourikis; Stergios I. Roumeliotis

In this paper, we present a localization algorithm for estimating the 3D position and orientation (pose) of a moving vehicle based on visual and inertial measurements. The main advantage of the proposed method is that it provides precise pose estimates at low computational cost. This is achieved by introducing a two-layer estimation architecture that processes measurements based on their information content. Inertial measurements and feature tracks between consecutive images are processed locally in the first layer (multi-state-constraint Kalman filter) providing estimates for the motion of the vehicle at a high rate. The second layer comprises a bundle adjustment iterative estimator that operates intermittently so as to (i) reduce the effect of the linearization errors, and (ii) update the state estimates every time an area is re-visited and features are re-detected (loop closure). Through this process reliable state estimates are available continuously, while the estimation errors remain bounded during long-term operation. The performance of the developed system is demonstrated in large-scale experiments, involving a vehicle localizing within an urban area.


IEEE Transactions on Robotics | 2007

SC-KF Mobile Robot Localization: A Stochastic Cloning Kalman Filter for Processing Relative-State Measurements

Anastasios I. Mourikis; Stergios I. Roumeliotis; Joel W. Burdick

This paper presents a new method to optimally combine motion measurements provided by proprioceptive sensors, with relative-state estimates inferred from feature-based matching. Two key challenges arise in such pose tracking problems: 1) the displacement estimates relate the state of the robot at two different time instants and 2) the same exteroceptive measurements are often used for computing consecutive displacement estimates, a process that renders the errors in these correlated. We present a novel stochastic cloning Kalman filtering (SC-KF) estimation algorithm that successfully addresses these challenges, while still allowing for efficient calculation of the filter gains and covariances. The proposed algorithm is not intended to compete with simultaneous localization and mapping (SLAM) approaches. Instead, it can be merged with any extended-Kalman- filter-based SLAM algorithm to increase its precision. In this respect, the SC-KF provides a robust framework for leveraging additional motion information extracted from dense point features that most SLAM algorithms do not treat as landmarks. Extensive experimental and simulation results are presented to verify the validity of the proposed method and to demonstrate that its performance is superior to that of alternative position-tracking approaches.

Collaboration


Dive into the Anastasios I. Mourikis's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mingyang Li

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrew Edie Johnson

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Adnan Ansar

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Hongsheng Yu

University of California

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge