Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Srikanth Saripalli is active.

Publication


Featured researches published by Srikanth Saripalli.


international conference on robotics and automation | 2003

Visually guided landing of an unmanned aerial vehicle

Srikanth Saripalli; James F. Montgomery; Gaurav S. Sukhatme

We present the design and implementation of a real-time, vision-based landing algorithm for an autonomous helicopter. The landing algorithm is integrated with algorithms for visual acquisition of the target (a helipad) and navigation to the target, from an arbitrary initial position and orientation. We use vision for precise target detection and recognition, and a combination of vision and Global Positioning System for navigation. The helicopter updates its landing target parameters based on vision and uses an onboard behavior-based controller to follow a path to the landing site. We present significant results from flight trials in the field which demonstrate that our detection, recognition, and control algorithms are accurate, robust, and repeatable.


international conference on robotics and automation | 2002

Vision-based autonomous landing of an unmanned aerial vehicle

Srikanth Saripalli; James F. Montgomery; Gaurav S. Sukhatme

We present the design and implementation of a real-time, vision-based landing algorithm for an autonomous helicopter. The helicopter is required to navigate from an initial position to a final position in a partially known environment based on GPS and vision, locate a landing target (a helipad of a known shape) and land on it. We use vision for precise target detection and recognition. The helicopter updates its landing target parameters based on vision and uses an on-board behavior-based controller to follow a path to the landing site. We present results from flight trials in the field which demonstrate that our detection, recognition and control algorithms are accurate and repeatable.


international conference on robotics and automation | 2004

Autonomous deployment and repair of a sensor network using an unmanned aerial vehicle

Peter Corke; Stefan Hrabar; Ronald A. Peterson; Daniela Rus; Srikanth Saripalli; Gaurav S. Sukhatme

We describe a sensor network deployment method using autonomous flying robots. Such networks are suitable for tasks such as large-scale environmental monitoring or for command and control in emergency situations. We describe in detail the algorithms used for deployment and for measuring network connectivity and provide experimental data we collected from field trials. A particular focus is on determining gaps in connectivity of the deployed network and generating a plan for a second, repair, pass to complete the connectivity. This project is the result of a collaboration between three robotics labs (CSIRO, USC, and Dartmouth.).


Journal of Field Robotics | 2006

Visual servoing of an autonomous helicopter in urban areas using feature tracking

Luis Mejias; Srikanth Saripalli; Pascual Campoy; Gaurav S. Sukhatme

We present the design and implementation of a vision-based feature tracking system for an autonomous helicopter. Visual sensing is used for estimating the position and velocity of features in the image plane (urban features like windows) in order to generate velocity references for the flight control. These visual-based references are then combined with GPS-positioning references to navigate towards these features and then track them. We present results from experimental flight trials, performed in two UAV systems and under different conditions that show the feasibility and robustness of our approach.


IEEE Control Systems Magazine | 2014

Unmanned Aerial Vehicle Path Following: A Survey and Analysis of Algorithms for Fixed-Wing Unmanned Aerial Vehicless

P. B. Sujit; Srikanth Saripalli; João Borges de Sousa

Unmanned aerial vehicles (UAVs) are mainly used by military and government organizations, but with low-cost sensors, electronics, and airframes there is significant interest in using low-cost UAVs among aircraft hobbyists, academic researchers, and industries. Applications such as mapping, search and rescue, patrol, and surveillance require the UAV to autonomously follow a predefined path at a prescribed height. The most commonly used paths are straight lines and circular orbits. Path-following algorithms ensure that the UAV will follow a predefined path in three or two dimensions at constant height. A basic requirement for these path-following algorithms is that they must be accurate and robust to wind disturbances.


international conference on robotics and automation | 2007

Flying Fast and Low Among Obstacles

Sebastian Scherer; Sanjiv Singh; Lyle Chamberlain; Srikanth Saripalli

Safe autonomous flight is essential for widespread acceptance of aircraft that must fly close to the ground. We have developed a method of collision avoidance that can be used in three dimensions in much the same way as autonomous ground vehicles that navigate over unexplored terrain. Safe navigation is accomplished by a combination of online environmental sensing, path planning and collision avoidance. Here we report results with an autonomous helicopter that operates at low elevations in uncharted environments some of which are densely populated with obstacles such as buildings, trees and wires. We have recently completed over 1000 successful runs in which the helicopter traveled between coarsely specified waypoints separated by hundreds of meters, at speeds up to 10 meters/sec at elevations of 5-10 meters above ground level. The helicopter safely avoids large objects like buildings and trees but also wires as thin as 6 mm. We believe this represents the first time an air vehicle has traveled this fast so close to obstacles. Here we focus on the collision avoidance method that learns to avoid obstacles by observing the performance of a human operator.


field and service robotics | 2003

Landing on a Moving Target using an Autonomous Helicopter

Srikanth Saripalli; Gaurav S. Sukhatme

We present a vision-based algorithm designed to enable an autonomous helicopter to land on a moving target. The helicopter is required to identify a target, track it, and land on it while the target is in motion. We use Hu’s moments of inertia for precise target recognition and a Kalman filter for target tracking. Based on the output of the tracker a simple trajectory controller is implemented which (within the given constraints) ensures that the helicopter is able to land on the target. We present results from data collected from manual flights which validate our tracking algorithm. Tests on actual landing with the helicopter UAV are ongoing.


Geosphere | 2014

Rapid mapping of ultrafine fault zone topography with structure from motion

Kendra L. Johnson; Edwin Nissen; Srikanth Saripalli; J. Ramon Arrowsmith; Patrick McGarey; K. M. Scharer; Patrick L. Williams; Kimberly Blisniuk

Structure from Motion (SfM) generates high-resolution topography and coregistered texture (color) from an unstructured set of overlapping photographs taken from varying viewpoints, overcoming many of the cost, time, and logistical limitations of Light Detection and Ranging (LiDAR) and other topographic surveying methods. This paper provides the first investigation of SfM as a tool for mapping fault zone topography in areas of sparse or low-lying vegetation. First, we present a simple, affordable SfM workflow, based on an unmanned helium balloon or motorized glider, an inexpensive camera, and semiautomated software. Second, we illustrate the system at two sites on southern California faults covered by existing airborne or terrestrial LiDAR, enabling a comparative assessment of SfM topography resolution and precision. At the first site, an ∼0.1 km 2 alluvial fan on the San Andreas fault, a colored point cloud of density mostly >700 points/m 2 and a 3 cm digital elevation model (DEM) and orthophoto were produced from 233 photos collected ∼50 m above ground level. When a few global positioning system ground control points are incorporated, closest point vertical distances to the much sparser (∼4 points/m 2 ) airborne LiDAR point cloud are mostly 530 points/m 2 and a 2 cm DEM and orthophoto were produced from 450 photos taken from ∼60 m above ground level. Closest point vertical distances to existing terrestrial LiDAR data of comparable density are mostly


intelligent robots and systems | 2003

A tale of two helicopters

Srikanth Saripalli; Jonathan M. Roberts; Peter Corke; Gregg Buskey; Gaurav S. Sukhatme

This paper discusses similarities and differences in autonomous helicopters developed at USC and CSIRO. The most significant differences are in the accuracy and sample rate of the sensor systems used for control. The USC vehicle, like a number of others, makes use of a sensor suite that costs an order of magnitude more than the vehicle. The CSIRO system, by contrast, utilizes low-cost inertial, magnetic, vision and GPS to achieve the same ends. We describe the architecture of both autonomous helicopters, discuss the design issues and present comparative results.


field and service robotics | 2008

Combined Visual and Inertial Navigation for an Unmanned Aerial Vehicle

Jonathan Kelly; Srikanth Saripalli; Gaurav S. Sukhatme

We describe an UAV navigation system which combines stereo visual odometry with inertial measurements from an IMU. Our approach fuses the motion estimates from both sensors in an extended Kalman filter to determine vehicle position and attitude. We present results using data from a robotic helicopter, in which the visual and inertial system produced a final position estimate within 1% of the measured GPS position, over a flight distance of more than 400 meters. Our results show that the combination of visual and inertial sensing reduced overall positioning error by nearly an order of magnitude compared to visual odometry alone.

Collaboration


Dive into the Srikanth Saripalli's collaboration.

Top Co-Authors

Avatar

Gaurav S. Sukhatme

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Yucong Lin

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Pascual Campoy

Technical University of Madrid

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sai Vemprala

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

Peter Corke

Queensland University of Technology

View shared research outputs
Top Co-Authors

Avatar

Edwin Nissen

Colorado School of Mines

View shared research outputs
Top Co-Authors

Avatar

Kip V. Hodges

Arizona State University

View shared research outputs
Top Co-Authors

Avatar

P. B. Sujit

Indraprastha Institute of Information Technology

View shared research outputs
Top Co-Authors

Avatar

Daniela Rus

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge