Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Giuseppe Loianno is active.

Publication


Featured researches published by Giuseppe Loianno.


international conference on robotics and automation | 2014

Toward image based visual servoing for aerial grasping and perching

Justin Thomas; Giuseppe Loianno; Koushil Sreenath; Vijay Kumar

This paper addresses the dynamics, control, planning, and visual servoing for micro aerial vehicles to perform high-speed aerial grasping tasks. We draw inspiration from agile, fast-moving birds, such as raptors, that detect, locate, and execute high-speed swoop maneuvers to capture prey. Since these grasping maneuvers are predominantly in the sagittal plane, we consider the planar system and present mathematical models and algorithms for motion planning and control, required to incorporate similar capabilities in quadrotors equipped with a monocular camera. In particular, we develop a dynamical model directly in the image space, show that this is a differentially-flat system with the image features serving as flat outputs, outline a method for generating trajectories directly in the image feature space, develop a geometric visual controller that considers the second order dynamics (in contrast to most visual servoing controllers that assume first order dynamics), and present validation of our methods through both simulations and experiments.


Bioinspiration & Biomimetics | 2014

Toward autonomous avian-inspired grasping for micro aerial vehicles.

Justin Thomas; Giuseppe Loianno; Joseph Polin; Koushil Sreenath; Vijay Kumar

Micro aerial vehicles, particularly quadrotors, have been used in a wide range of applications. However, the literature on aerial manipulation and grasping is limited and the work is based on quasi-static models. In this paper, we draw inspiration from agile, fast-moving birds such as raptors, that are able to capture moving prey on the ground or in water, and develop similar capabilities for quadrotors. We address dynamic grasping, an approach to prehensile grasping in which the dynamics of the robot and its gripper are significant and must be explicitly modeled and controlled for successful execution. Dynamic grasping is relevant for fast pick-and-place operations, transportation and delivery of objects, and placing or retrieving sensors. We show how this capability can be realized (a) using a motion capture system and (b) without external sensors relying only on onboard sensors. In both cases we describe the dynamic model, and trajectory planning and control algorithms. In particular, we present a methodology for flying and grasping a cylindrical object using feedback from a monocular camera and an inertial measurement unit onboard the aerial robot. This is accomplished by mapping the dynamics of the quadrotor to a level virtual image plane, which in turn enables dynamically-feasible trajectory planning for image features in the image space, and a vision-based controller with guaranteed convergence properties. We also present experimental results obtained with a quadrotor equipped with an articulated gripper to illustrate both approaches.


international conference on robotics and automation | 2016

Visual-inertial direct SLAM

Alejo Concha; Giuseppe Loianno; Vijay Kumar; Javier Civera

The so-called direct visual SLAM methods have shown a great potential in estimating a semidense or fully dense reconstruction of the scene, in contrast to the sparse reconstructions of the traditional feature-based algorithms. In this paper, we propose for the first time a direct, tightly-coupled formulation for the combination of visual and inertial data. Our algorithm runs in real-time on a standard CPU. The processing is split in three threads. The first thread runs at frame rate and estimates the camera motion by a joint non-linear optimization from visual and inertial data given a semidense map. The second one creates a semidense map of high-gradient areas only for camera tracking purposes. Finally, the third thread estimates a fully dense reconstruction of the scene at a lower frame rate. We have evaluated our algorithm in several real sequences with ground truth trajectory data, showing a state-of-the-art performance.


international conference on robotics and automation | 2016

Visual Servoing of Quadrotors for Perching by Hanging From Cylindrical Objects

Justin Thomas; Giuseppe Loianno; Kostas Daniilidis; Vijay Kumar

This letter addresses vision-based localization and servoing for quadrotors to enable autonomous perching by hanging from cylindrical structures using only a monocular camera. We focus on the problems of relative pose estimation, control, and trajectory planning for maneuvering a robot relative to cylinders with unknown orientations. We first develop a geometric model that describes the pose of the robot relative to a cylinder. Then, we derive the dynamics of the system, expressed in terms of the image features. Based on the dynamics, we present a controller, which guarantees asymptotic convergence to the desired image space coordinates. Finally, we develop an effective method to plan dynamically feasible trajectories in the image space, and we provide experimental results to demonstrate the proposed method under different operating conditions such as hovering, trajectory tracking, and perching.


international conference on unmanned aircraft systems | 2014

Autonomous deployment of swarms of micro-aerial vehicles in cooperative surveillance

Martin Saska; Jan Chudoba; Libor Precil; Justin Thomas; Giuseppe Loianno; Adam Tresnak; Vojtech Vonasek; Vijay Kumar

An algorithm for autonomous deployment of groups of Micro Aerial Vehicles (MAVs) in the cooperative surveillance task is presented in this paper. The algorithm enables to find a proper distributions of all MAVs in surveillance locations together with feasible and collision free trajectories from their initial position. The solution of the MAV-group deployment satisfies motion constraints of MAVs, environment constraints (non-fly zones) and constraints imposed by a visual onboard relative localization. The onboard relative localization, which is used for stabilization of the group flying in a compact formation, acts as an enabling technique for utilization of MAVs in situations where an external local system is not available or lacks the sufficient precision.


international conference on robotics and automation | 2017

Estimation, Control, and Planning for Aggressive Flight With a Small Quadrotor With a Single Camera and IMU

Giuseppe Loianno; Chris Brunner; Gary G. Mcgrath; Vijay Kumar

We address the state estimation, control, and planning for aggressive flight with a 150 cm diameter, 250 g quadrotor equipped only with a single camera and an inertial measurement unit (IMU). The use of smartphone grade hardware and the small scale provides an inexpensive and practical solution for autonomous flight in indoor environments. The key contributions of this paper are: 1) robust state estimation and control using only a monocular camera and an IMU at speeds of 4.5 m/s, accelerations of over 1.5 g, roll and pitch angles of up to 90°, and angular rate of up to 800°/s without requiring any structure in the environment; 2) planning of dynamically feasible three-dimensional trajectories for slalom paths and flights through narrow windows; and 3) extensive experimental results showing aggressive flights through and around obstacles with large rotation angular excursions and accelerations.


intelligent robots and systems | 2015

Smartphones power flying robots

Giuseppe Loianno; Yash Mulgaonkar; Chris Brunner; Dheeraj Ahuja; Arvind Ramanandan; Murali Ramaswamy Chari; Serafin Diaz; Vijay Kumar

Consumer grade technology seen in cameras and phones has led to the price/performance ratio of sensors and processors falling dramatically over the last decade. In particular, most devices are packaged with a camera, a gyroscope, and an accelerometer, important sensors for aerial robotics. The low mass and small form factor make them particularly well suited for autonomous flight with small flying robots, especially in GPS-denied environments. In this work, we present the first fully autonomous smartphone-based quadrotor. All the computation, sensing and control runs on an off-the-shelf smartphone, with all the software functionality in a smartphone app.We show how quadrotors can be stabilized and controlled to achieve autonomous flight in indoor buildings with application to smart homes, search and rescue, construction and architecture. The work allows any consumer with a smartphone to autonomously drive a quadrotor robot platform, even without GPS, by downloading an app, and concurrently build 3-D maps.


IEEE Robotics & Automation Magazine | 2015

Flying Smartphones: Automated Flight Enabled by Consumer Electronics

Giuseppe Loianno; Gareth Cross; Chao Qu; Yash Mulgaonkar; Joel A. Hesch; Vijay Kumar

Consumer-grade technology seen in cameras and phones has led to the price-performance ratio falling dramatically over the last decade. We are seeing a similar trend in robots that leverage this technology. A recent development is the interest of companies such as Google, Apple, and Qualcomm in high-end communication devices equipped with such sensors as cameras and inertial measurement units (IMUs) and with significant computational capability. Google, for instance, is developing a customized phone equipped with conventional as well as depth cameras. This article explores the potential for the rapid integration of inexpensive consumer-grade electronics with the off-the-shelf robotics technology for automation in homes and offices. We describe how standard hardware platforms (robots, processors, and smartphones) can be integrated through simple software architecture to build autonomous quadrotors that can navigate and map unknown, indoor environments. We show how the quadrotor can be stabilized and controlled to achieve autonomous flight and the generation of three-dimensional (3-D) maps for exploring and mapping indoor buildings with application to smart homes, search and rescue, and architecture. This opens up the possibility for any consumer to take a commercially available robot platform and a smartphone and automate the process of creating a 3-D map of his/her home or office.


international conference on robotics and automation | 2015

Cooperative localization and mapping of MAVs using RGB-D sensors

Giuseppe Loianno; Justin Thomas; Vijay Kumar

The fusion of IMU and RGB-D sensors presents an interesting combination of information to achieve autonomous localization and mapping using robotic platforms such as ground robots and flying vehicles. In this paper, we present a software framework for cooperative localization and mapping while simultaneously using multiple aerial platforms. We employ a monocular visual odometry algorithm to solve the localization task, where the depth data flow associated to the RGB image is used to estimate the scale factor associated with the visual information. The current framework enables autonomous onboard control of each vehicle with cooperative localization and mapping. We present a methodology that provides both a sparse map generated by the monocular SLAM and a multiple resolution dense map generated by the associated depth. The localization algorithm and both 3D mapping algorithms work in parallel improving the system real-time reliability. We present experimental results to show the effectiveness of the proposed approach using two quadrotors platforms.


Autonomous Robots | 2017

System for deployment of groups of unmanned micro aerial vehicles in GPS-denied environments using onboard visual relative localization

Martin Saska; Tomas Baca; Justin Thomas; Jan Chudoba; Libor Preucil; Tomas Krajnik; Jan Faigl; Giuseppe Loianno; Vijay Kumar

A complex system for control of swarms of micro aerial vehicles (MAV), in literature also called as unmanned aerial vehicles (UAV) or unmanned aerial systems (UAS), stabilized via an onboard visual relative localization is described in this paper. The main purpose of this work is to verify the possibility of self-stabilization of multi-MAV groups without an external global positioning system. This approach enables the deployment of MAV swarms outside laboratory conditions, and it may be considered an enabling technique for utilizing fleets of MAVs in real-world scenarios. The proposed visual-based stabilization approach has been designed for numerous different multi-UAV robotic applications (leader-follower UAV formation stabilization, UAV swarm stabilization and deployment in surveillance scenarios, cooperative UAV sensory measurement) in this paper. Deployment of the system in real-world scenarios truthfully verifies its operational constraints, given by limited onboard sensing suites and processing capabilities. The performance of the presented approach (MAV control, motion planning, MAV stabilization, and trajectory planning) in multi-MAV applications has been validated by experimental results in indoor as well as in challenging outdoor environments (e.g., in windy conditions and in a former pit mine).

Collaboration


Dive into the Giuseppe Loianno's collaboration.

Top Co-Authors

Avatar

Vijay Kumar

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Justin Thomas

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Kostas Daniilidis

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Yash Mulgaonkar

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Martin Saska

Czech Technical University in Prague

View shared research outputs
Top Co-Authors

Avatar

Adam Cho

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Camillo J. Taylor

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Chao Qu

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Michael Watterson

University of Pennsylvania

View shared research outputs
Researchain Logo
Decentralizing Knowledge