Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yash Mulgaonkar is active.

Publication


Featured researches published by Yash Mulgaonkar.


field and service robotics | 2012

Collaborative mapping of an earthquake-damaged building via ground and aerial robots

Nathan Michael; Shaojie Shen; Kartik Mohta; Yash Mulgaonkar; Vijay Kumar; Keiji Nagatani; Yoshito Okada; Seiga Kiribayashi; Kazuki Otake; Kazuya Yoshida; Kazunori Ohno; Eijiro Takeuchi; Satoshi Tadokoro

We report recent results from field experiments conducted with a team of ground and aerial robots engaged in the collaborative mapping of an earthquake-damaged building. The goal of the experimental exercise is the generation of three-dimensional maps that capture the layout of a multifloor environment. The experiments took place in the top three floors of a structurally compromised building at Tohoku University in Sendai, Japan that was damaged during the 2011 Tohoku earthquake. We provide details of the approach to the collaborative mapping and report results from the experiments in the form of maps generated by the individual robots and as a team. We conclude by discussing observations from the experiments and future research topics.


robotics science and systems | 2013

Vision-Based State Estimation and Trajectory Control Towards High-Speed Flight with a Quadrotor

Shaojie Shen; Yash Mulgaonkar; Nathan Michael; Vijay Kumar

This paper addresses the development of a lightweight autonomous quadrotor that uses cameras and an inexpensive IMU as its only sensors and onboard processors for estimation and control. We describe a fully-functional, integrated system with a focus on robust visual-inertial state estimation, and demonstrate the quadrotor’s ability to autonomously travel at speeds up to 4 m/s and roll and pitch angles exceeding 20◦. The performance of the proposed system is demonstrated via challenging experiments in three dimensional indoor environments.


international conference on robotics and automation | 2013

Vision-based state estimation for autonomous rotorcraft MAVs in complex environments

Shaojie Shen; Yash Mulgaonkar; Nathan Michael; Vijay Kumar

In this paper, we consider the development of a rotorcraft micro aerial vehicle (MAV) system capable of vision-based state estimation in complex environments. We pursue a systems solution for the hardware and software to enable autonomous flight with a small rotorcraft in complex indoor and outdoor environments using only onboard vision and inertial sensors. As rotorcrafts frequently operate in hover or nearhover conditions, we propose a vision-based state estimation approach that does not drift when the vehicle remains stationary. The vision-based estimation approach combines the advantages of monocular vision (range, faster processing) with that of stereo vision (availability of scale and depth information), while overcoming several disadvantages of both. Specifically, our system relies on fisheye camera images at 25 Hz and imagery from a second camera at a much lower frequency for metric scale initialization and failure recovery. This estimate is fused with IMU information to yield state estimates at 100 Hz for feedback control. We show indoor experimental results with performance benchmarking and illustrate the autonomous operation of the system in challenging indoor and outdoor environments.


international conference on robotics and automation | 2014

Multi-sensor Fusion for Robust Autonomous Flight in Indoor and Outdoor Environments with a Rotorcraft MAV

Shaojie Shen; Yash Mulgaonkar; Nathan Michael; Vijay Kumar

We present a modular and extensible approach to integrate noisy measurements from multiple heterogeneous sensors that yield either absolute or relative observations at different and varying time intervals, and to provide smooth and globally consistent estimates of position in real time for autonomous flight. We describe the development of algorithms and software architecture for a new 1.9kg MAV platform equipped with an IMU, laser scanner, stereo cameras, pressure altimeter, magnetometer, and a GPS receiver, in which the state estimation and control are performed onboard on an Intel NUC 3rd generation i3 processor. We illustrate the robustness of our framework in large-scale, indoor-outdoor autonomous aerial navigation experiments involving traversals of over 440 meters at average speeds of 1.5 m/s with winds around 10 mph while entering and exiting buildings.


conference on automation science and engineering | 2015

Devices, systems, and methods for automated monitoring enabling precision agriculture

Jnaneshwar Das; Gareth Cross; Chao Qu; Anurag Makineni; Pratap Tokekar; Yash Mulgaonkar; Vijay Kumar

Addressing the challenges of feeding the burgeoning world population with limited resources requires innovation in sustainable, efficient farming. The practice of precision agriculture offers many benefits towards addressing these challenges, such as improved yield and efficient use of such resources as water, fertilizer and pesticides. We describe the design and development of a light-weight, multi-spectral 3D imaging device that can be used for automated monitoring in precision agriculture. The sensor suite consists of a laser range scanner, multi-spectral cameras, a thermal imaging camera, and navigational sensors. We present techniques to extract four key data products - plant morphology, canopy volume, leaf area index, and fruit counts - using the sensor suite. We demonstrate its use with two systems: multi-rotor micro aerial vehicles and on a human-carried, shoulder-mounted harness. We show results of field experiments conducted in collaboration with growers and agronomists in vineyards, apple orchards and orange groves.


international symposium on experimental robotics | 2016

Initialization-Free Monocular Visual-Inertial State Estimation with Application to Autonomous MAVs

Shaojie Shen; Yash Mulgaonkar; Nathan Michael; Vijay Kumar

The quest to build smaller, more agile micro aerial vehicles has led the research community to address cameras and Inertial Measurement Units (IMUs) as the primary sensors for state estimation and autonomy. In this paper we present a monocular visual-inertial system (VINS) for an autonomous quadrotor which relies only on an inexpensive off-the-shelf camera and IMU, and describe a robust state estimator which allows the robot to execute trajectories at 2 m/s with roll and pitch angles of 20 degrees, with accelerations over 4 m/\(\text {s}^2\). The main innovations in the paper are an approach to estimate the vehicle motion without initialization and a method to determine scale and metric state information without encountering any degeneracy in real time.


field and service robotics | 2015

Inspection of Penstocks and Featureless Tunnel-like Environments Using Micro UAVs

Tolga Özaslan; Shaojie Shen; Yash Mulgaonkar; Nathan Michael; Vijay Kumar

Micro UAVs are receiving a great deal of attention in many diverse applications. In this paper, we are interested in a unique application, surveillance for maintenance of large infrastructure assets such as dams and penstocks, where the goal is to periodically inspect and map the structure to detect features that might indicate the potential for failures. Availability of architecture drawings of these constructions makes the mapping problem easier. However large buildings with featureless geometries pose a significant problem since it is difficult to design a robust localization algorithm for inspection operations. In this paper we show how a small quadrotor equipped with minimal sensors can be used for inspection of tunnel-like environments such as seen in dam penstocks. Penstocks in particular lack features and do not provide adequate structure for robot localization, especially along the tunnel axis. We develop a Rao-Blackwellized particle filter based localization algorithm which uses a derivative of the ICP for integrating laser measurements and IMU for short-to-medium range pose estimation. To our knowledge, this is the only study in the literature focusing on localization and autonomous control of a UAV in 3-D, featureless tunnel-like environments. We show the success of our work with results from real experiments.


Proceedings of SPIE | 2014

Power and weight considerations in small, agile quadrotors

Yash Mulgaonkar; Michael Whitzer; Brian Morgan; Christopher M. Kroninger; Aaron M. Harrington; Vijay Kumar

The development of autonomous Micro Aerial Vehicles (MAVs) is significantly constrained by their size, weight and power consumption. In this paper, we explore the energetics of quadrotor platforms and study the scaling of mass, inertia, lift and drag with their characteristic length. The effects of length scale on masses and inertias associated with various components are also investigated. Additionally, a study of Lithium Polymer battery performance is presented in terms of specific power and specific energy. Finally, we describe the power and energy consumption for different quadrotors and explore the dependence on size and mass for static hover tests as well as representative maneuvers.


intelligent robots and systems | 2015

Smartphones power flying robots

Giuseppe Loianno; Yash Mulgaonkar; Chris Brunner; Dheeraj Ahuja; Arvind Ramanandan; Murali Ramaswamy Chari; Serafin Diaz; Vijay Kumar

Consumer grade technology seen in cameras and phones has led to the price/performance ratio of sensors and processors falling dramatically over the last decade. In particular, most devices are packaged with a camera, a gyroscope, and an accelerometer, important sensors for aerial robotics. The low mass and small form factor make them particularly well suited for autonomous flight with small flying robots, especially in GPS-denied environments. In this work, we present the first fully autonomous smartphone-based quadrotor. All the computation, sensing and control runs on an off-the-shelf smartphone, with all the software functionality in a smartphone app.We show how quadrotors can be stabilized and controlled to achieve autonomous flight in indoor buildings with application to smart homes, search and rescue, construction and architecture. The work allows any consumer with a smartphone to autonomously drive a quadrotor robot platform, even without GPS, by downloading an app, and concurrently build 3-D maps.


IEEE Robotics & Automation Magazine | 2015

Flying Smartphones: Automated Flight Enabled by Consumer Electronics

Giuseppe Loianno; Gareth Cross; Chao Qu; Yash Mulgaonkar; Joel A. Hesch; Vijay Kumar

Consumer-grade technology seen in cameras and phones has led to the price-performance ratio falling dramatically over the last decade. We are seeing a similar trend in robots that leverage this technology. A recent development is the interest of companies such as Google, Apple, and Qualcomm in high-end communication devices equipped with such sensors as cameras and inertial measurement units (IMUs) and with significant computational capability. Google, for instance, is developing a customized phone equipped with conventional as well as depth cameras. This article explores the potential for the rapid integration of inexpensive consumer-grade electronics with the off-the-shelf robotics technology for automation in homes and offices. We describe how standard hardware platforms (robots, processors, and smartphones) can be integrated through simple software architecture to build autonomous quadrotors that can navigate and map unknown, indoor environments. We show how the quadrotor can be stabilized and controlled to achieve autonomous flight and the generation of three-dimensional (3-D) maps for exploring and mapping indoor buildings with application to smart homes, search and rescue, and architecture. This opens up the possibility for any consumer to take a commercially available robot platform and a smartphone and automate the process of creating a 3-D map of his/her home or office.

Collaboration


Dive into the Yash Mulgaonkar's collaboration.

Top Co-Authors

Avatar

Vijay Kumar

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Giuseppe Loianno

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Nathan Michael

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Shaojie Shen

Hong Kong University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Anurag Makineni

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Camillo J. Taylor

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Chao Qu

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Kartik Mohta

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Gareth Cross

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Ke Sun

University of Pennsylvania

View shared research outputs
Researchain Logo
Decentralizing Knowledge