Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Inkyu Sa is active.

Publication


Featured researches published by Inkyu Sa.


intelligent robots and systems | 2017

Multiresolution mapping and informative path planning for UAV-based terrain monitoring

Marija Popovic; Teresa A. Vidal-Calleja; Gregory Hitz; Inkyu Sa; Roland Siegwart; Juan I. Nieto

Unmanned aerial vehicles (UAVs) can offer timely and cost-effective delivery of high-quality sensing data. However, deciding when and where to take measurements in complex environments remains an open challenge. To address this issue, we introduce a new multiresolution mapping approach for informative path planning in terrain monitoring using UAVs. Our strategy exploits the spatial correlation encoded in a Gaussian Process model as a prior for Bayesian data fusion with probabilistic sensors. This allows us to incorporate altitude-dependent sensor models for aerial imaging and perform constant-time measurement updates. The resulting maps are used to plan information-rich trajectories in continuous 3-D space through a combination of grid search and evolutionary optimization. We evaluate our framework on the application of agricultural biomass monitoring. Extensive simulations show that our planner performs better than existing methods, with mean error reductions of up to 45% compared to traditional “lawnmower” coverage. We demonstrate proof of concept using a multirotor to map color in different environments.


international conference on robotics and automation | 2017

Online informative path planning for active classification using UAVs

Marija Popovic; Gregory Hitz; Juan I. Nieto; Inkyu Sa; Roland Siegwart; Enric Galceran

In this paper, we introduce an informative path planning (IPP) framework for active classification using unmanned aerial vehicles (UAVs). Our algorithm uses a combination of global viewpoint selection and evolutionary optimization to refine the planned trajectory in continuous 3D space while satisfying dynamic constraints. Our approach is evaluated on the application of weed detection for precision agriculture. We model the presence of weeds on farmland using an occupancy grid and generate adaptive plans according to information-theoretic objectives, enabling the UAV to gather data efficiently. We validate our approach in simulation by comparing against existing methods, and study the effects of different planning strategies. Our results show that the proposed algorithm builds maps with over 50% lower entropy compared to traditional “lawnmower” coverage in the same amount of time. We demonstrate the planning scheme on a multirotor platform with different artificial farmland set-ups.


IEEE Robotics & Automation Magazine | 2018

Build Your Own Visual-Inertial Drone: A Cost-Effective and Open-Source Autonomous Drone

Inkyu Sa; Mina Kamel; Michael Burri; Michael Bloesch; Raghav Khanna; Marija Popovic; Juan I. Nieto; Roland Siegwart

This article describes an approach to building a cost-effective and research-grade visual-inertial (VI) odometry-aided vertical takeoff and landing (VTOL) platform. We utilize an off-the-shelf VI sensor, an onboard computer, and a quadrotor platform, all of which are factory calibrated and mass produced, thereby sharing similar hardware and sensor specifications [e.g., mass, dimensions, intrinsic and extrinsic of camera-inertial measurement unit (IMU) systems, and signal-to-noise ratio]. We then perform system calibration and identification, enabling the use of our VI odometry, multisensor fusion (MSF), and model predictive control (MPC) frameworks with off-the-shelf products. This approach partially circumvents the tedious parameter-tuning procedures required to build a full system. The complete system is extensively evaluated both indoors using a motioncapture system and outdoors using a laser tracker while performing hover and step responses and trajectory-following tasks in the presence of external wind disturbances. We achieve root-mean-square (rms) pose errors of 0.036 m with respect to reference hover trajectories. We also conduct relatively long distance (.180 m) experiments on a farm site, demonstrating a 0.82% drift error of the total flight distance. This article conveys the insights we acquired about the platform and sensor module and offers open-source code with tutorial documentation to the community.


international conference on robotics and automation | 2017

On field radiometric calibration for multispectral cameras

Raghav Khanna; Inkyu Sa; Juan I. Nieto; Roland Siegwart

Perception systems for outdoor robotics have to deal with varying environmental conditions. Variations in illumination in particular, are currently the biggest challenge for vision-based perception. In this paper we present an approach for radiometric characterization of multispectral cameras. To enable spatio-temporal mapping we also present a procedure for in-situ illumination estimation, resulting in radiometric calibration of the collected images. In contrast to current approaches, we present a purely data driven, parameter free approach, based on maximum likelihood estimation which can be performed entirely on the field, without requiring specialised laboratory equipment. Our routine requires three simple datasets which are easily acquired using most modern multispectral cameras. We evaluate the framework with a cost-effective snapshot multispectral camera. The results show that our method enables the creation of quatitatively accurate relative reflectance images with challenging on field calibration datasets under a variety of ambient conditions.


field and service robotics | 2018

Dynamic System Identification, and Control for a Cost-Effective and Open-Source Multi-rotor MAV

Inkyu Sa; Mina Kamel; Raghav Khanna; Marija Popovic; Juan I. Nieto; Roland Siegwart

This paper describes dynamic system identification, and full control of a cost-effective Multi-rotor micro-aerial vehicle (MAV). The dynamics of the vehicle and autopilot controllers are identified using only a built-in IMU and utilized to design a subsequent model predictive controller (MPC). Experimental results for the control performance are evaluated using a motion capture system while performing hover, step responses, and trajectory following tasks in the presence of external wind disturbances. We achieve root-mean-square (RMS) errors between the reference and actual trajectory of x (=) 0.021 m, y (=) 0.016 m, z (=) 0.029 m, roll (=) 0.392(^circ ), pitch (=) 0.618(^circ ), and yaw (=) 1.087(^circ ) while performing hover. Although we utilize accurate state estimation provided from a motion capture system in an indoor environment, the proposed method is one of the non-trivial prerequisites to build any field or service aerial robots. This paper also conveys the insights we have gained about the commercial vehicle and returned to the community through an open-source code, and documentation.


field and service robotics | 2018

Improved Tau-Guidance and Vision-Aided Navigation for Robust Autonomous Landing of UAVs

Amedeo Rodi Vetrella; Inkyu Sa; Marija Popovic; Raghav Khanna; Juan I. Nieto; Giancarmine Fasano; Domenico Accardo; Roland Siegwart

In many unmanned aerial vehicle (UAV) applications, flexible trajectory generation algorithms are required to enable high levels of autonomy for critical mission phases, such as take-off, area coverage, and landing. In this paper, we present a guidance approach which uses the improved intrinsic tau guidance theory to create spatio-temporal 4-D trajectories for a desired time-to-contact with a landing platform tracked by a visual sensor. This allows us to perform maneuvers with tunable trajectory profiles, while catering for static or non-static starting and terminating motion states. We validate our method in both simulations and real platform experiments by using rotary-wing UAVs to land on static platforms. Results show that our method achieves smooth landings within 10 cm accuracy, with easily adjustable trajectory parameters.


intelligent robots and systems | 2017

Only look once, mining distinctive landmarks from ConvNet for visual place recognition

Fabiola Maffra; Inkyu Sa; Margarita Chli


arXiv: Robotics | 2017

Dynamic System Identification, and Control for a cost effective open-source VTOL MAV.

Inkyu Sa; Mina Kamel; Raghav Khanna; Marija Popovic; Juan I. Nieto; Roland Siegwart


arXiv: Robotics | 2017

Build your own visual-inertial odometry aided cost-effective and open-source autonomous drone.

Inkyu Sa; Mina Kamel; Michael Burri; Michael Bloesch; Raghav Khanna; Marija Popovic; Juan I. Nieto; Roland Siegwart


international conference on robotics and automation | 2018

Introduction to the Special Issue on Precision Agricultural Robotics and Autonomous Farming Technologies

Ho Seok Ahn; Inkyu Sa; Feras Dayoub

Collaboration


Dive into the Inkyu Sa's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge