Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hyukseong Kwon is active.

Publication


Featured researches published by Hyukseong Kwon.


international conference on robotics and automation | 2005

Person Tracking with a Mobile Robot using Two Uncalibrated Independently Moving Cameras

Hyukseong Kwon; Youngrock Yoon; Jae Byung Park; Avinash C. Kak

This paper presents an efficient person tracking algorithm for a vision-based mobile robot using two independently moving cameras each of which is mounted on its own pan/tilt unit. Without calibrating these cameras, the goal of our proposed method is to estimate the distance to a target appearing in the image sequences captured by the cameras. The main contributions of our approach include: 1) establishing the correspondence between the control inputs to the pan/tilt units and the pixel displacement in the image plane without using the intrinsic parameters of the cameras; and 2) derivation of the distance information from the correspondence between the centers of masses of the segmented color-blobs from the left and the right images without stereo camera calibration. Our proposed approach has been successfully tested on a mobile robot for the task of person following in real environments.


Robotics and Autonomous Systems | 2013

Building 3D visual maps of interior space with a new hierarchical sensor-fusion architecture

Hyukseong Kwon; Khalil Mustafa Ahmad Yousef; Avinash C. Kak

Abstract It is now generally recognized that sensor-fusion is the best approach to the accurate construction of environment maps by a sensor-equipped mobile robot. Typically, range data collected with a range sensor is combined with the reflectance data obtained from one or more cameras mounted on the robot. In much of the past work on sensor fusion in hierarchical approaches to map construction, the fusion was carried out only at the lowest level of the hierarchy. As a result, in those approaches, only the fused data was made available to the higher levels in the hierarchy. This implied that any errors caused by sensor fusion would propagate upwards into the higher level representations of an interior map. Our work, on the other hand, checks for consistency between the data elements produced by the different sensors at all levels of the hierarchy. This consistency checking is carried out with the help of an interval-based representation of uncertainties in the sensor data. In addition to demonstrating that our approach to the fusion of range and image data results in dense 3D maps of the interior space, we also provide validation of our overall framework by presenting a set of loop closure results. These results demonstrate that our overall errors in the maps remain small (within 0.91% of the distance traveled for map construction) even when the robot has to traverse over large loops inside a building.


international conference on robotics and automation | 2007

A New Approach for Active Stereo Camera Calibration

Hyukseong Kwon; Johnny Park; Avinash C. Kak

By active stereo we mean a stereo vision system that allows for independent panning and tilting for each of the two cameras. One advantage of active stereo in relation to regular stereo is the formers wider effective field of view; if an object is too close to the camera baseline, the depth to the object can still be estimated accurately by panning the cameras appropriately. Another advantage of active stereo is that it can yield a larger number of depth measurements simultaneously for each position of the platform on which the camera system is mounted. Panning and tilting over a large angular range, while being the main reason for the advantages of active stereo, also make it more challenging to calibrate such systems. For a calibration procedure to be effective for active stereo, the estimated parameters must be valid over the entire range of the pan and tilt angles. This paper presents a new approach to the calibration of such vision systems. Our method is based on the rationale that an active stereo calibration procedure must explicitly estimate the locations and the orientations of the pan and tilt rotating axes for the cameras through a closed-form solution. When these estimates for the axes are combined with the homogeneous transform relationships that link the various coordinate frames, we end with a calibration that is valid over a large variation in the pan and tilt angles.


intelligent robots and systems | 2013

Optimal path planning of a target-following fixed-wing UAV using sequential decision processes

Stanley S. Baek; Hyukseong Kwon; Josiah Yoder; Daniel J. Pack

In this work, we consider the optimal path of a fixed-wing unmanned aerial vehicle (UAV) tracking a mobile surface target. One of the limitations of fixed-wing UAVs in tracking mobile targets is the lack of hovering capability when the target moves much slower than the minimum UAV speed, requiring the UAV maintain an orbit about the target. In this paper, we propose a method to find the optimal policy for fixed-wing UAVs to minimize the location uncertainty of a mobile target. Using a grid-based Markov Decision Process (MDP), we use an off-line policy iteration algorithm to find an optimal UAV path in a coarse discretized state space, followed by an on-line policy iteration algorithm that applies a finer grid MDP to the region of interest to find the final optimal UAV trajectory. We validate the proposed algorithm using computer simulations. Comparing the simulation results with other methods, we show that the proposed method has up to 13% decrease in error uncertainty than ones resulted using conventional methods.


Journal of Intelligent and Robotic Systems | 2012

A Robust Mobile Target Localization Method for Cooperative Unmanned Aerial Vehicles Using Sensor Fusion Quality

Hyukseong Kwon; Daniel J. Pack

One of the current unmanned systems research areas at the US Air Force Academy is finding robust methods to locate ground mobile targets using multiple, cooperative unmanned aerial vehicles (UAVs). In our previous work (Plett et al., Lect Notes Control Inf Sci 369:22–44, 2007), we showed an effective method to search, detect, and localize static ground targets. The current focus of our research is to extend the method to handle mobile ground targets. To that end, we introduced a novel concept called Sensor Fusion Quality (SFQ) in Kwon and Pack (2011). In this paper, we adapt and incorporate the SFQ principle to include both static and mobile ground targets in a modified Out-of-Order Sigma Point Kalman Filtering (O3SPKF) approach (Plett et al., Lect Notes Control Inf Sci 369:22–44, 2007). The proposed method uses augmented covariances of sigma points to compute SFQ values. This approach showed superior performance over those observed when either the SFQ method or the O3SPKF method was used alone. The added benefit of the integrated approach is in the reduction of computational complexity associated with the propagation updates of target state uncertainties. We validate the proposed method using both simulation and flight experiment results.


intelligent robots and systems | 2010

Cooperative target localization using heterogeneous unmanned ground and aerial vehicles

Chad Hager; Dimitri Zarzhitsky; Hyukseong Kwon; Daniel J. Pack

This paper describes our on-going efforts toward developing heterogeneous, cooperative systems technologies. In particular, we present the role of unmanned mobile ground systems (robots) in a heterogeneous sensor network, consisting of two unmanned aircraft, a mobile ground robot, and a set of four stationary ground sensors, performing an intelligence, surveillance, and reconnaissance (ISR) mission. The unmanned mobile ground robot is equipped with an infrared (IR) sensor, the aircraft and the stationary ground sensors use optical cameras and radio frequency (RF) detectors, respectively. The primary responsibility of the mobile ground robot is to verify the identity of a target based on its IR signature. In addition, the mobile ground robot also assists with the sensor networks overall target localization estimation efforts by sharing its IR sensor-based target location measurements with members of the sensor network. Our analysis and field experiments demonstrated scalability and effectiveness of our approach.


international conference on unmanned aircraft systems | 2013

Maximizing target detection under sunlight reflection on water surfaces with an autonomous unmanned aerial vehicle

Hyukseong Kwon; Josiah Yoder; Stanley S. Baek; Scott Gruber; Daniel J. Pack

Reflected sunlight can significantly impact vision-based object detection and tracking algorithms, especially ones based on an aerial platform operating over a marine environment. Unmanned aerial systems above a water surface may be unable to detect objects on the water surface due to sunlight glitter. Although the area affected by sunlight reflection may be limited, rapid course corrections of unmanned aerial vehicles (UAVs)-especially fixed-wing UAVs-is also limited by aerodynamics, making it challenging to determine a reasonable path that avoids sunlight reflection while maximizing chances to capture a target. In this paper, we propose an approach for autonomous UAV path planning that maximizes the accuracy of the estimated target location by minimizing the sunlight reflection influences.


international conference on robotics and automation | 2012

UAV vision: Feature based accurate ground target localization through propagated initializations and interframe homographies

Kyuseo Han; Chad Aeschliman; Johnny Park; Avinash C. Kak; Hyukseong Kwon; Daniel J. Pack

Our work presents solutions to two related vexing problems in feature-based localization of ground targets in Unmanned Aerial Vehicle (UAV) images: (i) A good initial guess at the pose estimate that would speed up the convergence to the final pose estimate for each image frame in a video sequence; and (ii)Time-bounded estimation of the position of the ground target. We address both these problems within the framework of the Iterative Closest Point (ICP) algorithm that now has a rich tradition of usage in computer vision and robotics applications. We solve the first of the two problems by frame-to-frame propagation of the computed pose estimates for the purpose of the initializations needed by ICP. The second problem is solved by terminating the iterative estimation process at the expiration of the available time for each image frame. We show that when frame-to-frame homography is factored into the iterative calculations, the accuracy of the position calculated at the time of bailing out of the iterations is nearly always sufficient for the goals of UAV vision.


Infotech@Aerospace 2012 | 2012

Robust Tracking of Nonlinear Target Motion Using Out-Of-Order Sigma Point Kalman Filters

Hyukseong Kwon; Daniel J. Pack

Over the past decade, the tasks of autonomous localization and tracking of mobile ground targets using cooperative, multiple small unmanned aerial vehicles (UAVs) have been gaining an increasing amount of interest among researchers in the UAV community. Robust solutions have been elusive due to a number of challenges including sudden, unpredictable route changes of targets; inaccurate computation of small platform attitudes; and limited sensor field of views. In our previous works1,2 we demonstrated the merits of the Out-Of-Order Sigma-Point Kalman Filter (O3SPKF) and the concept of Sensor Fusion Quality (SFQ) as multiple UAVs effectively tracked and located mobile ground targets moving in a linear fashion. We showed that the O3SPKF enables UAVs to optimally incorporate randomly time-delayed sensor information, also called out-of-order data, while the SFQ technique removes invalid sensor data, enhancing the target localization accuracy for linearly moving targets. In this paper we present an improved O3SPKF/SFQ method that uses both linear and circular target tracking models to geo-locate mobile targets with non-linear motions.


Archive | 2015

UAV handbook: Payload design of small uavs

Scott Gruber; Hyukseong Kwon; Chad Hager; Rajnikant Sharma; Josiah Yoder; Daniel J. Pack

This chapter discusses the payload design issues for small unmanned aerial vehicles (UAVs). Details of several payload design principles to overcome various small UAV constraints imposed by stringent weight, power, and volume are discussed. Throughout the chapter, the efficacy of these principles is demonstrated with the help of the payloads for a fixed wing UAV developed by the S. Gruber ( ) • H. Kwon • C. Hager • R. Sharma • J. Yoder Academy Center for Unmanned Aircraft Systems Research, Department of Electrical and Computer Engineering, United States Air Force Academy, Colorado Springs, CO, USA e-mail: [email protected]; [email protected]; [email protected]; [email protected]; [email protected] D. Pack Department of Electrical and Computer Engineering, University of Texas at San Antonio, San Antonio, TX, USA e-mail: [email protected] K.P. Valavanis, G.J. Vachtsevanos (eds.), Handbook of Unmanned Aerial Vehicles, DOI 10.1007/978-90-481-9707-1 84,

Collaboration


Dive into the Hyukseong Kwon's collaboration.

Top Co-Authors

Avatar

Daniel J. Pack

University of Tennessee at Chattanooga

View shared research outputs
Top Co-Authors

Avatar

Josiah Yoder

United States Air Force Academy

View shared research outputs
Top Co-Authors

Avatar

Scott Gruber

United States Air Force Academy

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chad Hager

United States Air Force Academy

View shared research outputs
Top Co-Authors

Avatar

Dimitri Zarzhitsky

United States Air Force Academy

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge