Paul Y. Oh
Drexel University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Paul Y. Oh.
international conference on robotics and automation | 2004
William E. Green; Paul Y. Oh; Geoffrey L. Barrows
Near-earth environments are time consuming, labor intensive and possibly dangerous to safe guard. Accomplishing tasks like bomb detection, search-and-rescue and reconnaissance with aerial robots could save resources. This work describes the adoption of insect behavior and flight patterns to develop a MAV sensor suite. A prototype called CQAR: closed quarter aerial robot, which is capable of flying in and around buildings, through tunnels and in and out of caves is used to validate the efficiency of such a method when equipped with optic flow microsensors.
international conference on robotics and automation | 2001
Paul Y. Oh; K. Allen
There are many design factors and choices when mounting a vision system for robot control. Such factors may include the kinematic and dynamic characteristics in the robots degrees of freedom (DOF), which determine what velocities and fields-of-view a camera can achieve. Another factor is that additional motion components (such as pan-tilt units) are often mounted on a robot and introduce synchronization problems. When a task does not require visually servoing every robot DOF, the designer must choose which ones to servo. Questions then arise as to what roles, if any, do the remaining DOF play in the task. Without an analytical framework, the designer resorts to intuition and try-and-see implementations. This paper presents a frequency-based framework that identifies the parameters that factor into tracking. This framework gives design insight which was then used to synthesize a control law that exploits the kinematic and dynamic attributes of each DOF. The resulting multi-input multi-output control law, which we call partitioning, defines an underlying joint coupling to servo camera motions. The net effect is that by employing both visual and kinematic feedback loops, a robot can quickly position and orient a camera in a large assembly workcell. Real-time experiments tracking people and robot hands are presented using a 5-DOF hybrid (3-DOF Cartesian gantry plus 2-DOF pan-tilt unit) robot.
Journal of Intelligent and Robotic Systems | 2013
Matko Orsag; Christopher Korpela; Paul Y. Oh
Compared to autonomous ground vehicles, UAVs (unmanned aerial vehicles) have significant mobility advantages and the potential to operate in otherwise unreachable locations. Micro UAVs still suffer from one major drawback: they do not have the necessary payload capabilities to support high performance arms. This paper, however, investigates the key challenges in controlling a mobile manipulating UAV using a commercially available aircraft and a light-weight prototype 3-arm manipulator. Because of the overall instability of rotorcraft, we use a motion capture system to build an efficient autopilot. Our results indicate that we can accurately model and control our prototype system given significant disturbances when both moving the manipulators and interacting with the ground.
Journal of Intelligent and Robotic Systems | 2012
Christopher Korpela; Todd W. Danko; Paul Y. Oh
Given significant mobility advantages, UAVs have access to many locations that would be impossible for an unmanned ground vehicle to reach, but UAV research has historically focused on avoiding interactions with the environment. Recent advances in UAV size to payload and manipulator weight to payload ratios suggest the possibility of integration in the near future, opening the door to UAVs that can interact with their environment by manipulating objects. Therefore, we seek to investigate and develop the tools that will be necessary to perform manipulation tasks when this becomes a reality. We present our progress and results toward a design and physical system to emulate mobile manipulation by an unmanned aerial vehicle with dexterous arms and end effectors. To emulate the UAV, we utilize a six degree-of-freedom miniature gantry crane that provides the complete range of motion of a rotorcraft as well as ground truth information without the risk associated with free flight. Two four degree-of-freedom manipulators attached to the gantry system perform grasping tasks. Computer vision techniques and force feedback servoing provide target object and manipulator position feedback to the control hardware. To test and simulate our system, we leverage the OpenRAVE virtual environment and ROS software architecture. Because rotorcraft are inherently unstable, introduce ground effects, and experience changing flight dynamics under external loads, we seek to address the difficult task of maintaining a stable UAV platform while interacting with objects using multiple, dexterous arms. As a first step toward that goal, this paper describes the design of a system to emulate a flying, dexterous mobile manipulator.
international conference on advanced intelligent mechatronics | 2005
William E. Green; Paul Y. Oh
Near-earth environments, such as forests, caves, tunnels, and urban structures make reconnaissance, surveillance and search-and-rescue missions difficult and dangerous to accomplish. Micro-air-vehicles (MAVs), equipped with wireless cameras, can assist in such missions by providing real-time situational awareness. This paper describes an additional flight modality enabling fixed-wing MAVs to supplement existing endurance superiority with hovering capabilities. This secondary flight mode can also be used to avoid imminent collisions by quickly transitioning from cruise to hover flight. A sensor suite which allows for autonomous hovering by regulating the aircrafts yaw. pitch and roll angles is also described
IEEE Robotics & Automation Magazine | 2008
William E. Green; Paul Y. Oh
Flying in and around caves, tunnels, and buildings demands more than one sensing modality. This article presented an optic-flow- based approach inspired by flying insects for avoiding lateral collisions. However, there were a few real-world scenarios in which optic flow sensing failed. This occurred when obstacles on approach were directly in front of the aircraft. Here, a simple sonar or infrared sensor can be used to trigger a quick transition into the hovering mode to avoid the otherwise fatal collision. Toward this end, we have demonstrated a fixed-wing prototype capable of manually transitioning from conventional cruise flight into the hovering mode. The prototype was then equipped with an IMU and a flight control system to automate the hovering process. The next step in this research is to automate the transition from cruise to hover flight.
international conference on robotics and automation | 2006
William E. Green; Paul Y. Oh
Recently, there is a need to acquire intelligence in hostile or dangerous environments such as caves, forests, or urban areas. Rather than risking human life, backpackable, bird-sized aircraft, equipped with a wireless camera, can be rapidly deployed to gather reconnaissance in such environments. However, they first must be designed to fly in tight, cluttered terrain. This paper discusses an additional flight modality for a fixed-wing aircraft, enabling it to supplement existing endurance superiority with hovering capabilities. An inertial measurement sensor and an onboard processing and control unit, used to achieve autonomous hovering, are also described. This is, to the best of our knowledge, the first documented success of hovering a fixed-wing micro air vehicle autonomously
ASME 2003 International Mechanical Engineering Congress and Exposition | 2003
William E. Green; Paul Y. Oh; Keith W. Sevcik; Geoffrey L. Barrows
Urban environments are time consuming, labor intensive and possibly dangerous to safe guard. Accomplishing tasks like bomb detection, search-andrescue and reconnaissance with aerial robots could save resources. This paper describes a prototype called CQAR: Closed Quarter Aerial Robot, which is capable of flying in and around buildings. The prototype was analytically designed to fly safely and slowly. An optic flow microsensor for depth perception, which will allow autonomous takeoff and landing and collision avoidance, is also described.
international conference on robotics and automation | 1997
Peter K. Allen; Andrew T. Miller; Paul Y. Oh; Brian S. Leibowitz
Most robotic hands are either sensorless or lack the ability to accurately and robustly report position and force information relating to contact. This paper describes a robotic hand system that uses a limited set of native-joint position and force sensing along with custom-designed tactile sensors and real-time vision modules to accurately compute finger contacts and applied forces for grasping tasks. Three experiments are described: integration of real-time visual trackers in conjunction with internal strain gauge sensing to correctly localize and compute finger forces, determination of contact points on the inner and outer links of a finger through tactile sensing and visual sensing, and determination of vertical displacement by tactile sensing for a grasping task.
international conference on robotics and automation | 2013
Christopher Korpela; Matko Orsag; Miles Pekala; Paul Y. Oh
This paper presents a control scheme to achieve dynamic stability in an aerial vehicle with dual multi-degree of freedom manipulators. Arm movements assist with stability and recovery for ground robots, in particular humanoids and dynamically balancing vehicles. However, there is little work in aerial robotics where the manipulators themselves facilitate flight stability or the load mass is repositioned in flight for added control. We present recent results in arm motions that achieve increased flight stability without and with different load masses attached to the end-effectors. Our test flight results indicate that we can accurately model and control our aerial vehicle when both moving the manipulators and interacting with target objects.