Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David C. Conner is active.

Publication


Featured researches published by David C. Conner.


Journal of Field Robotics | 2015

Human-robot Teaming for Rescue Missions: Team ViGIR's Approach to the 2013 DARPA Robotics Challenge Trials

Stefan Kohlbrecher; Alberto Romay; Alexander Stumpf; Anant Gupta; Oskar von Stryk; Felipe Bacim; Doug A. Bowman; Alex K. Goins; Ravi Balasubramanian; David C. Conner

Team ViGIR entered the 2013 DARPA Robotics Challenge DRC with a focus on developing software to enable an operator to guide a humanoid robot through the series of challenge tasks emulating disaster response scenarios. The overarching philosophy was to make our operators full team members and not just mere supervisors. We designed our operator control station OCS to allow multiple operators to request and share information as needed to maintain situational awareness under bandwidth constraints, while directing the robot to perform tasks with most planning and control taking place onboard the robot. Given the limited development time, we leveraged a number of open source libraries in both our onboard software and our OCS design; this included significant use of the robot operating system libraries and toolchain. This paper describes the high level approach, including the OCS design and major onboard components, and it presents our DRC Trials results. The paper concludes with a number of lessons learned that are being applied to the final phase of the competition and are useful for related projects as well.


ieee-ras international conference on humanoid robots | 2014

Supervised footstep planning for humanoid robots in rough terrain tasks using a black box walking controller

Alexander Stumpf; Stefan Kohlbrecher; David C. Conner; Oskar von Stryk

In recent years, the numbers of life-size humanoids as well as their mobility capabilities have steadily grown. Stable walking motion and control for humanoid robots are already well investigated research topics. This raises the question how navigation problems in complex and unstructured environments can be solved utilizing a given black box walking controller with proper perception and modeling of the environment provided. In this paper we present a complete system for supervised footstep planning including perception, world modeling, 3D planner and operator interface to enable a humanoid robot to perform sequences of steps to traverse uneven terrain. A proper height map and surface normal estimation are directly obtained from point cloud data. A search-based planning approach (ARA*) is extended to sequences of footsteps in full 3D space (6 DoF). The planner utilizes a black box walking controller without knowledge of its implementation details. Results are presented for an Atlas humanoid robot during participation of Team ViGIR in the 2013 DARPA Robotics Challenge Trials.


ieee-ras international conference on humanoid robots | 2014

Template-based manipulation in unstructured environments for supervised semi-autonomous humanoid robots

Alberto Romay; Stefan Kohlbrecher; David C. Conner; Alexander Stumpf; Oskar von Stryk

Humanoid robotic manipulation in unstructured environments is a challenging problem. Limited perception, communications and environmental constraints present challenges that prevent fully autonomous or purely teleoperated robots from reliably interacting with their environment. In order to achieve higher reliability in manipulation we present an approach involving remote human supervision. Strengths from both human operator and humanoid robot are leveraged through a user interface that allows the operator to perceive the remote environment through an aggregated worldmodel based on onboard sensing, while the robot can efficiently receive perceptual and semantic information from the operator. A template based manipulation approach has been successfully applied to the Atlas humanoid robot; we show real world footage of the results obtained in the DARPA Robotics Challenge Trials 2013.


ieee-ras international conference on humanoid robots | 2015

Achieving versatile manipulation tasks with unknown objects by supervised humanoid robots based on object templates

Alberto Romay; Stefan Kohlbrecher; David C. Conner; Oskar von Stryk

The investigations of this paper are motivated by the scenario of a supervised semi-autonomous humanoid robot entering a mainly unknown, potentially degraded human environment to perform highly diverse disaster recovery tasks. For this purpose, the robot must be enabled to use any object it can find in the environment as tool for achieving its current manipulation task. This requires the use of potential unknown objects as well as known objects for new purposes (e.g. using a drill as a hammer). A recently proposed object template manipulation approach is extended to provide a semi-autonomous humanoid robot assisted by a remote human supervisor with the versatility needed to utilize objects in the described manner applying affordances [1] from other previously known objects. For an Atlas humanoid robot it is demonstrated how using a small set of such object templates with well defined affordances can be used to solve manipulation tasks using new unknown objects.


international symposium on safety, security, and rescue robotics | 2013

Overview of team ViGIR's approach to the Virtual Robotics Challenge

Stefan Kohlbrecher; David C. Conner; Alberto Romay; Felipe Bacim; Doug A. Bowman; Oskar von Stryk

With the DARPA Robotics Challenge (DRC), a call to an ambitious multi-part competition was sent out to the robotics community. In this paper, we briefly summarize the approach for addressing the Virtual Robotics Challenge (VRC) where software for control and supervision of a capable humanoid robot must be developed. Team ViGIR, comprising members from the US and Germany, leveraged previous robotics competition experience and a variety of open source tools, to achieve sixth place in the VRC out of 126 registrants, thereby advancing to the next round of the DRC and obtaining an Atlas robot.


international conference on robotics and automation | 2016

Reactive high-level behavior synthesis for an Atlas humanoid robot

Spyros Maniatopoulos; Philipp Schillinger; Vitchyr Pong; David C. Conner; Hadas Kress-Gazit

In this work, we take a step towards bridging the gap between the theory of formal synthesis and its application to real-world, complex, robotic systems. In particular, we present an end-to-end approach for the automatic generation of code that implements high-level robot behaviors in a verifiably correct manner, including reaction to the possible failures of low-level actions. We start with a description of the system defined a priori. Thus, a non-expert user need only specify a high-level task. We automatically construct a formal specification, in a fragment of Linear Temporal Logic (LTL), that encodes the systems capabilities and constraints, the task, and the desired reaction to low-level failures. We then synthesize a reactive mission plan that is guaranteed to satisfy the formal specification, i.e., achieve the tasks goals or correctly react to failures. Lastly, we automatically generate a state machine that instantiates the synthesized symbolic plan in software. We showcase our approach using Team ViGIRs software and Atlas humanoid robot and present lab experiments, thus demonstrating the application of formal synthesis techniques to complex robotic systems. The proposed approach has been implemented and open-sourced as a collection of Robot Operating System (ROS) packages, which are adaptable to other systems.


ieee-ras international conference on humanoid robots | 2015

Modeling, identification and joint impedance control of the atlas arms

Moritz Schappler; Jonathan Vorndamme; Alexander Todtheide; David C. Conner; Oskar von Stryk; Sami Haddadin

Compliant manipulation has become central to robots that are sought to safely act in and interact with unstructured as well as only partially known environments. In this paper we equip the hydraulically actuated, usually position controlled arms of the Atlas robot with model-based joint impedance control, including suitable damping design, and experimentally verify the proposed algorithm. Our approach, which originates from the advances in soft-robotics control, relies on high-performance low-level joint torque control. This makes it independent from the actual technology being hydraulic or electromechanical. This paper describes the approach to accurately model the dynamics, and design the optimal excitation trajectory for system identification to enable the specification of model-based feed-forward controls. In conclusion, the implemented controller enables the robot arm to execute significantly smoother motions, be compliant against external forces, and have similar tracking performance as compared to the existing position control scheme. Finally, unknown modeling inaccuracies and contact forces are accurately estimated by a suitable disturbance observer, which could be used in the future to further enhance our controllers performance.


ieee-ras international conference on humanoid robots | 2016

Open source integrated 3D footstep planning framework for humanoid robots

Alexander Stumpf; Stefan Kohlbrecher; David C. Conner; Oskar von Stryk

Humanoid robots benefit from their anthropomorphic shape when operating in human-made environments. In order to achieve human-like capabilities, robots must be able to perceive, understand and interact with the surrounding world. Humanoid locomotion in uneven terrain is a challenging task as it requires sophisticated world model generation, motion planning and control algorithms and their integration. In recent years, much progress in world modeling and motion control has been achieved. This paper presents one of the very first open source frameworks for full 3D footstep planning available for ROS which integrates perception and locomotion systems of humanoid bipedal robots. The framework is designed to be used for different type of humanoid robots having different perception and locomotion capabilities with minimal implementation effort. In order to integrate with almost any humanoid walking controller, the system can easily be extended with additional functionality that may be needed by low-level motion algorithms. It also considers sophisticated human-robot interaction that enables to direct the planner to generate improved solutions, provides monitoring data to the operator and debugging feedback for developers. The provided software package consists of three major blocks that address world generation, planning and interfacing low-level motion algorithms. The framework has been successfully applied to four different full-size humanoid robots.


southeastcon | 2017

Flexible Navigation: Finite state machine-based integrated navigation and control for ROS enabled robots

David C. Conner; Justin Willis

This paper describes the Flexible Navigation system that extends the ROS Navigation stack and compatible libraries to separate computation from decision making, and integrates the system with FlexBE — the Flexible Behavior Engine, which provides intuitive supervision with adjustable autonomy. Although the ROS Navigation plugin model offers some customization, many decisions are internal to move_base. In contrast, the Flexible Navigation system separates global planning from local planning and control, and uses a hierarchical finite state machine to coordinate behaviors. The Flexible Navigation system includes Python-based state implementations and ROS nodes derived from the move_base plugin model to provide compatibility with existing libraries as well as future extensibility. The paper concludes with complete system demonstrations in both simulation and hardware using the iRobot Create and Kobuki-based Turtlebot running under ROS Kinetic. The system supports multiple independent robots.


ieee-ras international conference on humanoid robots | 2015

Open source driving controller concept for humanoid robots: Teams hector and ViGIR at 2015 DARPA robotics challenge finals

Alberto Romay; Achim Stein; Martin Oehler; Alexander Stumpf; Stefan Kohlbrecher; Oskar von Stryk; David C. Conner

Among the eight tasks of the DARPA Robotics Challenge (DRC), the driving task was one of the most challenging. Obstacles in the course prevented straight driving and restricted communications limited the situation awareness of the operator. In this video we show how Team Hector and Team ViGIR successfully completed the driving task with different robot platforms, THOR-Mang and Atlas respectively, but using the same software and compliant steering adapter. Our driving user interface presents to the operator image view from cameras and driving aids such as wheel positioning and turn radius path of the wheels. The operator uses a standard computer game joystick which is used to command steering wheel angles and gas pedal pressure. Steering wheel angle positions are generated off-line and interpolated on-line in the robots onboard computer. The compliant steering adapter accommodates end-effector positioning errors. Gas pedal pressure is generated by a binary joint position of the robots leg. Commands are generated in the operator control station and sent as target positions to the robot. The driving user interface also provides feedback from the current steering wheel position. Video footage with descriptions from the driving interface, robots camera and LIDAR perception and external task monitoring is presented.

Collaboration


Dive into the David C. Conner's collaboration.

Top Co-Authors

Avatar

Oskar von Stryk

Technische Universität Darmstadt

View shared research outputs
Top Co-Authors

Avatar

Stefan Kohlbrecher

Technische Universität Darmstadt

View shared research outputs
Top Co-Authors

Avatar

Alberto Romay

Technische Universität Darmstadt

View shared research outputs
Top Co-Authors

Avatar

Alexander Stumpf

Technische Universität Darmstadt

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge