Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Paul E. Rybski is active.

Publication


Featured researches published by Paul E. Rybski.


IEEE Robotics & Automation Magazine | 2000

Enlisting rangers and scouts for reconnaissance and surveillance

Paul E. Rybski; Nikolaos Papanikolopoulos; Sascha A. Stoeter; Donald G. Krantz; Kemal Berk Yesin; Maria L. Gini; Richard M. Voyles; Dean F. Hougen; Bradley J. Nelson; Michael D. Erickson

Reconnaissance and surveillance are important activities for both military and civilian organizations, for hostage and survivor rescue, drug raids, response to chemical or toxic waste spills etc. We have developed a distributed heterogeneous robotic team that is based mainly on a miniature robotic system. Because some operations require covert action, most of the robots are extremely small. This also allows them to be easily transported and allows for a greater number to be brought into use for a single operation. This makes them expendable without jeopardizing the overall mission. We call these small robots scouts. Their individual components must all be exceedingly small, and their overall design must make maximum use of all available space. They must make efficient use of resources (e.g., batteries). We meet these challenges with an innovative design and creative use of additional support. We team the scouts with larger ranger robots, which can transport the scouts over distances of several kilometers, deploy them rapidly over a large area, coordinate their behavior, and collect and present the resulting data. We present the scouts and rangers, discuss their capabilities along with the associated software, and describe demonstrations conducted to test the innovative aspects of the system. We also discuss related work, analyze our results, and draw conclusions.


human-robot interaction | 2007

Interactive robot task training through dialog and demonstration

Paul E. Rybski; Kevin Yoon; Jeremy Stolarz; Manuela M. Veloso

Effective human/robot interfaces which mimic how humans interact with one another could ultimately lead to robots being accepted in a wider domain of applications. We present a framework for interactive task training of a mobile robot where the robot learns how to do various tasks while observing a human. In addition to observation, the robot listens to the humans speech and interprets the speech as behaviors that are required to be executed. This is especially important where individual steps of a given task may have contingencies that have to be dealt with depending on the situation. Finally, the context of the location where the task takes place and the people present factor heavily into the robots interpretation of how to execute the task. In this paper, we describe the task training framework, describe how environmental context and communicative dialog with the human help the robot learn the task, and illustrate the utility of this approach with several experimental case studies.


human-robot interaction | 2009

The snackbot: documenting the design of a robot for long-term human-robot interaction

Min Kyung Lee; Jodi Forlizzi; Paul E. Rybski; Frederick L. Crabbe; Wayne Chung; Josh Finkle; Eric Glaser; Sara Kiesler

We present the design of the Snackbot, a robot that will deliver snacks in our university buildings. The robot is intended to provide a useful, continuing service and to serve as a research platform for long-term Human-Robot Interaction. Our design process, which occurred over 24 months, is documented as a contribution for others in HRI who may be developing social robots that offer services. We describe the phases of the design project, and the design decisions and tradeoffs that led to the current version of the robot.


IEEE Transactions on Intelligent Transportation Systems | 2009

Obstacle Detection and Tracking for the Urban Challenge

Michael Darms; Paul E. Rybski; Christopher R. Baker; Chris Urmson

This paper describes the obstacle detection and tracking algorithms developed for Boss, which is Carnegie Mellon University s winning entry in the 2007 DARPA Urban Challenge. We describe the tracking subsystem and show how it functions in the context of the larger perception system. The tracking subsystem gives the robot the ability to understand complex scenarios of urban driving to safely operate in the proximity of other vehicles. The tracking system fuses sensor data from more than a dozen sensors with additional information about the environment to generate a coherent situational model. A novel multiple-model approach is used to track the objects based on the quality of the sensor data. Finally, the architecture of the tracking subsystem explicitly abstracts each of the levels of processing. The subsystem can easily be extended by adding new sensors and validation algorithms.


human-robot interaction | 2010

Gracefully mitigating breakdowns in robotic services

Min Kyung Lee; Sara Kielser; Jodi Forlizzi; Siddhartha S. Srinivasa; Paul E. Rybski

Robots that operate in the real world will make mistakes. Thus, those who design and build systems will need to understand how best to provide ways for robots to mitigate those mistakes. Building on diverse research literatures, we consider how to mitigate breakdowns in services provided by robots. Expectancy-setting strategies forewarn people of a robots limitations so people will expect mistakes. Recovery strategies, including apologies, compensation, and options for the user, aim to reduce the negative consequence of breakdowns. We tested these strategies in an online scenario study with 317 participants. A breakdown in robotic service had severe impact on evaluations of the service and the robot, but forewarning and recovery strategies reduced the negative impact of the breakdown. Peoples orientation toward services influenced which recovery strategy worked best. Those with a relational orientation responded best to an apology; those with a utilitarian orientation responded best to compensation. We discuss robotic service design to mitigate service problems.


Ai Magazine | 2009

Autonomous Driving in Traffic: Boss and the Urban Challenge

Chris Urmson; Christopher R. Baker; John M. Dolan; Paul E. Rybski; Bryan Salesky; Dave Ferguson; Michael Darms

The DARPA Urban Challenge was a competition to develop autonomous vehicles capable of safely, reliably and robustly driving in traffic. In this article we introduce Boss, the autonomous vehicle that won the challenge. Boss is complex artificially intelligent software system embodied in a 2007 Chevy Tahoe. To navigate safely, the vehicle builds a model of the world around it in real time. This model is used to generate safe routes and motion plans in both on roads and in unstructured zones. An essential part of Boss’ success stems from its ability to safely handle both abnormal situations and system glitches.


intelligent robots and systems | 2002

Autonomous stair-hopping with Scout robots

Sascha A. Stoeter; Paul E. Rybski; Maria L. Gini; Nikolaos Papanikolopoulos

Search and rescue operations in large disaster sites require quick gathering of relevant information. Both the knowledge of the location of victims and the environmental/structural conditions must be available to safely and efficiently guide rescue personnel. A major hurdle for robots in such scenarios is stairs. A system for autonomous surmounting of stairs is proposed in which a Scout robot jumps from step to step. The robots height is only about a quarter step in size. Control of the Scout is accomplished using visual servoing. An external observer such as another robot is brought into the control loop to provide the Scout with an estimation of its pose with respect to the stairs. This cooperation is necessary as the Scout must refrain from ill-fated motions that may lead it back down to where it started its ascend. Initial experimental results are presented along with a discussion of the issues involved.


Journal of Field Robotics | 2013

Moving object detection with laser scanners

Christoph Mertz; Luis E. Navarro-Serment; Robert A. MacLachlan; Paul E. Rybski; Aaron Steinfeld; Arne Suppé; Chris Urmson; Nicolas Vandapel; Martial Hebert; Charles E. Thorpe; David Duggins; Jay Gowdy

The detection and tracking of moving objects is an essential task in robotics. The CMU-RI Navlab group has developed such a system that uses a laser scanner as its primary sensor. We will describe our algorithm and its use in several applications. Our system worked successfully on indoor and outdoor platforms and with several different kinds and configurations of two-dimensional and three-dimensional laser scanners. The applications vary from collision warning systems, people classification, observing human tracks, and input to a dynamic planner. Several of these systems were evaluated in live field tests and shown to be robust and reliable.


EURASIP Journal on Advances in Signal Processing | 2009

Prioritized multihypothesis tracking by a robot with limited sensing

Paul E. Rybski; Manuela M. Veloso

To act intelligently in dynamic environments, mobile robots must estimate object positions using information obtained from a variety of sources. We formally describe the problem of estimating the state of objects where a robot can only task its sensors to view one object at a time. We contribute an object tracking method that generates and maintains multiple hypotheses consisting of probabilistic state estimates that are generated by the individual information sources. These different hypotheses can be generated by the robots own prediction model and by communicating robot team members. The multiple hypotheses are often spatially disjoint and cannot simultaneously be verified by the robots limited sensors. Instead, the robot must decide towards which hypothesis its sensors should be tasked by evaluating each hypothesis on its likelihood of containing the object. Our contributed algorithm prioritizes the different hypotheses, according to rankings set by the expected uncertainty in the objects motion model, as well as the uncertainties in the sources of information used to track their positions. We describe the algorithm in detail and show extensive empirical results in simulation as well as experiments on actual robots that demonstrate the effectiveness of our approach.


ieee intelligent vehicles symposium | 2008

Classification and tracking of dynamic objects with multiple sensors for autonomous driving in urban environments

Michael Darms; Paul E. Rybski; Chris Urmson

Future driver assistance systems are likely to use a multisensor approach with heterogeneous sensors for tracking dynamic objects around the vehicle. The quality and type of data available for a data fusion algorithm depends heavily on the sensors detecting an object. This article presents a general framework which allows the use sensor specific advantages while abstracting the specific details of a sensor. Different tracking models are used depending on the current set of sensors detecting the object. A sensor independent algorithm for classifying objects regarding their current and past movement state is presented. The described architecture and algorithms have been successfully implemented in Tartan racingpsilas autonomous vehicle for the urban grand challenge. Results are presented and discussed.

Collaboration


Dive into the Paul E. Rybski's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Manuela M. Veloso

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chris Urmson

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Carlos Vallespi

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Brett Browning

Carnegie Mellon University

View shared research outputs
Researchain Logo
Decentralizing Knowledge