Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stephen B. Hughes is active.

Publication


Featured researches published by Stephen B. Hughes.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2005

Validating USARsim for use in HRI Research

Jijun Wang; Michael Lewis; Stephen B. Hughes; Mary Koes; Stefano Carpin

HRI is an excellent candidate for simulator based research because of the relative simplicity of the systems being modeled, the behavioral fidelity possible with current physics engines and the capability of modern graphics cards to approximate camera video. In this paper we briefly introduce the USARsim simulation and discuss efforts to validate its behavior for use in Human Robot Interaction (HRI) research.


Journal of Cognitive Engineering and Decision Making | 2007

USARSim: Simulation for the Study of Human-Robot Interaction

Michael Lewis; Jijun Wang; Stephen B. Hughes

The PackBots being used by the U.S. military in Afghanistan and the urban search and rescue (USAR) robots that worked the World Trade Center site are just two recent examples of mobile robots moving from the laboratory to the field. What is significant about these new applications is that they invariably involve some form of human-robot interaction (HRI) rather than the full robot autonomy that has motivated most prior research. Conducting HRI research can be extremely difficult because experimentation with physical robots is expensive and time consuming. Few roboticists have experience or interest in conducting human experimentation, and researchers in human factors or human-computer interaction often lack experience in programming robots or access to robotic platforms. In this paper, we describe a high-fidelity, open-source simulation intended for HRI researchers of varying backgrounds and provide reference tasks and environments to facilitate collaboration in order to share the results. The architecture and capabilities of the game engine–based USARSim simulation are described. Its use for HRI research is illustrated through case studies describing experiments in camera control for remote viewing and integrated display of attitude information.


human factors in computing systems | 2004

Robotic camera control for remote exploration

Stephen B. Hughes; Michael Lewis

A video stream from a single camera is often the foundation for situational awareness in teleoperation activities. Poor camera placement, narrow field-of-view and other camera properties can significantly impair the operators perceptual link to the environment, inviting cognitive mistakes and general disorientation. This paper provides a brief overview of viewpoint control research for 3D virtual environments (VE) to motivate a user study that evaluates the effectiveness of viewpoint controls on a simulated robotic vehicle. Findings suggest that providing a camera that is controlled independently from the orientation of the vehicle may facilitate wayfinding tasks. Moreover, there is evidence to support the use of separate cameras and interfaces for different navigational subtasks.


systems, man and cybernetics | 2003

Camera control and decoupled motion for teleoperation

Stephen B. Hughes; Joseph Manojlovich; Michael Lewis; Jeffrey Gennari

Human judgment is an integral part of the teleoperation process that is often heavily influenced by a single video feed returned from the remote environment. This limitation on the perceptual links to the environment leaves the operator prone to cognitive mistakes and general disorientation. These faults may be enhanced or muted, depending on the camera mountings and control opportunities that are at the disposal of the operator. These issues form the basis for an experiment to assess the effectiveness of existing and potential teleoperation controls. Findings suggest that providing a camera that is controlled independently from the orientation of the vehicle may yield significant benefits.


systems man and cybernetics | 2005

Task-driven camera operations for robotic exploration

Stephen B. Hughes; Michael Lewis

Human judgment is an integral part of the teleoperation process that is often heavily influenced by a single video feed returned from the remote environment. Poor camera placement, narrow field of view, and other camera properties can significantly impair the operators perceptual link to the environment, inviting cognitive mistakes and general disorientation. These faults may be enhanced or muted, depending on the camera mountings and control opportunities that are at the disposal of the operator. These issues form the basis for two user studies that assess the effectiveness of existing and potential teleoperation controls. Findings suggest that providing a camera that is controlled independently from the orientation of the vehicle may yield significant benefits. Moreover, there is evidence to support the use of separate cameras for different navigational subtasks. Third, the use of multiple cameras can also be used to provide assistance without encroaching on the operators desired threshold for control.


Simulation | 2004

UTSAF: A Multi-Agent-Based Software Bridge for Interoperability between Distributed Military and Commercial Gaming Simulation

Phongsak Prasithsangaree; Joseph Manojlovich; Stephen B. Hughes; Michael Lewis

Rapid advances in consumer electronics have led to the anomaly that consumer off-the-shelf gaming hardware and software provide better interactive graphics than military and other specialized systems costing orders of magnitude more. UTSAF (Unreal Tournament Semi-Automated Force) is bridging software written to take advantage of the power of gaming systems by allowing them to participate in distributed simulations with military simulators. UTSAF illustrates the use of multiagent technology to flexibly interconnect otherwise incompatible systems. This article describes an architectural approach for rapidly constructing middleware by taking advantage of built-in capabilities for processing, communication, and interoperation that a multiagent infrastructure provides. Several software agents based on Reusable Environment for Task-Structured Intelligent Networked Agents (RETSINAs) are used to support interoperability between military simulation nodes based on distributed interactive simulation and Unreal game simulators. Using a multiagent system, UTSAF can be expanded to support several network environments and interact with other agent-based software.


systems, man and cybernetics | 2003

Experiments with attitude: attitude displays for teleoperation

Michael Lewis; Jijun Wang; Stephen B. Hughes; Xiong Liu

Attitude control refers to controlling the pitch and roll of a mobile robot. As environments grow more complex and cues to a robots pose sparser it becomes easy for a teleoperator to lose situational awareness. Information from separated attitude displays may be difficult to integrate with an ongoing navigation task and lead to errors. In this paper we report an experiment comparing a gravity referenced display (GRV) with a standard fixed camera with separated attitude Indication. Results show shorter task times and better path choices for users of the GRV.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2004

Gravity-Referenced Attitude Display for Teleoperation of Mobile Robots

Jijun Wang; Michael Lewis; Stephen B. Hughes

Attitude control refers to controlling the pitch and roll of a mobile robot. As environments grow more complex and cues to a robots pose sparser it becomes easy for a teleoperator using an egocentric (camera) display to lose situational awareness. Reported difficulties with teleoperated robots frequently involve rollovers and sometimes even failure to realize that a robot has rolled over. Attitude information has conventionally been displayed separate from the camera view using an artificial horizon graphic or individual indicators for pitch and roll. Information from separate attitude displays may be difficult to integrate with an ongoing navigation task and lead to errors. In this paper we report an experiment in which a simulated robot is maneuvered over rough exterior and interior terrains to compare a gravity-referenced view with a separated attitude indication. Results show shorter task times and better path choices for users of the gravity-referenced view.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2002

ATTENTIVE INTERACTION TECHNIQUES FOR SEARCHING VIRTUAL ENVIRONMENTS

Stephen B. Hughes; Michael Lewis

The ability to manipulate the viewpoint is critical to many tasks performed in virtual environments (VEs). However, viewers in an information-rich VE are particularly susceptible to superfluous data and are easily distracted. Attentive Interaction techniques can address this issue by allowing the viewer to explore freely while allowing the system to suggest optimal locations and orientations. In this study, we examine the effectiveness of two interaction techniques. The first technique, called the Attentive Camera, is characterized by the system automatically aligning the viewpoint with the ideal direction of gaze as the viewer explores the environment. In the second technique known as the Attentive Flashlight, the ideal gaze direction is used to aim a spotlight, illuminating objects of interest. Study participants used these techniques to complete four search tasks in a virtual art gallery.


systems man and cybernetics | 2000

Attentive camera navigation in virtual environments

Stephen B. Hughes; Michael Lewis

Gaining an accurate mental representation of real environments and realistic virtual environments is a gradual process. Significant aspects of an environment may be obvious to a trained expert, but not to the novice trainee. If a user does not know where to look, he or she may concentrate on irrelevant objects. This detracts from learning the locations of truly prominent landmarks. The paper explores attentive camera navigation, a technique that guides the user to focus on certain objects through automatic gaze redirection. Results of a user study suggest that this technique can help filter out unnecessary objects, and allow users to quickly understand the configuration of a selected subset of landmarks.

Collaboration


Dive into the Stephen B. Hughes's collaboration.

Top Co-Authors

Avatar

Michael Lewis

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar

Jijun Wang

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrew J. Hanson

Indiana University Bloomington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

J. Ben Schafer

University of Northern Iowa

View shared research outputs
Top Co-Authors

Avatar

Jinlin Chen

University of Pittsburgh

View shared research outputs
Researchain Logo
Decentralizing Knowledge