Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Derek Brock is active.

Publication


Featured researches published by Derek Brock.


International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 2003

Preparing to resume an interrupted task: effects of prospective goal encoding and retrospective rehearsal

J. Gregory Trafton; Erik M. Altmann; Derek Brock; Farilee E. Mintz

We examine peoples strategic cognitive responses to being interrupted while performing a task. Based on memory theory, we propose that resumption of a task after interruption is facilitated by preparation during the interruption lag, or the interval between an alert to a pending interruption (e.g. the phone ringing) and the interruption proper (the ensuing conversation). To test this proposal, we conducted an experiment in which participants in a Warning condition received an 8-s interruption lag, and participants in an Immediate condition received no interruption lag. Participants in the Warning condition prepared more than participants in the Immediate condition, as measured by verbal reports, and resumed the interrupted task more quickly. However, Immediate participants resumed faster with practice, suggesting that people adapt to particularly disruptive forms of interruption. The results support our task analysis of interruption and our model of memory for goals, and suggest further means for studying operator performance in dynamic task environments.


systems man and cybernetics | 2005

Enabling effective human-robot interaction using perspective-taking in robots

J. G. Trafton; Nicholas L. Cassimatis; Magdalena D. Bugajska; Derek Brock; Farilee E. Mintz; Alan C. Schultz

We propose that an important aspect of human-robot interaction is perspective-taking. We show how perspective-taking occurs in a naturalistic environment (astronauts working on a collaborative project) and present a cognitive architecture for performing perspective-taking called Polyscheme. Finally, we show a fully integrated system that instantiates our theoretical framework within a working robot system. Our system successfully solves a series of perspective-taking problems and uses the same frames of references that astronauts do to facilitate collaborative problem solving with a person.


49th Annual Meeting of the Human Factors and Ergonomics Society, HFES 2005 | 2005

Huh, what was I doing? How people use environmental cues after an interruption

J. Gregory Trafton; Erik M. Altmann; Derek Brock

We examine the effect of environmental cues on being interrupted while performing a task. We conducted an experiment in which participants, after an interruption, received either a blatant environmental cue of their previous action (a red arrow), a subtle environmental cue of their previous action (a cursor that was placed in the same location as their previous action), or no environmental cue at all. We found that participants in the blatant condition resumed their task faster than participants in the other two conditions. Furthermore, a subtle environmental cue was no better than no cue at all. The results support our model of memory for goals.


Archive | 2002

Communicating with Teams of Cooperative Robots

Dennis Perzanowski; Alan C. Schultz; William Adams; Magdalena D. Bugajska; Elaine Marsh; G. Trafton; Derek Brock; Marjorie Skubic; M. Abramson

We are designing and implementing a multi-modal interface to a team of dynamically autonomous robots. For this interface, we have elected to use natural language and gesture. Gestures can be either natural gestures perceived by a vision system installed on the robot, or they can be made by using a stylus on a Personal Digital Assistant. In this paper we describe the integrated modes of input and one of the theoretical constructs that we use to facilitate cooperation and collaboration among members of a team of robots. An integrated context and dialog processing component that incorporates knowledge of spatial relations enables cooperative activity between the multiple agents, both human and robotic.


human-robot interaction | 2007

Improving human-robot interaction through adaptation to the auditory scene

Eric Martinson; Derek Brock

Effective communication with a mobile robot using speech is a difficult problem even when you can control the auditory scene. Robot ego-noise, echoes, and human interference are all common sources of decreased intelligibility. In real-world environments, however, these common problems are supplemented with many different types of background noise sources. For instance, military scenarios might be punctuated by high decibel plane noise and bursts from weaponry that mask parts of the speech output from the robot. Even in non-military settings, however, fans, computers, alarms, and transportation noise can cause enough interference that they might render a traditional speech interface unintelligible. In this work, we seek to overcome these problems by applying robotic advantages of sensing and mobility to a text-to-speech interface. Using perspective taking skills to predict how the human user is being affected by new sound sources, a robot can adjust its speaking patterns and/or reposition itself within the environment to limit the negative impact on intelligibility, making a speech interface easier to use.


Archive | 2005

Cognition and Multi-Agent Interaction: Communicating and Collaborating with Robotic Agents

J. Gregory Trafton; Alan C. Schultz; Nicholas L. Cassimatis; Laura M. Hiatt; Dennis Perzanowski; Derek Brock; Magdalena D. Bugajska; William Adams

Introduction For the last few years, our lab has been attempting to build robots that are similar to humans in a variety of ways. Our goal has been to build systems that think and act like a person rather than look like a person since the state of the art is not sufficient for a robot to look (even superficially) like a human person. We believe that there are at least two reasons to build robots that think and act like a human. First, how an artificial system acts has a profound effect on how people act toward the system. Second, how an artificial system thinks has a profound effect on how people interact with the system.


International Journal of Humanoid Robotics | 2005

COLLABORATING WITH HUMANOID ROBOTS IN SPACE

Donald A. Sofge; Magdalena D. Bugajska; J. Gregory Trafton; Dennis Perzanowski; Scott Thomas; Marjorie Skubic; Samuel Blisard; Nicholas L. Cassimatis; Derek Brock; William Adams; Alan C. Schultz

One of the great challenges of putting humanoid robots into space is developing cognitive capabilities for the robots with an interface that allows human astronauts to collaborate with the robots as naturally and efficiently as they would with other astronauts. In this joint effort with NASA and the entire Robonaut team, we are integrating natural language and gesture understanding, spatial reasoning incorporating such features as human–robot perspective taking, and cognitive model-based understanding to achieve a high level of human–robot interaction. Building greater autonomy into the robot frees the human operator(s) from focusing strictly on the demands of operating the robot, and instead allows the possibility of actively collaborating with the robot to focus on the task at hand. By using shared representations between the human and robot, and enabling the robot to assume the perspectives of the human, the humanoid robot may become a more effective collaborator with a human astronaut for achieving mission objectives in space.


IEEE Transactions on Systems, Man, and Cybernetics | 2013

Auditory Perspective Taking

Eric Martinson; Derek Brock

Effective communication with a mobile robot using speech is a difficult problem even when you can control the auditory scene. Robot self-noise or ego noise, echoes and reverberation, and human interference are all common sources of decreased intelligibility. Moreover, in real-world settings, these problems are routinely aggravated by a variety of sources of background noise. Military scenarios can be punctuated by high decibel noise from materiel and weaponry that would easily overwhelm a robots normal speaking volume. Moreover, in nonmilitary settings, fans, computers, alarms, and transportation noise can cause enough interference to make a traditional speech interface unusable. This work presents and evaluates a prototype robotic interface that uses perspective taking to estimate the effectiveness of its own speech presentation and takes steps to improve intelligibility for human listeners.


applied imagery pattern recognition workshop | 2009

Persistence and tracking: Putting vehicles and trajectories in context

Robert Pless; Michael Dixon; Nathan Jacobs; Patrick Baker; Nicholas L. Cassimatis; Derek Brock; Ralph Hartley; Dennis Perzanowski

City-scale tracking of all objects visible in a camera network or aerial video surveillance is an important tool in surveillance and traffic monitoring. We propose a framework for human guided tracking based on explicitly considering the context surrounding the urban multi-vehicle tracking problem. This framework is based on a standard (but state of the art) probabilistic tracking model. Our contribution is to explicitly detail where human annotation of the scene (e.g. “this is a lane”), a track (e.g. “this track is bad”), or a pair of tracks (e.g. “these two tracks are confused”) can be naturally integrated within the probabilistic tracking framework. For an early prototype system, we offer results and examples from a dense urban traffic camera network tracking, querying data with thousands of vehicles over 30 minutes.


Journal of the Acoustical Society of America | 2006

Aural classification of impulsive‐source active sonar echoes

James W. Pitton; Scott Philips; Les E. Atlas; James A. Ballas; Derek Brock; Brian McClimens; Maxwell H. Miller

The goal of this effort is to develop automatic target classification technology for active sonar systems by exploiting knowledge of signal processing methods and human auditory processing. Using impulsive‐source active sonar data, formal listening experiments were conducted to determine if and how human subjects can discriminate between sonar target and clutter echoes using aural cues alone. Both trained sonar operators and naive listeners at APL‐UW were examined to determine a baseline performance level. This level was found to be well above chance for multiple subjects in both groups, validating the accepted wisdom that there are inherent aural cues separating targets from clutter. In a subsequent experiment, feedback was provided to the naive listeners and classification performance dramatically improved, demonstrating that naive listeners can be trained to a level on par with experts. Using these trained listeners at APL‐UW, a multidimensional scaling (MDS) listening experiment was designed and condu...

Collaboration


Dive into the Derek Brock's collaboration.

Top Co-Authors

Avatar

Brian McClimens

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Dennis Perzanowski

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Alan C. Schultz

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

James A. Ballas

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

William Adams

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Charles F. Gaumond

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

J. Gregory Trafton

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Magdalena D. Bugajska

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Nicholas L. Cassimatis

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Christina Wasylyshyn

United States Naval Research Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge