Cindy L. Bethel
University of South Florida
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Cindy L. Bethel.
systems man and cybernetics | 2008
Cindy L. Bethel; Robin R. Murphy
Non-facial and non-verbal methods of affective expression are essential for naturalistic social interaction in robots that are designed to be functional and lack expressive faces (appearance-constrained) such as those used in search and rescue, law enforcement, and military applications. This correspondence identifies five main methods of non-facial and non-verbal affective expression (body movement, posture, orientation, color, and sound), and ranks their effectiveness for appearance-constrained robots operating within the intimate, personal, and social proximity zones of a human corresponding to interagent distances of approximately 3 m or less. This distance is significant because it encompasses the most common human social interaction distances, the exception being the public distance zone used for formal presentations. The correspondence complements prior, broad surveys of affective expression by reviewing the psychology, computer science, and robotics literature specifically relating the impact of social interaction in non-anthropomorphic and appearance-constrained robots, and summarizing robotic implementations that utilize non-facial and non-verbal methods of affective expression as their primary means of expression. The literature is distilled into a set of prescriptive recommendations of the appropriate affective expression methods for each of the three proximity zones of interest. These recommendations serve as design guidelines for retroactively adding affective expression through software to a robot without physical modifications or designing a new robot.
International Journal of Social Robotics | 2010
Cindy L. Bethel; Robin R. Murphy
This article provides an overview on planning, designing, and executing human studies for Human-Robot Interaction (HRI) that leads to ten recommendations for experimental design and study execution. Two improvements are described, using insights from the psychology and social science disciplines. First is to use large sample sizes to better represent the populations being investigated to have a higher probability of obtaining statistically significant results. Second is the application of three or more methods of evaluation to have reliable and accurate results, and convergent validity. Five primary methods of evaluation exist: self-assessments, behavioral observations, psychophysiological measures, interviews, and task performance metrics. The article describes specific tools and procedures for operationalizing these improvements, as well as suggestions for recruiting participants. A recent large-scale, complex, controlled human study in HRI using 128 participants and four methods of evaluation is presented to illustrate planning, design, and execution choices.
human-robot interaction | 2006
Cindy L. Bethel; Robin R. Murphy
The use of affective expression and social interaction is an emerging area of importance in robotics, with the focus historically on facial expressions and/or animal mimicry [5], [10]. However, a large number of mobile robots currently in use for applications such as law enforcement, military, and search and rescue are not anthropomorphic, do not have any method of projecting facial expressions, and cannot be reengineered explicitly to support affective expression. This poses significant challenges in how these appearance-constrained robots will support naturalistic human-robot interaction. Fincannon et al. provides an example of how rescue workers conducting breaching expected a small tank-like robot to follow social conventions despite the robots non-anthropomorphic appearance [9]. Work by Murphy et al. [13] in using man-packable robots to act as a surrogate presence for doctors tending to trapped victims, identifies how the robot will interact with the victim as one of the four major open research issues. They noted that the robots operating within 3 meters of the simulated victims were perceived as “creepy” and not reassuring. In each of these cases, the robots were operating in highly confined spaces and the additions of faces or other devices might interfere with the critical attribute of mobility. Our work focuses on affective expression in non-anthropomorphic and appearanceconstrained robots for humanrobot interactions occurring within three meters of each other. Appearance-constrained robots are not engineered to be anthropomorphic and do not have the ability to exhibit facial expressions. Application limitations include mobility,
collaboration technologies and systems | 2007
Cindy L. Bethel; Jennifer L. Burke; Robin R. Murphy; Kristen Salomon
This paper outlines key experimental design issues associated with the use of psychophysiological measures in human-robot interaction (HRI) studies and summarizes related studies. Psychophysiological measurements are one tool for evaluating participantspsila reactions to a robot with which they are interacting. A brief review of psychophysiology is provided which includes: physiological activities and response tendencies; common psychophysiological measures; and advantages/issues related to psychophysiological measures. Psychophysiological experimental design recommendations are given for information required from participants before the psychophysiological measures are performed; a method to reduce habituation; post-testing assessment process; determining adequate sample sizes; and testing methods commonly used in HRI studies with recommended electrode placements. Psychophysiological measures should be utilized as part of a multi-faceted approach to experimental design including self-assessments, participant interviews, and/or video-recorded data collection methods over the course of an experimental study. Two or more methods of measurement should be utilized for convergent validity. Although psychophysiological measures may not be appropriate for all HRI studies, they can provide a valuable evaluation tool of participantspsila responses when properly incorporated into a multi-faceted experimental design.
human-robot interaction | 2007
Cindy L. Bethel; Robin R. Murphy
This work applies a previously developed set of heuristics for determining when to use non-facial/non-verbal methods of affective expression to the domain of a robot being used for victim assessment in the aftermath of a disaster. Robot-assisted victim assessment places a robot approximately three meters or less from a victim, and the path of the robot traverses three proximity zones (intimate (contact - 0.46 m), personal (0.46 - 1.22 m), and social (1.22 - 3.66 m)). Robot- and victim-eye views of an Inuktun robot were collected as it followed a path around the victim. The path was derived from observations of a prior robot-assisted medical reachback study. The victims-eye views of the robot from seven points of interest on the path illustrate the appropriateness of each of the five primary non-facial/non-verbal methods of affective expression: (body movement, posture, orientation, illuminated color, and sound), offering support for the heuristics as a design aid. In addition to supporting the heuristics, the investigation identified three open research questions on acceptable motions and impact of the surroundings on robot affect.
Paladyn | 2010
Cindy L. Bethel; Robin R. Murphy
Non-facial and non-verbal methods of affective expression are essential for social interaction in appearance-constrained robots such as those used in search and rescue, law enforcement, and military applications. This research identified five main methods of non-facial and non-verbal affective expression (body movements, postures, orientation, color, and sound). Based on an extensive review of literature, prescriptive design recommendations were developed for the appropriate non-facial and non-verbal affective expression methods for three proximity zones of interest (intimate, personal, and social). These design recommendations serve as guidelines to add retroactively affective expression through software with minimal or no physical modification to a robot. A large-scale, complex human-robot interaction study was conducted to validate these design recommendations using 128 participants and four methods of evaluation. The study was conducted in a high-fidelity, confined-space simulated disaster site with all robot interactions performed in the dark. Statistically significant results indicated that participants felt the robots that exhibited affective expressions were more calming, friendly, and attentive, which improved the social human-robot interactions.
international conference on pattern recognition | 2006
Cindy L. Bethel; Lawrence O. Hall; Dmitry B. Goldgof
Accruing patients for clinical trials has been a tedious and time consuming task for clinicians. It requires extensive knowledge of the specific criteria for all available clinical trials. Through interviews with clinicians, implications were discovered which reduced the number of required questions/answers to determine eligibility. After gathering and recording data on past breast cancer patients, the answers to the questions asked by an expert system were extracted. An association rule learner, was used to generate implication rules such as: male => not pregnant. It was determined that all current implication rules could be recovered with 100% confidence. Further searching for additional rules resulted in the discovery of several which provided an improvement in the clinical ease of use of the Web-based clinical trial assignment expert system
human-robot interaction | 2009
Cindy L. Bethel; Kristen Salomon; Robin R. Murphy
This paper describes preliminary results of a large-scale, complex human study in HRI in which results show that participants were calmer interacting with non-anthropomorphic robots operated in an emotive mode versus a standard, non-emotive mode.
international symposium on safety, security, and rescue robotics | 2008
Brian Day; Cindy L. Bethel; Robin R. Murphy; Jennifer L. Burke
This paper describes a visual display that provides depth of objects to be grasped and was developed at the request of a local bomb squad for use with a bomb disposal robot. The display provides four key functions: (1) it allows the operator to extract the distance between the object and the robots grasper that each pixel represents, (2) it cues the operator when the object is within a predefined distance from the robot grasper, (3) it can track the object in the video display, and (4) it can continuously display the distance from the robot grasper to the selected object. The display was designed specifically for the Canesta EP200 mounted on a Remotec mini-max robot, but the display functionality is expected to be useful for any robot grasper used in conjunction with a 3D sensor. While the usability of the visual display and its impact on grasper-related performance has not been formally evaluated, the informal feedback from the subject matter experts is that this display meets their requirements.
north american fuzzy information processing society | 2002
Scott Dick; Cindy L. Bethel; Abraham Kandel
We report on an experimental investigation of software reliability data. Our hypothesis in this investigation is that software failures are the result of a fundamentally deterministic process, rather than being realizations of a stochastic process as is commonly assumed. Using the techniques of nonlinear time series analysis, we examine three software reliability datasets for the signatures of deterministic, and possibly chaotic, behavior. In these datasets, we have found firm evidence of deterministic behavior, and hints of chaotic behavior. However, the latter are too limited to permit a definitive conclusion about the presence or absence of chaotic behavior.