Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where James Everett Young is active.

Publication


Featured researches published by James Everett Young.


interactive tabletops and surfaces | 2009

The Haptic Tabletop Puck: tactile feedback for interactive tabletops

Nicolai Marquardt; Miguel A. Nacenta; James Everett Young; M. Sheelagh T. Carpendale; Saul Greenberg; Ehud Sharlin

In everyday life, our interactions with objects on real tables include how our fingertips feel those objects. In comparison, current digital interactive tables present a uniform touch surface that feels the same, regardless of what it presents visually. In this paper, we explore how tactile interaction can be used with digital tabletop surfaces. We present a simple and inexpensive device -- the Haptic Tabletop Puck -- that incorporates dynamic, interactive haptics into tabletop interaction. We created several applications that explore tactile feedback in the area of haptic information visualization, haptic graphical interfaces, and computer supported collaboration. In particular, we focus on how a person may interact with the friction, height, texture and malleability of digital objects.


human-robot interaction | 2013

Communicating affect via flight path: exploring use of the laban effort system for designing affective locomotion paths

Megha Sharma; Dale Hildebrandt; Gem Newman; James Everett Young; M. Rasit Eskicioglu

People and animals use various kinds of motion in a multitude of ways to communicate their ideas and affective state, such as their moods or emotions. Further, people attribute affect and personalities to movements of even non-life like entities based solely on the style of their motions, e.g., the locomotion style of a geometric shape (how it moves about) can be interpreted as being shy, aggressive, etc. We investigate how robots can leverage this locomotion-style communication channel for communication with people. Specifically, our work deals with designing stylistic flying-robot locomotion paths for communicating affective state. To author and unpack the parameters of affect-oriented flying-robot locomotion styles we employ the Laban Effort System, a standard method for interpreting human motion commonly used in the performing arts. This paper describes our adaption of the Laban Effort System to author motions for flying robots, and the results of a formal experiment that investigated how various Laban Effort System parameters influence peoples perception of the resulting robotic motions. We summarize with a set of guidelines for aiding designers in using the Laban Effort System to author flying robot motions to elicit desired affective responses.


human-robot interaction | 2007

Robot expressionism through cartooning

James Everett Young; Min Xin; Ehud Sharlin

We present a new technique for human-robot interaction called robot expressionism through cartooning. We suggest that robots utilise cartoon-art techniques such as simplified and exaggerated facial expressions, stylised text, and icons for intuitive social interaction with humans. We discuss practical mixed reality solutions that allow robots to augment themselves or their surroundings with cartoon art content. Our effort is part of what we call robot expressionism, a conceptual approach to the design and analysis of robotic interfaces that focuses on providing intuitive insight into robotic states as well as the artistic quality of interaction. Our paper discusses a variety of ways that allow robots to use cartoon art and details a test bed design, implementation, and exploratory evaluation. We describe our test bed, Jeeves, which uses a Roomba, an iRobot vacuum cleaner robot, and a mixed-reality system as a platform for rapid prototyping of cartoon-art interfaces. Finally, we present a set of interaction content scenarios which use the Jeeves prototype: trash Roomba, the recycle police, and clean tracks, as well as initial exploratory evaluation of our approach.


human factors in computing systems | 2009

Touch and toys: new techniques for interaction with a remote group of robots

Cheng Guo; James Everett Young; Ehud Sharlin

Interaction with a remote team of robots in real time is a difficult human-robot interaction (HRI) problem exacerbated by the complications of unpredictable real-world environments, with solutions often resorting to a larger-than-desirable ratio of operators to robots. We present two innovative interfaces that allow a single operator to interact with a group of remote robots. Using a tabletop computer the user can configure and manipulate groups of robots directly by either using their fingers (touch) or by manipulating a set of physical toys (tangible user interfaces). We recruited participants to partake in a user study that required them to interact with a small group of remote robots in simple tasks, and present our findings as a set of design considerations.


symposium on computer animation | 2008

Puppet Master: designing reactive character behavior by demonstration

James Everett Young; Takeo Igarashi; Ehud Sharlin

Puppet Master is a system that enables designers to rapidly create interactive and autonomous animated character behaviors that react to a main character controlled by an end-user. The behavior is designed by demonstration, allowing non-technical artists to intuitively design the style, personality, and emotion of the character, traits which are very difficult to design using conventional programming. During training, designers demonstrate paired behavior between the main and reacting characters. During run time, the end user controls the main character and the system synthesizes the motion of the reacting character using the given training data. The algorithm is an extension of Image Analogies [HJO*01], modified to synthesize dynamic character behavior instead of an image. We introduce non-trivial extensions to the algorithm such as our selection of features, dynamic balancing between similarity metrics, and separate treatment of path trajectory and high-frequency motion texture. We implemented a prototype system using physical pucks tracked by a motion-capture system and conducted a user study demonstrating that novice users can easily and successfully design character personality and emotion using our system and that the resulting behaviors are meaningful and engaging.


human factors in computing systems | 2014

Involving children in content control: a collaborative and education-oriented content filtering approach

Yasmeen Hashish; Andrea Bunt; James Everett Young

We present an approach to content control where parents and children collaboratively configure restrictions and filters, an approach that focuses on education rather than simple rule setting. We conducted an initial exploratory qualitative study with results highlighting the importance that parents place on avoiding inappropriate content. Building on these findings, we designed an initial prototype which allows parents and children to work together to select appropriate applications, providing an opportunity for parents to educate their children on what is appropriate. A second qualitative study with parents and children in the six to eight year-old age group revealed a favorable response to this approach. Our results suggest that parents felt that this approach helped facilitate discussions with their children and made the education more enjoyable and approachable, and that children may have also learned from the interaction. In addition, the approach provided some parents with insights into their childrens interests and understanding of their notions of appropriate and inappropriate content.


robot and human interactive communication | 2011

How to walk a robot: A dog-leash human-robot interface

James Everett Young; Yoichi Kamiyama; Juliane Reichenbach; Takeo Igarashi; Ehud Sharlin

Human-robot interaction (HRI) tasks in everyday environments will require people to direct or lead a robot as they walk in close proximity to it. Tasks that exemplify this interaction include a robotic porter, carrying heavy suitcases, or a robot carrying groceries. As many users may not be robotics experts, we argue that such interaction schemes must be accessible, easy to use and understand. In this paper, we present a dog-leash interface that enables a person to lead a robot simply by holding the leash, following a dog-leash interaction metaphor. We introduce variants on dog-leash robotic interaction, present our original interface implementation, and detail our formal qualitative evaluation, exploring how users perceive and accept the dog leash robotic interaction.


human-robot interaction | 2012

Animal-inspired human-robot interaction: a robotic tail for communicating state

Ashish Singh; James Everett Young

We present a robotic tail interface for enabling a robot to communicate its state to people. Our interface design follows an animal-inspired methodology where we map the robots state to its tail output, leveraging peoples existing knowledge of and experiences with animals for human-robot interaction. In this paper we detail our robotic-tail design approach and our prototype implementations, and outline our future steps.


human-robot interaction | 2012

The Roomba mood ring: an ambient-display robot

Daniel J. Rea; James Everett Young; Pourang Irani

We present a robot augmented with an ambient display that communicates using a multi-color halo. We use this robot in a public café-style setting where people vote on which colors the robot will display: we ask people to select a color which “best represents their mood.” People can vote from a mobile device (e.g., smart phone or laptop) through a web interface. Thus, the robots display is an abstract aggregate of the current mood of the room. Our research investigates how a robot with an ambient display may integrate into a space. For example, how will the robot alter how people use or perceive the environment, or how people will interact with the robot itself? In this paper we describe our initial prototype, an iRobot Roomba augmented with lights, and highlight the research questions driving our exploration, including initial study design.


human-robot interaction | 2015

Poor Thing! Would You Feel Sorry for a Simulated Robot?: A comparison of empathy toward a physical and a simulated robot

Stela H. Seo; Denise Geiskkovitch; Masayuki Nakane; Corey King; James Everett Young

In designing and evaluating human-robot interactions and interfaces, researchers often use a simulated robot due to the high cost of robots and time required to program them. However, it is important to consider how interaction with a simulated robot differs from a real robot; that is, do simulated robots provide authentic interaction? We contribute to a growing body of work that explores this question and maps out simulated-versus-real differences, by explicitly investigating empathy: how people empathize with a physical or simulated robot when something bad happens to it. Our results suggest that people may empathize more with a physical robot than a simulated one, a finding that has important implications on the generalizability and applicability of simulated HRI work. Empathy is particularly relevant to social HRI and is integral to, for example, companion and care robots. Our contribution additionally includes an original and reproducible HRI experimental design to induce empathy toward robots in laboratory settings, and an experimentally validated empathy-measuring instrument from psychology for use with HRI. Categories and Subject Descriptors H.5.2 [User Interfaces]: evaluation/methodology General Terms Experimentation and Human Factors.

Collaboration


Dive into the James Everett Young's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrea Bunt

University of Manitoba

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge