Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jeffrey S. Shell is active.

Publication


Featured researches published by Jeffrey S. Shell.


Communications of The ACM | 2003

Interacting with groups of computers

Jeffrey S. Shell; Ted Selker; Roel Vertegaal

AUIs recognize human attention in order to respect and react to how users distribute their attention in technology-laden environments.


Computers in Human Behavior | 2006

Designing for augmented attention : Towards a framework for attentive user interfaces

Roel Vertegaal; Jeffrey S. Shell; Daniel Chen; Aadil Mamuji

Abstract Attentive user interfaces are user interfaces that aim to support the user’s attentional capacities. By sensing the users’ attention for objects and people in their everyday environment, and by treating user attention as a limited resource, these interfaces avoid today’s ubiquitous patterns of interruption. Focusing upon attention as a central interaction channel allows development of more sociable methods of communication and repair with ubiquitous devices. Our methods are analogous to human turn taking in group communication. Turn taking improves the user’s ability to conduct foreground processing of conversations. Attentive user interfaces bridge the gap between the foreground and periphery of user activity in a similar fashion, allowing users to move smoothly in between. We present a framework for augmenting user attention through attentive user interfaces. We propose five key properties of attentive systems: (i) to sense attention; (ii) to reason about attention; (iii) to regulate interactions; (iv) to communicate attention and (v) to augment attention.


human factors in computing systems | 2003

EyePliances: attention-seeking devices that respond to visual attention

Jeffrey S. Shell; Roel Vertegaal; Alexander W. Skaburskis

We present EyePliances: appliances and devices that detect and respond to human visual attention using eye contact sensors. EyePliances receive implicit input from users, in the form of eye gaze, and respond by opening communication channels. By allowing devices to recognize the attentional cues people already provide, requests for explicit input from users can be reduced. Further, eye contact sensing gives devices a mechanism to determine whether a user is available for interruption, and can provide the missing environmental context to improve speech recognition.


acm workshop on continuous archival and retrieval of personal experiences | 2004

Augmenting and sharing memory with eyeBlog

Connor Dickie; Roel Vertegaal; David Fono; Changuk Sohn; Daniel Chen; Daniel Cheng; Jeffrey S. Shell; Omar Aoudeh

eyeBlog is an automatic personal video recording and publishing system. It consists of ECSGlasses [1], which are a pair of glasses augmented with a wireless eye contact and glyph sensing camera, and a web application that visualizes the video from the ECSGlasses camera as chronologically delineated blog entries. The blog format allows for easy annotation, grading, cataloging and searching of video segments by the wearer or anyone else with internet access. eyeBlog reduces the editing effort of video bloggers by recording video only when something of interest is registered by the camera. Interest is determined by a combination of independent methods. For example, recording can automatically be triggered upon detection of eye contact towards the wearer of the glasses, allowing all face-to-face interactions to be recorded. Recording can also be triggered by the detection of image patterns such as glyphs in the frame of the camera. This allows the wearer to record their interactions with any object that has an associated unique marker. Finally, by pressing a button the user can manually initiate recording.


eye tracking research & application | 2004

ECSGlasses and EyePliances: using attention to open sociable windows of interaction

Jeffrey S. Shell; Roel Vertegaal; Daniel Cheng; Alexander W. Skaburskis; Changuk Sohn; A. James Stewart; Omar Aoudeh; Connor Dickie

We present ECSGlasses: wearable eye contact sensing glasses that detect human eye contact. ECSGlasses report eye contact to digital devices, appliances and EyePliances in the users attention space. Devices use this attentional cue to engage in a more sociable process of turn taking with users. This has the potential to reduce inappropriate intrusions, and limit their disruptiveness. We describe new prototype systems, including the Attentive Messaging Service (AMS), the Attentive Hit Counter, the first person attentive camcorder eyeBlog, and an updated Attentive Cell Phone. We also discuss the potential of these devices to open new windows of interaction using attention as a communication modality. Further, we present a novel signal-encoding scheme to uniquely identify EyePliances and users wearing ECSGlasses in multiparty scenarios.


human factors in computing systems | 2003

Hands on cooking: towards an attentive kitchen

Jeremy S. Bradbury; Jeffrey S. Shell; Craig B. Knowles

To make human computer interaction more transparent, different modes of communication need to be explored. We present eyeCOOK, a multimodal attentive cookbook to help a non-expert computer user cook a meal. The user communicates using eye-gaze and speech commands, and eyeCOOK responds visually and/or verbally, promoting communication through natural human input channels without physically encumbering the user. Our goal is to improve productivity and user satisfaction without creating additional requirements for user attention. We describe how the user interacts with the eyeCOOK prototype and the role of this system in an Attentive Kitchen.


eye tracking research & application | 2004

Auramirror: reflections on attention

Alexander W. Skaburskis; Roel Vertegaal; Jeffrey S. Shell

As ubiquitous computing becomes more prevalent, greater consideration will have to be taken on how devices interrupt us and vie for our attention. This paper describes Auramirror, an interactive art piece that raises questions of how computers use our attention. By measuring attention and visualizing the results for the audience in real-time, Auramirror brings the subject matter to the forefront of the audiences consideration. Finally, some ways of using the Auramirror system to help in the design of attention sensitive devices are discussed.


Archive | 2012

Method and apparatus for communication between humans and devices

Roel Vertegaal; Jeffrey S. Shell


Archive | 2004

Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections

Roel Vertegaal; Changuk Sohn; Daniel Cheng; Victor Macfarlane; Jeffrey S. Shell


human factors in computing systems | 2004

Eye contact sensing glasses for attention-sensitive wearable video blogging

Connor Dickie; Roel Vertegaal; Jeffrey S. Shell; Changuk Sohn; Daniel Cheng; Omar Aoudeh

Collaboration


Dive into the Jeffrey S. Shell's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge