Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Lisa J. Stifelman is active.

Publication


Featured researches published by Lisa J. Stifelman.


human factors in computing systems | 2001

The audio notebook: paper and pen interaction with structured speech

Lisa J. Stifelman; Barry Arons; Chris Schmandt

This paper addresses the problem that a listener experiences when attempting to capture information presented during a lecture, meeting, or interview. Listeners must divide their attention between the talker and their notetaking activity. We propose a new device-the Audio Notebook-for taking notes and interacting with a speech recording. The Audio Notebook is a combination of a digital audio recorder and paper notebook, all in one device. Audio recordings are structured using two techniques: user structuring based on notetaking activity, and acoustic structuring based on a talkers changes in pitch, pausing, and energy. A field study showed that the interaction techniques enabled a range of usage styles, from detailed review to high speed skimming. The study motivated the addition of phrase detection and topic suggestions to improve access to the audio recordings. Through these audio interaction techniques, the Audio Notebook defines a new approach for navigation in the audio domain.


human factors in computing systems | 1996

Augmenting real-world objects: a paper-based audio notebook

Lisa J. Stifelman

The Audio Notebook allows a user to capture and access an audio recording of a lecture or meeting in conjunction with notes written on paper. The audio recording is synchronized with the user’s handwritten notes and page turns. As a user flips through physical pages of notes, the audio scans to the start of each page. Audio is also accessed by pointing with a pen to a location in the notes or using an audio scrollbar. A small observational study of users in real settings was performed. The prototype did not interfere with the user’s normal interactions yet gave reassurance that key ideas could be accessed later. In future work, automatic segmentation of the recorded speech using acoustic cues will be combined with user activity to structure the audio.


human factors in computing systems | 1993

VoiceNotes: a speech interface for a hand-held voice notetaker

Lisa J. Stifelman; Barry Arons; Chris Schmandt; Eric A. Hulteen

VoiceNotes is an application for a voice-controlled hand-held computer that allows the creation, management, and retrieval of user-authored voice notes—small segments of digitized speech containing thoughts, ideas, reminders, or things to do. Iterative design and user testing helped to refine the initial user interface design. VoiceNotes explores the problem of capturing and retrieving spontaneous ideas, the use of speech as data, and the use of speech input and output in the user interface for a hand-held computer without a visual display. In addition, VoiceNotes serves as a step toward new uses of voice technology and interfaces for future portable devices.


user interface software and technology | 1995

Designing auditory interactions for PDAs

Debby Hindus; Barry Arons; Lisa J. Stifelman; Bill Gaver; Elizabeth D. Mynatt; Maribeth Back

This panel addresses issues in designing audio-based user interactions for small, personal computing devices, or PDAs. One issue is the nature of interacting with an auditory PDA and the interplay of affordances and form factors. Another issue is how both new and traditional metaphors and interaction concepts might be applied to auditory PDAs. The utility and design of nonspeech cues are discussed, as are the aesthetic issues of persona and narrative in designing sounds. Also discussed are commercially available sound and speech components and related hardware tradeoffs. Finally, the social implications of auditory interactions are explored, including privacy, fashion and novel social interactions.


user interface software and technology | 1995

A tool to support speech and non-speech audio feedback generation in audio interfaces

Lisa J. Stifelman

Development of new auditory interfaces requires the integration of text-to-speech synthesis, digitized audio, and non-speech audio output. This paper describes a tool for specifying speech and non-speech audio feedback and its use in the development of a speech interface, Conversational VoiceNotes. Auditory feedback is specified as a context-free grammar, where the basic elements in the grammar can be either words or non-speech sounds. The feedback specification method described here provides the ability to vary the feedback based on the current state of the system, and is flexible enough to allow different feedback for different input modalities (e.g., speech, mouse, buttons). The declarative specification is easily modifiable, supporting an iterative design process.


Archive | 1999

Multimedia linking device and method

Barry Arons; Lisa J. Stifelman


Archive | 1991

Not Just Another Voice Mail System

Lisa J. Stifelman


Archive | 1999

Page identification system and method

Barry Arons; Lisa J. Stifelman; Stephen D. Fantone; Kevin M. Sevigny


Archive | 1992

VoiceNotes--an application for a voice-controlled hand-held computer

Lisa J. Stifelman


Archive | 1995

A Discourse Analysis Approach to Structured Speech

Lisa J. Stifelman

Collaboration


Dive into the Lisa J. Stifelman's collaboration.

Top Co-Authors

Avatar

Barry Arons

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Chris Schmandt

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Debby Hindus

Interval Research Corporation

View shared research outputs
Top Co-Authors

Avatar

Elizabeth D. Mynatt

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge