Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stephen S. Intille is active.

Publication


Featured researches published by Stephen S. Intille.


international conference on pervasive computing | 2004

Activity recognition from user-annotated acceleration data

Ling Bao; Stephen S. Intille

In this work, algorithms are developed and evaluated to de- tect physical activities from data acquired using five small biaxial ac- celerometers worn simultaneously on different parts of the body. Ac- celeration data was collected from 20 subjects without researcher su- pervision or observation. Subjects were asked to perform a sequence of everyday tasks but not told specifically where or how to do them. Mean, energy, frequency-domain entropy, and correlation of acceleration data was calculated and several classifiers using these features were tested. De- cision tree classifiers showed the best performance recognizing everyday activities with an overall accuracy rate of 84%. The results show that although some activities are recognized well with subject-independent training data, others appear to require subject-specific training data. The results suggest that multiple accelerometers aid in recognition because conjunctions in acceleration feature values can effectively discriminate many activities. With just two biaxial accelerometers - thigh and wrist - the recognition performance dropped only slightly. This is the first work to investigate performance of recognition algorithms with multiple, wire-free accelerometers on 20 activities using datasets annotated by the subjects themselves.


international conference on pervasive computing | 2004

Activity Recognition in the Home Using Simple and Ubiquitous Sensors

Emmanuel Munguia Tapia; Stephen S. Intille; Kent Larson

In this work, a system for recognizing activities in the home setting using a set of small and simple state-change sensors is introduced. The sensors are designed to be “tape on and forget” devices that can be quickly and ubiquitously installed in home environments. The proposed sensing system presents an alternative to sensors that are sometimes perceived as invasive, such as cameras and microphones. Unlike prior work, the system has been deployed in multiple residential environments with non-researcher occupants. Preliminary results on a small dataset show that it is possible to recognize activities of interest to medical professionals such as toileting, bathing, and grooming with detection accuracies ranging from 25% to 89% depending on the evaluation criteria used.


International Journal of Computer Vision | 1999

Large Occlusion Stereo

Aaron F. Bobick; Stephen S. Intille

A method for solving the stereo matching problem in the presence of large occlusion is presented. A data structure—the disparity space image—is defined to facilitate the description of the effects of occlusion on the stereo matching process and in particular on dynamic programming (DP) solutions that find matches and occlusions simultaneously. We significantly improve upon existing DP stereo matching methods by showing that while some cost must be assigned to unmatched pixels, sensitivity to occlusion-cost and algorithmic complexity can be significantly reduced when highly-reliable matches, or ground control points, are incorporated into the matching process. The use of ground control points eliminates both the need for biasing the process towards a smooth solution and the task of selecting critical prior probabilities describing image formation. Finally, we describe how the detection of intensity edges can be used to bias the recovered solution such that occlusion boundaries will tend to be proposed along such edges, reflecting the observation that occlusion boundaries usually cause intensity discontinuities.


Teleoperators and Virtual Environments | 1999

The KidsRoom: A Perceptually-Based Interactive and Immersive Story Environment

Aaron F. Bobick; Stephen S. Intille; James W. Davis; Freedom Baird; Claudio S. Pinhanez; Lee W. Campbell; Yuri A. Ivanov; Arjan Schütte; Andrew D. Wilson

The KidsRoom is a perceptually-based, interactive, narrative playspace for children. Images, music, narration, light, and sound effects are used to transform a normal childs bedroom into a fantasy land where children are guided through a reactive adventure story. The fully automated system was designed with the following goals: (1) to keep the focus of user action and interaction in the physical and not virtual space; (2) to permit multiple, collaborating people to simultaneously engage in an interactive experience combining both real and virtual objects; (3) to use computer-vision algorithms to identify activity in the space without requiring the participants to wear any special clothing or devices; (4) to use narrative to constrain the perceptual recognition, and to use perceptual recognition to allow participants to drive the narrative; and (5) to create a truly immersive and interactive room environment. We believe the KidsRoom is the first multi-person, fully-automated, interactive, narrative environment ever constructed using non-encumbering sensors. This paper describes the KidsRoom, the technology that makes it work, and the issues that were raised during the systems development.1 A demonstration of the project, which complements the material presented here and includes videos, images, and sounds from each part of the story is available at .


IEEE Pervasive Computing | 2002

Designing a home of the future

Stephen S. Intille

An interdisciplinary team is developing technologies and design strategies that use context-aware sensing to empower people by presenting information at precisely the right time and place. The team is designing a living laboratory to study technology that motivates behavior change in context.


American Journal of Preventive Medicine | 2008

Health and the Mobile Phone

Kevin Patrick; William G. Griswold; Fred Raab; Stephen S. Intille

Within the next 8 years, annual U.S. expenditure on health care is projected to reach


international symposium on wearable computers | 2007

Real-Time Recognition of Physical Activities and Their Intensities Using Wireless Accelerometers and a Heart Rate Monitor

Emmanuel Munguia Tapia; Stephen S. Intille; William L. Haskell; Kent Larson; Julie A. Wright; Abby C. King; Robert H. Friedman

4 trillion/year, or 20% of the gross domestic product.1 Whether resource consumption of this order of magnitude is sustainable is an open question, but at the very least it suggests the need for population-level solutions for everything from the primary prevention of disease to improving end-of-life care. Ours is a society that often views challenges like this as being solved through the application of technology, and one technology in particular is emerging that may become very important to the delivery of health care: mobile phones. By June 2007 there were 239 million users of mobile phones in the U.S. or 79% of the population,2 and users are highly diverse.3 Mobile phones are beginning to replace landline telephones for some, and except for very young children, may ultimately reach an effective penetration of “one phone: one person” as is already the case in some countries such as Finland.4 This paper provides an overview of the implications of this trend for the delivery of healthcare services. In addition to addressing how mobile phones are changing the way health professionals communicate with their patients, a summary is provided of current and projected technologic capabilities of mobile phones that have the potential to render them an increasingly indispensable personal health device. Finally, the health risks of mobile phone use are addressed, as are several unresolved technical and policy-related issues unique to mobile phones. Because these issues may influence how well and how quickly mobile phones are integrated into health care, and how well they serve the needs of the entire population, they deserve the attention of both the healthcare and public health community.


human factors in computing systems | 2005

Using context-aware computing to reduce the perceived burden of interruptions from mobile devices

Joyce Ho; Stephen S. Intille

In this paper, we present a real-time algorithm for automatic recognition of not only physical activities, but also, in some cases, their intensities, using five triaxial wireless accelerometers and a wireless heart rate monitor. The algorithm has been evaluated using datasets consisting of 30 physical gymnasium activities collected from a total of 21 people at two different labs. On these activities, we have obtained a recognition accuracy performance of 94.6% using subject-dependent training and 56.3% using subject-independent training. The addition of heart rate data improves subject-dependent recognition accuracy only by 1.2% and subject-independent recognition only by 2.1%. When recognizing activity type without differentiating intensity levels, we obtain a subject-independent performance of 80.6%. We discuss why heart rate data has such little discriminatory power.


computer vision and pattern recognition | 1997

Real-time closed-world tracking

Stephen S. Intille; James W. Davis; Aaron F. Bobick

The potential for sensor-enabled mobile devices to proactively present information when and where users need it ranks among the greatest promises of ubiquitous computing. Unfortunately, mobile phones, PDAs, and other computing devices that compete for the users attention can contribute to interruption irritability and feelings of information overload. Designers of mobile computing interfaces, therefore, require strategies for minimizing the perceived interruption burden of proactively delivered messages. In this work, a context-aware mobile computing device was developed that automatically detects postural and ambulatory activity transitions in real time using wireless accelerometers. This device was used to experimentally measure the receptivity to interruptions delivered at activity transitions relative to those delivered at random times. Messages delivered at activity transitions were found to be better received, thereby suggesting a viable strategy for context-aware message delivery in sensor-enabled mobile computing devices.


Computer Vision and Image Understanding | 2001

Recognizing Planned, Multiperson Action

Stephen S. Intille; Aaron F. Bobick

A real-time tracking algorithm that uses contextual information is described. The method is capable of simultaneously tracking multiple, non-rigid objects when erratic movement and object collisions are common. A closed-world assumption is used to adaptively select and weight image features used for correspondence. Results of algorithm testing and the limitations of the method are discussed. The algorithm has been used to track children in an interactive, narrative playspace.

Collaboration


Dive into the Stephen S. Intille's collaboration.

Top Co-Authors

Avatar

Genevieve F. Dunton

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Kent Larson

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Aaron F. Bobick

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Fahd Albinali

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yue Liao

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Eldin Dzubur

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Jennifer S. Beaudin

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge