Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gabe Cohn is active.

Publication


Featured researches published by Gabe Cohn.


IEEE Pervasive Computing | 2011

Disaggregated End-Use Energy Sensing for the Smart Grid

Jon E. Froehlich; Eric C. Larson; Sidhant Gupta; Gabe Cohn; Matthew S. Reynolds; Shwetak N. Patel

This article surveys existing and emerging disaggregation techniques for energy-consumption data and highlights signal features that might be used to sense disaggregated data in an easily installed and cost-effective manner.


human factors in computing systems | 2012

Humantenna: using the body as an antenna for real-time whole-body interaction

Gabe Cohn; Dan Morris; Shwetak N. Patel; Desney S. Tan

Computer vision and inertial measurement have made it possible for people to interact with computers using whole-body gestures. Although there has been rapid growth in the uses and applications of these systems, their ubiquity has been limited by the high cost of heavily instrumenting either the environment or the user. In this paper, we use the human body as an antenna for sensing whole-body gestures. Such an approach requires no instrumentation to the environment, and only minimal instrumentation to the user, and thus enables truly mobile applications. We show robust gesture recognition with an average accuracy of 93% across 12 whole-body gestures, and promising results for robust location classification within a building. In addition, we demonstrate a real-time interactive system which allows a user to interact with a computer using whole-body gestures


human factors in computing systems | 2011

Your noise is my command: sensing gestures using the body as an antenna

Gabe Cohn; Dan Morris; Shwetak N. Patel; Desney S. Tan

Touch sensing and computer vision have made human-computer interaction possible in environments where keyboards, mice, or other handheld implements are not available or desirable. However, the high cost of instrumenting environments limits the ubiquity of these technologies, particularly in home scenarios where cost constraints dominate installation decisions. Fortunately, home environments frequently offer a signal that is unique to locations and objects within the home: electromagnetic noise. In this work, we use the body as a receiving antenna and leverage this noise for gestural interaction. We demonstrate that it is possible to robustly recognize touched locations on an uninstrumented home wall using no specialized sensors. We conduct a series of experiments to explore the capabilities that this new sensing modality may offer. Specifically, we show robust classification of gestures such as the position of discrete touches around light switches, the particular light switch being touched, which appliances are touched, differentiation between hands, as well as continuous proximity of hand to the switch, among others. We close by discussing opportunities, limitations, and future work.


international conference on pervasive computing | 2010

GasSense: appliance-level, single-point sensing of gas activity in the home

Gabe Cohn; Sidhant Gupta; Jon E. Froehlich; Eric C. Larson; Shwetak N. Patel

This paper presents GasSense, a low-cost, single-point sensing solution for automatically identifying gas use down to its source (e.g., water heater, furnace, fireplace). This work adds a complementary sensing solution to the growing body of work in infrastructure-mediated sensing. GasSense analyzes the acoustic response of a homes government mandated gas regulator, which provides the unique capability of sensing both the individual appliance at which gas is currently being consumed as well as an estimate of the amount of gas flow. Our approach provides a number of appealing features including the ability to be easily and safely installed without the need of a professional. We deployed our solution in nine different homes and initial results show that GasSense has an average accuracy of 95.2% in identifying individual appliance usage.


ubiquitous computing | 2012

An ultra-low-power human body motion sensor using static electric field sensing

Gabe Cohn; Sidhant Gupta; Tien Jui Lee; Dan Morris; Joshua R. Smith; Matthew S. Reynolds; Desney S. Tan; Shwetak N. Patel

Wearable sensor systems have been used in the ubiquitous computing community and elsewhere for applications such as activity and gesture recognition, health and wellness monitoring, and elder care. Although the power consumption of accelerometers has already been highly optimized, this work introduces a novel sensing approach which lowers the power requirement for motion sensing by orders of magnitude. We present an ultra-low-power method for passively sensing body motion using static electric fields by measuring the voltage at any single location on the body. We present the feasibility of using this sensing approach to infer the amount and type of body motion anywhere on the body and demonstrate an ultra-low-power motion detector used to wake up more power-hungry sensors. The sensing hardware consumes only 3.3 μW, and wake-up detection is done using an additional 3.3 μW (6.6 μW total).


human factors in computing systems | 2011

HeatWave: thermal imaging for surface user interaction

Eric C. Larson; Gabe Cohn; Sidhant Gupta; Xiaofeng Ren; Beverly L. Harrison; Dieter Fox; Shwetak N. Patel

We present HeatWave, a system that uses digital thermal imaging cameras to detect, track, and support user interaction on arbitrary surfaces. Thermal sensing has had limited examination in the HCI research community and is generally under-explored outside of law enforcement and energy auditing applications. We examine the role of thermal imaging as a new sensing solution for enhancing user surface interaction. In particular, we demonstrate how thermal imaging in combination with existing computer vision techniques can make segmentation and detection of routine interaction techniques possible in real-time, and can be used to complement or simplify algorithms for traditional RGB and depth cameras. Example interactions include (1) distinguishing hovering above a surface from touch events, (2) shape-based gestures similar to ink strokes, (3) pressure based gestures, and (4) multi-finger gestures. We close by discussing the practicality of thermal sensing for naturalistic user interaction and opportunities for future work.


ubiquitous computing | 2010

SNUPI: sensor nodes utilizing powerline infrastructure

Gabe Cohn; Erich P. Stuntebeck; Jagdish Nayayan Pandey; Brian P. Otis; Gregory D. Abowd; Shwetak N. Patel

A persistent concern of wireless sensors is the power consumption required for communication, which presents a significant adoption hurdle for practical ubiquitous computing applications. This work explores the use of the home powerline as a large distributed antenna capable of receiving signals from ultra-low-power wireless sensor nodes and thus allowing nodes to be detected at ranges that are otherwise impractical with traditional over-the-air reception. We present the design and implementation of small ultra-low-power 27 MHz sensor nodes that transmit their data by coupling over the powerline to a single receiver attached to the powerline in the home. We demonstrate the ability of our general purpose wireless sensor nodes to provide whole-home coverage while consuming less than 1 mW of power when transmitting (65 ¼W consumed in our custom CMOS transmitter). This is the lowest power transmitter to date compared to those found in traditional whole-home wireless systems.


human factors in computing systems | 2017

Finding Common Ground: A Survey of Capacitive Sensing in Human-Computer Interaction

Tobias Grosse-Puppendahl; Christian Holz; Gabe Cohn; Raphael Wimmer; Oskar Bechtold; Steve Hodges; Matthew S. Reynolds; Joshua R. Smith

For more than two decades, capacitive sensing has played a prominent role in human-computer interaction research. Capacitive sensing has become ubiquitous on mobile, wearable, and stationary devices - enabling fundamentally new interaction techniques on, above, and around them. The research community has also enabled human position estimation and whole-body gestural interaction in instrumented environments. However, the broad field of capacitive sensing research has become fragmented by different approaches and terminology used across the various domains. This paper strives to unify the field by advocating consistent terminology and proposing a new taxonomy to classify capacitive sensing approaches. Our extensive survey provides an analysis and review of past research and identifies challenges for future work. We aim to create a common understanding within the field of human-computer interaction, for researchers and practitioners alike, and to stimulate and facilitate future research in capacitive sensing.


ubiquitous computing | 2010

WATTR: a method for self-powered wireless sensing of water activity in the home

Tim Campbell; Eric C. Larson; Gabe Cohn; Jon E. Froehlich; Ramses Alcaide; Shwetak N. Patel

We present WATTR, a novel self-powered water activity sensor that utilizes residential water pressure impulses as both a powering and sensing source. Consisting of a power harvesting circuit, piezoelectric sensor, ultra-low-power 16-bit microcontroller, 16-bit analog-to-digital converter (ADC), and a 433 MHz wireless transmitter, WATTR is capable of sampling home water pressure at 33 Hz and transmitting over 3 m when any water fixture in the home is opened or closed. WATTR provides an alternative sensing solution to the power intensive Bluetooth-based sensor used in the HydroSense project by Froehlich et al. [2] for single-point whole-home water usage. We demonstrate WATTR as a viable self-powered sensor capable of monitoring and transmitting water usage data without the use of a battery. Unlike other water-based power harvesters, WATTR does not waste water to power itself. We discuss the design, implementation, and experimental verification of the WATTR device.


human factors in computing systems | 2013

uTouch: sensing touch gestures on unmodified LCDs

Ke-Yu Chen; Gabe Cohn; Sidhant Gupta; Shwetak N. Patel

Current solutions for enabling touch interaction on existing non-touch LCD screens require adding additional sensors to the interaction surface. We present uTouch, a system that detects and classifies touches and hovers without any modification to the display, and without adding any sensors to the user. Our approach utilizes existing signals in an LCD that are amplified when a user brings their hand near or touches the LCDs front panel. These signals are coupled onto the power lines, where they appear as electromagnetic interference (EMI) which can be sensed using a single device connected elsewhere on the power line infrastructure. We validate our approach with an 11 user, 8 LCD study, and demonstrate a real-time system.

Collaboration


Dive into the Gabe Cohn's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Eric C. Larson

University of Washington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Akash Badshah

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge