Ke-Yu Chen
University of Washington
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ke-Yu Chen.
user interface software and technology | 2013
Ke-Yu Chen; Kent Lyons; Sean White; Shwetak N. Patel
While much progress has been made in wearable computing in recent years, input techniques remain a key challenge. In this paper, we introduce uTrack, a technique to convert the thumb and fingers into a 3D input system using magnetic field (MF) sensing. A user wears a pair of magnetometers on the back of their fingers and a permanent magnet affixed to the back of the thumb. By moving the thumb across the fingers, we obtain a continuous input stream that can be used for 3D pointing. Specifically, our novel algorithm calculates the magnets 3D position and tilt angle directly from the sensor readings. We evaluated uTrack as an input device, showing an average tracking accuracy of 4.84 mm in 3D space - sufficient for subtle interaction. We also demonstrate a real-time prototype and example applications allowing users to interact with the computer using 3D finger input.
user interface software and technology | 2014
Chen Zhao; Ke-Yu Chen; Tanvir Islam Aumi; Shwetak N. Patel; Matthew S. Reynolds
Current smartphone inputs are limited to physical buttons, touchscreens, cameras or built-in sensors. These approaches either require a dedicated surface or line-of-sight for interaction. We introduce SideSwipe, a novel system that enables in-air gestures both above and around a mobile device. Our system leverages the actual GSM signal to detect hand gestures around the device. We developed an algorithm to convert the discrete and bursty GSM pulses to a continuous wave that can be used for gesture recognition. Specifically, when a user waves their hand near the phone, the hand movement disturbs the signal propagation between the phones transmitter and added receiving antennas. Our system captures this variation and uses it for gesture recognition. To evaluate our system, we conduct a study with 10 participants and present robust gesture recognition with an average accuracy of 87.2% across 14 hand gestures.
ieee international conference on pervasive computing and communications | 2015
Ruth Ravichandran; Elliot Saba; Ke-Yu Chen; Mayank Goel; Sidhant Gupta; Shwetak N. Patel
Sensing respiration rate has many applications in monitoring various health conditions, such as sleep apnea and chronic obstructive pulmonary disease. In this paper, we present WiBreathe, a wireless, high fidelity and non-invasive breathing monitor that leverages wireless signals at 2.4 GHz to estimate an individuals respiration rate. Our work extends past approaches of using wireless signals for respiratory monitoring by using only a single transmitter-receiver pair at the same frequency range of commodity Wi-Fi signals to estimate the respiratory rate of an individual. This is done irrespective of whether they are in line of sight or not (e.g., through walls). Furthermore, we demonstrate the capability of WiBreathe in detecting multiple people and by extension, their respiration rates. We evaluate our approach in various natural environments and show that we can track breathing with the accuracy of 1.54 breaths per minute when compared to a clinical respiratory chest band.
ubiquitous computing | 2011
Sidhant Gupta; Ke-Yu Chen; Matthew S. Reynolds; Shwetak N. Patel
In this paper, we describe LightWave, a sensing approach that turns ordinary compact fluorescent light (CFL) bulbs into sensors of human proximity. Unmodified CFL bulbs are shown to be sensitive proximity transducers when they are illuminated. This approach utilizes predictable variations in electromagnetic noise resulting from the change in impedance due to the proximity of a human body to the bulb. The electromagnetic noise can be sensed from any point along a homes electrical wiring. This allows users to perform gestures near any CFL lighting fixture, even when multiple lamps are operational. Gestures can be sensed using a single interface device plugged into any electrical outlet. We experimentally show that we can reliably detect hover gestures (waving a hand close to a lamp), touches on lampshades, and touches on the glass part of the bulb itself. Additionally, we show that touches anywhere along the body of a metal lamp can be detected. These basic detectable signals can then be combined to form complex gesture sequences for a variety of applications. We also show that CFLs can function as more general-purpose sensors for distributed human motion detection and ambient temperature sensing.
human factors in computing systems | 2013
Ke-Yu Chen; Gabe Cohn; Sidhant Gupta; Shwetak N. Patel
Current solutions for enabling touch interaction on existing non-touch LCD screens require adding additional sensors to the interaction surface. We present uTouch, a system that detects and classifies touches and hovers without any modification to the display, and without adding any sensors to the user. Our approach utilizes existing signals in an LCD that are amplified when a user brings their hand near or touches the LCDs front panel. These signals are coupled onto the power lines, where they appear as electromagnetic interference (EMI) which can be sensed using a single device connected elsewhere on the power line infrastructure. We validate our approach with an 11 user, 8 LCD study, and demonstrate a real-time system.
human factors in computing systems | 2016
Ke-Yu Chen; Shwetak N. Patel; Sean Keller
With the resurgence of head-mounted displays for virtual reality, users need new input devices that can accurately track their hands and fingers in motion. We introduce Finexus, a multipoint tracking system using magnetic field sensing. By instrumenting the fingertips with electromagnets, the system can track fine fingertip movements in real time using only four magnetic sensors. To keep the system robust to noise, we operate each electromagnet at a different frequency and leverage bandpass filters to distinguish signals attributed to individual sensing points. We develop a novel algorithm to efficiently calculate the 3D positions of multiple electromagnets from corresponding field strengths. In our evaluation, we report an average accuracy of 1.33 mm, as compared to results from an optical tracker. Our real-time implementation shows Finexus is applicable to a wide variety of human input tasks, such as writing in the air.
ieee international conference on pervasive computing and communications | 2015
Ke-Yu Chen; Sidhant Gupta; Eric C. Larson; Shwetak N. Patel
Electricity and appliance usage information can often reveal the nature of human activities in a home. For instance, sensing the use of vacuum cleaner, a microwave oven, and kitchen appliances can give insights into a persons current activities. Instead of putting a sensor on each appliance, our technique is based on the idea that appliance usage can be sensed by their manifestations in an environments existing electrical infrastructure. Prior approaches using this technique could only detect an appliances on-off states; that is, they only sense “what” is being used, but not “how” it is used. In this paper, we introduce DOSE, a significant advancement for inferring operating states of electronic devices from a single sensing point in a home. When an electronic device is in operation, it generates time-varying Electromagnetic Interference (EMI) based upon its operating states (e.g., vacuuming on a rug vs. hardwood floor). This EMI noise is coupled to the power line and can be picked up from a single sensing hardware attached to the wall outlet in a house. Unlike prior data-driven approaches, we employ domain knowledge of the devices circuitry for semi-supervised model training to avoid tedious labeling process. We evaluated DOSE in a residential house for 2 months and found that operating states for 16 appliances could be estimated with an average accuracy of 93.8%. These fine-grained electrical characteristics affords rich feature sets of electrical events and have the potential to support various applications such as in-home activity inference, energy disaggregation and device failure detection.
ubiquitous computing | 2014
Ke-Yu Chen; Daniel Ashbrook; Mayank Goel; Sung-hyuck Lee; Shwetak N. Patel
international conference on body area networks | 2013
Ke-Yu Chen; Mark Harniss; Justin Haowei Lim; Youngjun Han; Kurt L. Johnson; Shwetak N. Patel
Archive | 2015
Matthew S. Reynolds; Chen Zhao; Ke-Yu Chen; Shwetak N. Patel