Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Liwei Chan is active.

Publication


Featured researches published by Liwei Chan.


user interface software and technology | 2013

FingerPad: private and subtle interaction using fingertips

Liwei Chan; Rong-Hao Liang; Ming-Chang Tsai; Kai-Yin Cheng; Chao-Huai Su; Mike Y. Chen; Wen-Huang Cheng; Bing-Yu Chen

We present FingerPad, a nail-mounted device that turns the tip of the index finger into a touchpad, allowing private and subtle interaction while on the move. FingerPad enables touch input using magnetic tracking, by adding a Hall sensor grid on the index fingernail, and a magnet on the thumbnail. Since it permits input through the pinch gesture, FingerPad is suitable for private use because the movements of the fingers in a pinch are subtle and are naturally hidden by the hand. Functionally, FingerPad resembles a touchpad, and also allows for eyes-free use. Additionally, since the necessary devices are attached to the nails, FingerPad preserves natural haptic feedback without affecting the native function of the fingertips. Through user study, we analyze the three design factors, namely posture, commitment method and target size, to assess the design of the FingerPad. Though the results show some trade-off among the factors, generally participants achieve 93% accuracy for very small targets (1.2mm-width) in the seated condition, and 92% accuracy for 2.5mm-width targets in the walking condition.


human factors in computing systems | 2012

CapStones and ZebraWidgets: sensing stacks of building blocks, dials and sliders on capacitive touch screens

Liwei Chan; Stefanie Müller; Anne Roudaut; Patrick Baudisch

Recent research proposes augmenting capacitive touch pads with tangible objects, enabling a new generation of mobile applications enhanced with tangible objects, such as game pieces and tangible controllers. In this paper, we extend the concept to capacitive tangibles consisting of multiple parts, such as stackable gaming pieces and tangible widgets with moving parts. We achieve this using a system of wires and connectors inside each block that causes the capacitance of the bottom-most block to reflect the entire assembly. We demonstrate three types of tangibles, called CapStones, Zebra Dials and Zebra Sliders that work with current consumer hardware and investigate what designs may become possible as touchscreen hardware evolves.


international conference on pervasive computing | 2006

Collaborative localization: enhancing WiFi-based position estimation with neighborhood links in clusters

Liwei Chan; Ji-Rung Chiang; Yi-Chao Chen; Chia-nan Ke; Jane Yung-jen Hsu; Hao-Hua Chu

Location-aware services can benefit from accurate and reliable indoor location tracking. The widespread adoption of 802.11x wireless LAN as the network infrastructure creates the opportunity to deploy WiFi-based location services with few additional hardware costs. While recent research has demonstrated adequate performance, localization error increases significantly in crowded and dynamic situations due to electromagnetic interferences. This paper proposes collaborative localization as an approach to enhance position estimation by leveraging more accurate location information from nearby neighbors within the same cluster. The current implementation utilizes ZigBee radio as the neighbor-detection sensor. This paper introduces the basic model and algorithm for collaborative localization. We also report experiments to evaluate its performance under a variety of clustering scenarios. Our results have shown 28.2-56% accuracy improvement over the baseline system Ekahau, a commercial WiFi localization system.


human factors in computing systems | 2011

TUIC: enabling tangible interaction on capacitive multi-touch displays

Neng-Hao Yu; Liwei Chan; Seng-Yong Lau; Sung-Sheng Tsai; I-Chun Hsiao; Dian-Je Tsai; Fang-I Hsiao; Lung-Pan Cheng; Mike Y. Chen; Polly Huang; Yi-Ping Hung

We present TUIC, a technology that enables tangible interaction on capacitive multi-touch devices, such as iPad, iPhone, and 3Ms multi-touch displays, without requiring any hardware modifications. TUIC simulates finger touches on capacitive displays using passive materials and active modulation circuits embedded inside tangible objects, and can be used with multi-touch gestures simultaneously. TUIC consists of three approaches to sense and track objects: spatial, frequency, and hybrid (spatial plus frequency). The spatial approach, also known as 2D markers, uses geometric, multi-point touch patterns to encode object IDs. Spatial tags are straightforward to construct and are easily tracked when moved, but require sufficient spacing between the multiple touch points. The frequency approach uses modulation circuits to generate high-frequency touches to encode object IDs in the time domain. It requires fewer touch points and allows smaller tags to be built. The hybrid approach combines both spatial and frequency tags to construct small tags that can be reliably tracked when moved and rotated. We show three applications demonstrating the above approaches on iPads and 3Ms multi-touch displays.


human factors in computing systems | 2010

Touching the void: direct-touch interaction for intangible displays

Liwei Chan; HuiShan Kao; Mike Y. Chen; Ming-Sui Lee; Jane Yung-jen Hsu; Yi-Ping Hung

In this paper, we explore the challenges in applying and investigate methodologies to improve direct-touch interaction on intangible displays. Direct-touch interaction simplifies object manipulation, because it combines the input and display into a single integrated interface. While traditional tangible display-based direct-touch technology is commonplace, similar direct-touch interaction within an intangible display paradigm presents many challenges. Given the lack of tactile feedback, direct-touch interaction on an intangible display may show poor performance even on the simplest of target acquisition tasks. In order to study this problem, we have created a prototype of an intangible display. In the initial study, we collected user discrepancy data corresponding to the interpretation of 3D location of targets shown on our intangible display. The result showed that participants performed poorly in determining the z-coordinate of the targets and were imprecise in their execution of screen touches within the system. Thirty percent of positioning operations showed errors larger than 30mm from the actual surface. This finding triggered our interest to design a second study, in which we quantified task time in the presence of visual and audio feedback. The pseudo-shadow visual feedback was shown to be helpful both in improving user performance and satisfaction.


user interface software and technology | 2015

CyclopsRing: Enabling Whole-Hand and Context-Aware Interactions Through a Fisheye Ring

Liwei Chan; Yi-Ling Chen; Chi-Hao Hsieh; Rong-Hao Liang; Bing-Yu Chen

This paper presents CyclopsRing, a ring-style fisheye imaging wearable device that can be worn on hand webbings to en- able whole-hand and context-aware interactions. Observing from a central position of the hand through a fisheye perspective, CyclopsRing sees not only the operating hand, but also the environmental contexts that involve with the hand-based interactions. Since CyclopsRing is a finger-worn device, it also allows users to fully preserve skin feedback of the hands. This paper demonstrates a proof-of-concept device, reports the performance in hand-gesture recognition using random decision forest (RDF) method, and, upon the gesture recognizer, presents a set of interaction techniques including on-finger pinch-and-slide input, in-air pinch-and-motion input, palm-writing input, and their interactions with the environ- mental contexts. The experiment obtained an 84.75% recognition rate of hand gesture input from a database of seven hand gestures collected from 15 participants. To our knowledge, CyclopsRing is the first ring-wearable device that supports whole-hand and context-aware interactions.


user interface software and technology | 2010

Enabling beyond-surface interactions for interactive surface with an invisible projection

Liwei Chan; Hsiang-Tao Wu; HuiShan Kao; Ju-Chun Ko; Home-Ru Lin; Mike Y. Chen; Jane Yung-jen Hsu; Yi-Ping Hung

This paper presents a programmable infrared (IR) technique that utilizes invisible, programmable markers to support interaction beyond the surface of a diffused-illumination (DI) multi-touch system. We combine an IR projector and a standard color projector to simultaneously project visible content and invisible markers. Mobile devices outfitted with IR cameras can compute their 3D positions based on the markers perceived. Markers are selectively turned off to support multi-touch and direct on-surface tangible input. The proposed techniques enable a collaborative multi-display multi-touch tabletop system. We also present three interactive tools: i-m-View, i-m-Lamp, and i-m-Flashlight, which consist of a mobile tablet and projectors that users can freely interact with beyond the main display surface. Early user feedback shows that these interactive devices, combined with a large interactive display, allow more intuitive navigation and are reportedly enjoyable to use.


human factors in computing systems | 2016

DigitSpace: Designing Thumb-to-Fingers Touch Interfaces for One-Handed and Eyes-Free Interactions

Da-Yuan Huang; Liwei Chan; Shuo Yang; Fan Wang; Rong-Hao Liang; De-Nian Yang; Yi-Ping Hung; Bing-Yu Chen

Thumb-to-fingers interfaces augment touch widgets on fingers, which are manipulated by the thumb. Such interfaces are ideal for one-handed eyes-free input since touch widgets on the fingers enable easy access by the stylus thumb. This study presents DigitSpace, a thumb-to-fingers interface that addresses two ergonomic factors: hand anatomy and touch precision. Hand anatomy restricts possible movements of a thumb, which further influences the physical comfort during the interactions. Touch precision is a human factor that determines how precisely users can manipulate touch widgets set on fingers, which determines effective layouts of the widgets. Buttons and touchpads were considered in our studies to enable discrete and continuous input in an eyes-free manner. The first study explores the regions of fingers where the interactions can be comfortably performed. According to the comfort regions, the second and third studies explore effective layouts for button and touchpad widgets. The experimental results indicate that participants could discriminate at least 16 buttons on their fingers. For touchpad, participants were asked to perform unistrokes. Our results revealed that since individual participant performed a coherent writing behavior, personalized


human factors in computing systems | 2014

GaussBricks: magnetic building blocks for constructive tangible interactions on portable displays

Rong-Hao Liang; Liwei Chan; Hung-Yu Tseng; Han-Chih Kuo; Da-Yuan Huang; De-Nian Yang; Bing-Yu Chen

1 recognizers could offer 92% accuracy on a cross-finger touchpad. A series of design guidelines are proposed for designers, and a DigitSpace prototype that uses magnetic-tracking methods is demonstrated.


human factors in computing systems | 2013

GaussBits: magnetic tangible bits for portable and occlusion-free near-surface interactions

Rong-Hao Liang; Kai-Yin Cheng; Liwei Chan; Chuan-Xhyuan Peng; Mike Y. Chen; Rung-Huei Liang; De-Nian Yang; Bing-Yu Chen

This work describes a novel building block system for tangible interaction design, GaussBricks, which enables real-time constructive tangible interactions on portable displays. Given its simplicity, the mechanical design of the magnetic building blocks facilitates the construction of configurable forms. The form constructed by the magnetic building blocks, which are connected by the magnetic joints, allows users to stably manipulate with various elastic force feedback mechanisms. With an analog Hall-sensor grid mounted to its back, a portable display determines the geometrical configuration and detects various user interactions in real time. This work also introduce several methods to enable shape changing, multi-touch input, and display capabilities in the construction. The proposed building block system enriches how individuals interact with the portable displays physically.

Collaboration


Dive into the Liwei Chan's collaboration.

Top Co-Authors

Avatar

Yi-Ping Hung

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Bing-Yu Chen

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Rong-Hao Liang

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Jane Yung-jen Hsu

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Mike Y. Chen

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Da-Yuan Huang

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Neng-Hao Yu

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Kai-Yin Cheng

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Yi-Chi Liao

National Taiwan University

View shared research outputs
Researchain Logo
Decentralizing Knowledge