Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Rong-Hao Liang is active.

Publication


Featured researches published by Rong-Hao Liang.


user interface software and technology | 2013

FingerPad: private and subtle interaction using fingertips

Liwei Chan; Rong-Hao Liang; Ming-Chang Tsai; Kai-Yin Cheng; Chao-Huai Su; Mike Y. Chen; Wen-Huang Cheng; Bing-Yu Chen

We present FingerPad, a nail-mounted device that turns the tip of the index finger into a touchpad, allowing private and subtle interaction while on the move. FingerPad enables touch input using magnetic tracking, by adding a Hall sensor grid on the index fingernail, and a magnet on the thumbnail. Since it permits input through the pinch gesture, FingerPad is suitable for private use because the movements of the fingers in a pinch are subtle and are naturally hidden by the hand. Functionally, FingerPad resembles a touchpad, and also allows for eyes-free use. Additionally, since the necessary devices are attached to the nails, FingerPad preserves natural haptic feedback without affecting the native function of the fingertips. Through user study, we analyze the three design factors, namely posture, commitment method and target size, to assess the design of the FingerPad. Though the results show some trade-off among the factors, generally participants achieve 93% accuracy for very small targets (1.2mm-width) in the seated condition, and 92% accuracy for 2.5mm-width targets in the walking condition.


user interface software and technology | 2015

CyclopsRing: Enabling Whole-Hand and Context-Aware Interactions Through a Fisheye Ring

Liwei Chan; Yi-Ling Chen; Chi-Hao Hsieh; Rong-Hao Liang; Bing-Yu Chen

This paper presents CyclopsRing, a ring-style fisheye imaging wearable device that can be worn on hand webbings to en- able whole-hand and context-aware interactions. Observing from a central position of the hand through a fisheye perspective, CyclopsRing sees not only the operating hand, but also the environmental contexts that involve with the hand-based interactions. Since CyclopsRing is a finger-worn device, it also allows users to fully preserve skin feedback of the hands. This paper demonstrates a proof-of-concept device, reports the performance in hand-gesture recognition using random decision forest (RDF) method, and, upon the gesture recognizer, presents a set of interaction techniques including on-finger pinch-and-slide input, in-air pinch-and-motion input, palm-writing input, and their interactions with the environ- mental contexts. The experiment obtained an 84.75% recognition rate of hand gesture input from a database of seven hand gestures collected from 15 participants. To our knowledge, CyclopsRing is the first ring-wearable device that supports whole-hand and context-aware interactions.


human factors in computing systems | 2010

iCon: utilizing everyday objects as additional, auxiliary and instant tabletop controllers

Kai-Yin Cheng; Rong-Hao Liang; Bing-Yu Chen; Rung-Huei Laing; Sy-Yen Kuo

This work describes a novel approach to utilizing everyday objects of users as additional, auxiliary, and instant tabletop controllers. Based on this approach, a prototype platform, called iCon, is developed to explore the possible design. Field studies and user studies reveal that utilizing everyday objects such as auxiliary input devices might be appropriate under a multi-task scenario. User studies further demonstrate that daily objects can generally be applied in low precision circumstances, low engagement with selected objects, and medium-to-high frequency of use. The proposed approach allows users to interact with computers while not altering their original work environments.


human factors in computing systems | 2016

DigitSpace: Designing Thumb-to-Fingers Touch Interfaces for One-Handed and Eyes-Free Interactions

Da-Yuan Huang; Liwei Chan; Shuo Yang; Fan Wang; Rong-Hao Liang; De-Nian Yang; Yi-Ping Hung; Bing-Yu Chen

Thumb-to-fingers interfaces augment touch widgets on fingers, which are manipulated by the thumb. Such interfaces are ideal for one-handed eyes-free input since touch widgets on the fingers enable easy access by the stylus thumb. This study presents DigitSpace, a thumb-to-fingers interface that addresses two ergonomic factors: hand anatomy and touch precision. Hand anatomy restricts possible movements of a thumb, which further influences the physical comfort during the interactions. Touch precision is a human factor that determines how precisely users can manipulate touch widgets set on fingers, which determines effective layouts of the widgets. Buttons and touchpads were considered in our studies to enable discrete and continuous input in an eyes-free manner. The first study explores the regions of fingers where the interactions can be comfortably performed. According to the comfort regions, the second and third studies explore effective layouts for button and touchpad widgets. The experimental results indicate that participants could discriminate at least 16 buttons on their fingers. For touchpad, participants were asked to perform unistrokes. Our results revealed that since individual participant performed a coherent writing behavior, personalized


human factors in computing systems | 2014

GaussBricks: magnetic building blocks for constructive tangible interactions on portable displays

Rong-Hao Liang; Liwei Chan; Hung-Yu Tseng; Han-Chih Kuo; Da-Yuan Huang; De-Nian Yang; Bing-Yu Chen

1 recognizers could offer 92% accuracy on a cross-finger touchpad. A series of design guidelines are proposed for designers, and a DigitSpace prototype that uses magnetic-tracking methods is demonstrated.


human factors in computing systems | 2013

GaussBits: magnetic tangible bits for portable and occlusion-free near-surface interactions

Rong-Hao Liang; Kai-Yin Cheng; Liwei Chan; Chuan-Xhyuan Peng; Mike Y. Chen; Rung-Huei Liang; De-Nian Yang; Bing-Yu Chen

This work describes a novel building block system for tangible interaction design, GaussBricks, which enables real-time constructive tangible interactions on portable displays. Given its simplicity, the mechanical design of the magnetic building blocks facilitates the construction of configurable forms. The form constructed by the magnetic building blocks, which are connected by the magnetic joints, allows users to stably manipulate with various elastic force feedback mechanisms. With an analog Hall-sensor grid mounted to its back, a portable display determines the geometrical configuration and detects various user interactions in real time. This work also introduce several methods to enable shape changing, multi-touch input, and display capabilities in the construction. The proposed building block system enriches how individuals interact with the portable displays physically.


user interface software and technology | 2014

GaussStones: shielded magnetic tangibles for multi-token interactions on portable displays

Rong-Hao Liang; Han-Chih Kuo; Liwei Chan; De-Nian Yang; Bing-Yu Chen

We present GaussBits, which is a system of the passive magnetic tangible designs that enables 3D tangible interactions in the near-surface space of portable displays. When a thin magnetic sensor grid is attached to the back of the display, the 3D position and partial 3D orientation of the GaussBits can be resolved by the proposed bi-polar magnetic field tracking technique. This portable platform can therefore enrich tangible interactions by extending the design space to the near-surface space. Since non-ferrous materials, such as the users hand, do not occlude the magnetic field, interaction designers can freely incorporate a magnetic unit into an appropriately shaped non-ferrous object to exploit the metaphors of the real-world tasks, and users can freely manipulate the GaussBits by hands or using other non-ferrous tools without causing interference. The presented example applications and the collected feedback from an explorative workshop revealed that this new approach is widely applicable.


human factors in computing systems | 2015

Cyclops: Wearable and Single-Piece Full-Body Gesture Input Devices

Liwei Chan; Chi-Hao Hsieh; Yi-Ling Chen; Shuo Yang; Da-Yuan Huang; Rong-Hao Liang; Bing-Yu Chen

This work presents GaussStones, a system of shielded magnetic tangibles design for supporting multi-token interactions on portable displays. Unlike prior works in sensing magnetic tangibles on portable displays, the proposed tangible design applies magnetic shielding by using an inexpensive galvanized steel case, which eliminates interference between magnetic tangibles. An analog Hall-sensor grid can recognize the identity of each shielded magnetic unit since each unit generates a magnetic field with a specific intensity distribution and/or polarization. Combining multiple units as a knob further allows for resolving additional identities and their orientations. Enabling these features improves support for applications involving multiple tokens. Thus, using prevalent portable displays provides generic platforms for tangible interaction design.


international conference on computer graphics and interactive techniques | 2011

SonarWatch: appropriating the forearm as a slider bar

Rong-Hao Liang; Shu-Yang Lin; Chao-Huai Su; Kai-Yin Cheng; Bing-Yu Chen; De-Nian Yang

This paper presents Cyclops, a single-piece wearable device that sees its users whole body postures through an ego-centric view of the user that is obtained through a fisheye lens at the center of the users body, allowing it to see only the users limbs and interpret body postures effectively. Unlike currently available body gesture input systems that depend on external cameras or distributed motion sensors across the users body, Cyclops is a single-piece wearable device that is worn as a pendant or a badge. The main idea proposed in this paper is the observation of limbs from a central location of the body. Owing to the ego-centric view, Cyclops turns posture recognition into a highly controllable computer vision problem. This paper demonstrates a proof-of-concept device, and an algorithm for recognizing static and moving bodily gestures based on motion history images (MHI) and a random decision forest (RDF). Four example applications of interactive bodily workout, a mobile racing game that involves hands and feet, a full-body virtual reality system, and interaction with a tangible toy are presented. The experiment on the bodily workout demonstrates that, from a database of 20 body workout gestures that were collected from 20 participants, Cyclops achieved a recognition rate of 79% using MHI and simple template matching, which increased to 92% with the more advanced machine learning approach of RDF.


user interface software and technology | 2015

FlexiBend: Enabling Interactivity of Multi-Part, Deformable Fabrications Using Single Shape-Sensing Strip

Chin-yu Chien; Rong-Hao Liang; Long-Fei Lin; Liwei Chan; Bing-Yu Chen

Human bodies become an emerging type of human-computer interfaces recently. Not only because our skin is a surface that is always available and highly accessible, but also the sense of how our body is configured in space allows us to accurately interact with our bodies in an eye-free manner. Hence, this input method is suitable to be applied on extending the interaction space of mobile devices [Harrison et al. 2010] or providing more degrees-of-freedom for enhancing gaming experiences such as Kinect1. Nevertheless, since the additional gesture detector may be obtrusive or not so portable for users, this approach can hardly be applied in everyday life.

Collaboration


Dive into the Rong-Hao Liang's collaboration.

Top Co-Authors

Avatar

Bing-Yu Chen

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Liwei Chan

National Chiao Tung University

View shared research outputs
Top Co-Authors

Avatar

Kai-Yin Cheng

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Han-Chih Kuo

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Chao-Huai Su

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Da-Yuan Huang

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Chien-Ting Weng

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Hung-Yu Tseng

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar

Mike Y. Chen

National Taiwan University

View shared research outputs
Researchain Logo
Decentralizing Knowledge