Da-Yuan Huang
National Taiwan University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Da-Yuan Huang.
human factors in computing systems | 2016
Da-Yuan Huang; Liwei Chan; Shuo Yang; Fan Wang; Rong-Hao Liang; De-Nian Yang; Yi-Ping Hung; Bing-Yu Chen
Thumb-to-fingers interfaces augment touch widgets on fingers, which are manipulated by the thumb. Such interfaces are ideal for one-handed eyes-free input since touch widgets on the fingers enable easy access by the stylus thumb. This study presents DigitSpace, a thumb-to-fingers interface that addresses two ergonomic factors: hand anatomy and touch precision. Hand anatomy restricts possible movements of a thumb, which further influences the physical comfort during the interactions. Touch precision is a human factor that determines how precisely users can manipulate touch widgets set on fingers, which determines effective layouts of the widgets. Buttons and touchpads were considered in our studies to enable discrete and continuous input in an eyes-free manner. The first study explores the regions of fingers where the interactions can be comfortably performed. According to the comfort regions, the second and third studies explore effective layouts for button and touchpad widgets. The experimental results indicate that participants could discriminate at least 16 buttons on their fingers. For touchpad, participants were asked to perform unistrokes. Our results revealed that since individual participant performed a coherent writing behavior, personalized
human factors in computing systems | 2014
Rong-Hao Liang; Liwei Chan; Hung-Yu Tseng; Han-Chih Kuo; Da-Yuan Huang; De-Nian Yang; Bing-Yu Chen
1 recognizers could offer 92% accuracy on a cross-finger touchpad. A series of design guidelines are proposed for designers, and a DigitSpace prototype that uses magnetic-tracking methods is demonstrated.
human computer interaction with mobile devices and services | 2013
Neng-Hao Yu; Da-Yuan Huang; Jia-Jyun Hsu; Yi-Ping Hung
This work describes a novel building block system for tangible interaction design, GaussBricks, which enables real-time constructive tangible interactions on portable displays. Given its simplicity, the mechanical design of the magnetic building blocks facilitates the construction of configurable forms. The form constructed by the magnetic building blocks, which are connected by the magnetic joints, allows users to stably manipulate with various elastic force feedback mechanisms. With an analog Hall-sensor grid mounted to its back, a portable display determines the geometrical configuration and detects various user interactions in real time. This work also introduce several methods to enable shape changing, multi-touch input, and display capabilities in the construction. The proposed building block system enriches how individuals interact with the portable displays physically.
human factors in computing systems | 2015
Liwei Chan; Chi-Hao Hsieh; Yi-Ling Chen; Shuo Yang; Da-Yuan Huang; Rong-Hao Liang; Bing-Yu Chen
Current touch-based UIs commonly employ regions near the corners and/or edges of the display to accommodate essential functions. As the screen size of mobile phones is ever increasing, such regions become relatively distant from the thumb and hard to reach for single-handed use. In this paper, we present two techniques: CornerSpace and BezelSpace, designed to accommodate quick access to screen targets outside the thumbs normal interactive range. Our techniques automatically determine the thumbs physical comfort zone and only require minimal thumb movement to reach distant targets on the edge of the screen. A controlled experiment shows that BezelSpace is significantly faster and more accurate. Moreover, both techniques are application-independent, and instantly accommodate either hand, left or right.
mobile and ubiquitous multimedia | 2012
Da-Yuan Huang; Chien-Pang Lin; Yi-Ping Hung; Tzuwen Chang; Neng-Hao Yu; Min-Lun Tsai; Mike Y. Chen
This paper presents Cyclops, a single-piece wearable device that sees its users whole body postures through an ego-centric view of the user that is obtained through a fisheye lens at the center of the users body, allowing it to see only the users limbs and interpret body postures effectively. Unlike currently available body gesture input systems that depend on external cameras or distributed motion sensors across the users body, Cyclops is a single-piece wearable device that is worn as a pendant or a badge. The main idea proposed in this paper is the observation of limbs from a central location of the body. Owing to the ego-centric view, Cyclops turns posture recognition into a highly controllable computer vision problem. This paper demonstrates a proof-of-concept device, and an algorithm for recognizing static and moving bodily gestures based on motion history images (MHI) and a random decision forest (RDF). Four example applications of interactive bodily workout, a mobile racing game that involves hands and feet, a full-body virtual reality system, and interaction with a tangible toy are presented. The experiment on the bodily workout demonstrates that, from a database of 20 body workout gestures that were collected from 20 participants, Cyclops achieved a recognition rate of 79% using MHI and simple template matching, which increased to 92% with the more advanced machine learning approach of RDF.
user interface software and technology | 2017
Teng Han; Qian Han; Michelle Annett; Fraser Anderson; Da-Yuan Huang; Xing-Dong Yang
Most mobile games are designed for users to only focus on their own screens thus lack of face-to-face interaction even users are sitting together. Prior work shows that the shared information space created by multiple mobile devices can encourage users to communicate to each other naturally. The aim of this work is to provide a fluent view-stitching technique for mobile phone users to establish their information-shared view. We present MagMobile: a new spatial interaction technique that allows users to stitch views by simply putting multiple mobile devices close to each other. We describe the design of spatial-aware sensor module which is low cost and easy to be obtained into phones. We also propose two collaborative games to engage social interactions in the co-located place.
international conference on computer graphics and interactive techniques | 2015
Ping-Hsuan Han; Da-Yuan Huang; Hsin-Ruey Tsai; Po-Chang Chen; Chen-Hsin Hsieh; Kuan-Ying Lu; De-Nian Yang; Yi-Ping Hung
Smart rings have a unique form factor suitable for many applications, however, they offer little opportunity to provide the user with natural output. We propose passive kinesthetic force feedback as a novel output method for rotational input on smart rings. With this new output channel, friction force profiles can be designed, programmed, and felt by a user when they rotate the ring. This modality enables new interactions for ring form factors. We demonstrate the potential of this new haptic output method though Frictio, a prototype smart ring. In a controlled experiment, we determined the recognizability of six force profiles, including Hard Stop, Ramp-Up, Ramp-Down, Resistant Force, Bump, and No Force. The results showed that participants could distinguish between the force profiles with 94% accuracy. We conclude by presenting a set of novel interaction techniques that Frictio enables, and discuss insights and directions for future research.
human computer interaction with mobile devices and services | 2016
Min-Chieh Hsiu; Chiuan Wang; Da-Yuan Huang; Jhe-Wei Lin; Yu-Chih Lin; De-Nian Yang; Yi-Ping Hung; Mike Y. Chen
With the recent advances of wearable I/O devices, designers of immersive VR systems are able to provide users with many different ways to explore the virtual space. For example, Birdly [Rheiner 2014] is a flying simulator composed of visual, auditory, and smell feedback that can provide the user a compelling experience of flying in the sky. SpiderVision adopts a non-see-through head-mounted display (HMD) and two cameras with opposite directions to provide the user a front-and-back vision [Fan et al. 2014]. Although the use of HMD is quite popular recently, moving around in a virtual space is not as easy as looking around in a virtual space, mainly because position tracking is more complicated than orientation tracking with state-of-the-art technologies. Our goal is to provide the user the first-person perspective and experience of moving around in 3D space like a super human -- jump high, glide off, fly with rope, teleport, etc., even without the position tracking technologies.
user interface software and technology | 2017
Da-Yuan Huang; Ruizhen Guo; Jun Gong; Jingxian Wang; John M. Graham; De-Nian Yang; Xing-Dong Yang
Force sensing has been widely used for bringing the touch from binary to multiple states, creating new abilities on surface interactions. However, prior proposed force sensing techniques mainly focus on enabling force-applied gestures on certain devices. This paper presents Nail+, a technique using fingernail deformation to enable force touch sensing interactions on everyday rigid surfaces. Our prototype, 3x3 0.2mm strain sensor array mounted on a fingernail, was implemented and conducted with a 12-participant study for evaluating the feasibility of this sensing approach. Result showed that the accuracy for sensing normal and force-applied tapping and swiping can achieve 84.67% on average. We finally proposed two example applications using Nail+ prototype for controlling the interfaces of head-mounted display (HMD) devices and remote screens.
human factors in computing systems | 2018
Meng-Ju Hsieh; Rong-Hao Liang; Da-Yuan Huang; Jheng-You Ke; Bing-Yu Chen
The small screen size of a smartwatch limits user experience when watching or interacting with media. We propose a supplementary tactile feedback system to enhance the user experience with a method unique to the smartwatch form factor. Our system has a deformable surface on the back of the watch face, allowing the visual scene on screen to extend into 2.5D physical space. This allows the user to watch and feel virtual objects, such as experiencing a ball bouncing against the wrist. We devised two controlled experiments to analyze the influence of tactile display resolution on the illusion of virtual object presence. Our first study revealed that on average, a taxel can render virtual objects between 70% and 138% of its own size without shattering the illusion. From the second study, we found visual and haptic feedback can be separated by 4.5mm to 16.2mm for the tested taxels. Based on the results, we developed a prototype (called RetroShape) with 4×4 10mm taxels using micro servo motors, and demonstrated its unique capability through a set of tactile-enhanced games and videos. A preliminary user evaluation showed that participants welcome RetroShape as a useful addition to existing smartwatch output.