Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Thuong N. Hoang is active.

Publication


Featured researches published by Thuong N. Hoang.


australasian computer-human interaction conference | 2015

In Bed with Technology: Challenges and Opportunities for Sleep Tracking

Wanyu Liu; Bernd Ploderer; Thuong N. Hoang

In recent years a variety of mobile apps, wearable technologies and embedded systems have emerged that allow individuals to track the amount and the quality of their sleep in their own beds. Despite the widespread adoption of these technologies, little is known about the challenges that current users face in tracking and analysing their sleep. Hence we conducted a qualitative study to examine the practices of current users of sleep tracking technologies and to identify challenges in current practice. Based on data collected from 5 online forums for users of sleep-tracking technologies, we identified 22 different challenges under the following 4 themes: tracking continuity, trust, data manipulation, and data interpretation. Based on these results, we propose 6 design opportunities to assist researchers and practitioners in designing sleep-tracking technologies.


international symposium on wearable computers | 2010

Augmented Viewport: An action at a distance technique for outdoor AR using distant and zoom lens cameras

Thuong N. Hoang; Bruce H. Thomas

In this paper, we describe the Augmented Viewport, a new action at a distance technique for outdoor augmented reality using wearable computers. Augmented viewport is inspired by the virtual viewport window that has been widely used in virtual reality systems. Our technique utilizes a physical camera, which either is placed at a distant location or can zoom closer to a distant location, to provide the physical world information for the viewport window. Using the augmented viewport, the user can perform selection and affine transformations of distant virtual objects, using techniques that are more effective from a close distance, such as ray casting or image plane. We conducted a user study with results showing that the augmented viewport technique enhances the precision of object manipulations, and reduces time and effort.


designing interactive systems | 2016

Doctor, Can You See My Squats?: Understanding Bodily Communication in Video Consultations for Physiotherapy

Deepti Aggarwal; Bernd Ploderer; Frank Vetere; Mark Bradford; Thuong N. Hoang

This paper investigates the challenges of bodily communication during video-based clinical consultations. While previous works describe the lack of eye contact and gestures over video, it is unclear how these limitations impact the course of a clinical consultation, particularly in a domain like physiotherapy where the focus is on improving body movements and functioning. To contribute to this understanding, we conducted observations of 10 naturally occurring video and face-to-face consultations for physiotherapy. We found that clinicians rely on a variety of incidental bodily cues and fine-details of body movements to assess and examine the patient. These bodily cues were noticeable during face-to-face consultations; however, a variety of bodily cues got missed over video. Consequently, video consultations became conversational where the clinicians used verbal conduct to get a fair understanding of the patients health. To guide design of future video consultation systems, we reflect on our understanding as 4 design sensitivities: Visual Acuity, Field-of-view, Clinical Asymmetries, and Time Sequence.


international symposium on mixed and augmented reality | 2013

Passive Deformable Haptic glove to support 3D interactions in mobile augmented reality environments

Thuong N. Hoang; Ross T. Smith; Bruce H. Thomas

We present a passive deformable haptic (PDH) glove to enhance mobile immersive augmented reality manipulation with a sense of computer-captured touch, responding to objects in the physical environment. We extend our existing pinch glove design with a Digital Foam sensor, placed under the palm of the hand. The novel glove input device supports a range of touch-activated, precise, direct manipulation modeling techniques with tactile feedback including hole-punching, trench cutting, and chamfer creation. The PDH glove helps improve a users task performance time, decrease error rate and erroneous hand movements, and reduce fatigue.


international symposium on mixed and augmented reality | 2012

Distance-based modeling and manipulation techniques using ultrasonic gloves

Thuong N. Hoang; Bruce H. Thomas

We present a set of distance-based interaction techniques for modeling and manipulation, enabled by a new input device called the ultrasonic gloves. The ultrasonic gloves are built upon the original design of the pinch glove device for virtual reality systems with a tilt sensor and a pair of ultrasonic transducers in the palms of the gloves. The transducers are distance-ranging sensors that allow the user to specify a range of distances by natural gestures such as facing the palms towards each other or towards other surfaces. The user is able to create virtual models of physical objects by specifying their dimensions with hand gestures. We combine the reported distance with the tilt orientation data to construct virtual models. We also map the distance data to create a set of affine transformation techniques, including relative and fixed scaling, translation, and rotation. Our techniques can be generalized to different sensor technologies.


human factors in computing systems | 2017

Augmented Studio: Projection Mapping on Moving Body for Physiotherapy Education

Thuong N. Hoang; Martin Reinoso; Zaher Joukhadar; Frank Vetere; David Kelly

Physiotherapy students often struggle to translate anatomical knowledge from textbooks into a dynamic understanding of the mechanics of body movements in real life patients. We present the Augmented Studio, an augmented reality system that uses body tracking to project anatomical structures and annotations over moving bodies for physiotherapy education. Through a user and learner centered design approach, we established an understanding that through augmentation and annotation, augmented reality technology can enhance physiotherapy education. Augmented Studio enables augmentation through projection mapping to display anatomical information such as muscles and skeleton in real time on the body as it moves. We created a technique for annotation to create projected hand-drawing on the moving body, to enable explicit communication of the teachers clinical reasoning strategies to the students. Findings from our pilot usability study demonstrate a more engaging learning and teaching experience and increased communication between teacher and students when using Augmented Studio.


international symposium on mixed and augmented reality | 2013

3D interactions with a passive deformable haptic glove

Thuong N. Hoang; Ross T. Smith; Bruce H. Thomas

This paper explores enhancing mobile immersive augmented reality manipulations by providing a sense of computer-captured touch through the use of a passive deformable haptic glove that responds to objects in the physical environment. The glove extends our existing pinch glove design with a Digital Foam sensor that is placed under the palm of the hand. The novel glove input device supports a range of touch-activated, precise, direct manipulation modeling techniques with tactile feedback including hole cutting, trench cutting, and chamfer creation. A user evaluation study comparing an image plane approach to our passive deformable haptic glove showed that the glove improves a users task performance time, decreases error rate and erroneous hand movements, and reduces fatigue.


human factors in computing systems | 2017

SoPhy: A Wearable Technology for Lower Limb Assessment in Video Consultations of Physiotherapy

Deepti Aggarwal; Weiyi Zhang; Thuong N. Hoang; Bernd Ploderer; Frank Vetere; Mark Bradford

Physiotherapists are increasingly using video conferencing tools for their teleconsultations. Yet, the assessment of subtle differences in body movements remains a challenge. To support lower limb assessment in video consultations, we present SoPhy, a wearable technology consisting of a pair of socks with embedded sensors for patients to wear; and a web interface that displays information about range of weight distribution, foot movement, and foot orientation for physiotherapists in real-time. We conducted a laboratory study of 40 video consultations, in which postgraduate physiotherapy students assessed lower limb function. We compare assessment with and without SoPhy. Findings show that SoPhy increased the confidence in assessing squats exercise and fewer repetitions were required to assess patients when using SoPhy. We discuss the significance of SoPhy to address the challenges of assessing bodily information over video, and present considerations for its integration with clinical practices and tools.


nordic conference on human-computer interaction | 2016

Onebody: Remote Posture Guidance System using First Person View in Virtual Environment

Thuong N. Hoang; Martin Reinoso; Frank Vetere; Egemen Tanin

We present Onebody, a virtual reality system for remote posture guidance during sports or physical activity training, such as martial arts, yoga or dance, using first person perspective. The system uses skeletal tracking of the instructor and the students, rendered as virtual avatars. Using a virtual reality headset, the student can visualise the movement of the instructors avatar, rendered in place of their own body. Onebody provides a first person perspective of the movement instruction, allowing the student to step into the instructors body. We conducted a study to compare the performance of Onebody in terms of posture matching accuracy and users preference, with existing techniques of delivering movement instructions, including pre-recorded video, video conferencing and third person view virtual reality. The result indicated that Onebody offers better posture accuracy in delivering movement instructions.


international symposium on wearable computers | 2009

Web 2.0 Meets Wearable Augmented Reality

Thuong N. Hoang; Shane R. Porter; Benjamin Close; Bruce H. Thomas

This paper explores how a wearable computer with an augmented reality interface can provide real time contextual interactions, based on location aware Web 2.0 social network information.

Collaboration


Dive into the Thuong N. Hoang's collaboration.

Top Co-Authors

Avatar

Bruce H. Thomas

University of South Australia

View shared research outputs
Top Co-Authors

Avatar

Bernd Ploderer

Queensland University of Technology

View shared research outputs
Top Co-Authors

Avatar

Frank Vetere

University of Melbourne

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mark Bradford

Royal Children's Hospital

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ross T. Smith

University of South Australia

View shared research outputs
Top Co-Authors

Avatar

Steven Baker

University of Melbourne

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge