Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mark Billinghurst is active.

Publication


Featured researches published by Mark Billinghurst.


annual symposium on computer human interaction in play | 2018

Emotion Sharing and Augmentation in Cooperative Virtual Reality Games

Jonathon D. Hart; Thammathip Piumsomboon; Louise Lawrence; Gun A. Lee; Ross T. Smith; Mark Billinghurst

We present preliminary findings from sharing and augmenting facial expression in cooperative social Virtual Reality (VR) games. We implemented a prototype system for capturing and sharing facial expression between VR players through their avatar. We describe our current prototype system and how it could be assimilated into a system for enhancing social VR experience. Two social VR games were created for a preliminary study. We discuss our findings from our pilots, potential games for this system, and future directions for this research.


annual symposium on computer human interaction in play | 2018

Effects of Manipulating Physiological Feedback in Immersive Virtual Environments

Arindam Dey; Hao Chen; Mark Billinghurst; Robert W. Lindeman

Virtual environments have been proven to be effective in evoking emotions. Earlier research has found that physiological data is a valid measurement of the emotional state of the user. Being able to see ones physiological feedback in a virtual environment has proven to make the application more enjoyable. In this paper, we have investigated the effects of manipulating heart rate feedback provided to the participants in a single user immersive virtual environment. Our results show that providing slightly faster or slower real-time heart rate feedback can alter participants emotions more than providing unmodified feedback. However, altering the feedback does not alter real physiological signals.


annual symposium on computer human interaction in play | 2018

Demonstrating Emotion Sharing and Augmentation in Cooperative Virtual Reality Games

Jonathon D. Hart; Thammathip Piumsomboon; Louise Lawrence; Gun A. Lee; Ross T. Smith; Mark Billinghurst

For our demonstration, we present a prototype system for sharing and augmenting facial expression in cooperative social Virtual Reality (VR) games. We created two social VR games, “Bomb Defusal” and “Island Survivor”, to demonstrate our system for capturing and sharing facial expression between VR players through their avatar.


Computers & Graphics | 2018

Design considerations for combining augmented reality with intelligent tutors

Bradley Herbert; Barrett Ens; Amali Weerasinghe; Mark Billinghurst; Grant Wigley

Abstract Augmented Reality overlays virtual objects on the real world in real-time and has the potential to enhance education, however, few AR training systems provide personalised learning support. Combining AR with intelligent tutoring systems has the potential to improve training outcomes by providing personalised learner support, such as feedback on the AR environment. This paper reviews the current state of AR training systems combined with ITSs and proposes a series of requirements for combining the two paradigms. In addition, this paper identifies a growing need to provide more research in the context of design and implementation of adaptive augmented reality tutors (ARATs). These include possibilities of evaluating the user interfaces of ARAT and potential domains where an ARAT might be considered effective.


ICAT-EGVE | 2017

Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze

Gun A. Lee; Seungwon Kim; Youngho Lee; Arindam Dey; Thammathip Piumsomboon; Mitchell Norman; Mark Billinghurst

To improve remote collaboration in video conferencing systems, researchers have been investigating augmenting visual cues onto a shared live video stream. In such systems, a person wearing a head-mounted display (HMD) and camera can share her view of the surrounding real-world with a remote collaborator to receive assistance on a real-world task. While this concept of augmented video conferencing (AVC) has been actively investigated, there has been little research on how sharing gaze cues might affect the collaboration in video conferencing. This paper investigates how sharing gaze in both directions between a local worker and remote helper in an AVC system affects the collaboration and communication. Using a prototype AVC system that shares the eye gaze of both users, we conducted a user study that compares four conditions with different combinations of eye gaze sharing between the two users. The results showed that sharing each others gaze significantly improved collaboration and communication.


ICAT-EGVE | 2017

An Augmented Reality and Virtual Reality Pillar for Exhibitions: A Subjective Exploration

Zi Siang See; Mohd Shahrizal Sunar; Mark Billinghurst; Arindam Dey; Delas Santano; Human Esmaeili; Harold Thwaites

This paper presents the development of an Augmented Reality (AR) and Virtual Reality (AR) pillar, a novel approach for showing AR and VR content in a public setting. A pillar in a public exhibition venue was converted to a four-sided AR and VR showcase, and a cultural heritage exhibit of “Boatbuilders of Pangkor” was shown. Multimedia tablets and mobile AR head-mountdisplays (HMDs) were provided for visitors to experience multisensory AR and VR content demonstrated on the pillar. The content included AR-based videos, maps, images and text, and VR experiences that allowed visitors to view reconstructed 3D subjects and remote locations in a 360 virtual environment. In this paper, we describe the prototype system, a user evaluation study and directions for future work. CCS Concepts •Multimedia Information System → Artificial, augmented, and virtual realities;


ICAT-EGVE | 2017

Exploring Pupil Dilation in Emotional Virtual Reality Environments

Hao Chen; Arindam Dey; Mark Billinghurst; Robert W. Lindeman

Previous investigations have shown that pupil dilation can be affected by emotive pictures, audio clips, and videos. In this paper, we explore how emotive Virtual Reality (VR) content can also cause pupil dilation. VR has been shown to be able to evoke negative and positive arousal in users when they are immersed in different virtual scenes. In our research, VR scenes were used as emotional triggers. Five emotional VR scenes were designed in our study and each scene had five emotion segments; happiness, fear, anxiety, sadness, and disgust. When participants experienced the VR scenes, their pupil dilation and the brightness in the headset were captured. We found that both the negative and positive emotion segments produced pupil dilation in the VR environments. We also explored the effect of showing heart beat cues to the users, and if this could cause difference in pupil dilation. In our study, three different heart beat cues were shown to users using a combination of three channels; haptic, audio, and visual. The results showed that the haptic-visual cue caused the most significant pupil dilation change from the baseline. CCS Concepts •Computing methodologies → Virtual reality; •Human-centered computing → Empirical studies in HCI;


IEEE Access | 2018

Robust Tracking Through the Design of High Quality Fiducial Markers: An Optimization Tool for ARToolKit

Dawar Khan; Sehat Ullah; Dong-Ming Yan; Ihsan Rabbi; Paul Richard; Thuong N. Hoang; Mark Billinghurst; Xiaopeng Zhang


ICAT-EGVE | 2017

Real-time Visual Representations for Mixed Reality Remote Collaboration

Lei Gao; Huidong Bai; Thammathip Piumsomboon; Gun A. Lee; Robert W. Lindeman; Mark Billinghurst


IEEE Transactions on Visualization and Computer Graphics | 2018

Superman vs Giant: A Study on Spatial Perception for a Multi-Scale Mixed Reality Flying Telepresence Interface

Thammathip Piumsomboon; Gun A. Lee; Barrett Ens; Bruce H. Thomas; Mark Billinghurst

Collaboration


Dive into the Mark Billinghurst's collaboration.

Top Co-Authors

Avatar

Gun A. Lee

University of South Australia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Arindam Dey

University of South Australia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jonathon D. Hart

University of South Australia

View shared research outputs
Top Co-Authors

Avatar

Louise Lawrence

University of South Australia

View shared research outputs
Top Co-Authors

Avatar

Ross T. Smith

University of South Australia

View shared research outputs
Top Co-Authors

Avatar

Hao Chen

University of Canterbury

View shared research outputs
Top Co-Authors

Avatar

Huidong Bai

University of Canterbury

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge