Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Adrian J. Clark is active.

Publication


Featured researches published by Adrian J. Clark.


Foundations and Trends in Human-computer Interaction | 2015

A Survey of Augmented Reality

Mark Billinghurst; Adrian J. Clark; Gun A. Lee

This survey summarizes almost 50 years of research and development in the field of Augmented Reality AR. From early research in the1960s until widespread availability by the 2010s there has been steady progress towards the goal of being able to seamlessly combine real and virtual worlds. We provide an overview of the common definitions of AR, and show how AR fits into taxonomies of other related technologies. A history of important milestones in Augmented Reality is followed by sections on the key enabling technologies of tracking, display and input devices. We also review design guidelines and provide some examples of successful AR applications. Finally, we conclude with a summary of directions for future work and a review of some of the areas that are currently being researched.


human factors in computing systems | 2013

User-defined gestures for augmented reality

Thammathip Piumsomboon; Adrian J. Clark; Mark Billinghurst; Andy Cockburn

Recently there has been an increase in research of hand gestures for interaction in the area of Augmented Reality (AR). However this research has focused on developer designed gestures, and little is known about user preference and behavior for gestures in AR. In this paper, we present the results of a guessability study focused on hand gestures in AR. A total of 800 gestures have been elicited for 40 selected tasks from 20 partic-ipants. Using the agreement found among gestures, a user-defined gesture set was created to guide design-ers to achieve consistent user-centered gestures in AR.


international symposium on mixed and augmented reality | 2014

Grasp-Shell vs gesture-speech: A comparison of direct and indirect natural interaction techniques in augmented reality

Thammathip Piumsomboon; David Altimira; Hyungon Kim; Adrian J. Clark; Gun A. Lee; Mark Billinghurst

In order for natural interaction in Augmented Reality (AR) to become widely adopted, the techniques used need to be shown to support precise interaction, and the gestures used proven to be easy to understand and perform. Recent research has explored free-hand gesture interaction with AR interfaces, but there have been few formal evaluations conducted with such systems. In this paper we introduce and evaluate two natural interaction techniques: the free-hand gesture based Grasp-Shell, which provides direct physical manipulation of virtual content; and the multi-modal Gesture-Speech, which combines speech and gesture for indirect natural interaction. These techniques support object selection, 6 degree of freedom movement, uniform scaling, as well as physics-based interaction such as pushing and flinging. We conducted a study evaluating and comparing Grasp-Shell and Gesture-Speech for fundamental manipulation tasks. The results show that Grasp-Shell outperforms Gesture-Speech in both efficiency and user preference for translation and rotation tasks, while Gesture-Speech is better for uniform scaling. They could be good complementary interaction methods in a physics-enabled AR environment, as this combination potentially provides both control and interactivity in one interface. We conclude by discussing implications and future directions of this research.


australasian computer-human interaction conference | 2011

Seamless interaction in space

Adrian J. Clark; Andreas Dünser; Mark Billinghurst; Thammathip Piumsomboon; David Altimira

As more electronic devices enter the living room, there is a need to explore new ways to provide seamless interaction with them over a range of different distances. In this paper we describe a proximity-based interface that allows users to interact with screen content both within arms length and at a distance. To support such a wide interaction range we combine speech and gesture input with a secondary display, and have the interface change dynamically depending on the user proximity. We conducted a user evaluation of our prototype system and found that users were impressed with the away from screen interfaces, and believed that changing the interface based on proximity would be useful for larger displays. We present the lessons learned and discuss directions for future research.


image and vision computing new zealand | 2008

Perspective correction for improved visual registration using natural features.

Adrian J. Clark; Richard D. Green; Robert N. Grant

This research proposes a perspective invariant registration algorithm which improves on popular registration algorithms such as SIFT and SURF by correcting for perspective distortion using optical flow. A novel addition to the natural feature based registration process is proposed, which uses orientation information from previously correctly registered frames to attempt perspective correction. This process is evaluated when applied to the Natural Feature algorithms described. This research overcomes the cause of registration failings based on perspective distortion in Natural Feature Tracking, and attempts to find a better resolution than just pruning invalid matches. The results show that the proposed algorithm improved registration on two prominent Natural Feature based Registration algorithms, SIFT and SURF.


international symposium on mixed and augmented reality | 2011

An interactive augmented reality coloring book

Adrian J. Clark; Andreas Dünser; Raphael Grasset

Creating entertaining and educational books not only requires providing visually stimulating content but also means for students to interact, create, and express themselves. In this paper we present a new type of mixed-reality book experience, which augments an educational coloring book with user-generated three dimensional content. We explore a “pop-up book” metaphor and describe a process by which childrens drawing and coloring is used as input to generate and change the appearance of the book content. Our system is based on natural feature tracking and image processing techniques that can be easily exploited for other AR publishing applications.


symposium on 3d user interfaces | 2012

Poster: Physically-based natural hand and tangible AR interaction for face-to-face collaboration on a tabletop

Thammathip Piumsomboon; Adrian J. Clark; Atsushi Umakatsu; Mark Billinghurst

In this paper, we present an AR framework that allows natural hand and tangible AR interaction for physically-based interaction and environment awareness to support face-to-face collaboration using Microsoft Kinect. Our framework comprises of six major components: (1) marker tracking (2) depth acquisition (3) image processing (4) physics simulation (5) communication and (6) rendering. The resulting augmented environment supports occlusion, shadows, and physically-based interaction of real and virtual objects. We propose three methods of natural hand representations including mesh-based, direct sphere substitution and variational optical flow. A tabletop racing game and AR Sandbox applications are created based on this framework, showing the application possibilities.


ieee virtual reality conference | 2013

An advanced interaction framework for augmented reality based exposure treatment

Sam Corbett-Davies; Andreas Dünser; Richard D. Green; Adrian J. Clark

In this paper we present a novel interaction framework for augmented reality, and demonstrate its application in an interactive AR exposure treatment system for the fear of spiders. We use data from the Microsoft Kinect to track and model real world objects in the AR environment, enabling realistic interaction between them and virtual content. Objects are tracked in three dimensions using the Iterative Closest Point algorithm and a point cloud model of the objects is incrementally developed. The approximate motion and shape of each object in the scene serve as inputs to the AR application. Very few restrictions are placed on the types of objects that can be used. In particular, we do not require objects to be marked in a certain way in order to be recognized, facilitating natural interaction. To demonstrate our interaction framework we present an AR exposure treatment system where virtual spiders can walk up, around, or behind real objects and can be carried, prodded and occluded by the user. We also discuss improvements we are making to the interaction framework and its potential for use in other applications.


virtual reality continuum and its applications in industry | 2011

A realistic augmented reality racing game using a depth-sensing camera

Adrian J. Clark; Thammathip Piumsomboon

As augmented reality (AR) applications become more common, users are expecting increasingly sophisticated experiences combining impressive visuals, interaction, and awareness of the environment. Existing technology capable of understanding user interaction and the environment is often expensive or restrictive. However the newly released Microsoft Kinect provides researchers with a low cost and widely available real time depth sensor. In this paper, we investigate using the Kinect as a means to give AR applications an understanding of the three-dimensional (3D) environment they are operating in, and support new ways for the user to interact with virtual content in a natural and intuitive way.


international symposium on mixed and augmented reality | 2012

An interactive Augmented Reality system for exposure treatment

Sam Corbett-Davies; Andreas Dünser; Adrian J. Clark

In this poster we describe an Augmented Reality (AR) system we are developing for exposure treatment. AR has great potential for phobia treatment because virtual fear stimuli can be shown in the real world and the client can see their own body and interact naturally with the stimuli. However, advanced natural interactivity has so far not been fully implemented in AR based exposure therapy systems. The novelty of our approach is based on better integrating the real environment and the user into the system, and in recognising natural user actions as system input. Using the Microsoft Kinect device, we create a model of the therapy environment and the users body. This information is used in conjunction with a physics simulation engine to create a virtual spider that reacts to the real environment in a realistic manner. The virtual spider can walk up, around, or behind real objects and can be carried, prodded and occluded by the user. We present the most recent prototype of the system and discuss the improvements we continue to make.

Collaboration


Dive into the Adrian J. Clark's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mark Billinghurst

University of South Australia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Peng Guo

University of Canterbury

View shared research outputs
Top Co-Authors

Avatar

Yeong Shiong Chiew

Monash University Malaysia Campus

View shared research outputs
Top Co-Authors

Avatar

Andy Cockburn

University of Canterbury

View shared research outputs
Researchain Logo
Decentralizing Knowledge