Chris Gunn
Commonwealth Scientific and Industrial Research Organisation
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Chris Gunn.
Laryngoscope | 2008
Stephen O'Leary; Matthew A. Hutchins; Duncan Stevenson; Chris Gunn; Alexander Krumpholz; Gregor Kennedy; Michael Tykocinski; Marcus Dahm; B. C. Pyman
Objectives: To assess the content validity and concurrent validity of a haptically (force feedback) rendered, virtual reality simulation of temporal bone surgery.
symposium on haptic interfaces for virtual environment and teleoperator systems | 2005
Chris Gunn; Matthew A. Hutchins; Duncan Stevenson; Matt Adcock; Patricia Youngblood
We describe the design and trial of a remotely conducted surgical master class, using a haptic virtual environment. The instructor was located in the United States and the class was in Australia. The responses of the audience and participants are presented.
Virtual Reality | 2006
A. Hutchins; R. Stevenson; Chris Gunn; Alexander Krumpholz; Tony Adriaansen; B. C. Pyman; Stephen O’Leary
Networked virtual environments using haptic interfaces can be used for surgical training and support both a simulation component and a communication component. We present such an environment for training in surgery of the temporal bone, which emphasises communication between an instructor and a student. We give an overview of the learning requirements for surgeons in this area and present the details of our implementation with a focus on the way communication is supported. We describe a training trial that was undertaken with a group of surgical trainees and carry out a qualitative analysis of transcripts from the teaching sessions. We conclude that the virtual environment supports a rich dialogue between the instructor and student, allowing them to ground their conversation in the shared model. Haptic interfaces are an important enabling technology for the simulation and communication and are used in conjunction with other modes and media to support situated learning.
Presence: Teleoperators & Virtual Environments | 2005
Chris Gunn; Matthew A. Hutchins; Matt Adcock
Haptic (force) feedback is increasingly being used in surgical-training simulators. The addition of touch is important extra information that can add another dimension to the realism of the experience. Progress in networking these systems together over long distances has been held back, principally because the latency of the network can induce severe instability in any dynamic objects in the scene. This paper describes techniques allowing long-distance sharing of haptic-enabled, dynamic scenes. At the CSIRO Virtual Environments Laboratory, we have successfully used this system to connect a prototype of a surgical-simulation application between participants on opposite sides of the world in Sweden and Australia, over a standard Internet connection spanning 3 continents and 2 oceans. The users were able to simultaneously manipulate pliable objects in a shared workspace, as well as guide each others hands (and shake hands!) over 22,000 km (13620 miles) of Internet connection. The main obstacle to overcome was the latency-induced instability in the system, caused by the delays and jitter inherent in the network. Our system involved a combination of an event-collection mechanism, a network event-forwarding mechanism and a pseudophysics mechanism. We found that the resulting behavior of the interconnected body organs, under simultaneous-user manipulation, was sufficiently convincing to be considered for training surgical procedures.
australasian computer-human interaction conference | 2007
Doug Palmer; Matt Adcock; Jocelyn Smith; Matthew A. Hutchins; Chris Gunn; Duncan Stevenson; Ken Taylor
This paper presents a system that will support a remote guidance collaboration, in which a local expert guides a remotely located assistant to perform physical, three-dimensional tasks. The system supports this remote guidance by allowing the expert to annotate, point at and draw upon objects in the remote location using a pen and tablet-based interface to control a laser projection device. The specific design criteria for this system are drawn from a tele-health scenario involving remote medical examination of patients and the paper presents the software architecture and implementation details of the associated hardware. In particular, the algorithm for aligning the representation of the laser projection over the video display of the remote scene is described. Early evaluations by medical specialists are presented, the usability of the system in laboratory experiments is discussed and ideas for future developments are outlined.
Virtual Reality | 2006
Chris Gunn
This paper introduces a haptic virtual environment in which two users can collaboratively sculpt a virtual clay model, working from different physical locations connected by the internet. They view their virtual sculpting tools and the clay model in 3D, feel the tool’s pressure on the clay as they work, and have their hands co-located with the view of the tool and model. Since the sculptors have independent views of the same logical environment, they can work at different zoom levels, and be in different coordinate systems, even spinning ones, at the same time. This provides them with the capability to explore new styles of collaborative creativity, working off each other’s initiative where appropriate. The system was designed to allow unrestrained, asynchronous behaviour by the collaborating sculptors. The paper describes the hardware as well as the algorithms behind the deformability of the clay surface and the communications model enabling the distance collaboration. It gives an explanation of the simple conflict resolution mechanism that haptic feedback facilitates and also reports on the results of a qualitative study into the creativity benefits of such a collaborative system.
2003 IEEE International Augmented Reality Toolkit Workshop | 2003
Matt Adcock; Matthew A. Hutchins; Chris Gunn
This paper described the integration of the ARToolKit with the Reachin Core Technology API. The result is a system capable of providing a coherent mix of real world video, computer haptics and computer graphics. A key feature is that new applications can be rapidly developed. Ultimately, this system is used to support rich object based collaboration between face-to-face and remote participants.
field and service robotics | 2010
Elliot S. Duff; Con Caris; Adrian Bonchis; Ken Taylor; Chris Gunn; Matt Adcock
This paper describes the development of a tele-robotic rock breaker deployed at a mine over 1000kms from the remote operations centre. This distance introduces a number of technical and cognitive challenges to the design of the system, which have been addressed with the development of shared autonomy in the control system and a mixed reality user interface. A number of trials were conducted, culminating in a production field trial, which demonstrated that the system is safe, productive (sometimes faster) and integrates seamlessly with mine operations.
electronic imaging | 1999
Duncan Stevenson; Kevin A. Smith; John P. McLaughlin; Chris Gunn; J. P. Veldkamp; Mark J. Dixon
The Haptic Workbench combines stereo images, co-located force feedback and 3D audio to produce a small-scale hands- in virtual environment system. This paper present the Haptic Workbench, the HCI issues that arose and its deployment in prototype industrial applications. The problems associated with combining global graphic and local haptic rendering in an efficient and generalized manner are described. The benefits and the difficulties associated with this class of virtual environment system are discussed, the experience gained in applying it to industrial applications is described and conclusions are drawn about the appropriate use of co-located multi-sensory technologies in Virtual Environments.
australasian computer-human interaction conference | 2011
Chris Gunn; Matt Adcock
A worker performing a physical task may need to ask for advice and guidance from an expert. This can be a problem if the expert is in some distant location. We describe a system which allows the expert to see the workplace from the workers point of view, and to draw annotations directly into that workplace using a picoprojector. Since the system can be worn by the worker, these projected annotations may move with the workers movements. We describe two methods of sticking these annotations to the original positions thereby compensating for the movement of the worker.
Collaboration
Dive into the Chris Gunn's collaboration.
Commonwealth Scientific and Industrial Research Organisation
View shared research outputsCommonwealth Scientific and Industrial Research Organisation
View shared research outputsCommonwealth Scientific and Industrial Research Organisation
View shared research outputsCommonwealth Scientific and Industrial Research Organisation
View shared research outputsCommonwealth Scientific and Industrial Research Organisation
View shared research outputsCommonwealth Scientific and Industrial Research Organisation
View shared research outputsCommonwealth Scientific and Industrial Research Organisation
View shared research outputs