Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Matthew A. Hutchins is active.

Publication


Featured researches published by Matthew A. Hutchins.


Laryngoscope | 2008

Validation of a Networked Virtual Reality Simulation of Temporal Bone Surgery

Stephen O'Leary; Matthew A. Hutchins; Duncan Stevenson; Chris Gunn; Alexander Krumpholz; Gregor Kennedy; Michael Tykocinski; Marcus Dahm; B. C. Pyman

Objectives: To assess the content validity and concurrent validity of a haptically (force feedback) rendered, virtual reality simulation of temporal bone surgery.


symposium on haptic interfaces for virtual environment and teleoperator systems | 2005

Using collaborative haptics in remote surgical training

Chris Gunn; Matthew A. Hutchins; Duncan Stevenson; Matt Adcock; Patricia Youngblood

We describe the design and trial of a remotely conducted surgical master class, using a haptic virtual environment. The instructor was located in the United States and the class was in Australia. The responses of the audience and participants are presented.


Presence: Teleoperators & Virtual Environments | 2005

Combating latency in haptic collaborative virtual environments

Chris Gunn; Matthew A. Hutchins; Matt Adcock

Haptic (force) feedback is increasingly being used in surgical-training simulators. The addition of touch is important extra information that can add another dimension to the realism of the experience. Progress in networking these systems together over long distances has been held back, principally because the latency of the network can induce severe instability in any dynamic objects in the scene. This paper describes techniques allowing long-distance sharing of haptic-enabled, dynamic scenes. At the CSIRO Virtual Environments Laboratory, we have successfully used this system to connect a prototype of a surgical-simulation application between participants on opposite sides of the world in Sweden and Australia, over a standard Internet connection spanning 3 continents and 2 oceans. The users were able to simultaneously manipulate pliable objects in a shared workspace, as well as guide each others hands (and shake hands!) over 22,000 km (13620 miles) of Internet connection. The main obstacle to overcome was the latency-induced instability in the system, caused by the delays and jitter inherent in the network. Our system involved a combination of an event-collection mechanism, a network event-forwarding mechanism and a pseudophysics mechanism. We found that the resulting behavior of the interconnected body organs, under simultaneous-user manipulation, was sufficiently convincing to be considered for training surgical procedures.


australasian computer-human interaction conference | 2007

Annotating with light for remote guidance

Doug Palmer; Matt Adcock; Jocelyn Smith; Matthew A. Hutchins; Chris Gunn; Duncan Stevenson; Ken Taylor

This paper presents a system that will support a remote guidance collaboration, in which a local expert guides a remotely located assistant to perform physical, three-dimensional tasks. The system supports this remote guidance by allowing the expert to annotate, point at and draw upon objects in the remote location using a pen and tablet-based interface to control a laser projection device. The specific design criteria for this system are drawn from a tele-health scenario involving remote medical examination of patients and the paper presents the software architecture and implementation details of the associated hardware. In particular, the algorithm for aligning the representation of the laser projection over the video display of the remote scene is described. Early evaluations by medical specialists are presented, the usability of the system in laboratory experiments is discussed and ideas for future developments are outlined.


2003 IEEE International Augmented Reality Toolkit Workshop | 2003

Augmented reality haptics: using ARToolKit for display of haptic applications

Matt Adcock; Matthew A. Hutchins; Chris Gunn

This paper described the integration of the ARToolKit with the Reachin Core Technology API. The result is a system capable of providing a coherent mix of real world video, computer haptics and computer graphics. A key feature is that new applications can be rapidly developed. Ultimately, this system is used to support rich object based collaboration between face-to-face and remote participants.


International Journal of Human-computer Interaction | 2010

Human-Centered Evaluation for Broadband Tertiary Outpatient Telehealth: A Case Study

Duncan Stevenson; Matthew A. Hutchins; Jocelyn Smith

We present a pilot trial of a broadband telehealth system for tertiary outpatient consultations and use it as a case study to explore issues that arise in designing and evaluating broadband telehealth at a tertiary level of health care. The trial used outpatient consultations for pediatric surgical patients; these consultations involve high levels of interpersonal communications, multiple participants, and the need to share interactive access to large patient data sets. We used a human-centered evaluation approach applied at the level of the health care application (in the hospital setting using actual clinical consultations). The results from the case study indicate that this is the appropriate evaluation approach for early stage trials rather than the traditional randomized controlled trials. The different groups of participants (specialists, patients and parents, supporting clinicians) had different perspectives on the telehealth consultations and different criteria for success and future telehealth evaluations need to take these multiple points of view into account.


international conference on computer graphics and interactive techniques | 2004

Haptic collaboration with augmented reality

Matt Adcock; Matthew A. Hutchins; Chris Gunn

We describe a (face-to-face) collaborative environment that provides a coherent mix of real world video, computer haptics, graphics and audio. This system is a test-bed for investigating new collaborative affordances and behaviours.


Computers & Graphics | 1994

Mapping data into colour gamuts: Using interaction to increase usability and reduce complexity

Philip K. Robertson; Matthew A. Hutchins; Duncan Stevenson; Stephen Barrass; Chris Gunn; Dione Smith

Abstract An interactive system for mapping scientific data into the perceptual gamuts of colour devices is presented. Interaction provides the critical feedback that allows the user to concentrate on the problem domain, rather than be distracted by the combined complexities of the data mapping and realisation problems. By careful interface design, the combined problems of choosing appropriate colour representations, and of achieving colour-consistent visualisation and reproduction, are reduced to an intuitively straightforward set of operations. The interaction draws the user into understanding the limitations of specific colour devices within a device-independent representation, a spatially-represented perceptually uniform colour space. Colour-corrected feedback allows the user to see directly the implications of data mapping choices and strategies for dealing with problems imposed by device limitations, such as out-of-gamut points.


electronic imaging | 2000

Software components for haptic constraints

Matthew A. Hutchins

This paper discusses the software engineering of a class library for supporting haptic rendering of interaction constraints within a hand-immersive virtual environment. The design of interaction and navigation paradigms is a significant issue in the usability of virtual environments. The careful application of constraints in the interaction can help the user focus on their specific task. Interaction constraints can be usefully implemented using a haptic, or force-feedback, device. Haptic programming is difficult, so we are designing and implementing a class library to provide reusable components for programming haptic constraints. The library extends the Magma multi-sensory scenegraph API, providing a constrained proxy to serve as a new interaction point for the application, and an abstract constraint definition that can be realized by a variety of constraint types. The paper illustrates the constraint definition by describing a number of geometric constraints, and also describes techniques for combining and modifying constraints to create new ones. These techniques are used to construct constraints tailored to specific application requirements. The haptic constraints library is still a work in progress, and we have identified a number of areas where improvements can be made. One of the major challenges is to provide software components that can be reused to support a broad selection of different approaches to programming interaction in a haptically enabled virtual environment.


international conference on software maintenance | 1998

Improving visual impact analysis

Matthew A. Hutchins; Keith Gallagher

Collaboration


Dive into the Matthew A. Hutchins's collaboration.

Top Co-Authors

Avatar

Chris Gunn

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Top Co-Authors

Avatar

Duncan Stevenson

Australian National University

View shared research outputs
Top Co-Authors

Avatar

Matt Adcock

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Top Co-Authors

Avatar

Jocelyn Smith

Australian National University

View shared research outputs
Top Co-Authors

Avatar

Philip K. Robertson

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

B. C. Pyman

University of Melbourne

View shared research outputs
Top Co-Authors

Avatar

Dione Smith

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Top Co-Authors

Avatar

Doug Palmer

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Researchain Logo
Decentralizing Knowledge