Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Aaron Kotranza is active.

Publication


Featured researches published by Aaron Kotranza.


ieee virtual reality conference | 2008

Virtual Human + Tangible Interface = Mixed Reality Human An Initial Exploration with a Virtual Breast Exam Patient

Aaron Kotranza; Benjamin Lok

Virtual human (VH) experiences are receiving increased attention for training real-world interpersonal scenarios. Communication in interpersonal scenarios consists of not only speech and gestures, but also relies heavily on haptic interaction - interpersonal touch. By adding haptic interaction to VH experiences, the bandwidth of human-VH communication can be increased to approach that of human-human communication. To afford haptic interaction, a new species of embodied agent is proposed - mixed reality humans (MRHs). A MRH is a virtual human embodied by a tangible interface that shares the same registered space. The tangible interface affords the haptic interaction that is critical to effective simulation of interpersonal scenarios. We applied MRHs to simulate a virtual patient requiring a breast cancer screening (medical interview and physical exam). The design of the MRH patient is presented. This paper also presents the results of a pilot study in which eight (n = 8) physician-assistant students performed a clinical breast exam on the MRH patient. Results show that when afforded haptic interaction with a MRH patient, users demonstrated interpersonal touch and social engagement similarly to interacting with a human patient.


American Journal of Surgery | 2009

A pilot study to integrate an immersive virtual patient with a breast complaint and breast examination simulator into a surgery clerkship

Adeline M. Deladisma; Mamta Gupta; Aaron Kotranza; James G. Bittner; Toufic Imam; Dayna Swinson; Angela L. Gucwa; Robert R. Nesbit; Benjamin Lok; Carla M. Pugh; D. Scott Lind

BACKGROUND We aimed to determine if an immersive virtual patient (VP) with a breast complaint and a breast mannequin could prepare third-year medical students for history-taking (HT) and clinical breast examination (CBE) on a real patient. METHODS After standardized instruction in breast HT and CBE, students (n = 21) were randomized to either an interaction with a VP (experimental) or to no VP interaction (control) before seeing a real patient with a breast complaint. Participants completed baseline and exit surveys to assess confidence regarding their HT and CBE skills. RESULTS Students reported greater confidence in their HT (Delta value = 1.05 +/- 1.28, P < .05) and CBE skills (Delta value = 1.14 +/- .91, P < .05) and less anxiety when performing a CBE (Delta value = -.76 +/- 1.10, P < .05). The VP intervention group had a significantly higher mean HT confidence than the control group at the conclusion of the study (4.27 +/- .47 vs 3.50 +/- .71, respectively, P < .05). CONCLUSIONS A single interaction with a VP with a breast complaint and breast mannequin improves student confidence in breast HT during a surgery clerkship.


international symposium on mixed and augmented reality | 2009

Real-time in-situ visual feedback of task performance in mixed environments for learning joint psychomotor-cognitive tasks

Aaron Kotranza; D. Scott Lind; Carla M. Pugh; Benjamin Lok

This paper proposes an approach to mixed environment training of manual tasks requiring concurrent use of psychomotor and cognitive skills. To train concurrent use of both skill sets, the learner is provided real-time generated, in-situ presented visual feedback of her performance. This feedback provides reinforcement and correction of psychomotor skills concurrently with guidance in developing cognitive models of the task.


symposium on 3d user interfaces | 2009

Virtual multi-tools for hand and tool-based interaction with life-size virtual human agents

Aaron Kotranza; Kyle Johnsen; Juan C. Cendan; Bayard Miller; D. Scott Lind; Benjamin Lok

A common approach when simulating face-to-face interpersonal scenarios with virtual humans is to afford users only verbal interaction while providing rich verbal and non-verbal interaction from the virtual human. This is due to the difficulty in providing robust recognition of user non-verbal behavior and interpretation of these behaviors within the context of the verbal interaction between user and virtual human. To afford robust hand and tool-based non-verbal interaction with life-sized virtual humans, we propose virtual multi-tools. A single hand-held, tracked interaction device acts as a surrogate for the virtual multi-tools: the users hand, multiple tools, and other objects. By combining six degree-of-freedom, high update rate tracking with extra degrees of freedom provided by buttons and triggers, a commodity device, the Nintendo Wii Remote, provides the kinesthetic and haptic feedback necessary to provide a high-fidelity estimation of the natural, unencumbered interaction provided by ones hands and physical hand-held tools. These qualities allow virtual multi-tools to be a less error-prone interface to social and task-oriented non-verbal interaction with a life-sized virtual human. This paper discusses the implementation of virtual multi-tools for hand and tool-based interaction with life-sized virtual humans, and provides an initial evaluation of the usability of virtual multi-tools in the medical education scenario of conducting a neurological exam of a virtual human.


international symposium on mixed and augmented reality | 2005

A pipeline for rapidly incorporating real objects into a mixed environment

X. Wang; Aaron Kotranza; John Quarles; Benjamin Lok; B.D. Allen

A method is presented to rapidly incorporate real objects into virtual environments using laser scanned 3D models with color-based marker tracking. Both the real objects and their geometric models are put into a mixed environment (ME). In the ME, users can manipulate the scanned, articulated real objects, such as tools, parts, and physical correlates to complex computer-aided design (CAD) models. Our aim is to allow engineering teams to effectively conduct hands-on assembly design verification. This task would be simulated at a high degree of fidelity, and would benefit from the natural interaction afforded by a ME with many specific real objects.


ieee virtual reality conference | 2009

Virtual Humans That Touch Back: Enhancing Nonverbal Communication with Virtual Humans through Bidirectional Touch

Aaron Kotranza; Benjamin Lok; Carla M. Pugh; D. Scott Lind

Touch is a powerful component of human communication, yet has been largely absent in communication between humans and virtual humans (VHs). This paper expands on recent work which allowed unidirectional touch from human to VH, by evaluating bidirectional touch as a new channel for nonverbal communication. A VH augmented with a haptic interface is able to touch her interaction partner using a pseudo-haptic touch or an active-haptic touch from a co-located mechanical arm. Within the context of a simulated doctor-patient interaction, two user studies (n = 54) investigate how touch can be used by both human and VH to communicate. Results show that human-to-VH touch is used for the same communication purposes as human-to-human touch, and that VH-to-human touch (pseudo-haptic and active-haptic) allows the VH to communicate with its human interaction partner. The enhanced nonverbal communication provided by bidirectional touch has the potential to solve difficult problems in VH research, such as disambiguating user speech, enforcing social norms, and achieving rapport with VHs.


ieee virtual reality conference | 2009

Virtual Experiences for Social Perspective-Taking

Andrew Raij; Aaron Kotranza; D. Scott Lind; Benjamin Lok

This paper proposes virtual social perspective-taking (VSP). In VSP, users are immersed in an experience of another person to aid in understanding the persons perspective. Users are immersed by 1) providing input to user senses from logs of the target persons senses, 2) instructing users to act and interact like the target, and 3) reminding users that they are playing the role of the target. These guidelines are applied to a scenario where taking the perspective of others is crucial - the medical interview. A pilot study (n = 16) using this scenario indicates VSP elicits reflection on the perspectives of others and changes behavior in future, similar social interactions. By encouraging reflection and change, VSP advances the state-of-the-art in training social interactions with virtual experiences.


IEEE Transactions on Visualization and Computer Graphics | 2012

Real-Time Evaluation and Visualization of Learner Performance in a Mixed-Reality Environment for Clinical Breast Examination

Aaron Kotranza; David Scott Lind; Benjamin Lok

We investigate the efficacy of incorporating real-time feedback of user performance within mixed-reality environments (MREs) for training real-world tasks with tightly coupled cognitive and psychomotor components. This paper presents an approach to providing real-time evaluation and visual feedback of learner performance in an MRE for training clinical breast examination (CBE). In a user study of experienced and novice CBE practitioners (n = 69), novices receiving real-time feedback performed equivalently or better than more experienced practitioners in the completeness and correctness of the exam. A second user study (n = 8) followed novices through repeated practice of CBE in the MRE. Results indicate that skills improvement in the MRE transfers to the real-world task of CBE of human patients. This initial case study demonstrates the efficacy of MREs incorporating real-time feedback for training real-world cognitive-psychomotor tasks.


ieee virtual reality conference | 2011

An initial exploration of conversational errors as a novel method for evaluating virtual human experiences

Richard Skarbez; Aaron Kotranza; Frederick P. Brooks; Benjamin Lok

We present a new method for evaluating user experience in interactions with virtual humans (VHs). We code the conversational errors made by the VH. These errors, in addition to the duration of the interaction and the numbers of statements made by the participant and the VH, provide objective, quantitative data about the virtual social interaction. We applied this method to a set of previously collected interactions between medical students and VH patients and present preliminary results. The error metrics do not correlate with traditional measures of the quality of a virtual experience, e.g. presence and copresence questionnaires. The error metrics were significantly correlated with scores on the Maastricht Assessment of Simulated Patients (MaSP), a scenario-appropriate measure of simulation quality, suggesting further investigation is warranted.


virtual reality software and technology | 2006

Mixed reality: are two hands better than one?

Aaron Kotranza; John Quarles; Benjamin Lok

For simulating hands-on tasks, the ease of enabling two-handed interaction with virtual objects gives Mixed Reality (MR) an expected advantage over Virtual Reality (VR). A user study examined whether two-handed interaction is critical for simulating hands-on tasks in MR. The study explored the effect of one- and two-handed interaction on task performance in a MR assembly task. When presented with a MR system, most users chose to interact with two hands. This choice was not affected by a users past VR experience or the quantity and complexity of the real objects with which users interacted. Although two-handed interaction did not yield a significant performance improvement, two hands allowed subjects to perform the virtual assembly task similarly to the real-world task. Subjects using only one hand performed the task fundamentally differently, showing that affording two-handed interaction is critical for training systems.

Collaboration


Dive into the Aaron Kotranza's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

D. Scott Lind

Georgia Regents University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Angela L. Gucwa

Georgia Regents University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge