Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Lauren Cairco Dukes is active.

Publication


Featured researches published by Lauren Cairco Dukes.


intelligent user interfaces | 2013

SIDNIE: scaffolded interviews developed by nurses in education

Lauren Cairco Dukes; Toni Bloodworth Pence; Larry F. Hodges; Nancy Meehan; Arlene Johnson

One of the most common clinical education methods for teaching patient interaction skills to nursing students is role-playing established scenarios with their classmates. Unfortunately, this is far from simulating real world experiences that they will soon face, and does not provide the immediate, impartial feedback necessary for interviewing skills development. We present a system for Scaffolded Interviews Developed by Nurses In Education (SIDNIE) that supports baccalaureate nursing education by providing multiple guided interview practice sessions with virtual characters. Our scenario depicts a mother who has brought in her five year old child to the clinic. In this paper we describe our system and report on a preliminary usability evaluation conducted with nursing students.


ieee international conference on healthcare informatics | 2013

The Effects of Interaction and Visual Fidelity on Learning Outcomes for a Virtual Pediatric Patient System

Toni Bloodworth Pence; Lauren Cairco Dukes; Larry F. Hodges; Nancy Meehan; Arlene Johnson

One of the most common clinical education methods for teaching patient interaction skills to nursing students is role-playing established scenarios with their classmates. Unfortunately, this is far from simulating real world experiences that they will soon face, and does not provide the immediate, impartial feedback necessary for interviewing skills development. We developed a system for Scaffolded Interviews Developed by Nurses In Education (SIDNIE) that supports baccalaureate nursing education by providing multiple guided interview practice sessions with virtual characters. During the development and evaluation of SIDNIE we realized the importance of determining the visual and interaction fidelity requirements necessary for proper learning. In this paper we report on two fidelity studies conducted with nursing students. The goal of the visual fidelity study was to determine if our virtual characters containing life-like animations would have an effect on learning or if we would get the same effect using a stationary image of our virtual environment. The second study focused on the interaction fidelity of our system and the goal was to determine if the interaction modality had an effect on the learning outcome. In particular we evaluated the effect of voice input as compared to a standard mouse-click input for question selection.


ieee virtual reality conference | 2014

Tablet-based interaction panels for immersive environments

David M. Krum; Thai Phan; Lauren Cairco Dukes; Peter Wang; Mark T. Bolas

With the current widespread interest in head mounted displays, we perceived a need for devices that support expressive and adaptive interaction in a low-cost, eyes-free manner. Leveraging rapid prototyping techniques for fabrication, we have designed and manufactured a variety of panels that can be overlaid on multi-touch tablets and smartphones. The panels are coupled with an app running on the multi-touch device that exchanges commands and state information over a wireless network with the virtual reality application. Sculpted features of the panels provide tactile disambiguation of control widgets and an onscreen heads-up display provides interaction state information. A variety of interaction mappings can be provided through software to support several classes of interaction techniques in virtual environments. We foresee additional uses for applications where eyes-free use and adaptable interaction interfaces can be beneficial.


ieee virtual reality conference | 2014

Development of a scenario builder tool for scaffolded virtual patients

Lauren Cairco Dukes; Larry F. Hodges

Simulation training using virtual patients can provide many benefits for nursing and medical students in learning patient interaction skills, especially since virtual patients can represent a wide range of patient scenarios and demographics that are not typically represented through roleplay or clinical rotations. However, creation of a single virtual patient scenario may take six months or more, so it is difficult to provide many unique scenarios during the course of a nurses education. In this work, I propose the user-centered design, creation, and evaluation of a scenario-builder tool to enable nursing faculty to create their own scenarios, reducing development costs.


ieee virtual reality conference | 2014

Automated calibration of display characteristics (ACDC) for head-mounted displays and arbitrary surfaces

J. Adam Jones; Lauren Cairco Dukes; Mark T. Bolas

In this document we present a method for calibrating head-mounted displays and other display surfaces using an automated, low-cost camera system. A unique aspect of this method is that the calibration of geometric distortions, field of view, and chromatic aberration are achieved without the need for a priori knowledge of the display systems intrinsic parameters. This method operates by capturing and storing the pixel space locations of a series of real world control points. These control points are then used as ground truth references by which virtual space transformations can be automatically generated for a display system.


symposium on 3d user interfaces | 2013

Poster: Comparing usability of a single versus dual interaction metaphor in a multitask healthcare simulation

Lauren Cairco Dukes; Jeffrey W. Bertrand; Manan Gupta; Rowan Armstrong; Tracy Fasolino; Sabarish V. Babu; Larry F. Hodges

We present the results of a user study performed within a multitask healthcare simulation, where nurses are required to care for virtual patients within a 3D virtual environment while recording data in a 2D graphical user interface (GUI) based electronic health record system. We evaluated whether a single interaction metaphor of mouse and keyboard for both virtual and GUI sub-systems of our simulation was superior in terms of user preference and performance to a dual interaction metaphor of using touchscreen for the virtual environment while using mouse and keyboard for the GUI. User preference and performance both indicate that the single interaction metaphor was more usable, although each technique was sufficiently usable for accomplishing simulation goals.


ieee international conference on healthcare informatics | 2016

Usability Evaluation of a Pediatric Virtual Patient Creation Tool

Lauren Cairco Dukes; Nancy Meehan; Larry F. Hodges

Virtual patients are computer simulations that behave in the same way that an actual patient would in a medical context. Since these characters are simulated, they can provide realistic yet repetitive practice in patient interaction since they can represent a wide range of patients and each scenario can be practiced until the student achieves competency. However, the development costs for virtual patients are high, since creation of a single scenario may take up to nine months. In this work, we present a virtual patient platform that reduces development costs. The SIDNIE (Scaffolded Interviews Developed by Nurses in Education) system can adapt a single scenario to multiple levels of learners and supports the selection of multiple learning goals. Previously, we worked with nurse educators in a participatory design process to create a scenario builder tool for SIDNIE [paper in submission]. In this work we detail a usability evaluation of the scenario creation tool. We found that nurse educators were able to use our tool to create a virtual patient scenario in less than two hours.


collaboration technologies and systems | 2015

Correction of geometric distortions and the impact of eye position in virtual reality displays

J. Adam Jones; Lauren Cairco Dukes; David M. Krum; Mark T. Bolas; Larry F. Hodges

A common technology used to present immersive, collaborative environments is the head mounted virtual reality (VR) display. However, due to engineering limitations, variability in manufacturing, and person-to-person differences in eye position, virtual environments are often geometrically inaccurate. Correcting these inaccuracies typically requires complicated or interactive calibration procedures. In this document we present a method for calibrating head-mounted displays and other display surfaces using an automated, low-cost camera system. A unique aspect of this method is that the calibration of geometric distortions, field of view, and chromatic aberration are achieved without the need for a priori knowledge of the display systems intrinsic parameters. Since this calibration method can easily measure display distortions, we further extend our work to serve to measure the effect of eye position on the apparent location of imagery presented in a virtual reality head mounted display. We test a range of reasonable eye positions that may result from person-to-person variations in display placement and interpupilary distances. It was observed that the pattern of geometric distortions introduced by the displays optical system changes substantially as the eye moves from one position to the next. Though many commercial and research VR systems calibrate for interpupillary distance and optical distortions separately, this may be insufficient as eye position influences distortion characteristics.


intelligent virtual agents | 2014

An Eye Tracking Evaluation of a Virtual Pediatric Patient Training System for Nurses

Toni Bloodworth Pence; Lauren Cairco Dukes; Larry F. Hodges; Nancy Meehan; Arlene Johnson

We report an eye tracking experiment conducted on a virtual pediatric patient system to determine the effect of different visual interface layouts on the amount of visual interaction of student nurses with a virtual mother-child dyad. The results of this experiment provide insight into the tradeoffs between using animated or non-animated virtual patients and relative advantages/disadvantages of interacting with the system though a tablet interface.


ieee virtual reality conference | 2014

A demonstration of tablet-based interaction panels for immersive environments

David M. Krum; Thai Phan; Lauren Cairco Dukes; Peter Wang; Mark T. Bolas

Our demo deals with the need in immersive virtual reality for devices that support expressive and adaptive interaction in a low-cost, eyes-free manner. Leveraging rapid prototyping techniques for fabrication, we have developed a variety of panels that can be overlaid on multi-touch tablets and smartphones. The panels are coupled with an app running on the multi-touch device that exchanges commands and state information over a wireless network with the virtual reality application. Sculpted features of the panels provide tactile disambiguation of control widgets and an onscreen heads-up display provides interaction state information. A variety of interaction mappings can be provided through software to support several classes of interaction techniques in virtual environments. We foresee additional uses for applications where eyes-free use and adaptable interaction interfaces can be beneficial.

Collaboration


Dive into the Lauren Cairco Dukes's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mark T. Bolas

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David M. Krum

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Thai Phan

University of Southern California

View shared research outputs
Researchain Logo
Decentralizing Knowledge