Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Greg S. Ruthenbeck is active.

Publication


Featured researches published by Greg S. Ruthenbeck.


Journal of Simulation | 2015

Virtual reality for medical training: the state-of-the-art

Greg S. Ruthenbeck; Karen J. Reynolds

Virtual reality (VR) medical simulations deliver a tailored learning experience that can be standardized, and can cater to different learning styles in ways that cannot be matched by traditional teaching. These simulations also facilitate self-directed learning, allow trainees to develop skills at their own pace and allow unlimited repetition of specific scenarios that enable them to remedy skills deficiencies in a safe environment. A number of simulators have been validated and have shown clear benefits to medical training. However, while graphical realism is high, realistic haptic feedback and interactive tissues are limited for many simulators. This paper reviews the current status and benefits of haptic VR simulation-based medical training for bone and dental surgery, intubation procedures, eye surgery, and minimally invasive and endoscopic surgery.


American Journal of Rhinology & Allergy | 2013

Toward photorealism in endoscopic sinus surgery simulation

Greg S. Ruthenbeck; Jonathan C. Hobson; A. Simon Carney; Steve Sloan; Raymond Sacks; Karen J. Reynolds

Background Endoscopic sinus surgery (ESS) is the surgical standard treatment for chronic rhinitis/rhinosinusitis and nasal polyposis. There is a reported complication rate of 5–10% associated with this type of surgery. Simulation has been advocated as a means to improve surgical training and minimize the rates of complication and medical error. This study aimed to show how a virtual reality ESS simulator was developed, with particular emphasis on achieving satisfactory photorealism and surgical verisimilitude. Methods Sinus computed tomography scans were processed to create a triangle-based three-dimensional mesh model; this was incorporated into a spring-damper model of thousands of interconnected nodes, which is allowed to deform in response to user interactions. Dual haptic handpiece devices were programmed to simulate an endoscope and various surgical instruments. Textures and lighting effects were added to the mesh model to provide an accurate representation of the surgical field. Effects such as vasoconstriction in response to “virtual” decongestant were added. Results The final simulated endoscopic view of the sinuses accurately simulates the moist and glossy appearance of the sinuses. The interactive tissue simulation system enables the user to interactively cut and remove tissue while receiving accurate haptic feedback. A working prototype of the simulator has been developed that leverages recent advances in computer hardware to deliver a realistic user experience, both visually and haptically. Conclusion This new computer-based training tool for practicing ESS provides a risk-free environment for surgical trainees to practice and develop core skills. The novel use of customized precision force feedback (haptic) devices enables trainees to use movements during training that closely mimic those used during the actual procedure, which we anticipate will improve learning, retention, and recall.


Journal of Simulation | 2013

Virtual reality surgical simulator software development tools

Greg S. Ruthenbeck; Karen J. Reynolds

Virtual reality (VR) surgical simulations are among the most difficult software applications to develop mainly because of the type of user interactions that they must support. Surgery typically includes precise cutting of often intricate structures. Modelling these structures and accurately simulating their response to user interaction requires many software components to effectively work in unison. Some of these components are readily available but are tailored to more common applications such as computer games or open-world simulations such as flight-simulators. This article explores the software libraries that are currently available to developers of VR surgical simulation software. Like computer games and other VR simulations, VR surgical simulations require real-time lighting and rendering systems and physics-based interactions. However, in addition they require haptic interaction with cut-able and deformable soft-tissue, a key requirement that is not supported by the majority of the available tools. In this article, we introduce currently available software development tools and the specific benefits and limitations that can be encountered when using them to develop VR surgical simulations. We also provide a detailed review of collision detection libraries that are central to achieving reliable haptic rendering.


Anz Journal of Surgery | 2016

Comparing surgical experience with performance on a sinus surgery simulator.

Laura Diment; Greg S. Ruthenbeck; Nuwan Dharmawardana; A. Simon Carney; Charmaine M. Woods; Eng Hooi Ooi; Karen J. Reynolds

This study evaluates whether surgical experience influences technical competence using the Flinders sinus surgery simulator, a virtual environment designed to teach nasal endoscopic surgical skills.


ieee haptics symposium | 2014

Hapteo: Sharing visual-haptic experiences from virtual environments

Yongyao Yan; Greg S. Ruthenbeck; Karen J. Reynolds

For over two decades, haptics has provided people with new computer interaction styles across a range of applications. However, it is difficult to share haptic experiences and compare haptic rendering algorithms. In this paper, we introduce a new system called Hapteo that enables “haptic videos” (videos with haptic recordings) to be recorded and replayed without requiring haptic virtual environment software to be deployed. Our system enables users to capture visual-haptic information easily and share it using the Hapteo Player. In order to provide a more realistic and interactive experience, contact surfaces and haptic forces are modeled with radial basis functions, and videos are replayed in a haptic-driven style.


Computer Methods in Biomechanics and Biomedical Engineering | 2015

Real-time interactive isosurfacing: a new method for improving marching isosurfacing algorithm output and efficiency

Greg S. Ruthenbeck; Fabian S. Lim; Karen J. Reynolds

Efficient rendering of a changing volumetric data-set is central to the development of effective medical simulations that incorporate haptic feedback. A new method referred to as real-time interactive isosurfacing (RTII) is described in this paper. RTII is an algorithm that can be applied to output from Marching Cubes-like algorithms to improve performance for real-time applications. The approach minimises processing by re-evaluating the isosurface around changing sub-volumes resulting from user interactions. It includes innovations that significantly reduce mesh complexity and improve mesh quality as triangles are created from the Marching Tetrahedra isosurfacing algorithm. Rendering efficiency is further improved over other marching isosurfacing algorithm outputs by maintaining an indexed triangle representation of the mesh. The effectiveness of RTII is discussed within the context of an endoscopic sinus surgery simulation currently being developed by the authors.


Simulation | 2014

Edge concealment in a combined surface mesh and scalar-field tissue model for surgical simulations

Greg S. Ruthenbeck; Karen J. Reynolds

This article describes a new approach for producing highly realistic visualizations that are interactively cuttable by utilizing the programmability of the graphics rendering pipeline. It combines interactively changing scalar-field derived mesh geometry with static mesh geometry that contains additional lighting terms created offline using three-dimensional modeling software packages. This improves visual realism of surgical simulations whilst enabling more efficient surface representations for interactive areas of the same model, in this case the newly formed surface created when interactively cutting a model. The boundary between the interactively cut surface (generated from the scalar field), and the remaining surface triangles of the static model, is jagged and unrealistic when un-enhanced. Here we describe a method for blending the two models using a simple bleeding effect along the cut edge. This allows the cut edge and the internal cut surface to blend and thereby conceals unrealistic and distracting jagged cut edges. The bloodied edge is more realistic than an unmodified hard edge, which improves the quality of the simulation overall. Moreover, as the available processing power increases the resolution that can be achieved will increase and should allow this method to be extended for slice cutting simulation.


international conference on human haptic sensing and touch enabled computer applications | 2014

A Genetic Algorithm Approach to Identify Virtual Object Properties for Sharing the Feel from Virtual Environments

Yongyao Yan; Greg S. Ruthenbeck; Karen J. Reynolds

Haptics has provided people with new computer interaction styles across a range of applications. However, it is difficult to share haptic experiences from haptic virtual environments (HVEs). In this paper, we introduce a genetic algorithm (GA) approach, which is used to identify the virtual object’s properties (e.g. stiffness, friction coefficient and geometry parameters) based on haptic recordings, so that the haptic rendering can be reproduced without requiring the original HVE software to be deployed.


international conference on human haptic sensing and touch enabled computer applications | 2016

Does Haptic Feedback Improve Learning and Recall of Spatial Information? A Study Using a Virtual Reality Nasendoscopy Simulation

Greg S. Ruthenbeck; Michael Tlauka; Andria Tan

In the literature, haptic training has long been regarded as an effective means of acquiring skills that involve force feedback. This is relevant in the context of haptic virtual reality applications that argue that the addition of haptics increases the effectiveness of the training system. Here we describe an experimental investigation which examines whether haptic feedback increases peoples spatial knowledge of a simulation. In particular, we address the following question: Is visuo-haptic interaction a more effective way of learning spatial information than purely visual interaction? A comparison of two groups of participants visual versus visuo-haptic revealed no significant differences in their spatial knowledge of the simulation. The findings are discussed with reference to potential variables which may affect spatial learning such as cognitive load.


international symposium on multimedia | 2014

Haptic-Driven Visual Playback

Yongyao Yan; Greg S. Ruthenbeck; Karen J. Reynolds

Haptics refers to a tactile feedback technology which allows users to touch and feel objects in virtual environments. However, it is hard to share the visual-haptic experience, which relies too much on the computer hardware and the haptic virtual environment software. In our previous work, we proposed a new system which intends to record and playback visual-haptic information from haptic virtual environments. In this paper, we focus on the haptic-driven visual playback solution in the system, which enables users to replay visual information based on the spatial position of the haptic stylus.

Collaboration


Dive into the Greg S. Ruthenbeck's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Eng Hooi Ooi

Flinders Medical Centre

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge