Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kenneth Salisbury is active.

Publication


Featured researches published by Kenneth Salisbury.


symposium on haptic interfaces for virtual environment and teleoperator systems | 2007

Haptic Feedback Enhances Force Skill Learning

Dan Morris; Hong Z. Tan; Federico Barbagli; Timothy Chang; Kenneth Salisbury

This paper explores the use of haptic feedback to teach an abstract motor skill that requires recalling a sequence of forces. Participants are guided along a trajectory and are asked to learn a sequence of one-dimensional forces via three paradigms: haptic training, visual training, or combined visuohaptic training. The extent of learning is measured by accuracy of force recall. We find that recall following visuohaptic training is significantly more accurate than recall following visual or haptic training alone, although haptic training alone is inferior to visual training alone. This suggests that in conjunction with visual feedback, haptic training may be an effective tool for teaching sensorimotor skills that have a force-sensitive component to them, such as surgery. We also present a dynamic programming paradigm to align and compare spatiotemporal haptic trajectories


IEEE Computer Graphics and Applications | 2006

Visuohaptic simulation of bone surgery for training and evaluation

Dan Morris; Christopher Sewell; Federico Barbagli; Kenneth Salisbury; Nikolas H. Blevins; Sabine Girod

Visual and haptic simulation of bone surgery can support and extend current surgical training techniques. The authors present a system for simulating surgeries involving bone manipulation, such as temporal bone surgery and mandibular surgery, and discuss the automatic computation of surgical performance metrics. Experimental results confirm the systems construct validity


tests and proofs | 2006

Haptic discrimination of force direction and the influence of visual information

Federico Barbagli; Kenneth Salisbury; Cristy Ho; Charles Spence; Hong Z. Tan

Despite a wealth of literature on discrimination thresholds for displacement, force magnitude, stiffness, and viscosity, there is currently a lack of data on our ability to discriminate force directions. Such data are needed in designing haptic rendering algorithms where force direction, as well as force magnitude, are used to encode information such as surface topography. Given that haptic information is typically presented in addition to visual information in a data perceptualization system, it is also important to investigate the extent to which the congruency of visual information affects force-direction discrimination. In this article, the authors report an experiment on the discrimination threshold of force directions under the three display conditions of haptics alone (H), haptics plus congruent vision (HVcong), and haptics plus incongruent vision (HVincong). Average force-direction discrimination thresholds were found to be 18.4°, 25.6°, and 31.9° for the HVcong, H and HVincong conditions, respectively. The results show that the congruency of visual information significantly affected haptic discrimination of force directions, and that the force-direction discrimination thresholds did not seem to depend on the reference force direction. The implications of the results for designing haptic virtual environments, especially when the numbers of sensors and actuators in a haptic display do not match, are discussed.


Computer Aided Surgery | 2008

Providing metrics and performance feedback in a surgical simulator.

Christopher Sewell; Dan Morris; Nikolas H. Blevins; Sanjeev Dutta; Sumit Agrawal; Federico Barbagli; Kenneth Salisbury

One of the most important advantages of computer simulators for surgical training is the opportunity they afford for independent learning. However, if the simulator does not provide useful instructional feedback to the user, this advantage is significantly blunted by the need for an instructor to supervise and tutor the trainee while using the simulator. Thus, the incorporation of relevant, intuitive metrics is essential to the development of efficient simulators. Equally as important is the presentation of such metrics to the user in such a way so as to provide constructive feedback that facilitates independent learning and improvement. This paper presents a number of novel metrics for the automated evaluation of surgical technique. The general approach was to take criteria that are intuitive to surgeons and develop ways to quantify them in a simulator. Although many of the concepts behind these metrics have wide application throughout surgery, they have been implemented specifically in the context of a simulation of mastoidectomy. First, the visuohaptic simulator itself is described, followed by the details of a wide variety of metrics designed to assess the users performance. We present mechanisms for presenting visualizations and other feedback based on these metrics during a virtual procedure. We further describe a novel performance evaluation console that displays metric-based information during an automated debriefing session. Finally, the results of several user studies are reported, providing some preliminary validation of the simulator, the metrics, and the feedback mechanisms. Several machine learning algorithms, including Hidden Markov Models and a Naïve Bayes Classifier, are applied to our simulator data to automatically differentiate users’ expertise levels.


medical image computing and computer assisted intervention | 2004

A Collaborative Virtual Environment for the Simulation of Temporal Bone Surgery

Dan Morris; Christopher Sewell; Nikolas H. Blevins; Federico Barbagli; Kenneth Salisbury

We describe a framework for training-oriented simulation of temporal bone surgery. Bone dissection is simulated visually and haptically, using a hybrid data representation that allows smooth surfaces to be maintained for graphic rendering while volumetric data is used for haptic feedback. Novel sources of feedback are incorporated into the simulation platform, including synthetic drill sounds based on experimental data and simulated monitoring of virtual nerve bundles. Realistic behavior is modeled for a variety of surgical drill burrs, rendering the environment suitable for training low-level drilling skills. The system allows two users to independently observe and manipulate a common model, and allows one user to experience the forces generated by the other’s contact with the bone surface. This permits an instructor to remotely observe a trainee and provide real-time feedback and demonstration.


Neurosurgery | 2013

Virtual reality simulation in neurosurgery: technologies and evolution.

Sonny Chan; Francois Conti; Kenneth Salisbury; Nikolas H. Blevins

Neurosurgeons are faced with the challenge of learning, planning, and performing increasingly complex surgical procedures in which there is little room for error. With improvements in computational power and advances in visual and haptic display technologies, virtual surgical environments can now offer potential benefits for surgical training, planning, and rehearsal in a safe, simulated setting. This article introduces the various classes of surgical simulators and their respective purposes through a brief survey of representative simulation systems in the context of neurosurgery. Many technical challenges currently limit the application of virtual surgical environments. Although we cannot yet expect a digital patient to be indistinguishable from reality, new developments in computational methods and related technology bring us closer every day. We recognize that the design and implementation of an immersive virtual reality surgical simulator require expert knowledge from many disciplines. This article highlights a selection of recent developments in research areas related to virtual reality simulation, including anatomic modeling, computer graphics and visualization, haptics, and physics simulation, and discusses their implication for the simulation of neurosurgery.


international symposium on haptic interfaces for virtual environment and teleoperator systems | 2004

Simulating human fingers: a soft finger proxy model and algorithm

Federico Barbagli; Antonio Frisoli; Kenneth Salisbury; Massimo Bergamasco

This paper presents models and algorithms that can be used to simulate contact between one or more fingertips and a virtual object. First, the paper presents various models for rotational friction obtained from in-vivo fingertip models previously proposed in the robotics and biomechanics community. Then the paper describes two sets of experiments that were performed on in-vivo fingertips in order to understand which of the models presented fits best with the real rotational friction properties of the human fingertips. Finally an extension of the god object/proxy algorithm which allows the simulation of soft finger contact, i.e. a point-contact with friction capable of supporting moments (up to a torsional friction limit) about the contact normal, is proposed. The resulting algorithm is computationally efficient, being point-based, while retaining a good level of realism.


Medical Physics | 2010

Telerobotic system concept for real‐time soft‐tissue imaging during radiotherapy beam delivery

Jeffrey Schlosser; Kenneth Salisbury; Dimitre Hristov

PURPOSE The curative potential of external beam radiation therapy is critically dependent on having the ability to accurately aim radiation beams at intended targets while avoiding surrounding healthy tissues. However, existing technologies are incapable of real-time, volumetric, soft-tissue imaging during radiation beam delivery, when accurate target tracking is most critical. The authors address this challenge in the development and evaluation of a novel, minimally interfering, telerobotic ultrasound (U.S.) imaging system that can be integrated with existing medical linear accelerators (LINACs) for therapy guidance. METHODS A customized human-safe robotic manipulator was designed and built to control the pressure and pitch of an abdominal U.S. transducer while avoiding LINAC gantry collisions. A haptic device was integrated to remotely control the robotic manipulator motion and U.S. image acquisition outside the LINAC room. The ability of the system to continuously maintain high quality prostate images was evaluated in volunteers over extended time periods. Treatment feasibility was assessed by comparing a clinically deployed prostate treatment plan to an alternative plan in which beam directions were restricted to sectors that did not interfere with the transabdominal U.S. transducer. To demonstrate imaging capability concurrent with delivery, robot performance and U.S. target tracking in a phantom were tested with a 15 MV radiation beam active. RESULTS Remote image acquisition and maintenance of image quality with the haptic interface was successfully demonstrated over 10 min periods in representative treatment setups of volunteers. Furthermore, the robots ability to maintain a constant probe force and desired pitch angle was unaffected by the LINAC beam. For a representative prostate patient, the dose-volume histogram (DVH) for a plan with restricted sectors remained virtually identical to the DVH of a clinically deployed plan. With reduced margins, as would be enabled by real-time imaging, gross tumor volume coverage was identical while notable reductions of bladder and rectal volumes exposed to large doses were possible. The quality of U.S. images obtained during beam operation was not appreciably degraded by radiofrequency interference and 2D tracking of a phantom object in U.S. images obtained with the beam on/off yielded no significant differences. CONCLUSIONS Remotely controlled robotic U.S. imaging is feasible in the radiotherapy environment and for the first time may offer real-time volumetric soft-tissue guidance concurrent with radiotherapy delivery.


The International Journal of Robotics Research | 2005

A Multirate Approach to Haptic Interaction with Deformable Objects Single and Multipoint Contacts

Federico Barbagli; Domenico Prattichizzo; Kenneth Salisbury

In this paper we describe a new solution for stable haptic interaction with deformable object simulations featuring low servo rates and computational delays. The solution presented is a combination of the local model and the virtual coupling concepts proposed in the past. By varying the local model impedance depending on the local stiffness of the deformable object, the interaction between local model and simulation can always be made stable independently of low servo rates or computational delays. Moreover, by using more complex local impedances that feature an integral term, we are able to control the steady-state error between the device and the surface of the deformable object. This allows us to maximize the Z -width of the simulation, while obtaining overall stable behavior without using any added damping. The local model is always computed using the current deformable object surface, thus allowing for multi-point contact interaction, i.e., allowing multiple users to feel each other’s influence on the object. The proposed solution is presented and analyzed in a multirate setting. Experimental results employing a Phantom haptic interface are presented.


symposium on haptic interfaces for virtual environment and teleoperator systems | 2006

Haptically Annotated Movies: Reaching Out and Touching the Silver Screen

Derek Gaw; Dan Morris; Kenneth Salisbury

The film industry consistently strives to make the movie-going experience more immersive and more captivating, through larger screens, higher-quality images, and increasingly sophisticated speaker systems. Currently, however, presentation of movies is limited to the visual and auditory senses. Haptics provides significant potential for augmenting the theater experience beyond those sensory modalities. In this paper and the accompanying demo, we present a system for recording and annotating haptic information that is time-referenced to a movie, then replaying the recorded haptic information to a user. We discuss several user interface issues that we addressed and several scenarios that are augmented by this system.

Collaboration


Dive into the Kenneth Salisbury's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge