Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David Black is active.

Publication


Featured researches published by David Black.


International Journal of Medical Robotics and Computer Assisted Surgery | 2013

Auditory support for resection guidance in navigated liver surgery.

Christian Hansen; David Black; Christoph Lange; F Rieber; W Lamadé; Marcello Donati; Karl J. Oldhafer; Horst K. Hahn

An alternative mode of interaction with navigation systems for open liver surgery was requested. Surgeons who use such systems are impeded by having to constantly switch between viewing the navigation system screen and the patient during an operation.


computer assisted radiology and surgery | 2017

A Survey of auditory display in image-guided interventions

David Black; Christian Hansen; Arya Nabavi; Ron Kikinis; Horst K. Hahn

PurposeThis article investigates the current state of the art of the use of auditory display in image-guided medical interventions. Auditory display is a means of conveying information using sound, and we review the use of this approach to support navigated interventions. We discuss the benefits and drawbacks of published systems and outline directions for future investigation.MethodsWe undertook a review of scientific articles on the topic of auditory rendering in image-guided intervention. This includes methods for avoidance of risk structures and instrument placement and manipulation. The review did not include auditory display for status monitoring, for instance in anesthesia.ResultsWe identified 15 publications in the course of the search. Most of the literature (60%) investigates the use of auditory display to convey distance of a tracked instrument to an object using proximity or safety margins. The remainder discuss continuous guidance for navigated instrument placement. Four of the articles present clinical evaluations, 11 present laboratory evaluations, and 3 present informal evaluation (2 present both laboratory and clinical evaluations).ConclusionAuditory display is a growing field that has been largely neglected in research in image-guided intervention. Despite benefits of auditory displays reported in both the reviewed literature and non-medical fields, adoption in medicine has been slow. Future challenges include increasing interdisciplinary cooperation with auditory display investigators to develop more meaningful auditory display designs and comprehensive evaluations which target the benefits and drawbacks of auditory display in image guidance.


computer assisted radiology and surgery | 2017

Instrument-mounted displays for reducing cognitive load during surgical navigation

Marc Herrlich; Parnian Tavakol; David Black; Dirk Wenig; Christian Rieder; Rainer Malaka; Ron Kikinis

PurposeSurgical navigation systems rely on a monitor placed in the operating room to relay information. Optimal monitor placement can be challenging in crowded rooms, and it is often not possible to place the monitor directly beside the situs. The operator must split attention between the navigation system and the situs. We present an approach for needle-based interventions to provide navigational feedback directly on the instrument and close to the situs by mounting a small display onto the needle.MethodsBy mounting a small and lightweight smartwatch display directly onto the instrument, we are able to provide navigational guidance close to the situs and directly in the operator’s field of view, thereby reducing the need to switch the focus of view between the situs and the navigation system. We devise a specific variant of the established crosshair metaphor suitable for the very limited screen space. We conduct an empirical user study comparing our approach to using a monitor and a combination of both.ResultsResults from the empirical user study show significant benefits for cognitive load, user preference, and general usability for the instrument-mounted display, while achieving the same level of performance in terms of time and accuracy compared to using a monitor.ConclusionWe successfully demonstrate the feasibility of our approach and potential benefits. With ongoing technological advancements, instrument-mounted displays might complement standard monitor setups for surgical navigation in order to lower cognitive demands and for improved usability of such systems.


computer assisted radiology and surgery | 2017

Auditory feedback to support image-guided medical needle placement

David Black; Julian Hettig; Maria Luz; Christian Hansen; Ron Kikinis; Horst K. Hahn

PurposeDuring medical needle placement using image-guided navigation systems, the clinician must concentrate on a screen. To reduce the clinician’s visual reliance on the screen, this work proposes an auditory feedback method as a stand-alone method or to support visual feedback for placing the navigated medical instrument, in this case a needle.MethodsAn auditory synthesis model using pitch comparison and stereo panning parameter mapping was developed to augment or replace visual feedback for navigated needle placement. In contrast to existing approaches which augment but still require a visual display, this method allows view-free needle placement. An evaluation with 12 novice participants compared both auditory and combined audiovisual feedback against existing visual methods.ResultsUsing combined audiovisual display, participants show similar task completion times and report similar subjective workload and accuracy while viewing the screen less compared to using the conventional visual method. The auditory feedback leads to higher task completion times and subjective workload compared to both combined and visual feedback.ConclusionAudiovisual feedback shows promising results and establishes a basis for applying auditory feedback as a supplement to visual information to other navigated interventions, especially those for which viewing a patient is beneficial or necessary.


computer assisted radiology and surgery | 2018

Auditory display as feedback for a novel eye-tracking system for sterile operating room interaction

David Black; Michael Unger; Nele M. Fischer; Ron Kikinis; Horst K. Hahn; Thomas Neumuth; Bernhard Glaser

PurposeThe growing number of technical systems in the operating room has increased attention on developing touchless interaction methods for sterile conditions. However, touchless interaction paradigms lack the tactile feedback found in common input devices such as mice and keyboards. We propose a novel touchless eye-tracking interaction system with auditory display as a feedback method for completing typical operating room tasks. Auditory display provides feedback concerning the selected input into the eye-tracking system as well as a confirmation of the system response.MethodsAn eye-tracking system with a novel auditory display using both earcons and parameter-mapping sonification was developed to allow touchless interaction for six typical scrub nurse tasks. An evaluation with novice participants compared auditory display with visual display with respect to reaction time and a series of subjective measures.ResultsWhen using auditory display to substitute for the lost tactile feedback during eye-tracking interaction, participants exhibit reduced reaction time compared to using visual-only display. In addition, the auditory feedback led to lower subjective workload and higher usefulness and system acceptance ratings.ConclusionDue to the absence of tactile feedback for eye-tracking and other touchless interaction methods, auditory display is shown to be a useful and necessary addition to new interaction concepts for the sterile operating room, reducing reaction times while improving subjective measures, including usefulness, user satisfaction, and cognitive workload.


computer assisted radiology and surgery | 2018

Auditory display for Fluorescence-guided open brain tumor surgery

David Black; Horst K. Hahn; Ron Kikinis; Karin Wårdell; Neda Haj-Hosseini

PurposeProtoporphyrin (PpIX) fluorescence allows discrimination of tumor and normal brain tissue during neurosurgery. A handheld fluorescence (HHF) probe can be used for spectroscopic measurement of 5-ALA-induced PpIX to enable objective detection compared to visual evaluation of fluorescence. However, current technology requires that the surgeon either views the measured values on a screen or employs an assistant to verbally relay the values. An auditory feedback system was developed and evaluated for communicating measured fluorescence intensity values directly to the surgeon.MethodsThe auditory display was programmed to map the values measured by the HHF probe to the playback of tones that represented three fluorescence intensity ranges and one error signal. Ten persons with no previous knowledge of the application took part in a laboratory evaluation. After a brief training period, participants performed measurements on a tray of 96 wells of liquid fluorescence phantom and verbally stated the perceived measurement values for each well. The latency and accuracy of the participants’ verbal responses were recorded. The long-term memorization of sound function was evaluated in a second set of 10 participants 2–3 and 7–12 days after training.ResultsThe participants identified the played tone accurately for 98% of measurements after training. The median response time to verbally identify the played tones was 2 pulses. No correlation was found between the latency and accuracy of the responses, and no significant correlation with the musical proficiency of the participants was observed on the function responses. Responses for the memory test were 100% accurate.ConclusionThe employed auditory display was shown to be intuitive, easy to learn and remember, fast to recognize, and accurate in providing users with measurements of fluorescence intensity or error signal. The results of this work establish a basis for implementing and further evaluating auditory displays in clinical scenarios involving fluorescence guidance and other areas for which categorized auditory display could be useful.


International Journal of Medical Robotics and Computer Assisted Surgery | 2018

Design and evaluation of an eye tracking support system for the scrub nurse

Michael Unger; David Black; Nele M. Fischer; Thomas Neumuth; Bernhard Glaser

The availability of an increasing number of medical devices in the digital operating room has led to increased interaction demands of the surgical staff. To counteract the risk of bacterial contamination induced by device interactions, touchless interaction techniques are required.


Mensch & Computer | 2017

Auditory Display for Improving Free-hand Gesture Interaction

David Black; Bastian Ganze; Julian Hettig; Christian Hansen

Free-hand gesture recognition technologies allow touchless interaction with a range of applications. However, touchless interaction concepts usually only provide primary, visual feedback on a screen. The lack of secondary tactile feedback, such as that of pressing a key or clicking a mouse, in interaction with free-hand gestures is one reason that such techniques have not been adopted as a standard means of input. This work explores the use of auditory display to improve free-hand gestures. Gestures using a Leap motion controller were augmented with auditory icons and continuous, model-based sonification. Three concepts were generated and evaluated using a sphere-selection task and a video frame selection task. The user experience of the participants was evaluated using NASA TLX and QUESI questionnaires. Results show that the combination of auditory and visual display outperform both purely auditory and purely visual displays in terms of subjective workload and performance measures.


Journal of the Acoustical Society of America | 2017

Psychoacoustic sonification for tracked medical instrument guidance

Tim Ziemer; David Black

In image-guided surgery, displays show a tracked instrument relative to a patients anatomy. This helps the surgeon to follow a predefined path with a scalpel or to avoid risk structures. A psychoacoustically motivated sonification design is presented to help assist surgeons in navigating a tracked instrument to a target location in two-dimensional space. This is achieved by mapping spatial dimensions to audio parameters that affect the magnitude of different perceptual sound qualities. Horizontal distance and direction are mapped to glissando speed and direction of a Shepard tone. The vertical dimension is divided into two regions. Below the target, the vertical distance controls the LFO speed of an amplitude modulation to create a regular beating well below the threshold of roughness sensation. Above the target elevation, the vertical deflection controls the depth of frequency modulation to gradually increase the number and amplitudes of sidebands, affecting perceived noisiness and roughness. This redun...


172nd Meeting of the Acoustical Society of America | 2017

Psychoacoustic sonification design for navigation in surgical interventions

Tim Ziemer; David Black; Holger Schultheis

A psychoacoustically motivated sonification design to guide clinicians in two-dimensional space is presented, e.g., to navigate a scalpel towards a target resection trajectory, ablation needle towards a pre-planned insertion point, or drill towards a target burrhole for craniotomy. Navigation is achieved by mapping spatial dimensions to audio synthesis parameters that affect the magnitude of different perceptual sound qualities. Orthogonal spatial dimensions are mapped to orthogonal auditory qualities.In a preliminary laboratory study, non-expert users successfully identified the target field out of 16 possible fields in 41% of all trials. The correct cardinal direction was identified in 84% of the trials. Based on both findings and further psychoacoustic considerations, the mapping range is optimized, and an implementation of an additional depth dimension is discussed.

Collaboration


Dive into the David Black's collaboration.

Top Co-Authors

Avatar

Horst K. Hahn

Jacobs University Bremen

View shared research outputs
Top Co-Authors

Avatar

Jörn Loviscach

Bremen University of Applied Sciences

View shared research outputs
Top Co-Authors

Avatar

Christian Hansen

Otto-von-Guericke University Magdeburg

View shared research outputs
Top Co-Authors

Avatar

Ron Kikinis

Brigham and Women's Hospital

View shared research outputs
Top Co-Authors

Avatar

Christian Rieder

University of Koblenz and Landau

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Julian Hettig

Otto-von-Guericke University Magdeburg

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michael Hlatky

Bremen University of Applied Sciences

View shared research outputs
Researchain Logo
Decentralizing Knowledge