Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Don Stredney is active.

Publication


Featured researches published by Don Stredney.


Genome Biology | 2001

A draft annotation and overview of the human genome

Fred A. Wright; William J. Lemon; Wei D. Zhao; Russell Sears; Degen Zhuo; Jian Ping Wang; Hee-Yung Yang; Troy Baer; Don Stredney; Joe Spitzner; Al Stutz; Ralf Krahe; Bo Yuan

BackgroundThe recent draft assembly of the human genome provides a unified basis for describing genomic structure and function. The draft is sufficiently accurate to provide useful annotation, enabling direct observations of previously inferred biological phenomena.ResultsWe report here a functionally annotated human gene index placed directly on the genome. The index is based on the integration of public transcript, protein, and mapping information, supplemented with computational prediction. We describe numerous global features of the genome and examine the relationship of various genetic maps with the assembly. In addition, initial sequence analysis reveals highly ordered chromosomal landscapes associated with paralogous gene clusters and distinct functional compartments. Finally, these annotation data were synthesized to produce observations of gene density and number that accord well with historical estimates. Such a global approach had previously been described only for chromosomes 21 and 22, which together account for 2.2% of the genome.ConclusionsWe estimate that the genome contains 65,000-75,000 transcriptional units, with exon sequences comprising 4%. The creation of a comprehensive gene index requires the synthesis of all available computational and experimental evidence.


Otolaryngology-Head and Neck Surgery | 2002

Virtual temporal bone dissection: An interactive surgical simulator

Gregory J. Wiet; Don Stredney; Dennis Sessanna; Bryan J; D. Bradley Welling; Petra Schmalbrock

OBJECTIVE: Our goal was to integrate current and emerging technology in virtual systems to provide a temporal bone dissection simulator that allows the user interactivity and realism similar to the cadaver laboratory. STUDY DESIGN: Iterative design and validation of a virtual environment for simulating temporal bone dissection. SETTING: University otolaryngology training program with interdisciplinary interaction in a high-performance computer facility. RESULTS: The system provides visual, force feedback (haptic), and aural interfaces. Unlike previous “fly through” virtual systems, this environment provides a richer emulation of surgical experience. CONCLUSION: The system provides a high level of functional utility and, through initial evaluations, demonstrates promise in adding to traditional training methods. SIGNIFICANCE: The system provides an environment to learn temporal bone surgery in a way similar to the experience with cadaver material where the subject is able to interact with the data without constraints (nondeterministic). Eventually, it may provide the “front end” to a large repository of various temporal bone pathologies that can be accessed through the Internet.


Laryngoscope | 1998

Functional endoscopic sinus surgery training simulator

David T. Rudman; Don Stredney; Sessanna D; Roni Yagel; Roger Crawfis; David Heskamp; Charles V. Edmond; Gregory J. Wiet

Objective/Hypothesis: To determine the efficacy of a haptic (force feedback) device and to compare isosurface and volumetric models of a functional endoscopic sinus surgery (FESS) training simulator. Study Design: A pilot study involving faculty and residents from the Department of Otolaryngology at The Ohio State University. Methods: Objective trials evaluated the haptic devices ability to perceive three‐dimensional shapes (stereognosis) without the aid of image visualization. Ethmoidectomy tasks were performed with both isosurface and volumetric FESS simulators, and surveys compared the two models. Results: The haptic device was 77% effective for stereognosis tasks. There was a preference toward the isosurface model over the volumetric model in terms of visual representation, comfort, haptic‐visual fidelity, and overall performance. Conclusions: The FESS simulator uses both visual and haptic feedback to create a virtual reality environment to teach paranasal sinus anatomy and basic endoscopic sinus surgery techniques to ear, nose, and throat residents. The results of the current study showed that the haptic device was accurate in and of itself, within its current physical limitations, and that the isosurface‐based simulator was preferred. Laryngoscope, 108:1643–1647, 1998


IEEE Computer Graphics and Applications | 2005

Illustration motifs for effective medical volume illustration

Nikolai A. Svakhine; David S. Ebert; Don Stredney

The enormous amount of 3D data generated by modern scientific experiments, simulations, and scanners exacerbates the tasks of effectively exploring, analyzing, and communicating the essential information from these data sets. The expanding field of biomedicine creates data sets that challenge current techniques to effectively communicate information for use in diagnosis, staging, simulation, and training. In contrast, medical illustration succinctly represents essential anatomical structures in a clear way and is used extensively in the medical held for communicative and illustrative purposes. Thus, the idea of rendering real medical data sets using traditional medical illustrative styles inspired work in volume illustration. The main goal of the volume illustration approach is to enhance the expressiveness of volume rendering by highlighting important features within a volume while subjugating insignificant details, and rendering the result in a way that resembles an illustration. Recent approaches have been extended to interactive volume illustration by using PC graphics hardware volume rendering to accelerate the enhanced rendering, resulting in nearly interactive rates.


Otolaryngology-Head and Neck Surgery | 2005

Use of ultra-high-resolution data for temporal bone dissection simulation

Gregory J. Wiet; Petra Schmalbrock; Kimerly A. Powell; Don Stredney

OBJECTIVES: For the past 5 years, our group has been developing a virtual temporal bone dissection environment for training otologic surgeons. Throughout the course of our development, a recurring challenge is the acquisition of high-resolution, multimodal, and multi-scale data sets that are used for the visual as well as haptic (sense of touch) display. This study presents several new techniques in temporal bone imaging and their use as data for surgical simulation. METHODS: At our institution (OSU), we are fortunate to have a high-field (8 Tesla) magnetic resonance imaging (MRI) research magnet that provides an order of magnitude higher resolution compared to clinical 1.5T MRI scanners. Magnetic resonance imaging has traditionally been superb at delineating soft tissue structure, and certainly, the 8T unit does indeed do this at a resolution of 100-200 μm3. To delineate the bony structure of the mastoid and middle ear, computed tomography (CT) has traditionally been used because of the high signal-to-noise ratio delineating bone signal from air and soft tissue. We have partnered with researchers at other institutions (CCF) to make use of a “microCT” that provides a resolution of 214 × 214 × 390 micrometers of bony structure. RESULTS: This report provides a description of the 2 methodologies and presentation of the striking image data capable of being generated. See images presented. CONCLUSIONS: Using these 2 new and innovative imaging modalities, we provide an order of magnitude greater resolution to the visual and haptic display in our temporal bone dissection simulation environment.


Otolaryngologic Clinics of North America | 2011

Training and Simulation in Otolaryngology

Gregory J. Wiet; Don Stredney; Dinah Wan

This article focuses on key issues surrounding the needs and application of simulation technologies for technical skills training in otolaryngology. The discussion includes an overview of key topics in training and learning, the application of these issues in simulation environments, and the subsequent applications of these simulation environments to otolaryngology. Examples of past applications are presented, with discussion of how the interplay of cultural changes in surgical training in general along with the rapid advancements in technology have shaped and influenced their adoption and adaptation. The authors conclude with emerging trends and potential influences advanced simulation and training will have on technical skills training in otolaryngology.


Computers & Graphics | 1996

Building a virtual environment for endoscopic sinus surgery simulation

Roni Yagel; Don Stredney; Gregory J. Wiet; Petra Schmalbrock; Louis B. Rosenberg; Sessanna D; Yair Kurzion

Abstract Advanced display technologies have made the virtual exploration of relatively complex models feasible in many applications. Unfortunately, only a few human interfaces allow natural interaction with the environment. Moreover, in surgical applications, such realistic interaction requires real-time rendering of volumetric data—placing an overwhelming performance burden on the system. We report on our advances towards developing a virtual reality system that provides intuitive interaction with complex volume data by employing real-time realistic volume rendering and convincing forece feedback (haptic) sensations. We describe our methods for real-time volume rendering, model deformation, interaction, and the haptic devices, and demonstrate the utilization of this system in the real-world application of Endoscopic Sinus Surgery (ESS) simulation.


Laryngoscope | 2012

Virtual temporal bone dissection system: OSU virtual temporal bone system: development and testing.

Gregory J. Wiet; Don Stredney; Thomas Kerwin; Bradley Hittle; Soledad Fernandez; Mahmoud Abdel-Rasoul; Welling Db

The objective of this project was to develop a virtual temporal bone dissection system that would provide an enhanced educational experience for the training of otologic surgeons.


IEEE Transactions on Visualization and Computer Graphics | 2009

Enhancing Realism of Wet Surfaces in Temporal Bone Surgical Simulation

Thomas Kerwin; Han-Wei Shen; Don Stredney

We present techniques to improve visual realism in an interactive surgical simulation application: a mastoidectomy simulator that offers a training environment for medical residents as a complement to using a cadaver. As well as displaying the mastoid bone through volume rendering, the simulation allows users to experience haptic feedback and appropriate sound cues while controlling a virtual bone drill and suction/irrigation device. The techniques employed to improve realism consist of a fluid simulator and a shading model. The former allows for deformable boundaries based on volumetric bone data, while the latter gives a wet look to the rendered bone to emulate more closely the appearance of the bone in a surgical environment. The fluid rendering includes bleeding effects, meniscus rendering, and refraction. We incorporate a planar computational fluid dynamics simulation into our three-dimensional rendering to effect realistic blood diffusion. Maintaining real-time performance while drilling away bone in the simulation is critical for engagement with the system.


medicine meets virtual reality | 2002

Elastically Deformable 3D Organs for Haptic Surgical Simulators

Rw Webster; Haluck R, Mohler, B; Ravenscrogt R, Crouthamel E, Frack T, Terlecki, S; J Shaeffer; Westwood; H. Miller Hoffman; Richard A. Robb; Don Stredney

This paper describes a technique for incorporating real-time elastically deformable 3D organs in haptic surgical simulators. Our system is a physically based particle model utilizing a mass-springs-damper connectivity with an implicit predictor to speed up calculations during each time step. The solution involves repeated application of Newtons 2ndd Law of motion: F = ma using an implicit solver for numerically solving the differential equations.

Collaboration


Dive into the Don Stredney's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Thomas Kerwin

Ohio Supercomputer Center

View shared research outputs
Top Co-Authors

Avatar

Bradley Hittle

Ohio Supercomputer Center

View shared research outputs
Top Co-Authors

Avatar

Sessanna D

Ohio Supercomputer Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bryan J

Ohio State University

View shared research outputs
Researchain Logo
Decentralizing Knowledge