Dean A. Scribner
United States Naval Research Laboratory
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Dean A. Scribner.
Survey of Ophthalmology | 2002
Eyal Margalit; Mauricio Maia; James D. Weiland; Robert J. Greenberg; G.Y. Fujii; Gustavo Torres; Duke V. Piyathaisere; Tm O'Hearn; Wentai Liu; Gianluca Lazzi; Gislin Dagnelie; Dean A. Scribner; Eugene de Juan; Mark S. Humayun
Most of current concepts for a visual prosthesis are based on neuronal electrical stimulation at different locations along the visual pathways within the central nervous system. The different designs of visual prostheses are named according to their locations (i.e., cortical, optic nerve, subretinal, and epiretinal). Visual loss caused by outer retinal degeneration in diseases such as retinitis pigmentosa or age-related macular degeneration can be reversed by electrical stimulation of the retina or the optic nerve (retinal or optic nerve prostheses, respectively). On the other hand, visual loss caused by inner or whole thickness retinal diseases, eye loss, optic nerve diseases (tumors, ischemia, inflammatory processes etc.), or diseases of the central nervous system (not including diseases of the primary and secondary visual cortices) can be reversed by a cortical visual prosthesis. The intent of this article is to provide an overview of current and future concepts of retinal and optic nerve prostheses. This article will begin with general considerations that are related to all or most of visual prostheses and then concentrate on the retinal and optic nerve designs. The authors believe that the field has grown beyond the scope of a single article so cortical prostheses will be described only because of their direct effect on the concept and technical development of the other prostheses, and this will be done in a more general and historic perspective.
Proceedings of the IEEE | 1991
Dean A. Scribner; Melvin R. Kruer; J. M. Killiany
Requirements for infrared focal plane arrays (IRFPAs) for advanced infrared imaging systems are discussed, and an overview is given of different IRFPA architectures. Important IR detector structures, including photoconductive, photovoltaic, metal-insulator-semiconductor (MIS), and Schottky barrier, are reviewed. Infrared detector materials and related crystal-growth techniques are discussed, emphasizing applicability to IRFPA designs and performance. Three types of input circuit used to couple the detector to the readout circuitry are discussed, namely, direct injection, buffered direct injection, and gate modulation. An overview is given of several readout techniques, including the CCD, MOSFET switch, CID, and CIM. Also discussed are related onchip signal processing topics as well as questions regarding producibility and array implementation. >
international symposium on neural networks | 1993
Dean A. Scribner; Kenneth A. Sarkady; Melvin R. Kruer; John T. Caulfield; J. D. Hunt; M. Colbert; M. Descour
Retina-like processing techniques to perform nonuniformity correction of infrared focal plane arrays (IRFPAs) are discussed. The objective is to design and test electronic structures based on early vision processes in the vertebrate retina. Two adaptive techniques are successfully developed. Both are stable and convergent and are used to process existing infrared images in nonreal time on the Naval Research Laboratory (NRL) Connection Machine. They are implemented on a real-time image array processor interfaced to an IRFPA in the laboratory. Conceptual designs for an analog chip are prepared and partially simulated.<<ETX>>
Applications of Artificial Neural Networks | 1990
Dean A. Scribner; Kenneth A. Sarkady; John T. Caulfield; Melvin R. Kruer; G. Katz; C. J. Gridley; Charles Herman
Scene-based nonuniformity correction (NUC) was performed via two techniques, and the results are presented in terms of ability to allow increased sensitivity and to minimize scene degradation. The two techniques perform recalibration in real-time based on the radiance levels of the scene being viewed. The scene-based NUC was effected with an algorithm based on a temporal high-pass filter, and with one based on an artificial neural network. Recorded data from an MWIR staring array collecting different images were used for the experiments. The advantages of the two scene-based NUC techniques are summarized and compared to those of the traditional calibration technique. The potential for the elimination of spatial noise and the achievement of BLIP performance levels in high-quantum-efficiency FPAs are emphasized. Spatial noise can be eliminated by employing real-time signal processors.
Proceedings of SPIE | 1991
Dean A. Scribner; Kenneth A. Sarkady; Melvin R. Kruer; John T. Caulfield; J. D. Hunt; Charles Herman
With rapid advancements in infrared focal plane array (IRFPA) technology, greater demands are being placed on nonuniformity correction (NUC) techniques to provide near-BLIP performance over a wide dynamic range. Standard NUC techniques involve calibrating each detector using reference temperature sources before imaging the IRFPA. Usually the correction needs to be re-calibrated after a short period of time due to IRFPA drift or to adjust for changes in the level of background flux. Adaptive NUC techniques eliminate the need for calibration by continuously updating the correction coefficients based on radiance levels of the scene being viewed. In this manner, continuous compensation can be applied adaptively for individual detector non-idealities and background changes. Two adaptive NUC techniques are discussed; one is a temporal highpass filter and the other involves a neural network with lateral interconnects to nearest neighbor pixels. Both have similarities to biological retinal processing. Questions of implementation and stability are discussed and performance results are given for several test image sequences which were obtained from an MWIR HgCdTe array and a HIDAD uncooled array. We conclude that adaptive techniques will be very useful in future IRFPA sensors, primarily because of their ability to adapt over a wide range of background flux without calibration sources, but also because they can offer improved sensitivity under most operating conditions.
IEEE Transactions on Geoscience and Remote Sensing | 2003
Rulon Mayer; Frank Bucholtz; Dean A. Scribner
Changes in atmosphere, ground conditions, scene temperature, solar illumination, and sensor response can significantly affect the detected multispectral and hyperspectral data. Using uncorrected spectral target signatures in spectral matched filter searches therefore results in target detection with concomitant high false-alarm rates due to changes in multispectral and hyperspectral images. This letter introduces the use of the whitening/dewhitening (WD) transform to help correct target spectral signatures under varying conditions. An important feature of this transform is that it does not require subpixel registration between images collected at two distinct times. The transform was tested on images taken from two very different data collects using different sensors, targets, and backgrounds. In one dataset, the transform was applied to hyperspectral images taken from airborne longwave infrared sensor binned to 30 bands and the other data collect used images of a variety of tanks, trucks, calibration panels that were collected using bore-sighted broadband visible, shortwave infrared, midwave infrared, and longwave infrared staring array sensors. Target spectral signatures were transformed using imagery of spatially overlapping regions from datasets collected at different times and processed using the whitening and then dewhitening transform (inverse of a whitening transform). Use of the WD transform yielded a large target-to-clutter ratio (TCR) and was compared to the TCR derived from other transforms that approximated the cross-covariance matrix. In addition, the WD-transformed signatures applied in a matched filter search found targets (some concealed behind vegetative foliage or underneath camouflage) with low false-alarm rates as shown in a receiver operator characteristic curve. This letter demonstrates that the WD transform enhances searches for concealed targets in multisensor and hyperspectral data.
machine vision applications | 1999
Dean A. Scribner; Penny R. Warren; Jonathan M. Schuler
Abstract. There is a rapid growth in sensor technology in regions of the electromagnetic spectrum beyond the visible. Also, the parallel growth of powerful, inexpensive computers and digital electronics has made many new imaging applications possible. Although there are many approaches to sensor fusion, this paper provides a discussion of relevant infrared phenomenology and attempts to apply known methods of human color vision to achieve image fusion. Two specific topics of importance are color contrast enhancement and color constancy.
Optics & Photonics News | 1998
Dean A. Scribner; Penny R. Warren; Jon Schuler; Michael Satysher; Melvin R. Kruer
Sensor fusion is the science of combining information from multiple sensor responses. One approach, visualization, fuses various images and adds false color to define features that might otherwise be hidden and to aid the viewer in deciphering a scene.
IEEE Transactions on Biomedical Circuits and Systems | 2007
Dean A. Scribner; Johnson L; Skeath P; Klein R; Ilg D; Wasserman L; Fernandez N; Freeman W; Peele J; Perkins Fk; Friebele Ej; Bassett We; Howard Jg; Krebs W
A very large format neural stimulator device, to be used in future retinal prosthesis experiments, has been designed, fabricated, and tested. The device was designed to be positioned against a human retina for short periods in an operating room environment. Demonstrating a very large format, parallel interface between a 2-D microelectronic stimulator array and neural tissue would be an important step in proving the feasibility of high resolution retinal prosthesis for the blind. The architecture of the test device combines several novel components, including microwire glass, a microelectronic multiplexer, and a microcable connector. The array format is 80 times 40 array pixels with approximately 20 microwire electrodes per pixel. The custom assembly techniques involve indium bump bonding, ribbon bonding, and encapsulation. The design, fabrication, and testing of the device has resolved several important issues regarding the feasibility of high-resolution retinal prosthesis, namely, that the combination of conventional CMOS electronics and microwire glass provides a viable approach for a high resolution retinal prosthesis device. Temperature change from power dissipation within the device and maximum electrical output current levels suggest that the device is acceptable for acute human tests
Advanced Materials | 2002
Guido Zuccarello; Dean A. Scribner; Randall Sands; Leonard J. Buckley
Bio-inspired optical systems have the potential to transform the process of image collection and analysis in the future. Through simplicity and dynamics these collection systems may provide multiple functions with an excellent level of reliability. From the graded index of the fish-eye to the hyper-acuity of the compound eye, nature has provided function with amazing simplicity. This article reviews a select number of biological vision systems and the unique material behaviors that rise to these characteristics. We introduce the materials chemistry and morphological complexity that enables the biological system behavior.