Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Austin A. Richards is active.

Publication


Featured researches published by Austin A. Richards.


26th International Congress on High-Speed Photography and Photonics | 2005

Applications for high-speed infrared imaging

Austin A. Richards

The phrase high-speed imaging is generally associated with short exposure times, fast frame rates or both. Supersonic projectiles, for example, are often impossible to see with the unaided eye, and require strobe photography to stop their apparent motion. It is often necessary to image high-speed objects in the infrared region of the spectrum, either to detect them or to measure their surface temperature. Conventional infrared cameras have time constants similar to the human eye, so they too, are often at a loss when it comes to photographing fast-moving hot targets. Other types of targets or scenes such as explosions change very rapidly with time. Visualizing those changes requires an extremely high frame rate combined with short exposure times in order to slow down a dynamic event so that it can be studied and quantified. Recent advances in infrared sensor technology and computing power have pushed the envelope of what is possible to achieve with commercial IR camera systems.


European Symposium on Optics and Photonics for Defence and Security | 2004

Superframing: scene dynamic range extension of infrared cameras

Austin A. Richards; Brian K. Cromwell

Infrared cameras are often used to capture high-speed digital video of scenes with enormous ranges in in-band brightness. A simple example of this would be a man standing next to a hot fire. Under normal operating conditions, it can be next to impossible to fully span a scene like this with the brightness dynamic range of an infrared camera. The brightest or hottest parts of the image will often be saturated, while at the same time the darkest or coldest parts of the scene may be buried in the noise floor of the camera and appear black in the image. Varying the exposure by changing the integration time is necessary to maximize the useful information recorded by the camera, but sometimes a single integration time is not enough to fully encompass a scenes variations. The technique of superframing consists of varying the integration time of the camera from frame to frame in a cyclic manner, then combining the resulting subframes into single superframes with greatly extended dynamic ranges. The technique and some sample data are described in this paper.


Defense and Security | 2005

Radiometric calibration of infrared cameras accounting for atmospheric path effects

Austin A. Richards; Greg Johnson

Radiometric infrared camera systems are most often used to characterize the IR signature of targets (often an aircraft or rocket) through significant air paths that reduce the received signal. Tactical targets can be imaged at standoff distances up to 1000km or more, but there are many cases where the target is within 1km range, as is the case with a close-in flyby at a test range. This paper compares experimental radiometric data to a theoretical model of the atmosphere. The radiometric data was collected in the 3-5 micron band using an indium antimonide staring-array camera and long focal length lens combined with radiometric analysis software. The system was calibrated to measure target radiances, but can also be used to estimate target temperatures in cases where the in-band target emissivity is well understood. The radiometric data are compared to a model built on MODTRAN code, with conclusions about the attenuation introduced by the atmosphere for standard medium-range imaging systems in “typical” observing conditions.


Defense and Security | 2005

Extending IR camera scene radiance dynamic range with cyclic integration times

Austin A. Richards; Brian K. Cromwell

Infrared cameras are often used to capture high-speed digital video of scenes with enormous ranges in in-band brightness. A simple example of this is a rocket launch, a scene which can consist of a cold rocket hardbody and an extremely hot exhaust plume. It can be next to impossible to fully span a scene like this with the brightness dynamic range of an infrared camera (typically ~12-14 bits) at a single exposure value. The brightest or hottest parts of the image will often be saturated, while at the same time the darkest or coldest parts of the scene may be buried in the noise floor of the camera and appear black in the image. Varying the exposure by adjusting the camera to an optimal shutter speed or integration time is necessary to maximize the useful information recorded by the camera. Sometimes, however, a single integration time is not enough to fully encompass a scene’s brightness (temperature) variations. The technique of superframing gets around this problem by exploiting the capabilities of high frame-rate IR cameras. The technique involves cycling a camera through a set of integration times on a frame-by-frame basis, then combines the resulting “subframes” into single “superframes” with greatly extended dynamic ranges. If the frame rate is sufficiently high, the scene will not change appreciably from one subframe to the next. The technique and some sample data are described in this paper.


Proceedings of SPIE | 2017

Advantages of strained-layer superlattice detectors for high-speed thermal events

Austin A. Richards

Type II Strained-Layer Superlattice detectors are currently being incorporated into hybridized infrared camera focal plane arrays for commercial applications. The detectors offer significant advantages over InSb and MCT detectors for certain application spaces, particularly high-speed imaging for industrial purposes, and military test ranges. The advantage over MWIR InSb sensors is driven by blackbody physics, which results in much higher emitted photon radiance values for target temperatures around ambient, as well as increased temperature dynamic range as a result of the lower thermal derivative in the LWIR band relative to the MWIR band.


Proceedings of SPIE, the International Society for Optical Engineering | 2006

A novel NIR camera with extended dynamic range

Austin A. Richards; Shariff D'Souza

We have constructed a novel filter wheel camera that allows filters to be rapidly and sequentially introduced into the optical path of a high-performance NIR (near-infrared) camera based on a staring focal-plane array (FPA) made with indium gallium arsenide (InGaAs) detectors. The filter wheel is populated with neutral density filters ranging from a transmission of 0.97 (essentially no attenuation) to ~10-5 (an ND5 filter stack). The camera acquires images with increasing attenuation of signal in cycles of six images called subframes. Those images are collapsed into a single radiometrically-calibrated image (called a superframe) with a greatly extended dynamic range. In the current configuration of the system, the radiance dynamic range is about 4x106, which is equivalent to 22 bits, a significant enhancement over the nominal 14-bit dynamic range of the camera core. This extended range makes it possible to make radiometric measurements on low ambient light scenes with tremendous variability of temperature or radiance, such as rocket launches, laser beams and intense flames. It is also possible to image scenes with high ambient near-infrared light levels, such as landscape on bright, sunny days without having to dynamically adjust exposure. Since the wheel rotates at high speed (15 Hz), the resulting dataset of six-frame cycles can be reduced to a superframe movie sequence with 15 Hz frame rate, making it possible to image spatially-changing scenes such as rocket launches with good image registration between subframes in a given cycle.


Infrared Imaging Systems: Design, Analysis, Modeling, and Testing XXIX | 2018

Measurements of SWIR backgrounds using the swux unit of measure

Austin A. Richards; Martin Hübner; Michael Vollmer

The SWIR waveband between 0.8μm-1.8μm is getting increasingly exploited by imaging systems in a variety of different applications, including persistent imaging for security and surveillance of high-value assets, handheld tactical imagers, range-gated imaging systems and imaging LADAR for driverless vehicles. The vast majority of these applications utilize lattice-matched InGaAs detectors in their imaging sensors, and these sensors are rapidly falling in price, leading to their widening adoption. As these sensors are used in novel applications and locations, it is important that ambient SWIR backgrounds be understood and characterized for a variety of different field conditions, primarily for the purposes of system performance modeling of SNR and range metrics. SWIR irradiance backgrounds do not consistently track visible-light illumination at all. There is currently little of this type of information in the open literature, particularly measurements of SWIR backgrounds in urban areas, natural areas, or indoors. This paper presents field measurements done with an InGaAs detector calibrated in the swux unit of InGaAs-band-specific irradiance proposed by two of the authors in 2017. Simultaneous measurements of illuminance levels (in lux) at these sites are presented, as well as visible and InGaAs camera images of the scenery at some of these measurement sites. The swux and lux measurement hardware is described, along with the methods used to calibrate it. Finally, the swux levels during the partial and total phases of the total solar eclipse of 2017 are presented, along with curves fitted to the data from a theoretical model, based on obscuration of the sun by the moon. The apparent differences between photometric and swux measurements will be discussed.


Proceedings of SPIE | 2017

A new radiometric unit of measure to characterize SWIR illumination

Austin A. Richards; M. Hübner

We propose a new radiometric unit of measure we call the ‘swux’ to unambiguously characterize scene illumination in the SWIR spectral band between 0.8μm-1.8μm, where most of the ever-increasing numbers of deployed SWIR cameras (based on standard InGaAs focal plane arrays) are sensitive. Both military and surveillance applications in the SWIR currently suffer from a lack of a standardized SWIR radiometric unit of measure that can be used to definitively compare or predict SWIR camera performance with respect to SNR and range metrics. We propose a unit comparable to the photometric illuminance lux unit; see Ref. [1]. The lack of a SWIR radiometric unit becomes even more critical if one uses lux levels to describe SWIR sensor performance at twilight or even low light condition, since in clear, no-moon conditions in rural areas, the naturally-occurring SWIR radiation from nightglow produces a much higher irradiance than visible starlight. Thus, even well-intentioned efforts to characterize a test site’s ambient illumination levels in the SWIR band may fail based on photometric instruments that only measure visible light. A study of this by one of the authors in Ref. [2] showed that the correspondence between lux values and total SWIR irradiance in typical illumination conditions can vary by more than two orders of magnitude, depending on the spectrum of the ambient background. In analogy to the photometric lux definition, we propose the SWIR irradiance equivalent ‘swux’ level, derived by integration over the scene SWIR spectral irradiance weighted by a spectral sensitivity function S(λ), a SWIR analog of the V(λ) photopic response function.


Archive | 2006

Infrared and near-infrared camera hyperframing

Austin A. Richards; Shariff D'Souza


Archive | 2009

Infrared camera systems and methods for dual sensor applications

Nicholas Högasten; Jeffrey S. Scott; Patrick B. Richardson; Jeffrey D. Frank; Austin A. Richards; James T. Woolaway

Collaboration


Dive into the Austin A. Richards's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge