Breck A. Sieglinger
Georgia Tech Research Institute
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Breck A. Sieglinger.
Technologies for Synthetic Environments: Hardware-in-the-Loop Testing VIII | 2003
Breck A. Sieglinger; James D. Norman; William M. Meshell; David S. Flynn; Rhoe A. Thompson; George C. Goldsmith
For many types of infrared scene projectors, differences in the outputs of individual elements are one source of error in projecting a desired radiance scene. This is particularly true of resistor-array based infrared projectors. Depending on the sensor and application, the desired response uniformity may prove difficult to achieve. The properties of the sensor used to measure the projector outputs critically affect the procedures that can be used for nonuniformity correction (NUC) of the projector, as well as the final accuracy achievable by the NUC. In this paper we present a description of recent efforts to perform NUC of an infrared projector under “adverse” circumstances. For example, the NUC sensor may have some undesirable properties, including: significant random noise, large residual response nonuniformity, temporal drift in bias or gain response, vibration, and bad pixels. We present a procedure for reliably determining the output versus input response of each individual emitter of a resistor array projector. This NUC procedure has been demonstrated in several projection systems at the Kinetic Kill Vehicle Hardware-In-the-Loop Simulator (KHILS) including those within the KHILS cryogenic chamber. The NUC procedure has proven to be generally robust to various sensor artifacts.
Technologies for Synthetic Environments: Hardware-in-the-Loop Testing VIII | 2003
David S. Flynn; Richard Bryan Sisko; Breck A. Sieglinger; Rhoe A. Thompson
Infrared projection systems based on resistor arrays typically produce radiometric outputs with wavelengths that range from less than 3 microns to more than 12 microns. This makes it possible to test infrared sensors with spectral responsivity anywhere in this range. Two resistor-array projectors optically folded together can stimulate the two bands of a 2-color sensor. If the wavebands of the sensor are separated well enough, it is possible to fold the projected images together with a dichroic beam combiner (perhaps also using spectral filters in front of each resistor array) so that each resistor array independently stimulates one band of the sensor. If the wavebands are independently stimulated, it is simple to perform radiometric calibrations of both projector wavebands. In some sensors, the wavebands are strongly overlapping, and driving one of the resistor arrays stimulates both bands of the unit-under-test (UUT). This “coupling” of the two bands causes errors in the radiance levels measured by the sensor, if the projector bands are calibrated one at a time. If the coupling between the bands is known, it is possible to preprocess the driving images to effectively decouple the bands. This requires performing transformations, which read both driving images (one in each of the two bands) and judiciously adjusting both projectors to give the desired radiance in both bands. With this transformation included, the projection system acts as if the bands were decoupled - varying one input radiance at a time only produces a change in the corresponding band of the sensor. This paper describes techniques that have been developed to perform radiometric calibrations of spectrally coupled, 2-color projector/sensor systems. Also presented in the paper are results of tests performed to demonstrate the performance of the calibration techniques. Possible hardware and algorithms for performing the transformation in real-time are also presented.
Technologies for Synthetic Environments: Hardware-in-the-Loop Testing II | 1997
Breck A. Sieglinger; David S. Flynn; Charles F. Coker
This paper presents an analysis of spatial blurring and sampling effects for a sensor viewing a pixelized scene projector. It addresses the ability of a projector to simulate an arbitrary continuous radiance scene using a field of discrete elements. The spatial fidelity of the projector as seen by an imaging sensor is shown to depend critically on the width of the sensor MTF or spatial response function, and the angular spacing between projector pixels. Quantitative results are presented based on a simulation that compares the output of a sensor viewing a reference scene to the output of the sensor viewing a projector display of the reference scene. Dependence on the blur of the sensor and projector, the scene content, and alignment both of features in the scene and sensor samples with the projector pixel locations are addressed. We attempt to determine the projector characteristics required to perform hardware-in-the-loop testing with adequate spatial realism to evaluate seeker functions like autonomous detection, measuring radiant intensities and angular positions or unresolved objects, or performing autonomous recognition and aimpoint selection for resolved objects.
Technologies for Synthetic Environments: Hardware-in-the-Loop Testing XI | 2006
Breck A. Sieglinger; Steven Arthur Marlow; Richard Bryan Sisko; Rhoe A. Thompson
Testing of two-color imaging sensors often requires precise spatial alignment, including correction of distortion in the optical paths, beyond what can be achieved mechanically. Testing, in many cases, also demands careful radiometric calibration, which may be complicated by overlap in the spectral responses of the two sensor bands. In this paper, we describe calibration procedures used at the Air Force Research Laboratory hardware-in-the-loop (HWIL) facility at Eglin AFB, and present some results of recent two-color testing in a cryo-vacuum test chamber.
Technologies for Synthetic Environments: Hardware-in-the-Loop Testing XI | 2006
Owen M. Williams; Leszek Swierkowski; Breck A. Sieglinger; George C. Goldsmith
Resistor array infrared projectors offer the unique potential of simultaneously covering both a wide apparent temperature range and providing fine temperature resolution at low output levels. The temperature resolution capability may not be realized, however, if the projector error sources are not controlled; for example, residual nonuniformity after nonuniformity correction (NUC) procedures have been applied, temporal noise in analog drive voltages and quantization at several points in the projection system, all of which may introduce errors larger than the desired resolution. In this paper the temperature resolution limits are assessed in general. In particular, the quantization errors are assessed and the post-NUC residual nonuniformity levels required for achievement of fine temperature resolution are calculated.
Proceedings of SPIE | 1996
Breck A. Sieglinger; David S. Flynn; Charles F. Coker
Hardware-in-the-loop (HWIL) simulation combines functional hardware with digital models. This technique has proven useful for test and evaluation of guided missile seekers. In a nominal configuration, the seeker is stimulated by synthetic image data. Seeker outputs are passed to a simulation control computer that simulates guidance, navigation, control, and airframe response of the missile. The seeker can be stimulated either by a projector or by direct signal injection (DSI). Despite recent advancements in scene projection technology, there are practical limits to the scenes produced by a scene projector. Thus, the test method of choice is often DSI. This paper discusses DSI techniques for HWIL. In this mode, sensor hardware is not used; scene signature data, provided directly to the seeker signal processor, is computed and sensor measurement effects are simulated. The computed images include sensor effects such as blurring, sampling, detector response characteristics, and noise. This paper discusses DSI methods for HWIL, with specific applications at the Air Force Kinetic Kill Vehicle Hardware-in-the-loop Simulator facility.
Technologies for Synthetic Environments: Hardware-in-the-Loop Testing VII | 2002
Wayne Keen; David S. Flynn; Thomas P. Bergin; Breck A. Sieglinger; Rhoe A. Thompson
As discussed in a previous paper to this forum, optical components such as collimators that are part of many infrared projection systems can lead to significant distortions in the sensed position of projected objects versus their true position. The previous paper discussed the removal of these distortions in a single waveband through a polynomial correction process. This correction was applied during post-processing of the data from the infrared camera-under-test. This paper extends the correction technique to two-color infrared projection. The extension of the technique allows the distortions in the individual bands to be corrected, as well as providing for alignment of the two color channels at the aperture of the camera-under-test. The co-alignment of the two color channels is obtained through the application of the distortion removal function to the object position data prior to object projection.
Technologies for Synthetic Environments: Hardware-in-the-Loop Testing II | 1997
David S. Flynn; Breck A. Sieglinger; Robert Lee Murrer; Lawrence E. Jones; Eric M. Olson; Allen R. Andrews; James A. Gordon
In a series of measurements made to characterize the performance of a Wideband Infrared Scene Projector (WISP) system, timing artifacts were observed in one set of tests in which the projector update was synchronized with the camera readout. The projector was driven with images that varied from frame to frame, and the measured images were examined to determine if they varied from frame to frame in a corresponding manner. It was found that regardless of the relative time delay between the projector update and sensor readout, each output image was a result of two input images. By analyzing the timing characteristics of the camera integration scheme and the WISP update scheme it was possible to understand effects in the measured images and simulate images with the same effects. This paper describes the measurements and the analyses. Although the effects were due to the unique camera integration and readout scheme, the effects could show up when testing other sensors. Thus also presented in this paper are techniques for testing with resistive array projectors, so that the timing artifacts observed with various kinds of cameras are minimized or eliminated.
Proceedings of SPIE | 2012
Joe LaVeigne; Breck A. Sieglinger
Achieving very high apparent temperatures is a persistent goal in infrared scene projector (IRSP) design. Several programs are currently under way to develop technologies for producing high apparent temperatures. Producing a useful system capable of reproducing high fidelity scenes across a large range of apparent temperatures requires more than just a high temperature source. The entire scene projection system must support the extended dynamic range of the desired scenarios. Supporting this extended range places requirements on the rest of the system. System resolution and non-uniformity correction (NUC) are two areas of concern in the development of a high dynamic range IRSP. We report the results of some initial investigations into the resolution required for acceptable system performance and the effects of moving to a higher dynamic range may put on existing NUC procedures.
Technologies for Synthetic Environments: Hardware-in-the-Loop Testing XIII | 2008
Jon Geske; Kevin Sparkman; Jim Oleson; Joe LaVeigne; Breck A. Sieglinger; Steve Marlow; Heard S. Lowry; James Burns
Polarization is increasingly being considered as a method of discrimination in passive sensing applications. In this paper the degree of polarization of the thermal emission from the emitter arrays of two new Santa Barbara Infrared (SBIR) micro-bolometer resistor array scene projectors was characterized at ambient temperature and at 77 K. The emitter arrays characterized were from the Large Format Resistive Array (LFRA) and the Optimized Arrays for Space-Background Infrared Simulation (OASIS) scene projectors. This paper reports the results of this testing.