Jeremy P. Bos
Michigan Technological University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jeremy P. Bos.
Optical Engineering | 2012
Jeremy P. Bos; Michael C. Roggemann
Abstract. The presence of turbulence over horizontal imaging paths severely reduces the resolution available to imaging systems and introduces anisoplanatic distortions in the image frame. A variety of image processing techniques are being developed which can mitigate these effects, and sequences of high-fidelity images created via simulation are useful in their development. In this paper, we describe a simplification of the split-step wave propagation method that employs a series of uniformly spaced phase screens to accurately simulate turbulence effects on imaging over horizontal paths. Employing this method, a series of 1000 image frames were generated for each of three turbulence conditions. The mean squared error in intensity per pixel is also evaluated for each frame in comparison to a diffraction-limited reference. Examination of the per-frame intensity error statistics indicate these errors are log-normally distributed about a mean value that increases approximately linearly with turbulence strength.
Applied Optics | 2015
Jeremy P. Bos; Michael C. Roggemann; V. S. Rao Gudimetla
We describe a modification to fast Fourier transform (FFT)-based, subharmonic, phase screen generation techniques that accounts for non-Kolmogorov and anisotropic turbulence. Our model also allows for the angle of anisotropy to vary in the plane orthogonal to the direction of propagation. In addition, turbulence strength in our model is specified via a characteristic length equivalent to the Fried parameter in isotropic, Kolmogorov turbulence. Incorporating this feature enables comparison between propagating scenarios with differing anisotropies and power-law exponents to the standard Kolmogorov, isotropic model. We show that the accuracy of this technique is comparable to other FFT-based subharmonic methods up to three-dimensional spectral power-law exponents around 3.9.
Proceedings of SPIE | 2012
Jeremy P. Bos; Aleksandr Sergeyev; Michael C. Roggemann
To explore anisoplanatism over horizontal paths a LCS experiment was developed. The experiment operated in real-world conditions over a 3.2km path, partly over water. To compare the results obtained from the experimental data and established theory, we modeled the experimental path via simulation using a finite number of phase screens. The scale and location of the phase screens in the simulation were varied to account for a different turbulence conditions along the propagation path. Preliminary comparison of our experimental data and simulation show that adjacent PSFs are significantly correlated at angles much larger than the predicted theoretical isoplanatic angle.
Optical Engineering | 2012
Jeremy P. Bos
Image reconstruction methods employed in horizontal imaging scenarios must provide the best possible reconstructions over a broad range of conditions. In addition to variations in turbulence conditions, these systems must perform when operated by novice users who may lack knowledge regarding the underlying reconstruction algorithms. Systems that maintain their performance under these conditions and are otherwise immune to factors that cause variability in performance are considered to be robust. We evaluate the effect of varying design parameters, such as the number of input frames, inverse filter parameters, and number of phase estimates parametrically. We find that, for most conditions, it is possible to achieve performance near the limit available from these estimators using as few as 15 input frames, the average of 4 estimates of the object phase, and with only approximate knowledge of the imaging conditions. When operated under these conditions, speckle-imagers can be considered robust. In addition to this analysis, the bispectrum and Knox-Thompson methods were compared as alternative phase estimation techniques. We find that the Knox-Thompson phase estimation method is preferred in situations where minimum computation complexity is desired.
ieee aerospace conference | 2014
Casey J. Pellizzari; Jeremy P. Bos; Mark F. Spencer; Skip Williams; Stacie Williams; Brandoch Calef; Daniel C. Senft
Phase Gradient Autofocus (PGA) is an effective algorithm for estimating and removing piston-phase errors from spotlight-mode synthetic aperture radar (SAR) data. For target scenes dominated by a point source, the algorithm has been shown to be optimal in the sense that it approaches the Cramér-Rao bound for carrier-to-noise ratios (CNRs) as low as -5 dB. In this paper, we explore PGAs effectiveness against ground-based inverse synthetic aperture LADAR (ISAL) observations of spacecraft, where the target characteristics and phase errors are quite different than in the SAR case. At optical wavelengths, the power spectrum of the piston-phase errors will be dominated less by platform motion and more by atmospheric variations. In addition, space objects will have fewer range-resolution cells across them than would a typical extended SAR scene. This research characterizes the performance limitations of PGA for an ISAL system as a function of CNR and the number of range-resolution cells across the scene. A high-fidelity wave-optics simulation is used to generate representative test data for input to the PGA algorithm. Emphasis is placed on finding the lower limits of performance for which image reconstruction is possible.
Optical Engineering | 2014
Glen Archer; Jeremy P. Bos; Michael C. Roggemann
Abstract. The potential benefits of real-time, or near-real-time, image processing hardware to correct for turbulence-induced image defects for long-range surveillance and weapons targeting are sufficient to motivate significant resource commitment to their development. Quantitative comparisons between potential candidates are necessary to decide on a preferred processing algorithm. We begin by comparing the mean-square-error (MSE) performance of speckle imaging (SI) methods and multiframe blind deconvolution (MFBD), applied to long-path horizontal imaging of a static scene under anisoplanatic seeing conditions. Both methods are used to reconstruct a scene from three sets of 1000 simulated images featuring low, moderate, and severe turbulence-induced aberrations. The comparison shows that SI techniques can reduce the MSE up to 47%, using 15 input frames under daytime conditions. The MFBD method provides up to 40% improvement in MSE under the same conditions. The performance comparison is repeated under three diminishing light conditions, 30, 15, 8 photons per pixel on average, where improvements of up to 39% can be achieved using SI methods with 25 input frames, and up to 38% for the MFBD method using 150 input frames. The MFBD estimator is applied to three sets of field data and representative results presented. Finally, the performance of a hybrid bispectrum-MFBD estimator that uses a rapid bispectrum estimate as the starting point for the MFBD image reconstruction algorithm is examined.
Optical Engineering | 2012
Jeremy P. Bos; Michael C. Roggemann
Abstract. We propose using certain blind image quality metrics to tune the inverse filter used for amplitude recovery in speckle imaging systems. The inverse filter in these systems requires knowledge of the blurring function. When imaging through turbulence over long horizontal paths near the ground an estimate of the blurring function can be obtained from theoretical models incorporating an estimate of the integrated turbulence strength along the imaging path. Estimates provided by the user in these scenarios are likely to be inaccurate resulting in suboptimal reconstructions. In this work, we use two blind image quality metrics; one metric is based on image sharpness, and the other on anisotropy in image entropy, to tune the value of integrated turbulence in the long exposure atmospheric blurring model with the goal of providing images equivalent to the minimum mean squared error (MMSE) image available. We find that both blind metrics are capable of choosing images that differ from the MMSE image by less than 4% using simulated data. Using the image sharpness metric, it was possible to produce images within 1% of the MMSE on average. Both metrics were also able produce high quality reconstructions from field data.
Proceedings of SPIE | 2011
Jeremy P. Bos; Michael C. Roggemann
The problem of horizontal imaging through the atmospheric boundary layer is common in defense, surveillance and remote sensing applications. Like all earth-bound imaging systems the resolving capability of an imaging system is limited by atmospheric turbulence. Using speckle imaging techniques it is often possible to overcome these effects and recover images with resolution approaching the diffraction-limit. We examine the performance of a bispectrum-based speckle imaging technique when applied to imaging scenarios near the ground. Computer simulations were used to generate three sets of 70, turbulence degraded images with varied turbulence strength. Early results indicate the bispectrum to be a robust estimator for images corrupted by the anisoplanatic turbulence encountered when imaging horizontally. Bispectrum reconstructed image frames show an improvement of nearly 60% in Mean Squared Error (MSE) on average over the examined turbulence strengths. The improvement in MSE was found to increase as additional input frames used for image reconstruction though using a few as 10 input frames provided a 50% improvement in MSE on average over turbulence strengths.
Proceedings of SPIE | 2016
Jeffrey Beck; Celina Bekins; Jeremy P. Bos
Phase screen-based wave optics simulations are a fundamental tool used by researchers seeking to understand the effect of atmospheric turbulence on laser beam propagation and imaging. Most wave optics packages are either themselves proprietary or rely on expensive, proprietary software packages. We have developed WavePy, a wave optics package based in the open-source Python programming environment. Using WavePy we have been able to produce turbulence-corrupted imagery similar to those observed by ground-based telescopes imaging space objects. We have also simulated plane wave and spherical wave propagation through uniform turbulence volumes. In both cases, we found the execution time between the WavePy and MATLAB versions to be similar under certain circumstances.
Applied Optics | 2015
Solmaz Hajmohammadi; Saeid Nooshabadi; Jeremy P. Bos
This paper presents a massively parallel method for the phase reconstruction of an object from its bispectrum phase. Our aim is to recover an enhanced version of a turbulence-corrupted image by developing an efficient and fast parallel image-restoration algorithm. The proposed massively parallel bispectrum algorithm relies on multiple block parallelization. Further, in each block, we employ wavefront processing through strength reduction to parallelize an iterative algorithm. Results are presented and compared with the existing iterative bispectrum method. We report a speed-up factor of 85.94 with respect to sequential implementation of the same algorithm for an image size of 1024×1024.