Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Michael J. Cree is active.

Publication


Featured researches published by Michael J. Cree.


Proceedings of SPIE | 2011

Separating true range measurements from multi-path and scattering interference in commercial range cameras

Adrian A. Dorrington; John Peter Godbaz; Michael J. Cree; Andrew D. Payne; Lee V. Streeter

Time-of-flight range cameras acquire a three-dimensional image of a scene simultaneously for all pixels from a single viewing location. Attempts to use range cameras for metrology applications have been hampered by the multi-path problem, which causes range distortions when stray light interferes with the range measurement in a given pixel. Correcting multi-path distortions by post-processing the three-dimensional measurement data has been investigated, but enjoys limited success because the interference is highly scene dependent. An alternative approach based on separating the strongest and weaker sources of light returned to each pixel, prior to range decoding, is more successful, but has only been demonstrated on custom built range cameras, and has not been suitable for general metrology applications. In this paper we demonstrate an algorithm applied to both the Mesa Imaging SR-4000 and Canesta Inc. XZ-422 Demonstrator unmodified off-the-shelf range cameras. Additional raw images are acquired and processed using an optimization approach, rather than relying on the processing provided by the manufacturer, to determine the individual component returns in each pixel. Substantial improvements in accuracy are observed, especially in the darker regions of the scene.


Proceedings of SPIE | 2012

Closed-form Inverses for the Mixed Pixel/Multipath Interference Problem in AMCW Lidar

John Peter Godbaz; Michael J. Cree; Adrian A. Dorrington

We present two new closed-form methods for mixed pixel/multipath interference separation in AMCW lidar systems. The mixed pixel/multipath interference problem arises from the violation of a standard range-imaging assumption that each pixel integrates over only a single, discrete backscattering source. While a numerical inversion method has previously been proposed, no close-form inverses have previously been posited. The first new method models reflectivity as a Cauchy distribution over range and uses four measurements at different modulation frequencies to determine the amplitude, phase and reflectivity distribution of up to two component returns within each pixel. The second new method uses attenuation ratios to determine the amplitude and phase of up to two component returns within each pixel. The methods are tested on both simulated and real data and shown to produce a significant improvement in overall error. While this paper focusses on the AMCW mixed pixel/multipath interference problem, the algorithms contained herein have applicability to the reconstruction of a sparse one dimensional signal from an extremely limited number of discrete samples of its Fourier transform.


Remote Sensing | 2011

Understanding and ameliorating non-linear phase and amplitude responses in AMCW Lidar

John Peter Godbaz; Michael J. Cree; Adrian A. Dorrington

Amplitude modulated continuous wave (AMCW) lidar systems commonly suffer from non-linear phase and amplitude responses due to a number of known factors such as aliasing and multipath inteference. In order to produce useful range and intensity information it is necessary to remove these perturbations from the measurements. We review the known causes of non-linearity, namely aliasing, temporal variation in correlation waveform shape and mixed pixels/multipath inteference. We also introduce other sources of non-linearity, including crosstalk, modulation waveform envelope decay and non-circularly symmetric noise statistics, that have been ignored in the literature. An experimental study is conducted to evaluate techniques for mitigation of non-linearity, and it is found that harmonic cancellation provides a significant improvement in phase and amplitude linearity.


Archive | 2013

Understanding and Ameliorating Mixed Pixels and Multipath Interference in AMCW Lidar

John Peter Godbaz; Adrian A. Dorrington; Michael J. Cree

Amplitude modulated continuous wave (AMCW) lidar systems suffer from significant systematic errors due to mixed pixels and multipath interference. Commercial systems can achieve centimetre precision, however accuracy is typically an order of magnitude worse limiting practical use of these devices. In this chapter the authors address AMCW measurement formation and the causes of mixed pixels and multipath interference. A comprehensive review of the literature is given, from the first reports of mixed pixels in point-scanning AMCW systems, through to the gamut of research over the past two decades into mixed pixels and multipath interference. An overview of a variety of detection and mitigation techniques, including deconvolution based intra-camera scattering reduction, modelling of intra-scene scattering, correlation waveform deconvolution techniques/multifrequency sampling and standard removal approaches, all of which can be applied to range-data from standard commercial cameras is presented. The chapter concludes with comments on future work for better detection and correction of systematic errors in full-field AMCW lidar.


asian conference on computer vision | 2010

Extending AMCW lidar depth-of-field using a coded aperture

John Peter Godbaz; Michael J. Cree; Adrian A. Dorrington

By augmenting a high resolution full-field Amplitude Modulated Continuous Wave lidar system with a coded aperture, we show that depth-of-field can be extended using explicit, albeit blurred, range data to determine PSF scale. Because complex domain range-images contain explicit range information, the aperture design is unconstrained by the necessity for range determination by depth-from-defocus. The coded aperture design is shown to improve restoration quality over a circular aperture. A proof-of-concept algorithm using dynamic PSF determination and spatially variant Landweber iterations is developed and using an empirically sampled point spread function is shown to work in cases without serious multipath interference or high phase complexity.


image and vision computing new zealand | 2009

Undue influence: Mitigating range-intensity coupling in AMCW ‘flash’ lidar using scene texture

John Peter Godbaz; Michael J. Cree; Adrian A. Dorrington

We present a new algorithm for mitigating range-intensity coupling caused by scattered light in full-field amplitude modulated continuous wave lidar systems using scene texture. Full-field Lidar works using the time-of-flight principle to measure the range to thousands of points in a scene simultaneously. Mixed pixel are erroneous range measurements caused by pixels integrating light from more than one object at a time. Conventional optics suffer from internal reflections and light scattering which can result in every pixel being mixed with scattered light. This causes erroneous range measurements and range-intensity coupling. By measuring how range changes with intensity over local regions it is possible to determine the phase and intensity of the scattered light without the complex calibration inherent in deconvolution based restoration. The new method is shown to produce a substantial improvement in range image quality. An additional range from texture method is demonstrated which is resistant to scattered light. Variations of the algorithms are tested with and without segmentation — the variant without segmentation is faster, but causes erroneous ranges around the edges of objects which are not present in the segmented algorithm.


Proceedings of SPIE | 2010

Blind deconvolution of depth-of-field limited full-field lidar data by determination of focal parameters

John Peter Godbaz; Michael J. Cree; Adrian A. Dorrington

We present a new two-stage method for parametric spatially variant blind deconvolution of full-field Amplitude Modulated Continuous Wave lidar image pairs taken at different aperture settings subject to limited depth of field. A Maximum Likelihood based focal parameter determination algorithm uses range information to reblur the image taken with a smaller aperture size to match the large aperture image. This allows estimation of focal parameters without prior calibration of the optical setup and produces blur estimates which have better spatial resolution and less noise than previous depth from defocus (DFD) blur measurement algorithms. We compare blur estimates from the focal parameter determination method to those from Pentlands DFD method, Subbaraos S-Transform method and estimates from range data/the sampled point spread function. In a second stage the estimated focal parameters are applied to deconvolution of total integrated intensity lidar images improving depth of field. We give an example of application to complex domain lidar images and discuss the trade-off between recovered amplitude texture and sharp range estimates.


ieee sensors | 2009

A fast Maximum Likelihood method for improving AMCW lidar precision using waveform shape

John Peter Godbaz; Michael J. Cree; Adrian A. Dorrington; Andrew D. Payne

Amplitude Modulated Continuous Wave imaging lidar systems use the time-of-flight principle to determine the range to objects in a scene. Typical systems use modulated illumination of a scene and a modulated sensor or image intensifier. By changing the relative phase of the two modulation signals it is possible to measure the phase shift induced in the illumination signal, thus the range to the scene. In practical systems, the resultant correlation waveform contains harmonics that typically result in a non-linear range response. Nevertheless, these harmonics can be used to improve range precision. We model a waveform continuously variable in phase and intensity as a linear interpolation. By approximating the problem as a Maximum Likelihood problem, an analytic solution for the problem is derived that enables an entire range image to be processed in a few seconds. A substantial improvement in overall RMS error and precision over the standard Fourier phase analysis approach results.


Archive | 2011

Effect of the time characteristics of the Compton camera on its performance

Chibueze Zimuzo Uche; W. Howell Round; Michael J. Cree

EPSM-ABEC 2011 Conference 14–18 August, 2011, Darwin Convention Centre, Northern Territory, Australia Australasian College of Physical Scientists and Engineers in Medicine 2011


Archive | 2011

A Monte Carlo study of two Compton camera’s first plane detectors

Chibueze Zimuzo Uche; W. Howell Round; Michael J. Cree

EPSM-ABEC 2011 Conference 14–18 August, 2011, Darwin Convention Centre, Northern Territory, Australia Australasian College of Physical Scientists and Engineers in Medicine 2011

Collaboration


Dive into the Michael J. Cree's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge