Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Refael Whyte is active.

Publication


Featured researches published by Refael Whyte.


international conference on computer graphics and interactive techniques | 2013

Coded time of flight cameras: sparse deconvolution to address multipath interference and recover time profiles

Achuta Kadambi; Refael Whyte; Ayush Bhandari; Lee V. Streeter; Christopher Barsi; Adrian A. Dorrington; Ramesh Raskar

Time of flight cameras produce real-time range maps at a relatively low cost using continuous wave amplitude modulation and demodulation. However, they are geared to measure range (or phase) for a single reflected bounce of light and suffer from systematic errors due to multipath interference. We re-purpose the conventional time of flight device for a new goal: to recover per-pixel sparse time profiles expressed as a sequence of impulses. With this modification, we show that we can not only address multipath interference but also enable new applications such as recovering depth of near-transparent surfaces, looking through diffusers and creating time-profile movies of sweeping light. Our key idea is to formulate the forward amplitude modulated light propagation as a convolution with custom codes, record samples by introducing a simple sequence of electronic time delays, and perform sparse deconvolution to recover sequences of Diracs that correspond to multipath returns. Applications to computer vision include ranging of near-transparent objects and subsurface imaging through diffusers. Our low cost prototype may lead to new insights regarding forward and inverse problems in light transport.


Optics Letters | 2014

Resolving multipath interference in time-of-flight imaging via modulation frequency diversity and sparse regularization

Ayush Bhandari; Achuta Kadambi; Refael Whyte; Christopher Barsi; Micha Feigin; Adrian A. Dorrington; Ramesh Raskar

Time-of-flight (ToF) cameras calculate depth maps by reconstructing phase shifts of amplitude-modulated signals. For broad illumination of transparent objects, reflections from multiple scene points can illuminate a given pixel, giving rise to an erroneous depth map. We report here a sparsity-regularized solution that separates K interfering components using multiple modulation frequency measurements. The method maps ToF imaging to the general framework of spectral estimation theory and has applications in improving depth profiles and exploiting multiple scattering.


international conference on computational photography | 2014

Demultiplexing illumination via low cost sensing and nanosecond coding

Achuta Kadambi; Ayush Bhandari; Refael Whyte; Adrian A. Dorrington; Ramesh Raskar

Several computer vision algorithms require a sequence of photographs taken in different illumination conditions, which has spurred development in the area of illumination multiplexing. Various techniques for optimizing the multiplexing process already exist, but are geared toward regular or high speed cameras. Such cameras are fast, but code on the order of milliseconds. In this paper we propose a fusion of two popular contexts, time of flight range cameras and illumination multiplexing. Time of flight cameras are a low cost, consumer-oriented technology capable of acquiring range maps at 30 frames per second. Such cameras have a natural connection to conventional illumination multiplexing strategies as both paradigms rely on the capture of multiple shots and synchronized illumination. While previous work on illumination multiplexing has exploited coding at millisecond intervals, we repurpose sensors that are ordinarily used in time of flight imaging to demultiplex via nanosecond coding strategies.


machine vision applications | 2010

Multiple range imaging camera operation with minimal performance impact

Refael Whyte; Andrew D. Payne; Adrian A. Dorrington; Michael J. Cree

Time-of-flight range imaging cameras operate by illuminating a scene with amplitude modulated light and measuring the phase shift of the modulation envelope between the emitted and reflected light. Object distance can then be calculated from this phase measurement. This approach does not work in multiple camera environments as the measured phase is corrupted by the illumination from other cameras. To minimize inaccuracies in multiple camera environments, replacing the traditional cyclic modulation with pseudo-noise amplitude modulation has been previously demonstrated. However, this technique effectively reduced the modulation frequency, therefore decreasing the distance measurement precision (which has a proportional relationship with the modulation frequency). A new modulation scheme using maximum length pseudo-random sequences binary phase encoded onto the existing cyclic amplitude modulation, is presented. The effective modulation frequency therefore remains unchanged, providing range measurements with high precision. The effectiveness of the new modulation scheme was verified using a custom time-of-flight camera based on the PMD19-K2 range imaging sensor. The new pseudo-noise modulation has no significant performance decrease in a single camera environment. In a two camera environment, the precision is only reduced by the increased photon shot noise from the second illumination source.


Applied Optics | 2015

Application of lidar techniques to time-of-flight range imaging.

Refael Whyte; Lee V. Streeter; Michael J. Cree; Adrian A. Dorrington

Amplitude-modulated continuous wave (AMCW) time-of-flight (ToF) range imaging cameras measure distance by illuminating the scene with amplitude-modulated light and measuring the phase difference between the transmitted and reflected modulation envelope. This method of optical range measurement suffers from errors caused by multiple propagation paths, motion, phase wrapping, and nonideal amplitude modulation. In this paper a ToF camera is modified to operate in modes analogous to continuous wave (CW) and stepped frequency continuous wave (SFCW) lidar. In CW operation the velocity of objects can be measured. CW measurement of velocity was linear with true velocity (R2=0.9969). Qualitative analysis of a complex scene confirms that range measured by SFCW is resilient to errors caused by multiple propagation paths, phase wrapping, and nonideal amplitude modulation which plague AMCW operation. In viewing a complicated scene through a translucent sheet, quantitative comparison of AMCW with SFCW demonstrated a reduction in the median error from -1.3  m to -0.06  m with interquartile range of error reduced from 4.0 m to 0.18 m.


Rundbrief Der Gi-fachgruppe 5.10 Informationssystem-architekturen | 2014

Coded Time-of-Flight Imaging for Calibration Free Fluorescence Lifetime Estimation

Ayush Bhandari; Christopher Barsi; Refael Whyte; Achuta Kadambi; Anshuman J. Das; Adrian A. Dorrington; Ramesh Raskar

We present a novel, single-shot, calibration-free, framework within which Time-of-Flight cameras can be used to estimate lifetimes of fluorescent samples. Our technique relaxes the high time resolution or the multi-frequency measurement requirements in conventional systems.


Optical Engineering | 2015

Resolving multiple propagation paths in time of flight range cameras using direct and global separation methods

Refael Whyte; Lee V. Streeter; Michael J. Cree; Adrian A. Dorrington

Abstract. Time of flight (ToF) range cameras illuminate the scene with an amplitude-modulated continuous wave light source and measure the returning modulation envelopes: phase and amplitude. The phase change of the modulation envelope encodes the distance travelled. This technology suffers from measurement errors caused by multiple propagation paths from the light source to the receiving pixel. The multiple paths can be represented as the summation of a direct return, which is the return from the shortest path length, and a global return, which includes all other returns. We develop the use of a sinusoidal pattern from which a closed form solution for the direct and global returns can be computed in nine frames with the constraint that the global return is a spatially lower frequency than the illuminated pattern. In a demonstration on a scene constructed to have strong multipath interference, we find the direct return is not significantly different from the ground truth in 33/136  pixels tested; where for the full-field measurement, it is significantly different for every pixel tested. The variance in the estimated direct phase and amplitude increases by a factor of eight compared with the standard time of flight range camera technique.


international conference on computer graphics and interactive techniques | 2013

Multifrequency time of flight in the context of transient renderings

Ayush Bhandari; Achuta Kadambi; Refael Whyte; Lee V. Streeter; Christopher Barsi; Adrian A. Dorrington; Ramesh Raskar

The emergence of commercial time of flight (ToF) cameras for realtime depth images has motivated extensive study of exploitation of ToF information. In principle, a ToF camera is an active sensor that emits an amplitude modulated near-infrared (NIR) signal, which illuminates a given scene. The per-pixel phase difference of the modulation between reflected light and a reference signal determines the path length, and hence a depth map, of the scene.


ieee sensors | 2014

Review of methods for resolving multi-path interference in Time-of-Flight range cameras

Refael Whyte; Lee V. Streeter; Michael J. Cree; Adrian A. Dorrington

Time-of-Flight range cameras measure the distance from the camera to the objects in the field of view. This is achieved by illuminating the scene with amplitude modulated light and measuring the phase change in the modulation envelope between the transmitted and received light, simultaneously for each pixel. A major cause of accuracy errors are multiple propagation paths between the light source and pixel. This error is highly scene dependent and can be substantial. In this paper we review the state of art in multi-path interference correction. Two mathematical models dominate the literature, a sparse formulation from discrete paths and a lambertian inter-reflection model. The recovery methods for the sparse formulation are effective at looking through a translucent sheet, while lambertian model based techniques can reduce the errors due to inter-reflections in a corner.


image and vision computing new zealand | 2014

Time Frequency Duality of Time-of-Flight Range Cameras for Resolving Multi-path Interference

Refael Whyte; Ayush Bhandari; Lee V. Streeter; Michael J. Cree; Adrian A. Dorrington

Time-of-Flight (ToF) range cameras measure the depth from the camera to the objects in the field of view. This is achieved by illuminating the scene with amplitude modulated light and measuring the phase shift in the modulation envelope between the out going and reflected light. ToF cameras suffer from measurement errors when multiple propagation paths exist from the light source to the pixel. Previous work has resolved this error by taking measurements at different modulation frequencies. Other previous work has used the properties of a binary sequence as the modulation waveform to resolve the multiple propagation paths. In this work the advantages of taking measurements with each method are investigated. There is an improved jitter performance and measurement linearity with using binary sequences, while decreased variance in the phase with a frequency sweep. We present a transform between sampling with a binary sequence and sampling with multiple frequencies and show that they are equivalent and a transform exists between them. The results of resolving multi-path interference are compared between each method and its transform.

Collaboration


Dive into the Refael Whyte's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ayush Bhandari

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ramesh Raskar

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Achuta Kadambi

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Christopher Barsi

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anshuman J. Das

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Micha Feigin

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge