Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David Rees is active.

Publication


Featured researches published by David Rees.


international conference on multisensor fusion and integration for intelligent systems | 2012

Bayesian fusion of thermal and visible spectra camera data for region based tracking with rapid background adaptation

Rustam Stolkin; David Rees; Mohammed Talha; Ionut Florescu

This paper presents a method for optimally combining pixel information from an infra-red thermal imaging camera, and a conventional visible spectrum colour camera, for tracking a moving target. The tracking algorithm rapidly re-learns its background models for each camera modality from scratch at every frame. This enables, firstly, automatic adjustment of the relative importance of thermal and visible information in decision making, and, secondly, a degree of “camouflage target” tracking by continuously re-weighting the importance of those parts of the target model that are most distinct from the present background at each frame. Furthermore, this very rapid background adaptation ensures robustness to large, sudden and arbitrary camera motion, and thus makes this method a useful tool for robotics, for example visual servoing of a pan-tilt turret mounted on a moving robot vehicle. The method can be used to track any kind of arbitrarily shaped or deforming object, however the combination of thermal and visible information proves particularly useful for enabling robots to track people. The method is also important in that it can be readily extended for data fusion of an arbitrary number of statistically independent features from one or arbitrarily many imaging modalities.


ieee sensors | 2012

Bayesian fusion of thermal and visible spectra camera data for mean shift tracking with rapid background adaptation

Rustam Stolkin; David Rees; Mohammed Talha; Ionut Florescu

This paper presents a method for optimally combining pixel information from thermal imaging and visible spectrum colour cameras, for tracking an arbitrarily shaped deformable moving target. The tracking algorithm rapidly re-learns its background models for each camera modality from scratch at every frame. This enables, firstly, automatic adjustment of the relative importance of thermal and visible information in decision making, and, secondly, a degree of “camouflage target” tracking by continuously re-weighting the importance of those parts of the target model that are most distinct from the present background at each frame. Furthermore, this very rapid background adaptation ensures robustness to rapid camera motion. The combination of thermal and visible information is applicable to any target, but particularly useful for people tracking. The method is also important in that it can be readily extended for fusion of data from arbitrarily many imaging modalities.

Collaboration


Dive into the David Rees's collaboration.

Top Co-Authors

Avatar

Mohammed Talha

University of Birmingham

View shared research outputs
Top Co-Authors

Avatar

Rustam Stolkin

University of Birmingham

View shared research outputs
Top Co-Authors

Avatar

Ionut Florescu

Stevens Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge