Kristopher Ellis
National Research Council
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Kristopher Ellis.
international conference on unmanned aircraft systems | 2014
Dan Tulpan; Nabil Belacel; Fazel Famili; Kristopher Ellis
Feature detection for Unmanned Aircraft Systems (UAS) sense and avoid scenarios is a crucial preliminary step for target detection. Its importance culminates when distant (pixel size) targets representing incoming aircraft are considered. This paper presents an experimental evaluation of four popular feature detection methods using flight test data and based on evaluation criteria such as first detection distance and percentage of frames with detected target features. Our results show that for close range targets all four methods have similar performance, while for distant (pixel-size) targets, the Shi and Tomasi method outperforms the other three methods (Harris-Stephens-Plessey, SUSAN and FAST).
ieee sensors | 2010
Cyrus Minwalla; Mussie Tekeste; Kyle Watters; Paul S. Thomas; Richard Hornsey; Kristopher Ellis; Sion Jennings
Sense and avoid systems for civilian unmanned air vehicles (UAVs) are essential in controlled airspace under visual flight rules (VFR). A prototype optical sensor accomplishes the task with attractive performance specifications. Key requirements include long-range detection (up to 10 km), wide field of view, discrimination of small threats against the background and tolerance of direct solar illumination. We demonstrate a prototype system based on a network of independent camera modules equipped with local processing. Availability of a fly-by-wire helicopter configured as a UAV emulator allows for realistic field tests with consumer components. Aspects of the design, implementation and evaluation of the prototype sensor are presented here, as are preliminary measurements to clarify the roles of platform motion, system optical point-spread, noise, direct sunlight and target highlighting.
canadian conference on electrical and computer engineering | 2011
Cyrus Minwalla; Kyle Watters; Paul J. Thomas; Richard Hornsey; Kristopher Ellis; Sion Jennings
Extraction and utilization of the horizon contour is presented within the context of an optical collision avoidance instrument, comprised of individual nodes configured in a fixed-topology distributed network. The algorithm iterates between learning and application stages. Pixel neighbourhoods were classified into ground and sky regions using a vector of feature descriptors. The clusters in feature space were separated by a learnt minimal-error threshold, which was subsequently applied to the entire image, or a pre-selected region-of-interest aided by scenario-dependent constraints. Morphological operations reduced spurious clusters. The resultant contour was parametrically fitted to a polynomial for comparison to ground-truth. Adaptive operation allows inputs from prior measurements and external attitude information.
Unmanned Systems | 2016
Cyrus Minwalla; Dan Tulpan; Nabil Belacel; Fazel Famili; Kristopher Ellis
Detecting collision-course targets in aerial scenes from purely passive optical images is challenging for a vision-based sense-and-avoid (SAA) system. Proposed herein is a processing pipeline for detecting and evaluating collision course targets from airborne imagery using machine vision techniques. The evaluation of eight feature detectors and three spatio-temporal visual cues is presented. Performance metrics for comparing feature detectors include the percentage of detected targets (PDT), percentage of false positives (POT) and the range at earliest detection (Rdet). Contrast and motion-based visual cues are evaluated against standard models and expected spatio-temporal behavior. The analysis is conducted on a multi-year database of captured imagery from actual airborne collision course flights flown at the National Research Council of Canada. Datasets from two different intruder aircraft, a Bell 206 rotor-craft and a Harvard Mark IV trainer fixed-wing aircraft, were compared for accuracy and robustness. Results indicate that the features from accelerated segment test (FAST) feature detector shows the most promise as it maximizes the range at earliest detection and minimizes false positives. Temporal trends from visual cues analyzed on the same datasets are indicative of collision-course behavior. Robustness of the cues was established across collision geometry, intruder aircraft types, illumination conditions, seasonal environmental variations and scene clutter.
canadian conference on electrical and computer engineering | 2011
Kyle Watters; Cyrus Minwalla; Michael Liscombe; Hou In Lio; Paul J. Thomas; Richard Hornsey; Kristopher Ellis; Sion Jennings
The characterization of a laboratory prototype collision avoidance sensor based on a distributed network of smart cameras is presented. Choices for the computer, optics and image sensor are characterized with a combination of laboratory and field experiments. Intra- and inter- camera calibrations are performed via a custom in-house laser-scanner facility. Field tests measured the impact of the scene dynamic range and the camera point-spread function on the range at first detection.
Proceedings of SPIE | 2011
Sven Schmerwitz; Hans-Ullrich Doehler; Kristopher Ellis; Sion Jennings
The DLR project ALLFlight (Assisted Low Level Flight and Landing on Unprepared Landing Sites) is devoted to demonstrating and evaluating the characteristics of sensors for helicopter operations in degraded visual environments. Millimeter wave radar is one of the many sensors considered for use in brown-out. It delivers a lower angular resolution compared to other sensors, however it may provide the best dust penetration capabilities. In cooperation with the NRC, flight tests on a Bell 205 were conducted to gather sensor data from a 35 GHz pencil beam radar for terrain mapping, obstacle detection and dust penetration. In this paper preliminary results from the flight trials at NRC are presented and a description of the radars general capability is shown. Furthermore, insight is provided into the concept of multi-sensor fusion as attempted in the ALLFlight project.
international conference on unmanned aircraft systems | 2017
Dan Tulpan; Cajetan Bouchard; Kristopher Ellis; Cyrus Minwalla
Unmanned aircraft flying beyond line of sight in uncontrolled airspace need to maintain adequate separation from local inclement weather patterns for regulatory compliance and operational safety. Although commercial solutions for ‘weather avoidance’ exist, they are tailored to manned aviation and as such either lack the accuracy or the size, weight, and power (SWaP) requirements of small Unmanned Aerial System (UAS). Detection and ranging to the cloud ceiling is a key component of weather avoidance. Proposed herein is a computer vision approach to cloud detection consisting of feature extraction and machine learning. Six image moments on local texture regions were extracted and fused within a classification algorithm for discrimination of cloud pixels. Three different popular classifiers were evaluated for efficacy. Two publicly available datasets of all-sky images were utilized for training and test datasets. The proposed approach was compared to five well-known thresholding techniques via quantitative analysis. Results indicate that our method consistently outperformed the popular thresholding methods across all tested images. Comparison between the classification techniques indicated random forests to possess the highest training accuracy, while multilayer perceptrons showed better prediction accuracy on the test dataset. Upon extending the method to realistic images including background clutter, the random forest classifier demonstrated the best training accuracy of 100% and the best prediction accuracy of 96%. Although computationally more expensive, the random forest classifier also produced the fewest number of false positives. A sensitivity analysis for window sizes is presented for robust validation of the chosen approach, which showed that detection accuracy improved in proportion to window size at the expense of computation time.
Archive | 2009
Kristopher Ellis; Arthur W. Gubbels
Proceedings of SPIE | 2012
Cyrus Minwalla; Paul J. Thomas; Kristopher Ellis; Richard Hornsey; Sion Jennings
AIAA Information Systems-AIAA Infotech @ Aerospace | 2017
Cyrus Minwalla; Kristopher Ellis