Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sion Jennings is active.

Publication


Featured researches published by Sion Jennings.


Sensors | 2016

Airborne Optical and Thermal Remote Sensing for Wildfire Detection and Monitoring.

Robert S. Allison; Joshua Johnston; Gregory Craig; Sion Jennings

For decades detection and monitoring of forest and other wildland fires has relied heavily on aircraft (and satellites). Technical advances and improved affordability of both sensors and sensor platforms promise to revolutionize the way aircraft detect, monitor and help suppress wildfires. Sensor systems like hyperspectral cameras, image intensifiers and thermal cameras that have previously been limited in use due to cost or technology considerations are now becoming widely available and affordable. Similarly, new airborne sensor platforms, particularly small, unmanned aircraft or drones, are enabling new applications for airborne fire sensing. In this review we outline the state of the art in direct, semi-automated and automated fire detection from both manned and unmanned aerial platforms. We discuss the operational constraints and opportunities provided by these sensor systems including a discussion of the objective evaluation of these systems in a realistic context.


Journal of Aircraft | 2004

Time Delays in Visually Coupled Systems During Flight Test and Simulation

Sion Jennings; Lloyd D. Reid; Gregory Craig; Ronald V. Kruk

Increasing control system time delays have well-documented detrimental effects on pilot-in-the-loop performance. With increased use of helmet-mounted displays in aircraft, pilots may soon be exposed to both control system and visual display system time delays. They may be sensitive to both the magnitude and source of the time delay, that is, some pilots may be more sensitive to visual delays than to control delays, or vice versa. In the current study, control and visual delays were examined in two experiments, the first conducted in a helicopter and the second conducted in a flight simulator. A helmet-mounted display was used to present external imagery and symbology in both experiments. Standardized low-level maneuvering tasks were used to examine changes in system handling qualities ratings as a function of time delays in the control and visual display processing loops. The addition of delays in both the control and the visual loops impaired the system handling qualities and increased the magnitude of position maintenance error. Differences between control and visual delays were evident in reports of motion sickness symptoms, which were more frequent for visual delay conditions. Motion sickness symptoms and related physiological effects induced by delays may increase pilot fatigue. Therefore, determination of acceptable latency criteria for design and implementation in systems with visually coupled components is critical.


Helmet- and Head-Mounted Displays IX: Technologies and Applications | 2004

Detection of motion-defined form under simulated night vision conditions

Todd Macuda; Robert S. Allison; Paul J. Thomas; Gregory Craig; Sion Jennings

The influence of Night Vision Goggle-produced noise on the perception of motion-defined form was investigated using synthetic imagery and standard psychophysical procedures. Synthetic image sequences incorporating synthetic noise were generated using a software model developed by our research group. This model is based on the physical properties of the Aviator Night Vision Imaging System (ANVIS-9) image intensification tube. The image sequences either depicted a target that moved at a different speed than the background, or only depicted the background. For each trial, subjects were shown a pair of image sequences and required to indicate which sequence contained the target stimulus. We tested subjects at a series of target speeds at several realistic noise levels resulting from varying simulated illumination. The results showed that subjects had increased difficulty detecting the target with increased noise levels, particularly at slower target speeds. This study suggests that the capacity to detect motion-defined form is degraded at low levels of illumination. Our findings are consistent with anecdotal reports of impaired motion perception in NVGs. Perception of motion-defined form is important in operational tasks such as search and rescue and camouflage breaking. These degradations in performance should be considered in operational planning.


Human Factors | 2008

The Impact of Night Vision Goggles on Way-Finding Performance and the Acquisition of Spatial Knowledge

Michelle Gauthier; Avi Parush; Todd Macuda; Denis Tang; Gregory Craig; Sion Jennings

Objective: This study examined the effects of night vision goggles (NVGs) on navigation and way-finding performance and the acquisition of spatial knowledge. Background: Although numerous studies have examined the effects of NVGs on visual perception, few have examined the effects of using NVGs on the acquisition and expression of spatial cognition. Method: Participants learned the environment through active navigation and way finding, searching for targets within a life-sized maze with or without NVGs. Knowledge of the environment was then tested with two spatial memory tests. Results: Findings show that navigation and way finding with NVGs appear to be harder, as indicated by longer navigation times and additional, unnecessary turns, than they are without NVGs. Moreover, change in navigation performance over the course of the way-finding trials varied as a function of group assignment indicating that NVGs influenced the learning process. NVG users demonstrated a significant decrease in navigation times earlier as well as significant decreases in navigational legs compared with the control group. In judging the positions of objects relative to target objects in different rooms in the maze, performance was better for participants without NVGs than for those with NVGs. In a map-drawing task, participants in the NVG group were more likely to position objects incorrectly and to receive worse scores than the controls. Conclusion: These results demonstrate that NVGs affected not only spatial navigation and way-finding performance but also the acquisition of spatial knowledge. Application: These degradations in spatial knowledge should be considered in operational planning and NVG training programs.


Proceedings of SPIE, the International Society for Optical Engineering | 2005

Detection of motion-defined form using night vision goggles

Todd Macuda; Greg Craig; Robert S. Allison; Pearl S. Guterman; Paul S. Thomas; Sion Jennings

Perception of motion-defined form is important in operational tasks such as search and rescue and camouflage breaking. Previously, we used synthetic Aviator Night Vision Imaging System (ANVIS-9) imagery to demonstrate that the capacity to detect motion-defined form was degraded at low levels of illumination (see Macuda et al., 2004; Thomas et al., 2004). To validate our simulated NVG results, the current study evaluated observer’s ability to detect motion-defined form through a real ANVIS-9 system. The image sequences consisted of a target (square) that moved at a different speed than the background, or only depicted the moving background. For each trial, subjects were shown a pair of image sequences and required to indicate which sequence contained the target stimulus. Mean illumination and hence image noise level was varied by means of Neutral Density (ND) filters placed in front of the NVG objectives. At each noise level, we tested subjects at a series of target speeds. With both real and simulated NVG imagery, subjects had increased difficulty detecting the target with increased noise levels, at both slower and higher target speeds. These degradations in performance should be considered in operational planning. Further research is necessary to expand our understanding of the impact of NVG-produced noise on visual mechanisms.


Proceedings of SPIE, the International Society for Optical Engineering | 2000

Effects of field-of-view on pilot performance in night vision goggles flight trials: preliminary findings

Sion Jennings; Gregory Craig

Night vision goggles (NVGs) allow pilots to see and navigate under minimal levels of illumination. While NVGs allow the user to see more than they typically could under these levels of illumination, the visual information provided by NVGs has a limited field-of-view. The size of the field-of- view can diminish the pilots spatial orientation ability in the night flying environment. We examined pilot performance in low level helicopter flight while the pilots were using NVGs with 40 degree(s), and 52 degree(s) fields-of-view. The pilots flew a standardized ADS-33D hover maneuver in a Bell 206 helicopter equipped with an accurate position measurement system. The tests were conducted in simulated night conditions and both subjective and objective measures of task performance were obtained. Pilot Cooper-Harper ratings increased from Level 1 baseline ratings to Level 2 ratings when the NVGs were used, indicating worse performance when using the NVGs. Small rating differences were noticed between the 52 degree(s) and 40 degree(s) field-of-view conditions. Similar trends were noticed in the objective data of altitude, and lateral and longitudinal station keeping errors.


Head-mounted displays. Conference | 1997

Helmet-mounted display research activity on the NRC Bell 205 airborne simulator

Carl P. Swail; Arthur W. Gubbels; Sion Jennings

The National Research Council (NRC) of Canada, in conjunction with the Canadian Department of National Defence, is investigating the use of helmet-mounted displays as an aid in improving pilot situational awareness in all- weather search and rescue helicopter operations. For over 30 years, the NRC Bell 205 Airborne Simulator has been an integral part of valuable research programs. Equipped with a full authority fly-by-wire control systems, the Bell 205 has variable stability characteristics, which makes the airborne simulator the ideal platform for the integrated flight testing of helmet-mounted displays in a simulated operational environment. This paper will describe the test facility in detail, including a description of the airborne simulator, the fiber-optic helmet-mounted display hardware, the camera system, the head tracking system, and the sensor platform. In addition, the paper will provide a complete description of the symbology overlaid on the camera image that was developed for use with the helmet mounted display. The paper will conclude with the description of a planned preliminary handling-qualities investigation into the effects of using helmet-mounted displays on the pilots workload and performance while performing standard maneuvers.


Enhanced and synthetic vision. Conference | 2002

Hybrid enhanced and synthetic vision system architecture for rotorcraft operations

Norah K. Link; Ronald V. Kruk; David McKay; Sion Jennings; Greg Craig

The ability to conduct rotorcraft search and rescue (SAR) operations can be limited by environmental conditions that affect visibility. Poor visibility compromises transit to the search area, the search for the target, descent to the site and departure from the search area. In a collaborative program funded by the Canadian Department of National Defence, CAE and CMC Electronics designed, and together with the Flight Research Laboratory of the National Research Council of Canada integrated and flight-tested an enhanced and synthetic vision system (ESVS) to examine the potential of the concept for SAR operations. The key element of the ESVS was a wide field-of-view helmet-mounted display which provided a continuous field-of-regard over a large range of pilot head movements. The central portion of the display consisted of a head-slaved sensor image, which was fused with a larger computer generated image of the terrain. The combination of sensor and synthetic imagery into a hybrid system allows the accurate detection of obstacles with the sensor while the synthetic image provides a continuous high-quality image, regardless of environmental conditions. This paper presents the architecture and component technologies of the ESVS 2000 TD, as well as lessons learned and future applications for the hybrid approach.


ieee sensors | 2010

Modeling a prototype optical collision avoidance sensor for unmanned aerial vehicles

Cyrus Minwalla; Mussie Tekeste; Kyle Watters; Paul S. Thomas; Richard Hornsey; Kristopher Ellis; Sion Jennings

Sense and avoid systems for civilian unmanned air vehicles (UAVs) are essential in controlled airspace under visual flight rules (VFR). A prototype optical sensor accomplishes the task with attractive performance specifications. Key requirements include long-range detection (up to 10 km), wide field of view, discrimination of small threats against the background and tolerance of direct solar illumination. We demonstrate a prototype system based on a network of independent camera modules equipped with local processing. Availability of a fly-by-wire helicopter configured as a UAV emulator allows for realistic field tests with consumer components. Aspects of the design, implementation and evaluation of the prototype sensor are presented here, as are preliminary measurements to clarify the roles of platform motion, system optical point-spread, noise, direct sunlight and target highlighting.


Proceedings of SPIE | 2009

Flight performance using a hyperstereo helmet-mounted display: aircraft handling

Sion Jennings; Gregory Craig; Geoffrey W. Stuart; Melvyn E. Kalich; Clarence E. Rash; Thomas H. Harding

A flight study was conducted to assess the impact of hyperstereopsis on helicopter handling proficiency, workload and pilot acceptance. Three pilots with varying levels of night vision goggle and hyperstereo helmet-mounted display experience participated in the test. The pilots carried out a series of flights consisting of low-level maneuvers over a period of two weeks. Four of the test maneuvers, The turn around the tail, the hard surface landing, the hover height estimation and the tree-line following were analysed in detail. At the end of the testing period, no significant difference was observed in the performance data, between maneuvers performed with the TopOwl helmet and maneuvers performed with the standard night vision goggle. This study addressed only the image intensification display aspects of the TopOwl helmet system. The tests did not assess the added benefits of overlaid symbology or head slaved infrared camera imagery. These capabilities need to be taken into account when assessing the overall usefulness of the TopOwl system. Even so, this test showed that pilots can utilize the image intensification imagery displayed on the TopOwl to perform benign night flying tasks to an equivalent level as pilots using ANVIS. The study should be extended to investigate more dynamic and aggressive low level flying, slope landings and ship deck landings. While there may be concerns regarding the effect of hyperstereopsis on piloting, this initial study suggests that pilots can either adapt or compensate for hyperstereo effects with sufficient exposure and training. Further analysis and testing is required to determine the extent of training required.

Collaboration


Dive into the Sion Jennings's collaboration.

Top Co-Authors

Avatar

Todd Macuda

National Research Council

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Greg Craig

National Research Council

View shared research outputs
Top Co-Authors

Avatar

Gregory Craig

National Research Council

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Paul S. Thomas

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge