Ryan J. Decker
Picatinny Arsenal
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ryan J. Decker.
23rd AIAA Aerodynamic Decelerator Systems Technology Conference | 2015
Ryan J. Decker; Oleg A. Yakimenko
This this paper advocates the use of automated computer vision and image processing techniques for characterization of parachute cargo systems. It considers two applications in which test video can be used to extract useful information about the behavior of these systems. The first application deals with identifying the relative motion of the decelerator canopy compared to that of the payload using a simple upward facing camera mounted on the top of the payload. The second application employs similar algorithms, but uses multiple ground-based cameras to reveal a six degree of freedom trajectory histories of parachute and parafoil-payload systems.
Journal of Testing and Evaluation | 2014
Ryan J. Decker; Mathias Kölsch; Oleg A. Yakimenko
This paper describes a cost effective automated methodology to analyze launch video of artillery projectiles. Image processing and computer vision techniques are used to segment and classify the projectile shape in each video frame. Within minutes of being fired, the initial position, velocity, and orientation history of the projectile in three-dimensional space is determined at the gun site. An overview of several standard methods used by the Army to characterize pitching and yawing motion of projectiles is included, as well as a discussion of the limitations of these methods. Results from real artillery testing using the automated video analysis method are validated through comparisons to results measured using conventional techniques.
Journal of Testing and Evaluation | 2014
Ryan J. Decker; Mathias Kölsch; Oleg A. Yakimenko
High-speed video has been used by ballistics engineers as a cost-effective technique to measure the spin-rates of both spin-stabilized and fin-stabilized artillery projectiles. Often at test ranges, state-of-the-art video systems are used to verify critical launch events following muzzle exit. From manual analysis of these videos, important performance metrics such as velocity, pitch angle, and spin-rate can be estimated. To do this, operators step through the video frames and record the time at which certain fiducial markings or numbered stripes are observed on the projectile as it rotates. The different methods evaluated in this paper are automated processes for calculating the muzzle-exit spin-rate from launch videos of spin-stabilized artillery projectiles painted with stripes. Image processing and computer vision techniques are employed to segment the shape of the projectile and extract the stripe pattern in each video frame. The most accurate algorithm estimates are validated to within 0.02 % for both laboratory and computer-simulated flight video and within 0.13 % for manual analysis of real flight video. The sensitivities of the methods to image resolution, video frame-rate, and the length of flight captured are evaluated. Areas of continued research and recommendations for increasing the accuracy of measurements in future tests are also identified.
21st AIAA Aerodynamic Decelerator Systems Technology Conference and Seminar | 2011
Ryan J. Decker; Oleg A. Yakimenko; Michael S. Hollis; Patrick J. Sweeney
This paper investigates a prospective avionics suite rescue kit to salvage some of the state-of-the-art electronics in the data-collecting fuze system employed on an artillery projectile. A single-use data collection fuze is currently in use by the Army that relays sensor measurements for the purpose of characterizing the flight of an artillery projectile. The goal of the present study is to develop a
ieee aerospace conference | 2017
Oleg A. Yakimenko; Ryan J. Decker
Todays aerial vehicles are equipped with relatively inexpensive vision sensors that can contribute to the navigation solution providing high-rate position, velocity and attitude navigational fixes. This paper focuses on preliminary results of computer simulations of a monocular vision-based system that is based on the image matching technique. Specifically, this system relies on high-definition satellite imagery that has become available in recent years. The paper presents the overall concept, discusses imagery data that was available for testing the suggested concept, and finally shows some preliminary results exhibiting satisfactory match between the estimated and true states.
AIAA Modeling and Simulation Technologies Conference | 2012
Ryan J. Decker; Oleg A. Yakimenko; Christopher Stout
‡This paper presents a high-fidelity trajectory model and parametric study to determine the critical distance that a precision-guided, indirect-fire, artillery round can be from a moving target when it begins its guidance algorithm to maneuver to a moving target. Determining the critical target acquisition proximity at various levels of maneuver authority is critical to the development of the projectile’s airframe and the selection of its target-seeking instruments. This paper concentrates on simulating the aeroballistic free-flight, rocket-assist, and guidance phases of the weapon’s trajectory to determine the critical target acquisition proximity for this generic mortar airframe to strike a ground vehicle traveling in all directions. For a given baseline of aerodynamic properties, this model quantifies the relationship between target acquisition proximity and maneuver authority.
Archive | 2017
Oleg A. Yakimenko; Mathias Kölsch; Ryan J. Decker
Defence Technology | 2017
Ryan J. Decker; Marco Duca; Shawn Spickert-Fulton
Defence Technology | 2016
Ryan J. Decker; Joseph Donini; William Gardner; Jobin John; Walter Koenig
Archive | 2011
Ryan J. Decker; Matthew Ledyard; Boris Flyash; Dominic Cantatore; Michael Hollis