Dmitry Bratanov
Queensland University of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Dmitry Bratanov.
Sensors | 2018
Fernando Vanegas; Dmitry Bratanov; K. S. Powell; John Weiss; Felipe Gonzalez
Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used—the sensors, the UAV, and the flight operations—the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analysing and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications.
IEEE Aerospace and Electronic Systems Magazine | 2016
Michael Wilson; Daniel Ryan; Dmitry Bratanov; Alexander Wainwright; Jason J. Ford; Lennon Cork; Michael Brouckaert
For over 100 years manned aviation has been based on pilots seeing and avoiding other aircraft. During this time aviation has evolved to a point where there were 37.4 million commercial flights scheduled in 2014. The national airspace system (NAS) of each country is a complex system-of-systems involving air traffic control, a network of navigation and communication facilities, airports, controlled and uncontrolled airspace, and the associated rules and regulations for each part of this system. It is into this system that we are now introducing unmanned aircraft systems (UAS) for commercial and civilian applications.
Journal of Intelligent and Robotic Systems | 2017
Franz Andert; Nikolaus Alexander Ammann; Stefan Krause; Sven Lorenz; Dmitry Bratanov; Luis Mejias
This paper presents an optical-aided navigation method for automatic flights where satellite navigation might be disturbed. The proposed solution follows common approaches where satellite position updates are replaced with measurements from environment sensors such as a camera, lidar or radar as required. The alternative positioning is determined by a localization and mapping (SLAM) algorithm that handles 2D feature inputs from monocular camera images as well as 3D inputs from camera images that are augmented by range measurements. The method requires neither known landmarks nor a globally flat terrain. Beside the visual SLAM algorithm, the paper describes how to generate 3D feature inputs from lidar and radar sources and how to benefit from both monocular triangulation and 3D features. Regarding state estimation, the approach decouples visual SLAM from the filter updates. This allows software and hardware separation, i.e. visual SLAM computations on powerful hardware while the main filter can be installed on real-time hardware with possible lower capabilities. The localization quality in case of satellite dropouts is tested with data sets from manned and unmanned flights with different sensors while keeping all parameters constant. The tests show the applicability of this method in flat and hilly terrain and with different path lengths from few hundred meters to many kilometers. The relative navigation achieves an accumulation error of 1–6 % of distance traveled depending on the flight scenario. In addition to the flights, the paper discusses flight profile limitations when optical navigation methods are used.
international conference on unmanned aircraft systems | 2016
Franz Andert; Sven Lorenz; Luis Mejias; Dmitry Bratanov
For flight automation tolerable to satellite navigation dropouts, this paper presents a simultaneous localization and mapping method based on radar altimeter measurements and monocular camera images. The novelty within mapping is the combination of radar distance and image triangulation. This approach verifies whether the radar measurement fits to a specific horizontal plane in the map, yielding the sub-set of image features that do most probably correspond with the radar measurement. With this map match of the radar altitude, ambiguities in the radar measurement can be resolved. Since unusable radar measurements are suppressed, this method is suitable for positioning in non-flat terrain, e.g. in mountain areas. For matched data, the method estimates a scale correction factor for the image projection rays in order to remove scale ambiguities of the monocular navigation. Together with mapping, vehicle localization is done which is essentially camera resectioning. Localization can be parameterized with the required number of degrees of freedom depending on the availability of additional position sensors. The incremental positioning is tested in kilometer-scale outdoor flights of a 30 kg unmanned airplane as well as in flights with a Cessna 172R equipped with camera and radar sensors. The tests show the benefits of the proposed method in flat and hilly terrain, and demonstrate reduction of accumulation errors down to 2-6% over the distance flown. Some constraints of the method for the altitude range are existent, however it is highlighted that this method will generally work on typical flight profiles.
Sensors | 2018
Mark Parsons; Dmitry Bratanov; Kevin J. Gaston; Felipe Gonzalez
Recent advances in unmanned aerial system (UAS) sensed imagery, sensor quality/size, and geospatial image processing can enable UASs to rapidly and continually monitor coral reefs, to determine the type of coral and signs of coral bleaching. This paper describes an unmanned aerial vehicle (UAV) remote sensing methodology to increase the efficiency and accuracy of existing surveillance practices. The methodology uses a UAV integrated with advanced digital hyperspectral, ultra HD colour (RGB) sensors, and machine learning algorithms. This paper describes the combination of airborne RGB and hyperspectral imagery with in-water survey data of several types in-water survey of coral under diverse levels of bleaching. The paper also describes the technology used, the sensors, the UAS, the flight operations, the processing workflow of the datasets, the methods for combining multiple airborne and in-water datasets, and finally presents relevant results of material classification. The development of the methodology for the collection and analysis of airborne hyperspectral and RGB imagery would provide coral reef researchers, other scientists, and UAV practitioners with reliable data collection protocols and faster processing techniques to achieve remote sensing objectives.
ieee aerospace conference | 2018
Fernando Vanegas; Dmitry Bratanov; John Weiss; K. S. Powell; Felipe Gonzalez
Science & Engineering Faculty | 2017
Dmitry Bratanov; Luis Mejias; Jason J. Ford
Institute for Future Environments; Science & Engineering Faculty | 2016
Michael Wilson; Daniel Ryan; Dmitry Bratanov; Alexander Wainwright; Jason J. Ford; Lennon Cork; Michael Brouckaert
Australian Research Centre for Aerospace Automation; Science & Engineering Faculty | 2010
Dmitry Bratanov; Gerd Boedecker
Australian Research Centre for Aerospace Automation; Science & Engineering Faculty | 2010
Dmitry Bratanov