Remote. Sens. | 2021

Crops Fine Classification in Airborne Hyperspectral Imagery Based on Multi-Feature Fusion and Deep Learning

 
 
 
 
 
 
 
 

Abstract


Hyperspectral imagery has been widely used in precision agriculture due to its rich spectral characteristics. With the rapid development of remote sensing technology, the airborne hyperspectral imagery shows detailed spatial information and temporal flexibility, which open a new way to accurate agricultural monitoring. To extract crop types from the airborne hyperspectral images, we propose a fine classification method based on multi-feature fusion and deep learning. In this research, the morphological profiles, GLCM texture and endmember abundance features are leveraged to exploit the spatial information of the hyperspectral imagery. Then, the multiple spatial information is fused with the original spectral information to generate classification result by using the deep neural network with conditional random field (DNN+CRF) model. Specifically, the deep neural network (DNN) is a deep recognition model which can extract depth features and mine the potential information of data. As a discriminant model, conditional random field (CRF) considers both spatial and contextual information to reduce the misclassification noises while keeping the object boundaries. Moreover, three multiple feature fusion approaches, namely feature stacking, decision fusion and probability fusion, are taken into account. In the experiments, two airborne hyperspectral remote sensing datasets (Honghu dataset and Xiong’an dataset) are used. The experimental results show that the classification performance of the proposed method is satisfactory, where the salt and pepper noise is decreased, and the boundary of the ground object is preserved.

Volume 13
Pages 2917
DOI 10.3390/rs13152917
Language English
Journal Remote. Sens.

Full Text