ArXiv | 2021

No-Reference Quality Assessment for 360-degree Images by Analysis of Multi-frequency Information and Local-global Naturalness

 
 
 
 

Abstract


360-degree/omnidirectional images (OIs) have received remarkable attention due to the increasing applications of virtual reality (VR). Compared to conventional 2D images, OIs can provide more immersive experiences to consumers, benefiting from the higher resolution and plentiful field of views (FoVs). Moreover, observing OIs is usually in a head-mounted display (HMD) without references. Therefore, an efficient blind quality assessment method, which is specifically designed for 360-degree images, is urgently desired. In this paper, motivated by the characteristics of the human visual system (HVS) and the viewing process of VR visual content, we propose a novel and effective no-reference omnidirectional image quality assessment (NR OIQA) algorithm by MultiFrequency Information and Local-Global Naturalness (MFILGN). Specifically, inspired by the frequency-dependent property of the visual cortex, we first decompose the projected equirectangular projection (ERP) maps into wavelet subbands by using discrete Haar wavelet transform (DHWT). Then, the entropy intensities of low-frequency and high-frequency subbands are exploited to measure the multifrequency information of OIs. In addition to considering the global naturalness of ERP maps, owing to the browsed FoVs, we extract the natural scene statistics (NSS) features from each viewport image as the measure of local naturalness. With the proposed multifrequency information measurement and local-global naturalness measurement, we utilize support vector regression (SVR) as the final image quality regressor to train the quality evaluation model from visual quality-related features to human ratings. To our knowledge, the proposed model is the first no-reference quality assessment method for 360-degree images that combines multifrequency information and image naturalness. Experimental results on two publicly available OIQA databases demonstrate that our proposed MFILGN outperforms state-of-the-art full-reference (FR) and NR approaches.

Volume abs/2102.11393
Pages None
DOI 10.1109/TCSVT.2021.3081182
Language English
Journal ArXiv

Full Text