Journal of Ambient Intelligence and Humanized Computing | 2021

SpSAN: Sparse self-attentive network-based aspect-aware model for sentiment analysis

 
 
 
 

Abstract


Consumer reviews for services and products are an essential performance measure for organizations on their offerings. They are also necessary for forthcoming consumers to understand previous consumer experiences. This experimental work carried out using consumer reviews gathered from an online review platform. This work evaluates consumer sentiment associated with qualitative content, quantitative ratings, and cultural aspect to predict Consumer Recommendation Decisions (CRDs). Moreover, we extract service aspects from online reviews and fuse them with the word sequences before feeding them into the model, which helps incorporate aspect representation and position information in context with the sentences. Additionally, this study proposed a Sparse Self-Attention Network (SpSAN) model to predict CRDs. Proposed SpSAN improves the fine-tuning performance of the Bidirectional Encoder Representations from Transformers (BERT) model by introducing sparsity into the self-attention procedure. Specifically, this work integrates sparsity into the self-attention mechanism by changing the softmax function with a controllable sparse transformation at the time of fine-tuning with BERT. It empowers us to understand sparse attention distribution with a more intelligible representation of the complete input data. Experimental results and their analysis describes the importance of the proposed SpSAN model.

Volume None
Pages None
DOI 10.1007/s12652-021-03436-x
Language English
Journal Journal of Ambient Intelligence and Humanized Computing

Full Text