Knowledge-Based Systems | 2021

Leveraging bilingual-view parallel translation for code-switched emotion detection with adversarial dual-channel encoder

 
 
 
 

Abstract


Abstract Code-switched emotion detection, a task analyzing the emotion in code-switched texts, has gain increasing research attention within recent years. Prior works utilize various neural models with sophisticated features to pursuit high performance of the task, while they still overlook some crucial characteristics of the code-switched texts. In this work, we present an innovative approach for improving code-switched emotion detection. We first consider a bilingual-view parallel translation for code-switched text enhancement, i.e., translating the code-switched texts into two languages. Then we propose an adversarial dual-channel encoder architecture, where two private encoders take as inputs the parallel texts in two languages, respectively. The private encoders and the shared encoder work collaboratively, and effectively retrieve the features from monolingual and bilingual perspectives under adversarial training. We conduct extensive experiments on five code-switched benchmark datasets. Results show that our model outperforms the strongly-performing baselines that leverage external code-switched or bilingual word embedding with over 1.5% F1 score on the Chinese-English, Spanish-English and Hindi-English code-mixed data, becoming the new state-of-the-art system. Further analyses including ablation, qualitative and error studies, demonstrate the effectiveness of our proposed encoder for code-switched texts, as well as the bilingual-view parallel translation strategy.

Volume None
Pages None
DOI 10.1016/j.knosys.2021.107436
Language English
Journal Knowledge-Based Systems

Full Text