Knowl. Based Syst. | 2021

Enhancing Transformer-based language models with commonsense representations for knowledge-driven machine comprehension

 
 
 
 
 
 

Abstract


Abstract Compared to the traditional machine reading comprehension (MRC) with limitation to the information in a passage, knowledge-driven MRC tasks aim to enable models to answer the question according to text and related commonsense knowledge. Although pre-trained Transformer-based language models (TrLMs) such as BERT and Roberta, have shown powerful performance in MRC, external knowledge such as unspoken commonsense and world knowledge still can not be used and explained explicitly. In this work, we present three simple yet effective injection methods integrated into the structure of TrLMs to fine-tune downstream knowledge-driven MRC tasks with off-the-shelf commonsense representations. Moreover, we introduce a mask mechanism for a token-level multi-hop relationship searching to filter external knowledge. We have conducted extensive experiments on DREAM and CosmosQA, two prevalent knowledge-driven datasets. Experimental results indicate that the incremental TrLMs have outperformed the baseline systems by 1%-4.1% with a fewer computational cost. Further analysis shows the effectiveness of the proposed methods and the robustness of the incremental model in the case of an incomplete training set.

Volume 220
Pages 106936
DOI 10.1016/J.KNOSYS.2021.106936
Language English
Journal Knowl. Based Syst.

Full Text