IEEE Transactions on Knowledge and Data Engineering | 2021

Multi-hop Reasoning Question Generation and Its Application

 
 
 
 

Abstract


This article focuses on the topic of multi-hop question generation (QG), which aims to generate the questions requiring multi-hop reasoning skills from the given text. These questions are not only syntactically valid but also logically correlated with the answers. Concretely, we first design a basic QG model and customize several techniques to ensure results syntactic validity. In order to promote the logical correlations, we use a reasoning chain extracted from the text to regularize the results. Considering that different samples have their own characteristics on the aspects of text contextual structure, the type of question, and logical correlation, we propose a new adaptive meta-learner to optimize the basic QG model. Each case and its similar samples are viewed as a pseudo-QG task. The similar structural contexts contained in the same task are used as guidance to fine-tune the model. To measure the similarity of samples structured inputs, we propose a data-driven multi-level recognizer. The experimental results on two typical data sets in various domains show the effectiveness of the proposed approach. Moreover, we apply the generated results to the task of machine reading comprehension and achieve significant performance improvements. That demonstrates the capacity of multi-hop QG in facilitating real-world applications.

Volume None
Pages 1-1
DOI 10.1109/TKDE.2021.3073227
Language English
Journal IEEE Transactions on Knowledge and Data Engineering

Full Text