[1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12]

Journal of Information Science and Engineering, Vol. 38 No. 3, pp. 571-589

Question Generation for Reading Comprehension Test Complying with Types of Question

1Ritsumeikan Global Innovation Research Organization
2College of Information Science and Engineering
Ritsumeikan University
Shiga, 525-8577 Japan

3Faculty of Informatics
Kansai University
Osaka, 569-1095 Japan
E-mail: {shan; nisihara}@fc.ritsumei.ac.jp

In this paper, we proposed a method to generate two different types of reading comprehension questions complying with types of question for language learning tests with the Transformer model and the seq2seq method. In recent years, many approaches have showed good results by treating question generation as a seq2seq task. These approaches were implemented with a question-answering (QA) dataset; however, few studies have considered a reading comprehension-based dataset. Therefore, this paper proposed a method to generate questions appropriate for reading comprehension tests from articles. Moreover, analysis of reading comprehension test questions revealed two primary types of the question’s asking style: the commonly-used question (CM question) and the directly-related question (DR question). The characteristic of the two question types was different and therefore needs to design the generation models complying with the type of questions. We proposed a method to separate the two question types in the dataset and used two models to generate both types, comparing the result with the method that generates the two types of questions together. The positive rate for the proposed method’s CM questions was 88% and for its DR questions was 49%, compared to 33% and 24%, respectively, for the comparative method. The evaluation showed that the proposed method could generate the two types of reading comprehension questions more effectively, with a positive rate increased by an average of 40%.

Keywords: question generation, reading comprehension tests, language learning, attention mechanism, transformer model, Seq2Seq

  Retrieve PDF document (JISE_202203_05.pdf)