The documents distributed by this server have been provided by the contributing authors as a means to ensure timely dissemination of scholarly and technical work on a non-commercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, not withstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Educational Automatic Question Generation Improves Reading Comprehension in Non-native Speakers: A Learner-Centric Case Study

Author:Tim Steuer, Anna Filighera, Thomas Tregel, André Miede
Date:June 2022
Kind:Article - use for journal articles only
Journal:Frontiers in Artificial Intelligence
Keywords:automatic question generation, self-assessment, natural language processing, reading comprehension, education
Abstract:Background: Asking learners manually authored questions about their readings improves their text comprehension. Yet, not all reading materials comprise sufficiently many questions and many informal reading materials do not contain any. Therefore, automatic question generation has great potential in education as it may alleviate the lack of questions. However, currently, there is insufficient evidence on whether or not those automatically generated questions are beneficial for learners’ understanding in reading comprehension scenarios. Objectives: We investigate the positive and negative effects of automatically generated short-answer questions on learning outcomes in a reading comprehension scenario. Methods: A learner-centric, in between-groups, quasi-experimental reading comprehension case study with 48 college students is conducted. We test two hypotheses concerning positive and negative effects on learning outcomes during the text comprehension of science texts and descriptively explore how the generated questions influenced learners. Results: The results show a positive effect of the generated questions on the participants learning outcomes. However, we cannot entirely exclude question-induced adverse side effects on learning of non-questioned information. Interestingly, questions identified as computer-generated by learners nevertheless seemed to benefit their understanding. Take Away: Automatic question generation positively impacts reading comprehension in the given scenario. In the reported case study, even questions recognized as computer-generated supported reading comprehension.
Full paper (pdf)

[Export this entry to BibTeX]