Skip to yearly menu bar Skip to main content


Poster
in
Workshop: How Far Are We From AGI

Exploring Iterative Enhancement for Improving Learnersourced Multiple-Choice Question Explanations with Large Language Models

Qiming Bao · Juho Leinonen · Alex Peng · Wanjun Zhong · GaĆ«l Gendron · Timothy Pistotti · Alice Huang · Paul Denny · Michael Witbrock · Jiamou Liu

Keywords: [ Learnersourcing ] [ Automated Explanation Evaluation ] [ Multiple-Choice Question ] [ MCQs ] [ Automated Explanation Generation ] [ Explanations ] [ Large Language Models in Education ] [ LLMs ]


Abstract:

Large language models exhibit superior capabilities in processing and understanding language, yet their applications in educational contexts remain underexplored. Learnersourcing enhances learning by engaging students in creating their own educational content. When learnersourcing multiple-choice questions, creating explanations for the solution of a question is a crucial step; it helps other students understand the solution and promotes a deeper understanding of related concepts. However, it is often difficult for students to craft effective solution explanations, due to limited subject understanding. To help scaffold the task of automated explanation generation, we present and evaluate a framework called "ILearner-LLM", that iteratively enhances the generated explanations for the given questions with large language models. Comprising an explanation generation model and an explanation evaluation model, the framework generates high-quality student-aligned explanations by iteratively feeding the quality rating score from the evaluation model back into the instruction prompt of the explanation generation model. Experimental results demonstrate the effectiveness of our ILearner-LLM on LLaMA2-13B and GPT-4 to generate higher quality explanations that are closer to those written by students on five PeerWise datasets. Our findings represent a promising path to enrich the learnersourcing experience for students and to enhance the capabilities of large language models for educational applications.

Chat is not available.