Distractor quality evaluation in Multiple Choice Questions - Laboratoire d'Informatique pour la Mécanique et les Sciences de l'Ingénieur Accéder directement au contenu
Communication Dans Un Congrès Année : 2015

Distractor quality evaluation in Multiple Choice Questions

Résumé

Multiple choice questions represent a widely used evaluation mode; yet writing items that properly evaluate student learning is a complex task. Guidelines were developed for manual item creation, but automatic item quality evaluation would constitute a helpful tool for teachers. In this paper, we present a method for evaluation of option quality, based on Natural Language Processing criteria, which evaluate their syntactic and semantic homogeneity. We perform an evaluation of this method on a large MCQ corpus and show that the combination of several measures enables to validate distractors.
Fichier principal
Vignette du fichier
aied2015.pdf (232.15 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01631779 , version 1 (13-11-2017)

Identifiants

  • HAL Id : hal-01631779 , version 1

Citer

Van-Minh Pho, Anne-Laure Ligozat, Brigitte Grau. Distractor quality evaluation in Multiple Choice Questions. International Conference on Artificial Intelligence in Education, Jan 2015, Madrid, Spain. ⟨hal-01631779⟩
105 Consultations
1518 Téléchargements

Partager

Gmail Facebook X LinkedIn More