Modeling Prompt Adherence in Student Essays
Isaac Persing and Vincent Ng.
Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 1534-1543, 2014.
Click here for the PostScript or PDF
version.
The talk slides are available here.
Abstract
Recently, researchers have begun exploring methods of scoring student essays with
respect to particular dimensions of quality such as coherence, technical errors,
and prompt adherence. The work
on modeling prompt adherence,
however, has been focused mainly
on whether individual
sentences adhere to the prompt. We
present a new annotated corpus of essay-level
prompt adherence scores and propose a feature-rich
approach to scoring essays along the prompt
adherence dimension. Our approach
significantly
outperforms a knowledge-lean baseline prompt adherence
scoring system yielding
improvements of up to 16.6%.
Dataset
The human annotation used in this paper is available from
this page.
BibTeX entry
@InProceedings{Persing+Ng:14a,
author = {Isaac Persing and Vincent Ng},
title = {Modeling Prompt Adherence in Student Essays},
booktitle = {Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)},
pages = {1534--1543},
year = 2014}