The Impact of Spelling Errors on Trained Raters' Scoring Decisions
Second language (L2) writing assessments seldom allow a spellchecker and often have a time limit. Naturally, test takers often submit responses with spelling errors. However, little is known about whether and how spelling errors in test taker responses affect trained raters’ scoring decisions. In this study, we investigated the impact of spelling errors on trained raters' holistic evaluation of response quality. We selected 148 responses to four L2 writing tasks, and created error-free versions of the responses by correcting spelling errors in the original responses. Both the original and corrected responses were randomly assigned to trained raters, who scored the responses according to the same holistic scoring rubrics. We compared the resulting scores of original and corrected responses to gauge the impact of spelling errors. We also examined whether the impact of spelling errors varied across different task types and spelling error characteristics. The results showed that the spelling error correction led to an average of more than a half score point increase. The score gains from the error correction varied across task types and the quantity of corrected errors. These findings have multiple implications, including suggestions for rater training and assessment development.
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.