You are here:

Automated Essay Scoring Versus Human Scoring: A Correlational Study

, South Texas College, United States ; , Texas A&M University-Kingsville, United States

CITE Journal Volume 8, Number 4, ISSN 1528-5804 Publisher: Society for Information Technology & Teacher Education, Waynesville, NC USA


The purpose of the current study was to analyze the relationship between automated essay scoring (AES) and human scoring in order to determine the validity and usefulness of AES for large-scale placement tests. Specifically, a correlational research design was used to examine the correlations between AES performance and human raters' performance. Spearman rank correlation coefficient tests were utilized for data analyses. Results from the data analyses showed no statistically significant correlation between the overall holistic scores assigned by the AES tool and the overall holistic scores assigned by faculty human raters or human raters who scored another standardized writing test. On the other hand, there was a significant correlation between scores assigned by two teams of human raters. A significant correlation was also present between AES and faculty human scoring in Dimension 4 - Sentence Structure, but no significant correlations existed in other dimensions. Findings from the current study do not corroborate previous findings on AES tools. Implications of these findings for English educators reveal that AES tools have limited capability at this point and that more reliable measures for assessment, like writing portfolios and conferencing, still need to be a part of the methods repertoire.


Wang, J. & Stallone Brown, M. (2008). Automated Essay Scoring Versus Human Scoring: A Correlational Study. Contemporary Issues in Technology and Teacher Education, 8(4), 310-325. Waynesville, NC USA: Society for Information Technology & Teacher Education. Retrieved August 25, 2019 from .



View References & Citations Map

These references have been extracted automatically and may have some errors. Signed in users can suggest corrections to these mistakes.

Suggest Corrections to References