You are here:

Validating Automated Essay Scoring for Online Writing Placement

Assessing Writing Volume 18, Number 1, ISSN 1075-2935


In this paper, I describe the design and evaluation of automated essay scoring (AES) models for an institution's writing placement program. Information was gathered on admitted student writing performance at a science and technology research university in the northeastern United States. Under timed conditions, first-year students (N = 879) were assigned to write essays on two persuasive prompts within the "Criterion"[R] Online Writing Evaluation Service at the beginning of the semester. AES models were built and evaluated for a total of four prompts. AES models meeting recommended performance criteria were then compared to standardized admissions measures and locally developed writing measures. Results suggest that there is evidence to support the use of "Criterion" as part of the placement process at the institution. (Contains 12 tables.)


Ramineni, C. (2013). Validating Automated Essay Scoring for Online Writing Placement. Assessing Writing, 18(1), 40-61. Retrieved January 27, 2022 from .

This record was imported from ERIC on April 18, 2013. [Original Record]

ERIC is sponsored by the Institute of Education Sciences (IES) of the U.S. Department of Education.

Copyright for this record is held by the content creator. For more details see ERIC's copyright policy.


Cited By

View References & Citations Map
  • Impact of AWE Rubrics and Automated Assessment on EFL Writing Instruction

    Jinlan‎ Tang, School of Online and Continuing Education, Beijing Foreign Studies University, Beijing, China; Yi'an Wu, School of English and International Studies, Beijing Foreign Studies University, Beijing, China

    International Journal of Computer-Assisted Language Learning and Teaching Vol. 7, No. 2 (April 2017) pp. 58–74

These links are based on references which have been extracted automatically and may have some errors. If you see a mistake, please contact