You are here:

Randomised Items in Computer-Based Tests: Russian Roulette in Assessment?
ARTICLE

,

Journal of Educational Technology & Society Volume 11, Number 4, ISSN 1176-3647 e-ISSN 1176-3647

Abstract

Computer-based assessments are becoming more commonplace, perhaps as a necessity for faculty to cope with large class sizes. These tests often occur in large computer testing venues in which test security may be compromised. In an attempt to limit the likelihood of cheating in such venues, randomised presentation of items is automatically programmed into testing software, such that neighbouring screens present different items to the test-taker. This article argues that randomisation of test items can be a disadvantage to students who were randomly presented with difficult items first. Such disadvantage would violate the American Psychological Association's published guidelines concerning testing and assessment that call for the principle of fairness for test-takers across diverse test modes. Owing to the smallness of the chance of a student being randomly assigned difficult items first, it may be hard to prove such disadvantage. However, even if only one test-taker is affected once during a high-stakes test, the principle of fairness is compromised. This article reports on four instances out of about 400 in which students may either have been unfairly advantaged or disadvantaged by being given a series of easy or difficult items at the beginning of the test. Although the results are not statistically significant, we conclude that more research needs to be done before one can ignore what we have named the Item Randomisation Effect. (Contains 1 table and 4 figures.)

Citation

Marks, A.M. & Cronje, J.C. (2008). Randomised Items in Computer-Based Tests: Russian Roulette in Assessment?. Journal of Educational Technology & Society, 11(4), 41-50. Retrieved July 18, 2019 from .

This record was imported from ERIC on April 19, 2013. [Original Record]

ERIC is sponsored by the Institute of Education Sciences (IES) of the U.S. Department of Education.

Copyright for this record is held by the content creator. For more details see ERIC's copyright policy.

Keywords