You are here:

The Effects of Item Preview on Video-Based Multiple-Choice Listening Assessments
ARTICLE

, ,

Language Learning & Technology Volume 20, Number 1, ISSN 1094-3501

Abstract

Multiple-choice formats remain a popular design for assessing listening comprehension, yet no consensus has been reached on how multiple-choice formats should be employed. Some researchers argue that test takers must be provided with a preview of the items prior to the input (Buck, 1995; Sherman, 1997); others argue that a preview may decrease the authenticity of the task by changing the way input is processed (Hughes, 2003). Using stratified random sampling techniques, more and less proficient Japanese university English learners (N = 206) were assigned one of three test conditions: preview of question stem and answer options (n = 67), preview of question stem only (n = 70), and no preview (n = 69). A two-way ANOVA, with test condition and listening proficiency level as independent variables and score on the multiple-choice listening test as the dependent variable, indicated that the amount of item preview affected test scores but did not affect high and low proficiency students' scores differently. Item-level analysis identified items that were harder or easier than expected for one or more of the conditions, and the researchers posit three possible sources for these unexpected findings: 1) frequency of options in the input, 2) location of item focus, and 3) presence of organizational markers.

Citation

Koyama, D., Sun, A. & Ockey, G.J. (2016). The Effects of Item Preview on Video-Based Multiple-Choice Listening Assessments. Language Learning & Technology, 20(1), 148-165. Retrieved September 16, 2021 from .

This record was imported from ERIC on January 10, 2019. [Original Record]

ERIC is sponsored by the Institute of Education Sciences (IES) of the U.S. Department of Education.

Copyright for this record is held by the content creator. For more details see ERIC's copyright policy.

Keywords