You are here:

The Effects of Citing Peer-Generated Questions during an Online Student Test-Construction Learning Task

, National Cheng Kung University, Taiwan, Taiwan ; , National Cheng Kung University, Taiwan

EdMedia + Innovate Learning, in Washington, DC ISBN 978-1-939797-29-2 Publisher: Association for the Advancement of Computing in Education (AACE), Waynesville, NC


The effects of citing peer-generated questions during an online student test-construction learning task on student question-generation performance and cognitive load were examined. A pretest-posttest quasi-experimental design was adopted. Six fifth-grade classes (N=165) were randomly assigned to two treatment groups (i.e., the citing and no-citing groups). An online learning system was used to support associated tasks for eleven weeks. The data was examined using the analysis of covariance, and the results showed significance differences in student question-generation performance between the two treatment groups, with the citing group scoring significantly higher than the no-citing group. Nevertheless, no statistically significant differences in cognitive load were found between the two groups. Significance of this study and suggestions for instructors and online system developers are provided.


Yu, F.Y. & Wei, J.K. (2017). The Effects of Citing Peer-Generated Questions during an Online Student Test-Construction Learning Task. In J. Johnston (Ed.), Proceedings of EdMedia 2017 (pp. 1046-1049). Washington, DC: Association for the Advancement of Computing in Education (AACE). Retrieved December 18, 2018 from .

View References & Citations Map


  1. Abramovich, S. & Brouwer, P. (2008). Taskstream as a web 2.0 tool for interactive communication in teacher education. International Journal of Technology in Teaching and Learning, 4(2), 97-108.
  2. Akem, J.A. & Agbe. N.N. (2003). Rudiments of measurement and evaluation in education psychology. Makurdi: The Return Press.
  3. Brown, S.I., & Walter, M.I. (2005). The art of problem posing (3rd ed.). New Jersey: Lawrence Erlbaum Associates.
  4. Hart, S.G., & Staveland, L.E. (1988). Development of NASA-TLX (task load index): Results of empirical and theoretical research. Advances in Psychology, 52, 139-183.
  5. Mehrens, W.A & Lehmann, I.J. (1998). Measurement and evaluation in education and psychology. Chicago: Holt, Rinehalt, and Wonston, Inc.
  6. O’Reilly, T. (2005). What is web 2.0-Design patterns and business models for the next generation of software. Retrieved December 6, 2016, from
  7. Torrance, E.P. (1974). Torrance tests of creative thinking. Bensenville, IL: Scholastic Testing Service, Inc.
  8. Yu, F.Y. (2009). Scaffolding student-generated questions: Design and development of a customizable online learning system. Computers in Human Behavior, 25(5), 1129-1138.
  9. Yu, F.Y. & Su, C.-L. (2015). A student-constructed test learning system: The design, development and evaluation of its pedagogical potential. Australasian Journal of Educational Technology, 31(6), 685-698.
  10. Yu, F.Y. & Wu, C-P (2016). The effects of an online student-constructed test strategy on knowledge construction. Computers& Education, 94, 89-101.

These references have been extracted automatically and may have some errors. If you see a mistake in the references above, please contact