You are here:

Effective web videoconferencing for proctoring online oral exams: a case study at scale in Brazil
ARTICLE

, , The Open University UK ; , UNISUL

Open Praxis Volume 7, Number 3, ISSN 1369-9997 e-ISSN 1369-9997 Publisher: International Council for Open and Distance Education

Abstract

The challenging of assessing formal and informal online learning at scale includes various issues. Many universities who are now promoting “Massive Online Open Courses” (MOOC), for instance, focus on relatively informal assessment of participant competence, which is not highly ‘quality assured’. This paper reports best practices on the use of a web videoconferencing application to quality control student assignments through online oral examination at scale. In this case study, we examine the use of a simple online conferencing technology FlashMeeting (FM) by a Brazilian University to provide ‘quality assurance’ in the assessment of twelve online postgraduate courses in Law for 20,000 students. Our research questions investigate the benefits and recommendations of using FM in online oral exams at scale. Our qualitative and quantitative data analysis centres on 3,462 short format interviews through FM conducted for this purpose by a group of around fifty assessors from September 2008 to September 2012. The effective use of FM provided evidence with respect to high quality assurance recognised by the Institution with respect to: students’ identity, their knowledge and ownership of written work. The key benefits identified from the perspective of assessors and students were: reliable examination, credible technology, authentic assessment, interactive e-Viva, low cost, scalable process and practical testing in terms of time, effort and money.

Citation

Okada, A., Scott, P. & Mendona, M. (2015). Effective web videoconferencing for proctoring online oral exams: a case study at scale in Brazil. Open Praxis, 7(3), 227-242. International Council for Open and Distance Education. Retrieved December 13, 2018 from .

This record was imported from OpenPraxis on August 2, 2015. [Original Record]

Keywords

View References & Citations Map

References

  1. Canessa, E., Tenze, L. & Salvatori, E. (2013). Attendance to Massive Open On-line Courses: Towards a Solution to Track on-line Recorded Lectures Viewing. Bulletin of the IEEE Technical Committee on Learning Technology, 15(1), 36–39. Retrieved from http://www.ieeetclt.org/issues/January2013/Canessa.pdf
  2. Gaytan, J. (2005). Effective assessment techniques for online instruction. Information Technology, Learning, and Performance Journal, 23(1), 25–33.
  3. Harmon, O. & Lambrinos, J. (2008). Are Online Exams an Invitation to Cheat? The Journal of Economic Education, 39(2), 116–125. Http://dx.doi.org/10.3200/JECE.39.2.116-125Hollister,K.K.& Berenson, M.L. (2009), Proctored Versus Unproctored Online Exams: Studying the Impact of Exam Environment on Student Performance. Decision Sciences Journal of Innovative Education, 7(1), 271–294. Http://dx.doi.org/10.1111/J.1540-4609.2008.00220.x
  4. Hopkins, J. (2011). Assessed real-time language learning tasks online: how do learners prepare? eLC Research Paper Series, 2, 59–72. Retrieved from http://elcrps.uoc.edu/index.php/elcrps/ Article/view/n2-hopkins/n2-hopkins
  5. Kim, J., & Craig, D.A. (2012). Validation of a videoconferenced speaking test. Computer Assisted Language Learning, 25(3), 257–275. Http://dx.doi.org/10.1080/09588221.2011.649482
  6. Okada, A., Tomadaki, E., Buckingham Shum, S. & Scott, P. (2008). Fostering Open Sensemaking Communities by Combining Knowledge Maps and Videoconferencing. UPGRADE, The European Journal for the Informatics Professional, 9(3), 27–36. Retrieved from http://www.cepis.org/Upgrade/files/2008-III-scott.pdf
  7. Perna, L.W., Ruby, A., Boruch, R.F., Wang, N., Scull, J., Ahmad, S., & Evans, C. (2014). Moving through MOOCs: Understanding the progression of users in massive open online courses. Educational Researcher, 43(9), 421–432. Http://dx.doi.org/10.3102/0013189X14562423Picciano,A.G.(2002).Beyondstudent perceptions: Issues of interaction, presence, and performance in an online course. Journal of Asynchronous Learning Networks, 6(1), 21–40.
  8. Robles, M., & Braathen, S. (2002). Online assessment techniques. The Delta Pi Epsilon Journal, 44(1), 39–49.
  9. Scott, P., Castañeda, L., Quick, K. And Linney, J. (2009). Synchronous symmetrical support: a naturalistic study of live online peer-to-peer learning via software videoconferencing. Interactive Learning Environments, 17(2), 119–134. Http://dx.doi.org/10.1080/10494820701794730Scott,P.;Tomadaki,E.& Quick, K. (2007). The Shape of Live On-line Meetings. The International Journal of Technology, Knowledge and Society, 3(4). Retrieved from http://flashmeeting.open.
  10. Wellman, G. (2005). Comparing Learning Style to Performance in On-Line Teaching: Impact of Proctored V. Un-Proctored Testing. Journal of Interactive Online Learning, 4(1), 20–39. Retrieved from http://www.ncolr.org/jiol/issues/pdf/4.1.2.pdf Wodehouse, A; Breslin, C; Eris, O.; Grierson, H.; Ion, W; Jung, M.; Juster, N.; Leifer, L.; Mabogunje, A. & Sonalkar, N. (2007). A reflective approach to learning in a global design project. International Conference on Engineering and Product Design Education, 13–14 September 2007, Northumbria University, Newcastle Upon Tyne, United Kingdom.
  11. Wynne, L. & Lopes, S. (2006). Implementing Large Scale Assessment Programmes. In M. Danson (ed.). Proceedings of the 10th CAA International Computer Assisted Assessment Conference, 4 & 5 July 2006, Loughborough University (pp. 525–526).

These references have been extracted automatically and may have some errors. If you see a mistake in the references above, please contact info@learntechlib.org.