From Log Files to Assessment Metrics: Measuring Students' Science Inquiry Skills Using Educational Data Mining
ARTICLE
Janice D. Gobert, Michael Sao Pedro, Juelaila Raziuddin, Ryan S. Baker
Journal of the Learning Sciences Volume 22, Number 4, ISSN 1050-8406
Abstract
We present a method for assessing science inquiry performance, specifically for the inquiry skill of designing and conducting experiments, using educational data mining on students' log data from online microworlds in the Inq-ITS system (Inquiry Intelligent Tutoring System; www.inq-its.org). In our approach, we use a 2-step process: First we use text replay tagging, a type of rapid protocol analysis in which categories are developed and, in turn, used to hand-score students' log data. In the second step, educational data mining is conducted using a combination of the text replay data and machine-distilled features of student interactions in order to produce an automated means of assessing the inquiry skill in question; this is referred to as a "detector." Once this detector is appropriately validated, it can be applied to students' log files for auto-assessment and, in the future, to drive scaffolding in real time. Furthermore, we present evidence that this detector developed in 1 scientific domain, phase change, can be used--with no modification or retraining--to effectively detect science inquiry skill in another scientific domain, density.
Citation
Gobert, J.D., Sao Pedro, M., Raziuddin, J. & Baker, R.S. (2013). From Log Files to Assessment Metrics: Measuring Students' Science Inquiry Skills Using Educational Data Mining. Journal of the Learning Sciences, 22(4), 521-563. Retrieved March 5, 2021 from https://www.learntechlib.org/p/154329/.

ERIC is sponsored by the Institute of Education Sciences (IES) of the U.S. Department of Education.
Copyright for this record is held by the content creator. For more details see ERIC's copyright policy.
Keywords
- Coding
- construct validity
- Data Collection
- educational technology
- Grade 8
- inquiry
- intelligent tutoring systems
- Knowledge Level
- Learning Activities
- Middle School Students
- models
- PERFORMANCE BASED ASSESSMENT
- Predictive Validity
- Pretests Posttests
- Reliability
- science education
- Science Experiments
- Science Process Skills
- Scoring
- Simulated Environment
- Student Behavior
- Virtual Classrooms
Cited By
View References & Citations Map-
Exploring Case Specificity in Medical Students’ Clinical Reasoning
Tenzin Doleck, McGill University, Canada; Eric Poitras, University of Utah, United States; Susanne Lajoie, McGill University, Canada
E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2017 (Oct 17, 2017) pp. 572–577
-
Assessment in Immersive Virtual Environments: Cases for Learning, of Learning, and as Learning
Jillianne Code, University of British Columbia, Canada; Nick Zap, University of Victoria, Canada
Journal of Interactive Learning Research Vol. 28, No. 3 (July 2017) pp. 235–248
These links are based on references which have been extracted automatically and may have some errors. If you see a mistake, please contact info@learntechlib.org.