ERIC Number: ED577318
Record Type: Non-Journal
Publication Date: 2013-Apr
Pages: 5
Abstractor: As Provided
ISBN: N/A
ISSN: EISSN-
EISSN: N/A
Available Date: N/A
What Different Kinds of Stratification Can Reveal about the Generalizability of Data-Mined Skill Assessment Models
Sao Pedro, Michael A.; Baker, Ryan S. J. d.; Gobert, Janice D.
Grantee Submission, Paper presented at the Conference on Learning Analytics and Knowledge (LAK) (Leuven, Belgium, Apr 8-12, 2013)
When validating assessment models built with data mining, generalization is typically tested at the student-level, where models are tested on new students. This approach, though, may fail to find cases where model performance suffers if other aspects of those cases relevant to prediction are not well represented. We explore this here by testing if scientific inquiry skill models built and validated for one science topic can predict skill demonstration for new students and a new science topic. Test cases were chosen using two methods: student-level stratification, and stratification based on the amount of trials ran during students' experimentation. We found that predictive performance of the models was different on each test set, revealing limitations that would have been missed from student-level validation alone.
Descriptors: Educational Research, Data Collection, Data Analysis, Generalizability Theory, Models, Science Process Skills, Inquiry, Prediction, Experiments, Simulation, Validity, Coding, Performance Based Assessment, Hypothesis Testing, Generalization, Physical Sciences, Scientific Concepts, Grade 8
Publication Type: Reports - Research; Speeches/Meeting Papers
Education Level: Grade 8
Audience: N/A
Language: English
Sponsor: National Science Foundation (NSF); National Center for Education Research (ED)
Authoring Institution: N/A
Identifiers - Location: Massachusetts
IES Funded: Yes
Grant or Contract Numbers: NSFDRL0733286; NSFDRL1008649; NSFDGE0742503; R305A090170; R305A120778
Author Affiliations: N/A