NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
Assessments and Surveys
National Assessment of…4
What Works Clearinghouse Rating
Showing all 10 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Wenyi Lu; Joseph Griffin; Troy D. Sadler; James Laffey; Sean P. Goggins – Journal of Learning Analytics, 2025
Game-based learning (GBL) is increasingly recognized as an effective tool for teaching diverse skills, particularly in science education, due to its interactive, engaging, and motivational qualities, along with timely assessments and intelligent feedback. However, more empirical studies are needed to facilitate its wider application in school…
Descriptors: Game Based Learning, Predictor Variables, Evaluation Methods, Educational Games
Peer reviewed Peer reviewed
Direct linkDirect link
Abdu, Rotem; Olsher, Shai; Yerushalmy, Michal – Digital Experiences in Mathematics Education, 2022
This article queries how learning analytics systems can support content-specific group formation to develop students' thinking about a specific mathematical concept. Automated group formation requires identifying personal characteristics, designing tasks to probe students' perceptions, and grouping them to increase individual learning chances.…
Descriptors: Teaching Methods, Grouping (Instructional Purposes), Learning Analytics, Mathematics Education
Peer reviewed Peer reviewed
Direct linkDirect link
Emily K. Toutkoushian; Kihyun Ryoo – Measurement: Interdisciplinary Research and Perspectives, 2024
The Next Generation Science Standards (NGSS) delineate three interrelated dimensions that describe what students should know and how they should engage in science learning. These present significant challenges for assessment because traditional assessments may not be able to capture the ways in which students engage with content. Science…
Descriptors: Middle School Students, Academic Standards, Science Education, Learner Engagement
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Congning Ni; Bhashithe Abeysinghe; Juanita Hicks – International Electronic Journal of Elementary Education, 2025
The National Assessment of Educational Progress (NAEP), often referred to as The Nation's Report Card, offers a window into the state of U.S. K-12 education system. Since 2017, NAEP has transitioned to digital assessments, opening new research opportunities that were previously impossible. Process data tracks students' interactions with the…
Descriptors: Reaction Time, Multiple Choice Tests, Behavior Change, National Competency Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Erkan Er; Safak Silik; Sergen Cansiz – Journal of Learning Analytics, 2024
E-learning platforms have become increasingly popular in K--8 education to promote student learning and enhance classroom teaching. Student interactions with these platforms produce trace data, which are digital records of learning processes. Although trace data have been effective in identifying learners' engagement profiles in higher education…
Descriptors: Foreign Countries, Elementary Secondary Education, Grade 1, Grade 2
Peer reviewed Peer reviewed
Direct linkDirect link
Jiang, Yang; Gong, Tao; Saldivia, Luis E.; Cayton-Hodges, Gabrielle; Agard, Christopher – Large-scale Assessments in Education, 2021
In 2017, the mathematics assessments that are part of the National Assessment of Educational Progress (NAEP) program underwent a transformation shifting the administration from paper-and-pencil formats to digitally-based assessments (DBA). This shift introduced new interactive item types that bring rich process data and tremendous opportunities to…
Descriptors: Data Use, Learning Analytics, Test Items, Measurement
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Levin, Nathan A. – Journal of Educational Data Mining, 2021
The Big Data for Education Spoke of the NSF Northeast Big Data Innovation Hub and ETS co-sponsored an educational data mining competition in which contestants were asked to predict efficient time use on the NAEP 8th grade mathematics computer-based assessment, based on the log file of a student's actions on a prior portion of the assessment. In…
Descriptors: Learning Analytics, Data Collection, Competition, Prediction
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Bosch, Nigel – Journal of Educational Data Mining, 2021
Automatic machine learning (AutoML) methods automate the time-consuming, feature-engineering process so that researchers produce accurate student models more quickly and easily. In this paper, we compare two AutoML feature engineering methods in the context of the National Assessment of Educational Progress (NAEP) data mining competition. The…
Descriptors: Accuracy, Learning Analytics, Models, National Competency Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Guo, Hongwen; Zhang, Mo; Deane, Paul; Bennett, Randy E. – Journal of Educational Data Mining, 2020
This study investigates the effects of a scenario-based assessment design on students' writing processes. An experimental data set consisting of four design conditions was used in which the number of scenarios (one or two) and the placement of the essay task with respect to the lead-in tasks (first vs. last) were varied. Students' writing…
Descriptors: Instructional Effectiveness, Vignettes, Writing Processes, Learning Analytics
Peer reviewed Peer reviewed
Direct linkDirect link
Tate, Tamara P.; Warschauer, Mark – Technology, Knowledge and Learning, 2019
The quality of students' writing skills continues to concern educators. Because writing is essential to success in both college and career, poor writing can have lifelong consequences. Writing is now primarily done digitally, but students receive limited explicit instruction in digital writing. This lack of instruction means that students fail to…
Descriptors: Writing Tests, Computer Assisted Testing, Writing Skills, Writing Processes