NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 8 results Save | Export
Weiling Li; Aaron Butler; Catherine Oberle; Anabil Munshi; Amy J. Dray – Online Submission, 2023
Edmentum offers a personalized learning platform called Exact Path. This quasi-experimental study, designed to meet ESSA Tier 2 evidence and What Works Clearinghouse standards with reservations, aimed to assess the efficacy of Exact Path in a district from the Midwestern United States. The goal was to provide specific recommendations to educators…
Descriptors: Achievement Tests, Scores, Educational Technology, Reading Achievement
Peer reviewed Peer reviewed
Direct linkDirect link
Aspiranti, Kathleen B.; Henze, Erin E. C.; Reynolds, Jennifer L. – School Psychology Review, 2020
Curriculum-based measurement (CBM) tools are increasingly administered through technology-based modalities such as computers and tablets. Two studies were conducted to examine whether students perform similarly on paper-based and tablet-based math fact probes. Ten students completed 1-min addition or multiplication math probes using a single-case…
Descriptors: Mathematics Tests, Test Format, Computer Assisted Testing, Handheld Devices
Swain, Matthew; Randel, Bruce; Norman Dvorak, Rebecca – Human Resources Research Organization (HumRRO), 2019
The purpose of this study was to evaluate the impact of using a combination of three Curriculum Associates' mathematics products: (a) "i-Ready® Diagnostic," (b) "i-Ready® Instruction," and (c) "Ready® Mathematics Core Curriculum" over use of "i-Ready Diagnostic" alone at grades K-5. Use of all three products…
Descriptors: Program Effectiveness, Mathematics Instruction, Elementary School Mathematics, Mathematics Achievement
Yue Huang – ProQuest LLC, 2023
Automated writing evaluation (AWE) is a cutting-edge technology-based intervention designed to help teachers meet their challenges in writing classrooms and improve students' writing proficiency. The fast development of AWE systems, along with the encouragement of technology use in the U.S. K-12 education system by the Common Core State Standards…
Descriptors: Computer Assisted Testing, Writing Tests, Automation, Writing Evaluation
Liu, Junhui; Brown, Terran; Chen, Jianshen; Ali, Usama; Hou, Likun; Costanzo, Kate – Partnership for Assessment of Readiness for College and Careers, 2016
The Partnership for Assessment of Readiness for College and Careers (PARCC) is a state-led consortium working to develop next-generation assessments that more accurately, compared to previous assessments, measure student progress toward college and career readiness. The PARCC assessments include both English Language Arts/Literacy (ELA/L) and…
Descriptors: Testing, Achievement Tests, Test Items, Test Bias
Peer reviewed Peer reviewed
Direct linkDirect link
Nelson, Peter M.; Parker, David C.; Zaslofsky, Anne F. – Assessment for Effective Intervention, 2016
The purpose of the current study was to evaluate the importance of growth in math fact skills within the context of overall math proficiency. Data for 1,493 elementary and middle school students were included for analysis. Regression models were fit to examine the relative value of math fact fluency growth, prior state test performance, and a fall…
Descriptors: Mathematics, Mathematics Instruction, Mathematics Skills, Mathematics Achievement
Rogers, Angela – Mathematics Education Research Group of Australasia, 2013
As we move into the 21st century, educationalists are exploring the myriad of possibilities associated with Computer Based Assessment (CBA). At first glance this mode of assessment seems to provide many exciting opportunities in the mathematics domain, yet one must question the validity of CBA and whether our school systems, students and teachers…
Descriptors: Mathematics Tests, Student Evaluation, Computer Assisted Testing, Test Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Randolph, Justus J.; Virnes, Marjo; Jormanainen, Ilkka; Eronen, Pasi J. – Educational Technology & Society, 2006
Although computer-assisted interview tools have much potential, little empirical evidence on the quality and quantity of data generated by these tools has been collected. In this study we compared the effects of using Virre, a computer-assisted self-interview tool, with the effects of using other data collection methods, such as written responding…
Descriptors: Computer Science Education, Effect Size, Data Collection, Computer Assisted Testing