Publication Date
| In 2026 | 0 |
| Since 2025 | 4 |
| Since 2022 (last 5 years) | 10 |
| Since 2017 (last 10 years) | 17 |
| Since 2007 (last 20 years) | 17 |
Descriptor
| Learning Analytics | 17 |
| Test Items | 17 |
| Item Analysis | 9 |
| Computer Assisted Testing | 8 |
| Mathematics Tests | 7 |
| Computer Software | 6 |
| Multiple Choice Tests | 6 |
| Scores | 6 |
| Difficulty Level | 5 |
| Foreign Countries | 4 |
| Decision Making | 3 |
| More ▼ | |
Source
Author
Publication Type
| Reports - Research | 17 |
| Journal Articles | 13 |
| Speeches/Meeting Papers | 2 |
| Numerical/Quantitative Data | 1 |
Education Level
| Secondary Education | 5 |
| Higher Education | 4 |
| Junior High Schools | 4 |
| Middle Schools | 4 |
| Postsecondary Education | 4 |
| Elementary Education | 3 |
| Grade 8 | 2 |
| High Schools | 2 |
| Early Childhood Education | 1 |
| Grade 4 | 1 |
| Intermediate Grades | 1 |
| More ▼ | |
Audience
Location
| Africa | 1 |
| Alabama | 1 |
| California | 1 |
| France | 1 |
| Ghana | 1 |
| Iran | 1 |
| Kansas | 1 |
| Maine | 1 |
| Missouri | 1 |
| Nigeria | 1 |
| Pennsylvania | 1 |
| More ▼ | |
Laws, Policies, & Programs
Assessments and Surveys
| National Assessment of… | 3 |
What Works Clearinghouse Rating
Eva de Schipper; Remco Feskens; Franck Salles; Saskia Keskpaik; Reinaldo dos Santos; Bernard Veldkamp; Paul Drijvers – Large-scale Assessments in Education, 2025
Background: Students take many tests and exams during their school career, but they usually receive feedback about their test performance based only on an analysis of the item responses. With the increase in digital assessment, other data have become available for analysis as well, such as log data of student actions in online assessment…
Descriptors: Problem Solving, Mathematics Instruction, Learning Analytics, Identification
Lucia M. Reyes; Michael A. Cook; Steven M. Ross – Center for Research and Reform in Education, 2025
In March of 2025, brightwheel, a San Francisco-based educational technology company, partnered with the Center for Research and Reform in Education (CRRE) at Johns Hopkins University to test brightwheel's product, the Experience Assessment. The assessment was designed to provide early childhood educators with an objective and systematic way to…
Descriptors: Psychometrics, Educational Technology, Early Childhood Education, Young Children
Owen Henkel; Hannah Horne-Robinson; Maria Dyshel; Greg Thompson; Ralph Abboud; Nabil Al Nahin Ch; Baptiste Moreau-Pernet; Kirk Vanacore – Journal of Learning Analytics, 2025
This paper introduces AMMORE, a new dataset of 53,000 math open-response question-answer pairs from Rori, a mathematics learning platform used by middle and high school students in several African countries. Using this dataset, we conducted two experiments to evaluate the use of large language models (LLM) for grading particularly challenging…
Descriptors: Learning Analytics, Learning Management Systems, Mathematics Instruction, Middle School Students
Marli Crabtree; Kenneth L. Thompson; Ellen M. Robertson – HAPS Educator, 2024
Research has suggested that changing one's answer on multiple-choice examinations is more likely to lead to positive academic outcomes. This study aimed to further understand the relationship between changing answer selections and item attributes, student performance, and time within a population of 158 first-year medical students enrolled in a…
Descriptors: Anatomy, Science Tests, Medical Students, Medical Education
Lozano, José H.; Revuelta, Javier – Educational and Psychological Measurement, 2023
The present paper introduces a general multidimensional model to measure individual differences in learning within a single administration of a test. Learning is assumed to result from practicing the operations involved in solving the items. The model accounts for the possibility that the ability to learn may manifest differently for correct and…
Descriptors: Bayesian Statistics, Learning Processes, Test Items, Item Analysis
Congning Ni; Bhashithe Abeysinghe; Juanita Hicks – International Electronic Journal of Elementary Education, 2025
The National Assessment of Educational Progress (NAEP), often referred to as The Nation's Report Card, offers a window into the state of U.S. K-12 education system. Since 2017, NAEP has transitioned to digital assessments, opening new research opportunities that were previously impossible. Process data tracks students' interactions with the…
Descriptors: Reaction Time, Multiple Choice Tests, Behavior Change, National Competency Tests
Jiang, Yang; Gong, Tao; Saldivia, Luis E.; Cayton-Hodges, Gabrielle; Agard, Christopher – Large-scale Assessments in Education, 2021
In 2017, the mathematics assessments that are part of the National Assessment of Educational Progress (NAEP) program underwent a transformation shifting the administration from paper-and-pencil formats to digitally-based assessments (DBA). This shift introduced new interactive item types that bring rich process data and tremendous opportunities to…
Descriptors: Data Use, Learning Analytics, Test Items, Measurement
Patel, Nirmal; Sharma, Aditya; Shah, Tirth; Lomas, Derek – Journal of Educational Data Mining, 2021
Process Analysis is an emerging approach to discover meaningful knowledge from temporal educational data. The study presented in this paper shows how we used Process Analysis methods on the National Assessment of Educational Progress (NAEP) test data for modeling and predicting student test-taking behavior. Our process-oriented data exploration…
Descriptors: Learning Analytics, National Competency Tests, Evaluation Methods, Prediction
Azevedo, Jose Manuel; Oliveira, Ema P.; Beites, Patrícia Damas – International Journal of Information and Learning Technology, 2019
Purpose: The purpose of this paper is to find appropriate forms of analysis of multiple-choice questions (MCQ) to obtain an assessment method, as fair as possible, for the students. The authors intend to ascertain if it is possible to control the quality of the MCQ contained in a bank of questions, implemented in Moodle, presenting some evidence…
Descriptors: Learning Analytics, Multiple Choice Tests, Test Theory, Item Response Theory
PaaBen, Benjamin; Dywel, Malwina; Fleckenstein, Melanie; Pinkwart, Niels – International Educational Data Mining Society, 2022
Item response theory (IRT) is a popular method to infer student abilities and item difficulties from observed test responses. However, IRT struggles with two challenges: How to map items to skills if multiple skills are present? And how to infer the ability of new students that have not been part of the training data? Inspired by recent advances…
Descriptors: Item Response Theory, Test Items, Item Analysis, Inferences
Mehri Izadi; Maliheh Izadi; Farrokhlagha Heidari – Education and Information Technologies, 2024
In today's environment of growing class sizes due to the prevalence of online and e-learning systems, providing one-to-one instruction and feedback has become a challenging task for teachers. Anyhow, the dialectical integration of instruction and assessment into a seamless and dynamic activity can provide a continuous flow of assessment…
Descriptors: Adaptive Testing, Computer Assisted Testing, English (Second Language), Second Language Learning
Kevin Hirschi; Okim Kang – TESOL Quarterly: A Journal for Teachers of English to Speakers of Other Languages and of Standard English as a Second Dialect, 2024
Issues of intelligibility may arise amongst English learners when acquiring new words and phrases in North American academic settings, perhaps in part due to limited linguistic data available to the learner for understanding language use patterns. To this end, this paper examines the effects of Data-Driven Learning for Pronunciation (DDLfP) on…
Descriptors: English for Academic Purposes, Second Language Learning, Second Language Instruction, Phonology
Reddick, Rachel – International Educational Data Mining Society, 2019
One significant challenge in the field of measuring ability is measuring the current ability of a learner while they are learning. Many forms of inference become computationally complex in the presence of time-dependent learner ability, and are not feasible to implement in an online context. In this paper, we demonstrate an approach which can…
Descriptors: Measurement Techniques, Mathematics, Assignments, Learning
Moro, Sérgio; Martins, António; Ramos, Pedro; Esmerado, Joaquim; Costa, Joana Martinho; Almeida, Daniela – Computers in the Schools, 2020
Many university programs include Microsoft Excel courses given their value as a scientific and technical tool. However, evaluating what is effectively learned by students is a challenging task. Considering multiple-choice written exams are a standard evaluation format, this study aimed to uncover the features influencing students' success in…
Descriptors: Multiple Choice Tests, Test Items, Spreadsheets, Computer Software
Pelánek, Radek; Effenberger, Tomáš; Kukucka, Adam – Journal of Educational Data Mining, 2022
We study the automatic identification of educational items worthy of content authors' attention. Based on the results of such analysis, content authors can revise and improve the content of learning environments. We provide an overview of item properties relevant to this task, including difficulty and complexity measures, item discrimination, and…
Descriptors: Item Analysis, Identification, Difficulty Level, Case Studies
Previous Page | Next Page »
Pages: 1 | 2
Peer reviewed
Direct link
