NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Location
Greece1
Turkey1
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 9 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Lee, Won-Chan; Kim, Stella Y.; Choi, Jiwon; Kang, Yujin – Journal of Educational Measurement, 2020
This article considers psychometric properties of composite raw scores and transformed scale scores on mixed-format tests that consist of a mixture of multiple-choice and free-response items. Test scores on several mixed-format tests are evaluated with respect to conditional and overall standard errors of measurement, score reliability, and…
Descriptors: Raw Scores, Item Response Theory, Test Format, Multiple Choice Tests
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Malec, Wojciech; Krzeminska-Adamek, Malgorzata – Practical Assessment, Research & Evaluation, 2020
The main objective of the article is to compare several methods of evaluating multiple-choice options through classical item analysis. The methods subjected to examination include the tabulation of choice distribution, the interpretation of trace lines, the point-biserial correlation, the categorical analysis of trace lines, and the investigation…
Descriptors: Comparative Analysis, Evaluation Methods, Multiple Choice Tests, Item Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Fragkouli, Konstantina; Antoniou, Faye; Mouzaki, Angeliki; Ralli, Asimina M.; Kokkali, Vasiliki; Alexoudi, Kariofyllia – Reading & Writing Quarterly, 2022
The development of spelling skill is an intricate process for children with or at risk of Specific Learning Disabilities and requires targeted interventions. This problem exacerbates in the Greek orthographic system owning to its high complexity. The current study presents a novel spelling intervention program for Greek 3rd graders at risk of…
Descriptors: Intervention, Spelling, At Risk Students, Reading Fluency
Peer reviewed Peer reviewed
Direct linkDirect link
Goncher, Andrea M.; Jayalath, Dhammika; Boles, Wageeh – IEEE Transactions on Education, 2016
Concept inventory tests are one method to evaluate conceptual understanding and identify possible misconceptions. The multiple-choice question format, offering a choice between a correct selection and common misconceptions, can provide an assessment of students' conceptual understanding in various dimensions. Misconceptions of some engineering…
Descriptors: Case Studies, Concept Formation, Teaching Methods, Misconceptions
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Sayin, Ayfer – Journal of Education and Training Studies, 2016
In the formation education that is carried out within the scope of undergraduate and non-thesis graduate programs within the same university, different criteria are used to evaluate students' success. In this study, classification accuracy of letter grades that are generated to evaluate students' success using relative and absolute criteria and…
Descriptors: Foreign Countries, Student Evaluation, Grading, Evaluation Criteria
Peer reviewed Peer reviewed
Direct linkDirect link
Wang, Wen-Chung; Huang, Sheng-Yun – Educational and Psychological Measurement, 2011
The one-parameter logistic model with ability-based guessing (1PL-AG) has been recently developed to account for effect of ability on guessing behavior in multiple-choice items. In this study, the authors developed algorithms for computerized classification testing under the 1PL-AG and conducted a series of simulations to evaluate their…
Descriptors: Computer Assisted Testing, Classification, Item Analysis, Probability
PDF pending restoration PDF pending restoration
Schulz, E. Matthew; Kolen, Michael J.; Nicewander, W. Alan – 1997
This paper compares modified Guttman and item response theory (IRT) based procedures for classifying examinees in ordered levels when each level is represented by several multiple choice test items. In the modified Guttman procedure, within-level number correct scores are mapped to binary level mastery scores. Examinees are then assigned to levels…
Descriptors: Classification, Comparative Analysis, Item Response Theory, Mathematics Tests
Finch, F. L.; Dost, Marcia A. – 1992
Many state and local entities are developing and using performance assessment programs. Because these initiatives are so diverse, it is very difficult to understand what they are doing, or to compare them in any meaningful way. Multiple-choice tests are contrasted with performance assessments, and preliminary classifications are suggested to…
Descriptors: Alternative Assessment, Classification, Comparative Analysis, Constructed Response
Stecher, Brian – 1995
The resources necessary to create, administer, and score performance assessments in science were studied. RAND and the University of California, Santa Barbara (UCSB) designed performance tasks for science in grades five and six as part of a larger study of the feasibility of science performance assessment. Tasks were developed in pairs in task…
Descriptors: Classification, Comparative Analysis, Costs, Educational Assessment