NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
ERIC Number: ED662331
Record Type: Non-Journal
Publication Date: 2024
Pages: 201
Abstractor: As Provided
ISBN: 979-8-3840-9532-3
ISSN: N/A
EISSN: N/A
Available Date: N/A
Evaluating Alternative Item Response Theory Approaches to Account for Missing Data in Children's Word Dictation Responses
Jechun An
ProQuest LLC, Ph.D. Dissertation, University of Minnesota
Students' responses to Word Dictation curriculum-based measurement (CBM) in writing tend to include a lot of missing values, especially items not reached due to the three-minute test time limit. A large amount of non-ignorable not-reached responses in Word Dictation can be considered using alternative item response theory (IRT) approaches. In addition, these alternative approaches could be used to estimate students' writing productivity as well as their accuracy. The purpose of this study was to evaluate the Word Dictation performance of elementary students who are struggling with writing using a classical IRT approach that considers writing accuracy only and alternative IRT approaches that consider both writing accuracy and productivity. This study used data from a larger research project that evaluated the effectiveness of a professional development program designed to support elementary teachers in implementing data-based instruction for students struggling with writing. Participants were recruited in two sites in the Midwest in the U.S. A total of 523 elementary students completed screening tests to determine eligibility for the larger project. Word Dictation CBM in writing, used for screening, was designed to measure transcription skills at the word level by asking students to write dictated words as accurately as they can. I examined the extent to which the students' results differed by comparing the classical IRT approach, latent regression model (LRM) and item response tree (IRTree) model approach. Results from different approaches considering not-reached items (IRTree model and LRM) yielded different ranges of writing ability level, even though students had the same score evaluated by the classical IRT approach. First, goodness-of-fit, item fit, person fit, and model fit were evaluated for each approach to demonstrate that the models have good fit indices. In addition, the results of conditional standardized errors of measurements (cSEM) across models revealed that the alternative IRT approaches evaluated ability more accurately and precisely than classical IRT approaches. Second, considering not-reached responses as either incorrect or missing made a difference in the ability parameters of students with different productivity performance, ultimately underestimating students performing at different levels. In addition, special education eligibility was a significant factor in comparisons of rank-order differences of several models, but not a significant factor in the comparison between classical IRT missing model and alternative IRT models (IRTree model and LRM). Third, most of the results across Word Dictation Forms A and B were consistently replicated. Major inconsistencies in the factor of English Language Learner (ELL) eligibility emerged in examining relations between child-level factors and writing productivity and accuracy and relations between rank-order difference and child-level factors. Handling missing responses is difficult, but entails important procedures that lead to more accurately estimating the ability parameters in the context of CBM. Though no one absolute evaluating approach could be identified, it is possible that abilities of students with specific status including special education eligibility and ELL eligibility were under- or overestimated when missing data were not considered. Better understanding of students' writing performance as it relates to writing productivity and accuracy could ultimately support teachers to use the instructionally meaningful data for individualized instruction. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://www.proquest.com.bibliotheek.ehb.be/en-US/products/dissertations/individuals.shtml.]
ProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site: http://www.proquest.com.bibliotheek.ehb.be/en-US/products/dissertations/individuals.shtml
Publication Type: Dissertations/Theses - Doctoral Dissertations
Education Level: Elementary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: N/A