Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 1 |
| Since 2007 (last 20 years) | 2 |
Descriptor
Author
| Almeda, Mia | 1 |
| Asbell-Clarke, Jodi | 1 |
| Attali, Yigal | 1 |
| Bardar, Erin | 1 |
| Edwards, Teon | 1 |
| Gasca, Santiago | 1 |
| Laitusis, Cara | 1 |
| Rowe, Elizabeth | 1 |
| Shute, Valerie | 1 |
| Stone, Elizabeth | 1 |
| Ventura, Matthew | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 2 |
| Reports - Research | 2 |
Education Level
| Elementary Education | 2 |
| Junior High Schools | 2 |
| Middle Schools | 2 |
| Secondary Education | 2 |
| Grade 8 | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Rowe, Elizabeth; Asbell-Clarke, Jodi; Almeda, Mia; Gasca, Santiago; Edwards, Teon; Bardar, Erin; Shute, Valerie; Ventura, Matthew – International Journal of Computer Science Education in Schools, 2021
The Inclusive Assessment of Computational Thinking (CT) designed for accessibility and learner variability was studied in over 50 classes in US schools (grades 3-8). The validation studies of IACT sampled thousands of students to establish IACT's construct and concurrent validity as well as test-retest reliability. IACT items for each CT practice…
Descriptors: Puzzles, Logical Thinking, Thinking Skills, Construct Validity
Attali, Yigal; Laitusis, Cara; Stone, Elizabeth – Educational and Psychological Measurement, 2016
There are many reasons to believe that open-ended (OE) and multiple-choice (MC) items elicit different cognitive demands of students. However, empirical evidence that supports this view is lacking. In this study, we investigated the reactions of test takers to an interactive assessment with immediate feedback and answer-revision opportunities for…
Descriptors: Test Items, Questioning Techniques, Differences, Student Reaction

Peer reviewed
Direct link
