Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 1 |
| Since 2017 (last 10 years) | 3 |
| Since 2007 (last 20 years) | 6 |
Descriptor
| Multiple Choice Tests | 9 |
| Racial Differences | 9 |
| Test Reliability | 9 |
| Computer Assisted Testing | 4 |
| Gender Differences | 4 |
| Test Items | 4 |
| Reading Tests | 3 |
| Scores | 3 |
| Scoring | 3 |
| Student Evaluation | 3 |
| Test Validity | 3 |
| More ▼ | |
Source
| CBE - Life Sciences Education | 1 |
| Grantee Submission | 1 |
| International Journal of… | 1 |
| Journal of Applied Testing… | 1 |
| Journal of Learning… | 1 |
| Journal of Science Education… | 1 |
Author
Publication Type
| Reports - Research | 8 |
| Journal Articles | 5 |
| Information Analyses | 1 |
| Numerical/Quantitative Data | 1 |
| Speeches/Meeting Papers | 1 |
| Tests/Questionnaires | 1 |
Education Level
Audience
Location
| Maryland | 2 |
| Florida | 1 |
| Indiana | 1 |
| New Jersey | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| Advanced Placement… | 1 |
| National Assessment of… | 1 |
What Works Clearinghouse Rating
Ben Seipel; Sarah E. Carlson; Virginia Clinton-Lisell; Mark L. Davison; Patrick C. Kennedy – Grantee Submission, 2022
Originally designed for students in Grades 3 through 5, MOCCA (formerly the Multiple-choice Online Causal Comprehension Assessment), identifies students who struggle with comprehension, and helps uncover why they struggle. There are many reasons why students might not comprehend what they read. They may struggle with decoding, or reading words…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Diagnostic Tests, Reading Tests
Relkin, Emily; de Ruiter, Laura; Bers, Marina Umaschi – Journal of Science Education and Technology, 2020
There is a need for developmentally appropriate Computational Thinking (CT) assessments that can be implemented in early childhood classrooms. We developed a new instrument called "TechCheck" for assessing CT skills in young children that does not require prior knowledge of computer programming. "TechCheck" is based on…
Descriptors: Developmentally Appropriate Practices, Computation, Thinking Skills, Early Childhood Education
Farrington, Amber L.; Lonigan, Christopher J. – Journal of Learning Disabilities, 2015
Children's emergent literacy skills are highly predictive of later reading abilities. To determine which children have weaker emergent literacy skills and are in need of intervention, it is necessary to assess emergent literacy skills accurately and reliably. In this study, 1,351 children were administered the "Revised Get Ready to…
Descriptors: Emergent Literacy, Preschool Children, Reading Tests, Item Response Theory
McFarland, Jenny L.; Price, Rebecca M.; Wenderoth, Mary Pat; Martinková, Patrícia; Cliff, William; Michael, Joel; Modell, Harold; Wright, Ann – CBE - Life Sciences Education, 2017
We present the Homeostasis Concept Inventory (HCI), a 20-item multiple-choice instrument that assesses how well undergraduates understand this critical physiological concept. We used an iterative process to develop a set of questions based on elements in the Homeostasis Concept Framework. This process involved faculty experts and undergraduate…
Descriptors: Scientific Concepts, Multiple Choice Tests, Science Tests, Test Construction
Ling, Guangming – International Journal of Testing, 2016
To investigate possible iPad related mode effect, we tested 403 8th graders in Indiana, Maryland, and New Jersey under three mode conditions through random assignment: a desktop computer, an iPad alone, and an iPad with an external keyboard. All students had used an iPad or computer for six months or longer. The 2-hour test included reading, math,…
Descriptors: Educational Testing, Computer Assisted Testing, Handheld Devices, Computers
Lissitz, Robert W.; Hou, Xiaodong; Slater, Sharon Cadman – Journal of Applied Testing Technology, 2012
This article investigates several questions regarding the impact of different item formats on measurement characteristics. Constructed response (CR) items and multiple choice (MC) items obviously differ in their formats and in the resources needed to score them. As such, they have been the subject of considerable discussion regarding the impact of…
Descriptors: Computer Assisted Testing, Scoring, Evaluation Problems, Psychometrics
Cooper, Peter L. – 1984
Recent information from established testing programs was used to investigate the nature and limitations of essay and multiple-choice tests of writing ability, the statistical relationship of these types of tests, the performance of population subgroups on each, the possible need of different disciplines for different tests of composition skill,…
Descriptors: Correlation, Cost Effectiveness, Essay Tests, Evaluation Methods
Mazzeo, John; And Others – 1993
This report describes three exploratory studies of the performance of males and females on the multiple-choice and constructed-response sections of four Advanced Placement Examinations: United States History, Biology, Chemistry, and English Language and Composition. Analyses were carried out for each racial or ethnic group with a sample size of at…
Descriptors: Advanced Placement, College Entrance Examinations, Constructed Response, Ethnic Groups
Saturnelli, Annette Miele; Repa, J. Theodore – 1995
The specific focus of this research was to determine how the outcomes on two alternative forms of assessment (multiple-choice and hands-on/manipulative) for science process skills were related when students were grouped on the basis of sex, race/ethnicity, and poverty level. Subjects were 1,381 fourth graders in a culturally diverse city school…
Descriptors: Academic Achievement, Alternative Assessment, Cultural Differences, Educational Assessment

Peer reviewed
Direct link
