NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20260
Since 20250
Since 2022 (last 5 years)0
Since 2017 (last 10 years)0
Since 2007 (last 20 years)7
Audience
Location
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing all 8 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Perrett, Jamis J. – Journal of Statistics Education, 2012
This article demonstrates how textbooks differ in their description of the term "experimental unit". Advanced Placement Statistics teachers and students are often limited in their statistical knowledge by the information presented in their classroom textbook. Definitions and descriptions differ among textbooks as well as among different…
Descriptors: Statistics, Advanced Placement Programs, Textbooks, Mathematics Instruction
Peer reviewed Peer reviewed
Direct linkDirect link
Kaliski, Pamela K.; Wind, Stefanie A.; Engelhard, George, Jr.; Morgan, Deanna L.; Plake, Barbara S.; Reshetar, Rosemary A. – Educational and Psychological Measurement, 2013
The many-faceted Rasch (MFR) model has been used to evaluate the quality of ratings on constructed response assessments; however, it can also be used to evaluate the quality of judgments from panel-based standard setting procedures. The current study illustrates the use of the MFR model for examining the quality of ratings obtained from a standard…
Descriptors: Item Response Theory, Models, Standard Setting (Scoring), Science Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Hendrickson, Amy; Huff, Kristen; Luecht, Richard – Applied Measurement in Education, 2010
Evidence-centered assessment design (ECD) explicates a transparent evidentiary argument to warrant the inferences we make from student test performance. This article describes how the vehicles for gathering student evidence--task models and test specifications--are developed. Task models, which are the basis for item development, flow directly…
Descriptors: Evidence, Test Construction, Measurement, Classification
Reshetar, Rosemary; Melican, Gerald J. – College Board, 2010
This paper discusses issues related to the design and psychometric work for mixed-format tests --tests containing both multiple-choice (MC) and constructed-response (CR) items. The issues of validity, fairness, reliability and score consistency can be addressed but for mixed-format tests there are many decisions to be made and no examination or…
Descriptors: Psychometrics, Test Construction, Multiple Choice Tests, Test Items
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Siko, Jason Paul – International Journal of E-Learning & Distance Education, 2014
In this study, the perceptions of parents (n = 14) and students (n = 47) enrolled in a blended learning course, the first of its kind at their school, were examined. Student performance in the blended and in the traditional portion of the course was examined, and the Educational Success Prediction Instrument (ESPRI) was administered to predict…
Descriptors: Blended Learning, Educational Technology, Distance Education, Electronic Learning
College Board, 2011
This catalog lists research reports, research notes, and other publications available from the College Board's website. The catalog briefly describes research publications available free of charge. Introduced in 1981, the Research Report series includes studies and reviews in areas such as college admission, special populations, subgroup…
Descriptors: Research Reports, Publications, Educational Research, College Students
Hendrickson, Amy; Patterson, Brian; Melican, Gerald – College Board, 2008
Presented at the Annual National Council on Measurement in Education (NCME) in New York in March 2008. This presentation explores how different item weighting can affect the effective weights, validity coefficents and test reliability of composite scores among test takers.
Descriptors: Multiple Choice Tests, Test Format, Test Validity, Test Reliability
Stricker, Lawrence J. – College Entrance Examination Board, 1998
Steele and Aronson (1995) found that the performance of African-American subjects on test items portrayed as a problem-solving task, in a laboratory experiment, was adversely affected when they were asked about their ethnicity. This outcome was attributed to "stereotype threat". Performance was disrupted by the subjects' concerns about…
Descriptors: Ethnicity, Ethnic Groups, Test Items, Problem Solving