Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 2 |
| Since 2017 (last 10 years) | 2 |
| Since 2007 (last 20 years) | 6 |
Descriptor
| Construct Validity | 7 |
| Item Response Theory | 7 |
| Test Format | 7 |
| Test Items | 5 |
| Comparative Analysis | 4 |
| Computer Assisted Testing | 3 |
| Item Analysis | 3 |
| Language Tests | 3 |
| Second Language Learning | 3 |
| Test Construction | 3 |
| College Entrance Examinations | 2 |
| More ▼ | |
Source
| College Board | 1 |
| Educational and Psychological… | 1 |
| International Journal of… | 1 |
| Journal of Vocational Behavior | 1 |
| Language Assessment Quarterly | 1 |
| Language Testing | 1 |
| ProQuest LLC | 1 |
Author
| Borowski, Andreas | 1 |
| Chapman, Mark | 1 |
| Crabtree, Ashleigh R. | 1 |
| Einarsdottir, Sif | 1 |
| Fischer, Hans E. | 1 |
| Gess-Newsome, Julie | 1 |
| Hendrickson, Amy | 1 |
| Kalender, Ilker | 1 |
| Kaya, Elif | 1 |
| Kim, Ahyoung Alicia | 1 |
| Kirschner, Sophie | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 5 |
| Reports - Research | 4 |
| Dissertations/Theses -… | 1 |
| Non-Print Media | 1 |
| Reference Materials - General | 1 |
| Reports - Evaluative | 1 |
| Tests/Questionnaires | 1 |
Education Level
| Higher Education | 3 |
| Postsecondary Education | 3 |
| Secondary Education | 3 |
| Elementary Education | 1 |
| High Schools | 1 |
Audience
Location
| Germany | 1 |
| Iowa | 1 |
| Turkey (Ankara) | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| Myers Briggs Type Indicator | 1 |
| SAT (College Admission Test) | 1 |
| Strong Interest Inventory | 1 |
What Works Clearinghouse Rating
Kim, Ahyoung Alicia; Tywoniw, Rurik L.; Chapman, Mark – Language Assessment Quarterly, 2022
Technology-enhanced items (TEIs) are innovative, computer-delivered test items that allow test takers to better interact with the test environment compared to traditional multiple-choice items (MCIs). The interactive nature of TEIs offer improved construct coverage compared with MCIs but little research exists regarding students' performance on…
Descriptors: Language Tests, Test Items, Computer Assisted Testing, English (Second Language)
Kaya, Elif; O'Grady, Stefan; Kalender, Ilker – Language Testing, 2022
Language proficiency testing serves an important function of classifying examinees into different categories of ability. However, misclassification is to some extent inevitable and may have important consequences for stakeholders. Recent research suggests that classification efficacy may be enhanced substantially using computerized adaptive…
Descriptors: Item Response Theory, Test Items, Language Tests, Classification
Crabtree, Ashleigh R. – ProQuest LLC, 2016
The purpose of this research is to provide information about the psychometric properties of technology-enhanced (TE) items and the effects these items have on the content validity of an assessment. Specifically, this research investigated the impact that the inclusion of TE items has on the construct of a mathematics test, the technical properties…
Descriptors: Psychometrics, Computer Assisted Testing, Test Items, Test Format
Kirschner, Sophie; Borowski, Andreas; Fischer, Hans E.; Gess-Newsome, Julie; von Aufschnaiter, Claudia – International Journal of Science Education, 2016
Teachers' professional knowledge is assumed to be a key variable for effective teaching. As teacher education has the goal to enhance professional knowledge of current and future teachers, this knowledge should be described and assessed. Nevertheless, only a limited number of studies quantitatively measures physics teachers' professional…
Descriptors: Evaluation Methods, Tests, Test Format, Science Instruction
Einarsdottir, Sif; Rounds, James – Journal of Vocational Behavior, 2009
Item response theory was used to address gender bias in interest measurement. Differential item functioning (DIF) technique, SIBTEST and DIMTEST for dimensionality, were applied to the items of the six General Occupational Theme (GOT) and 25 Basic Interest (BI) scales in the Strong Interest Inventory. A sample of 1860 women and 1105 men was used.…
Descriptors: Test Format, Females, Vocational Interests, Construct Validity
Peer reviewedTzeng, Oliver C. S.; And Others – Educational and Psychological Measurement, 1991
Measurement properties of two response formats (bipolar and unipolar ratings) in personality assessment were compared using data from 135 college students taking the Myers-Briggs Type Indicator (MBTI). Factorial validity and construct validity of the MBTI were supported. Reasons why the bipolar method is preferable are discussed. (SLD)
Descriptors: College Students, Comparative Testing, Construct Validity, Factor Analysis
Hendrickson, Amy; Patterson, Brian; Melican, Gerald – College Board, 2008
Presented at the Annual National Council on Measurement in Education (NCME) in New York in March 2008. This presentation explores how different item weighting can affect the effective weights, validity coefficents and test reliability of composite scores among test takers.
Descriptors: Multiple Choice Tests, Test Format, Test Validity, Test Reliability

Direct link
