Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 2 |
| Since 2007 (last 20 years) | 6 |
Descriptor
| Comparative Analysis | 10 |
| Reliability | 10 |
| Test Format | 10 |
| Psychometrics | 4 |
| Computer Assisted Instruction | 3 |
| Elementary School Students | 3 |
| Error of Measurement | 3 |
| Test Construction | 3 |
| Test Items | 3 |
| Test Use | 3 |
| Validity | 3 |
| More ▼ | |
Source
| Educational Measurement:… | 1 |
| Educational and Psychological… | 1 |
| Journal of Educational… | 1 |
| Journal of Experimental… | 1 |
| Journal of Psychoeducational… | 1 |
| Journal of Research on… | 1 |
| Journal of Speech, Language,… | 1 |
Author
| Lee, Won-Chan | 2 |
| Mott, Michael S. | 2 |
| Ardasheva, Yuliya | 1 |
| Choi, Jiwon | 1 |
| Earley, Mark A. | 1 |
| Halpin, Regina | 1 |
| Hao, Tao | 1 |
| Hare, R. Dwight | 1 |
| Harley, Dwight | 1 |
| Hoffman, Lesa | 1 |
| Huber, David E. | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 7 |
| Reports - Research | 7 |
| Speeches/Meeting Papers | 3 |
| Information Analyses | 1 |
| Reports - Descriptive | 1 |
| Reports - Evaluative | 1 |
Education Level
| Higher Education | 2 |
| Early Childhood Education | 1 |
| Elementary Education | 1 |
| Elementary Secondary Education | 1 |
| Postsecondary Education | 1 |
| Preschool Education | 1 |
Audience
Location
| California | 1 |
Laws, Policies, & Programs
Assessments and Surveys
| Peabody Individual… | 1 |
| Peabody Picture Vocabulary… | 1 |
What Works Clearinghouse Rating
Lee, Won-Chan; Kim, Stella Y.; Choi, Jiwon; Kang, Yujin – Journal of Educational Measurement, 2020
This article considers psychometric properties of composite raw scores and transformed scale scores on mixed-format tests that consist of a mixture of multiple-choice and free-response items. Test scores on several mixed-format tests are evaluated with respect to conditional and overall standard errors of measurement, score reliability, and…
Descriptors: Raw Scores, Item Response Theory, Test Format, Multiple Choice Tests
Hao, Tao; Wang, Zhe; Ardasheva, Yuliya – Journal of Research on Educational Effectiveness, 2021
This meta-analysis reviewed research between 2012 and 2018 focused on technology-assisted second language (L2) vocabulary learning for English as a foreign language (EFL) learner. A total of 45 studies of 2,374 preschool-to-college EFL students contributed effect sizes to this meta-analysis. Compared with traditional instructional methods, the…
Descriptors: Vocabulary Development, Second Language Learning, Second Language Instruction, English (Second Language)
Yarnell, Jordy B.; Pfeiffer, Steven I. – Journal of Psychoeducational Assessment, 2015
The present study examined the psychometric equivalence of administering a computer-based version of the Gifted Rating Scale (GRS) compared with the traditional paper-and-pencil GRS-School Form (GRS-S). The GRS-S is a teacher-completed rating scale used in gifted assessment. The GRS-Electronic Form provides an alternative method of administering…
Descriptors: Gifted, Psychometrics, Rating Scales, Computer Assisted Testing
Hoffman, Lesa; Templin, Jonathan; Rice, Mabel L. – Journal of Speech, Language, and Hearing Research, 2012
Purpose: The present work describes how vocabulary ability as assessed by 3 different forms of the Peabody Picture Vocabulary Test (PPVT; Dunn & Dunn, 1997) can be placed on a common latent metric through item response theory (IRT) modeling, by which valid comparisons of ability between samples or over time can then be made. Method: Responses…
Descriptors: Item Response Theory, Test Format, Vocabulary, Comparative Analysis
Kolen, Michael J.; Lee, Won-Chan – Educational Measurement: Issues and Practice, 2011
This paper illustrates that the psychometric properties of scores and scales that are used with mixed-format educational tests can impact the use and interpretation of the scores that are reported to examinees. Psychometric properties that include reliability and conditional standard errors of measurement are considered in this paper. The focus is…
Descriptors: Test Use, Test Format, Error of Measurement, Raw Scores
Jang, Yoonhee; Wixted, John T.; Huber, David E. – Journal of Experimental Psychology: General, 2009
The current study compared 3 models of recognition memory in their ability to generalize across yes/no and 2-alternative forced-choice (2AFC) testing. The unequal-variance signal-detection model assumes a continuous memory strength process. The dual-process signal-detection model adds a thresholdlike recollection process to a continuous…
Descriptors: Test Format, Familiarity, Testing, Criteria
Mertler, Craig A.; Earley, Mark A. – 2003
A study was conducted to compare the psychometric qualities of two forms of an identical survey: one administered in a paper-and-pencil format and the other administered in Web format. The survey addressed the topic of college course anxiety and used to survey a sample of 236 undergraduate students. The psychometric qualities investigated included…
Descriptors: Anxiety, Comparative Analysis, Higher Education, Psychometrics
Peer reviewedRogers, W. Todd; Harley, Dwight – Educational and Psychological Measurement, 1999
Examined item-level and test-level characteristics for items in a high-stakes school-leaving mathematics examination. Results from 158 students show that the influence of testwiseness is lessened when three-option items are used. Tests of three-option items are at least equivalent to four-option item tests in terms of internal-consistency score…
Descriptors: Comparative Analysis, High School Students, High Schools, High Stakes Tests
Mott, Michael S.; Halpin, Regina – 1999
The reliability and developmental and concurrent validity of the Writing What You Read (WWYR) rubric, designed for use with paper and pen, for hypermedia-authored narrative productions of students in grades 2 and 3 were studied. Sixty students from 4 classrooms produced hypermedia narratives (interactive multimedia presentations) that were rated…
Descriptors: Comparative Analysis, Computer Assisted Instruction, Computer Software, Elementary School Students
Mott, Michael S.; Hare, R. Dwight – 1999
This study investigated the reliability and developmental and concurrent validity of the Writing What You Read (WWYR) rubric, an instrument originally designed for use with paper-and-pen-created narratives, for hypermedia productions of students in grades 2 and 3. Four teachers guided their students in a 3-month-long hypermedia/process writing…
Descriptors: Comparative Analysis, Computer Assisted Instruction, Computer Software, Elementary Education

Direct link
