Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 2 |
| Since 2017 (last 10 years) | 4 |
| Since 2007 (last 20 years) | 9 |
Descriptor
| Computer Assisted Testing | 9 |
| Construct Validity | 9 |
| Item Analysis | 9 |
| Foreign Countries | 5 |
| Item Response Theory | 4 |
| Language Tests | 4 |
| Psychometrics | 4 |
| Test Construction | 4 |
| Test Format | 4 |
| English (Second Language) | 3 |
| Test Items | 3 |
| More ▼ | |
Source
Author
| Bronson Hui | 1 |
| Brünken, Roland | 1 |
| Chapman, Mark | 1 |
| Chen, Shin-Feng | 1 |
| Craig, Daniel A. | 1 |
| Edelsbrunner, Peter | 1 |
| Forster, Natalie | 1 |
| Hsu, Ying-Shao | 1 |
| Huynh, Huynh | 1 |
| Jen, Tsung-Hau | 1 |
| Kalender, Ilker | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 8 |
| Reports - Research | 7 |
| Dissertations/Theses -… | 1 |
| Reports - Evaluative | 1 |
| Tests/Questionnaires | 1 |
Education Level
| Elementary Education | 4 |
| Higher Education | 3 |
| Middle Schools | 3 |
| Secondary Education | 3 |
| High Schools | 2 |
| Postsecondary Education | 2 |
| Grade 11 | 1 |
| Grade 3 | 1 |
| Grade 4 | 1 |
| Grade 6 | 1 |
| Grade 8 | 1 |
| More ▼ | |
Audience
Location
| Germany | 1 |
| South Korea | 1 |
| Switzerland | 1 |
| Taiwan | 1 |
| Turkey (Ankara) | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Bronson Hui – ProQuest LLC, 2021
Vocabulary researchers have started expanding their assessment toolbox by incorporating timed tasks and psycholinguistic instruments (e.g., priming tasks) to gain insights into lexical development (e.g., Elgort, 2011; Godfroid, 2020b; Nakata & Elgort, 2020; Vandenberghe et al., 2021). These timed sensitive and implicit word measures differ…
Descriptors: Measures (Individuals), Construct Validity, Decision Making, Vocabulary Development
Kim, Ahyoung Alicia; Tywoniw, Rurik L.; Chapman, Mark – Language Assessment Quarterly, 2022
Technology-enhanced items (TEIs) are innovative, computer-delivered test items that allow test takers to better interact with the test environment compared to traditional multiple-choice items (MCIs). The interactive nature of TEIs offer improved construct coverage compared with MCIs but little research exists regarding students' performance on…
Descriptors: Language Tests, Test Items, Computer Assisted Testing, English (Second Language)
Küchemann, Stefan; Malone, Sarah; Edelsbrunner, Peter; Lichtenberger, Andreas; Stern, Elsbeth; Schumacher, Ralph; Brünken, Roland; Vaterlaus, Andreas; Kuhn, Jochen – Physical Review Physics Education Research, 2021
Representational competence is essential for the acquisition of conceptual understanding in physics. It enables the interpretation of diagrams, graphs, and mathematical equations, and relating these to one another as well as to observations and experimental outcomes. In this study, we present the initial validation of a newly developed…
Descriptors: Physics, Science Instruction, Teaching Methods, Concept Formation
Kaya, Elif; O'Grady, Stefan; Kalender, Ilker – Language Testing, 2022
Language proficiency testing serves an important function of classifying examinees into different categories of ability. However, misclassification is to some extent inevitable and may have important consequences for stakeholders. Recent research suggests that classification efficacy may be enhanced substantially using computerized adaptive…
Descriptors: Item Response Theory, Test Items, Language Tests, Classification
Lin, Sheau-Wen; Liu, Yu; Chen, Shin-Feng; Wang, Jing-Ru; Kao, Huey-Lien – International Journal of Science and Mathematics Education, 2015
The purpose of this study was to develop a computer-based assessment for elementary school students' listening comprehension of science talk within an inquiry-oriented environment. The development procedure had 3 steps: a literature review to define the framework of the test, collecting and identifying key constructs of science talk, and…
Descriptors: Listening Comprehension, Science Education, Computer Assisted Testing, Test Construction
Kuo, Che-Yu; Wu, Hsin-Kai; Jen, Tsung-Hau; Hsu, Ying-Shao – International Journal of Science Education, 2015
The potential of computer-based assessments for capturing complex learning outcomes has been discussed; however, relatively little is understood about how to leverage such potential for summative and accountability purposes. The aim of this study is to develop and validate a multimedia-based assessment of scientific inquiry abilities (MASIA) to…
Descriptors: Multimedia Materials, Program Development, Program Validation, Test Construction
Kim, Jungtae; Craig, Daniel A. – Computer Assisted Language Learning, 2012
Videoconferencing offers new opportunities for language testers to assess speaking ability in low-stakes diagnostic tests. To be considered a trusted testing tool in language testing, a test should be examined employing appropriate validation processes [Chapelle, C.A., Jamieson, J., & Hegelheimer, V. (2003). "Validation of a web-based ESL…
Descriptors: Speech Communication, Testing, Language Tests, Construct Validity
Forster, Natalie; Souvignier, Elmar – Learning Disabilities: A Contemporary Journal, 2011
The purpose of this study was to examine the technical adequacy of a computer-based assessment instrument which is based on hierarchical models of text comprehension for monitoring student reading progress following the Curriculum-Based Measurement (CBM) approach. At intervals of two weeks, 120 third-grade students finished eight CBM tests. To…
Descriptors: Educational Needs, Intervals, Curriculum Based Assessment, Computer Assisted Testing
Kim, Do-Hong; Huynh, Huynh – Educational and Psychological Measurement, 2008
The current study compared student performance between paper-and-pencil testing (PPT) and computer-based testing (CBT) on a large-scale statewide end-of-course English examination. Analyses were conducted at both the item and test levels. The overall results suggest that scores obtained from PPT and CBT were comparable. However, at the content…
Descriptors: Reading Comprehension, Computer Assisted Testing, Factor Analysis, Comparative Testing

Direct link
Peer reviewed
