ERIC Number: ED589516
Record Type: Non-Journal
Publication Date: 2016
Pages: 183
Abstractor: As Provided
ISBN: 978-1-3399-8285-4
ISSN: EISSN-
EISSN: N/A
Available Date: N/A
Psychometric Properties of Technology-Enhanced Item Formats: An Evaluation of Construct Validity and Technical Characteristics
Crabtree, Ashleigh R.
ProQuest LLC, Ph.D. Dissertation, The University of Iowa
The purpose of this research is to provide information about the psychometric properties of technology-enhanced (TE) items and the effects these items have on the content validity of an assessment. Specifically, this research investigated the impact that the inclusion of TE items has on the construct of a mathematics test, the technical properties of these items, and the influence these item types have on test characteristics. An empirical dataset was used to investigate the impact of including TE items on a multiple-choice (MC) assessment. The test used was the Iowa End-of-Course Algebra I (IEOC-A) assessment. The sample included 3850 students from the state of Iowa who took the IEOC-A assessment in the spring of 2012. The base form of the Algebra EOC assessment consisted of 30 MC items. Sixty TE items were developed and aligned to the same blueprint as the MC items. These items were appended in sets of five to the base form, in effect resulting in 12 different test forms. The forms were randomly assigned to students during the spring administration window. Several methods were used in an attempt to form a more complete understanding of the content characteristics and technical properties of TE items. This research first examined whether adding TE items to an established MC exam had an effect on the construct of the test. The factor analysis confirmed a two-factor model comprising latent factors of MC and TE items, indicating that TE items may add a new dimension to the test. Subsequent to these findings, a more thorough analysis of the item pool was conducted and IRT analyses were done to investigate item information, test information, and relative efficiency. This analysis indicated that there may be a difference in the way students perform on MC and TE items. There is evidence in this particular pool of items that there is a difference in these two item types. This difference may manifest itself as an additional, perhaps unintended, construct on the exam. Additionally, TE items may perform differently depending on the ability level of the student. Specifically, TE items may provide more information, and measure the construct more efficiently than MC items at higher levels of ability. Finally, the quantity of TE items included on a test has the potential to affect the relative efficiency of the instrument, underscoring the importance of selecting items that reinforce the purpose and uses of the test. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://www.proquest.com.bibliotheek.ehb.be/en-US/products/dissertations/individuals.shtml.]
Descriptors: Psychometrics, Computer Assisted Testing, Test Items, Test Format, Construct Validity, Test Construction, Mathematics Tests, Multiple Choice Tests, Algebra, Item Response Theory, Efficiency
ProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site: http://www.proquest.com.bibliotheek.ehb.be/en-US/products/dissertations/individuals.shtml
Publication Type: Dissertations/Theses - Doctoral Dissertations
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: Iowa
Grant or Contract Numbers: N/A
Author Affiliations: N/A