Publication Date
| In 2026 | 0 |
| Since 2025 | 0 |
| Since 2022 (last 5 years) | 0 |
| Since 2017 (last 10 years) | 0 |
| Since 2007 (last 20 years) | 1 |
Descriptor
| Graduate Students | 17 |
| Higher Education | 17 |
| Test Format | 17 |
| Multiple Choice Tests | 8 |
| Test Items | 7 |
| Test Construction | 6 |
| Comparative Analysis | 5 |
| Difficulty Level | 5 |
| Student Attitudes | 4 |
| Undergraduate Students | 4 |
| Comparative Testing | 3 |
| More ▼ | |
Source
Author
| Tollefson, Nona | 3 |
| Tripp, Alice | 2 |
| Benshoff, James M. | 1 |
| Chang, Lei | 1 |
| Chen, Ju Shan | 1 |
| Chissom, Brad | 1 |
| Chukabarah, Prince C. O. | 1 |
| Cleary, T. Anne | 1 |
| Dirkes, M. Ann | 1 |
| Geranpayeh, Ardeshir | 1 |
| Hancock, Gregory R. | 1 |
| More ▼ | |
Publication Type
| Reports - Research | 15 |
| Journal Articles | 8 |
| Speeches/Meeting Papers | 4 |
| Guides - Classroom - Teacher | 1 |
| Reports - Evaluative | 1 |
| Tests/Questionnaires | 1 |
Education Level
| Higher Education | 2 |
Audience
| Practitioners | 1 |
| Researchers | 1 |
Location
Laws, Policies, & Programs
Assessments and Surveys
| International English… | 1 |
| Minnesota Multiphasic… | 1 |
| Test of English as a Foreign… | 1 |
| UCLA Loneliness Scale | 1 |
What Works Clearinghouse Rating
Hanshaw, Larry G. – College Student Journal, 2012
This study sought to determine how students would describe their group-only cooperative testing experiences in terms of key elements of cooperative learning often cited in the literature. Written comments of 159 graduate students were analyzed and 26 related categories of comments were derived from 495 statements of students enrolled in two…
Descriptors: Achievement Gains, Cooperative Learning, Teaching Methods, Graduate Students
Peer reviewedPinsoneault, Terry B. – Computers in Human Behavior, 1996
Computer-assisted and paper-and-pencil-administered formats for the Minnesota Multiphasic Personality Inventories were investigated. Subjects were 32 master's and doctoral-level counseling students. Findings indicated that the two formats were comparable and that students preferred the computer-assisted format. (AEF)
Descriptors: Comparative Analysis, Computer Assisted Testing, Graduate Students, Higher Education
Peer reviewedBenshoff, James M.; Thomas, Wayne P. – Counselor Education and Supervision, 1992
Reexamined Counselor Evaluation Rating Scale (CERS) using confirmatory factor analysis. Analyzed 185 self-rated CERs. Findings suggest that, when counselors use CERS to rate themselves, different factors may emerge from those emerging when experienced supervisors use CERS to evaluate supervisee progress and performance. (Author/NB)
Descriptors: Counselor Training, Evaluation Methods, Factor Structure, Graduate Students
Tollefson, Nona; Chen, Ju Shan – 1986
This study compared item difficulty and item discrimination indices for parallel multiple-choice items in three content areas: measurement concepts, statistical terminology, and synonyms. The statistics and measurement items were administered in classes where graduate students taking the test were studying the content. Vocabulary items represented…
Descriptors: Difficulty Level, Graduate Students, Higher Education, Item Analysis
Sinkavich, Frank J. – 1988
The relationship between metamemorial accuracy and student test performance was assessed in a graduate educational psychology class. Metamemory is defined as information about the content of one's own memory. Adult students in two courses (N=67) were asked to select multiple choice answers in two midterms and a final examination and to rate their…
Descriptors: Cognitive Processes, Confidence Testing, Educational Psychology, Graduate Students
Tollefson, Nona; Tripp, Alice – 1986
The item difficulty and item discrimination of three multiple-choice item formats were compared in experimental and non-experimental settings. In the experimental study, 104 graduate students were randomly assigned to complete one of three forms of a multiple-choice test: (1) a complex alternative ("none of the above") as the correct answer; (2) a…
Descriptors: Achievement Tests, Difficulty Level, Discriminant Analysis, Graduate Students
Tollefson, Nona; Tripp, Alice – 1983
This study compared the item difficulty and item discrimination of three multiple choice item formats. The multiple choice formats studied were: a complex alternative (none of the above) as the correct answer; a complex alternative as a foil, and the one-correct answer format. One hundred four graduate students were randomly assigned to complete…
Descriptors: Analysis of Variance, Difficulty Level, Graduate Students, Higher Education
Geranpayeh, Ardeshir – Edinburgh Working Papers in Applied Linguistics, 1994
This paper reports on a study conducted to determine if comparisons between scores on the Test of English as a Foreign Language (TOEFL) and the International English Language Testing Service (IELTS) are justifiable. The test scores of 216 Iranian graduate students who took the TOEFL and IELTS, as well as the Iranian Ministry of Culture and Higher…
Descriptors: Comparative Analysis, English (Second Language), Foreign Countries, Graduate Students
Peer reviewedMiller, Timothy R.; Cleary, T. Anne – Educational and Psychological Measurement, 1993
The degree to which statistical item selection reduces direction-of-wording effects in balanced affective measures developed from relatively small item pools was investigated with 171 male and 228 female undergraduate and graduate students at 2 U.S. universities. Clearest direction-of-wording effects result from selection of items with high…
Descriptors: Affective Measures, Correlation, Factor Analysis, Graduate Students
Swartz, Stephen M. – Journal of Education for Business, 2006
The confidence level (information-referenced testing; IRT) design is an attempt to improve upon the multiple choice format by allowing students to express a level of confidence in the answers they choose. In this study, the author evaluated student perceptions of the ease of use and accuracy of and general preference for traditional multiple…
Descriptors: Multiple Choice Tests, Essay Tests, Graduate Students, Student Attitudes
Cognitive Complexity and the Comparability of Multiple-Choice and Constructed-Response Test Formats.
Peer reviewedHancock, Gregory R. – Journal of Experimental Education, 1994
To investigate the ability of multiple-choice tests to assess higher order thinking skills, examinations were constructed as half multiple choice and half constructed response. Results with 90 undergraduate and graduate students indicate that the 2 formats measure similar constructs at different levels of complexity. (SLD)
Descriptors: Cognitive Processes, Comparative Analysis, Constructed Response, Educational Assessment
Chissom, Brad; Chukabarah, Prince C. O. – 1985
The comparative effects of various sequences of test items were examined for over 900 graduate students enrolled in an educational research course at The University of Alabama, Tuscaloosa. experiment, which was conducted a total of four times using four separate tests, presented three different arrangements of 50 multiple-choice items: (1)…
Descriptors: Analysis of Variance, Comparative Testing, Difficulty Level, Graduate Students
Ory, John C. – 1983
Information on the uses, advantages, and limitations of different types of test items is presented for college faculty, along with guidelines for developing test items. Included is advice on: choosing between objective and subjective test items, when essay tests or objective tests are appropriate, and when either essay or objective tests can be…
Descriptors: Check Lists, College Instruction, Essay Tests, Graduate Students
Chang, Lei – 1993
Equivalence in reliability and validity across 4-point and 6-point scales was assessed by fitting different measurement models through confirmatory factor analysis of a multitrait-multimethod covariance matrix. Responses to nine Likert-type items designed to measure perceived quantitative ability, self-perceived usefulness of quantitative…
Descriptors: Ability, Comparative Testing, Education Majors, Graduate Students
Dirkes, M. Ann – 1984
A divergent thinking (DT) test format, scored for flexible and original thinking, is presented. The DT test format, designed to assess teacher competencies and estimate the transfer of competencies to new situations, was administered to graduate students enrolled in a testing course. The DT format allowed students to list phrases and sentence…
Descriptors: Academic Achievement, Achievement Tests, Cognitive Tests, Divergent Thinking
Previous Page | Next Page ยป
Pages: 1 | 2
Direct link
