NotesFAQContact Us
Collection
Advanced
Search Tips
Location
China1
Cyprus1
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 58 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Sen, Sedat – Creativity Research Journal, 2022
The purpose of this study was to estimate the overall reliability values for the scores produced by Runco Ideational Behavior Scale (RIBS) and explore the variability of RIBS score reliability across studies. To achieve this, a reliability generalization meta-analysis was carried out using the 86 Cronbach's alpha estimates obtained from 77 studies…
Descriptors: Generalization, Creativity, Meta Analysis, Higher Education
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Davison, Christopher B.; Dustova, Gandzhina – Journal of Instructional Pedagogies, 2017
This research study describes the correlations between student performance and examination format in a higher education teaching and research institution. The researchers employed a quantitative, correlational methodology utilizing linear regression analysis. The data was obtained from undergraduate student test scores over a three-year time span.…
Descriptors: Statistical Analysis, Performance Based Assessment, Correlation, Higher Education
Jin, Yan – Journal of Pan-Pacific Association of Applied Linguistics, 2011
The College English Test (CET) is an English language test designed for educational purposes, administered on a very large scale, and used for making high-stakes decisions. This paper discusses the key issues facing the CET during the course of its development in the past two decades. It argues that the most fundamental and critical concerns of…
Descriptors: High Stakes Tests, Language Tests, Measures (Individuals), Graduates
Peer reviewed Peer reviewed
Holley, Joyce H.; Jenkins, Elizabeth K. – Journal of Education for Business, 1993
The relationship between performance on four test formats (multiple-choice theory, multiple-choice quantitative, open-ended theory, open-ended quantitative) and scores on the Kolb Learning Style Inventory was investigated for 49 accounting students. Learning style was significant for all formats except multiple-choice quantitative. (SK)
Descriptors: Accounting, Cognitive Style, Higher Education, Scores
Peer reviewed Peer reviewed
Plake, Barbara S. – Journal of Experimental Education, 1980
Three-item orderings and two levels of knowledge of ordering were used to study differences in test results, student's perception of the test's fairness and difficulty, and student's estimation of test performance. No significant order effect was found. (Author/GK)
Descriptors: Difficulty Level, Higher Education, Scores, Test Format
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Alexander, Melody W.; Bartlett, James E.; Truell, Allen D.; Ouwenga, Karen – Journal of Career and Technical Education, 2001
Students in a computer technology course completed either a paper-and-pencil test (n=40) or an online test in a proctored computer lab (n=43). Test scores were equivalent, but the online group, particularly freshmen, completed the test in less time. Online testing time did not correlate with test score. (Contains 30 references.) (SK)
Descriptors: Academic Achievement, Computer Assisted Testing, Computers, Higher Education
Carstens, Paul W.; McKeag, Robert A. – 1982
This study utilized a test, re-test procedure to investigate what effects a change in the order, or sequence, of test items would have on student performance. College juniors (n=102) were given a 50-item multiple-choice and matching item test on the general subject of educational measurement. The items had no particular sequence, but had simply…
Descriptors: Higher Education, Objective Tests, Performance Factors, Scores
Peer reviewed Peer reviewed
Phillips, Fred – Journal of Education for Business, 1999
Accounting students (n=202) had different preferences for learning discrete facts, quick and easy problems, and new and ambiguous situations. On a multiple-choice test and unstructured task completed by 73 students, preference for quick and easy problems distinguished poor and good performers on the task but not on the test. (SK)
Descriptors: Accounting, Business Administration Education, Cognitive Style, Educational Environment
Peer reviewed Peer reviewed
Eaves, Ronald C.; Smith, Earl – Journal of Experimental Education, 1986
The effects of examination format and previous experience with microcomputers on the test scores of 96 undergraduate students were investigated. Results indicated no significant differences in the scores obtained on the two types of test administration (microcomputer and traditional paper and pencil). Computer experience was not an important…
Descriptors: College Students, Computer Assisted Testing, Educational Media, Higher Education
Peer reviewed Peer reviewed
Mason, B. Jean; Patry, Marc; Berstein, Daniel J. – Journal of Educational Computing Research, 2001
Discussion of adapting traditional paper and pencil tests to electronic formats focuses on a study of undergraduates that examined the equivalence between computer-based and traditional tests when the computer testing provided opportunities comparable to paper testing conditions. Results showed no difference between scores from the two test types.…
Descriptors: Comparative Analysis, Computer Assisted Testing, Higher Education, Intermode Differences
Peer reviewed Peer reviewed
Henk, William A. – Journal of Reading Behavior, 1981
Analyzes alternative cloze forms derived from selected deletion strategies, scoring procedures, and blank conditions for respective effects on the cloze test performance of college-level readers. (HOD)
Descriptors: Cloze Procedure, College Students, Higher Education, Reading Research
Peer reviewed Peer reviewed
Jenkins, Elizabeth K.; Holley, Joyce H. – Research in Higher Education, 1990
A study examined the interactive effects of accounting students' language background (English as a first or second language) and test item format on test scores. Items on the same material were in four formats: multiple choice quantitative, multiple choice theoretical, open-ended quantitative, and open-ended essay questions. (Author/MSE)
Descriptors: Accounting, Achievement Tests, English (Second Language), Higher Education
Peer reviewed Peer reviewed
Laffitte, Rondeau G., Jr. – Teaching of Psychology, 1984
A study involving undergraduate college students enrolled in an introductory psychology course showed that test item arrangement by difficulty or by order of content presentation has no effect on total achievement test score. The data also fail to demonstrate any influence of test item order on student perception of test difficulty. (RM)
Descriptors: Difficulty Level, Educational Research, Higher Education, Psychology
Peer reviewed Peer reviewed
Cziko, Gary A. – TESOL Quarterly, 1982
Describes an attempt to construct an ESL dictation test that would: (1) be appropriate for a wide range of ability, (2) be easy and fast to score, (3) consist of set items that would form both a unidimensional and cumulative scale, and (4) yield scores that would be directly interpretable with respect to specified levels of English proficiency.…
Descriptors: Criterion Referenced Tests, English (Second Language), Higher Education, Scores
Peer reviewed Peer reviewed
Loo, S. Robert; Thorpe, Karran – Educational and Psychological Measurement, 1999
Used samples of 142 management and 123 nursing undergraduates to evaluate the psychometric properties and factor structure of the newly developed Form S (short form) of the Watson-Glaser Critical Thinking Appraisal (G. Watson and E. Glaser, 1964, 1994). Results provide only limited support for Form S, and further refinement is suggested. (SLD)
Descriptors: Administration, Critical Thinking, Higher Education, Nursing
Previous Page | Next Page ยป
Pages: 1  |  2  |  3  |  4