Publication Date
| In 2026 | 0 |
| Since 2025 | 2 |
| Since 2022 (last 5 years) | 2 |
| Since 2017 (last 10 years) | 4 |
| Since 2007 (last 20 years) | 6 |
Descriptor
| Computer Assisted Testing | 11 |
| Multiple Choice Tests | 11 |
| Problem Solving | 11 |
| Mathematics Tests | 7 |
| Test Items | 5 |
| Test Construction | 4 |
| Algebra | 3 |
| Scores | 3 |
| Student Evaluation | 3 |
| Teaching Methods | 3 |
| Test Format | 3 |
| More ▼ | |
Source
Author
| Andrei Ludu | 1 |
| Bennett, Randy Elliot | 1 |
| Braswell, James S. | 1 |
| Bridgeman, Brent | 1 |
| Chenglu Li | 1 |
| Hai Li | 1 |
| Jackson, Carol A. | 1 |
| Jones, Ian | 1 |
| Krupa, Erin | 1 |
| Kuo, Bor-Chen | 1 |
| Li, Cheng-Hsuan | 1 |
| More ▼ | |
Publication Type
| Journal Articles | 7 |
| Reports - Research | 6 |
| Reports - Evaluative | 3 |
| Reports - Descriptive | 2 |
| Speeches/Meeting Papers | 2 |
Education Level
| Higher Education | 2 |
| Junior High Schools | 2 |
| Middle Schools | 2 |
| Secondary Education | 2 |
| Elementary Secondary Education | 1 |
| Grade 10 | 1 |
| Grade 9 | 1 |
| High Schools | 1 |
| Postsecondary Education | 1 |
Audience
Location
| Taiwan | 1 |
| United Kingdom | 1 |
Laws, Policies, & Programs
| No Child Left Behind Act 2001 | 1 |
Assessments and Surveys
| SAT (College Admission Test) | 2 |
| Graduate Record Examinations | 1 |
| Praxis Series | 1 |
| Preliminary Scholastic… | 1 |
| Program for International… | 1 |
What Works Clearinghouse Rating
Andrei Ludu; Maria Ludu; Teha Cooks – Journal of Computers in Mathematics and Science Teaching, 2025
This paper presents research activity on computer-based mathematics learning to study the effectiveness of open-source teaching computer platforms (Canvas) in computer-assisted instruction. We designed a set of multiple-choice online quizzes as a dynamical flow-chart of possible paths to follow while solving a difficult math problem on…
Descriptors: Teaching Methods, Computer Assisted Instruction, Mathematics Education, Engineering Education
Hai Li; Wanli Xing; Chenglu Li; Wangda Zhu; Simon Woodhead – Journal of Learning Analytics, 2025
Knowledge tracing (KT) is a method to evaluate a student's knowledge state (KS) based on their historical problem-solving records by predicting the next answer's binary correctness. Although widely applied to closed-ended questions, it lacks a detailed option tracing (OT) method for assessing multiple-choice questions (MCQs). This paper introduces…
Descriptors: Mathematics Tests, Multiple Choice Tests, Computer Assisted Testing, Problem Solving
Kuo, Bor-Chen; Liao, Chen-Huei; Pai, Kai-Chih; Shih, Shu-Chuan; Li, Cheng-Hsuan; Mok, Magdalena Mo Ching – Educational Psychology, 2020
The current study explores students' collaboration and problem solving (CPS) abilities using a human-to-agent (H-A) computer-based collaborative problem solving assessment. Five CPS assessment units with 76 conversation-based items were constructed using the PISA 2015 CPS framework. In the experiment, 53,855 ninth and tenth graders in Taiwan were…
Descriptors: Computer Assisted Testing, Cooperative Learning, Problem Solving, Item Response Theory
Sangwin, Christopher J.; Jones, Ian – Educational Studies in Mathematics, 2017
In this paper we report the results of an experiment designed to test the hypothesis that when faced with a question involving the inverse direction of a reversible mathematical process, students solve a multiple-choice version by verifying the answers presented to them by the direct method, not by undertaking the actual inverse calculation.…
Descriptors: Mathematics Achievement, Mathematics Tests, Multiple Choice Tests, Computer Assisted Testing
Krupa, Erin; Webel, Corey; McManus, Jason – North American Chapter of the International Group for the Psychology of Mathematics Education, 2013
We share results from a quasi-experimental study in which we compared achievement between traditional lecture-based and computer-based sections of college algebra on a common multiple choice exam as well as performance on problem solving items. Students in the computer-based group performed better on the final exam and were also more likely to…
Descriptors: Algebra, Mathematics Instruction, Quasiexperimental Design, Comparative Analysis
Tucker, Bill – Educational Leadership, 2009
New technology-enabled assessments offer the potential to understand more than just whether a student answered a test question right or wrong. Using multiple forms of media that enable both visual and graphical representations, these assessments present complex, multistep problems for students to solve and collect detailed information about an…
Descriptors: Research and Development, Problem Solving, Student Characteristics, Information Technology
Singley, Mark K.; Bennett, Randy Elliot – 1995
One of the main limitations of the current generation of computer-based tests is its dependency on the multiple-choice item. This research was aimed at extending computer-based testing by bringing limited forms of performance assessment to it in the domain of mathematics. This endeavor involves not only building task types that better reflect…
Descriptors: Computer Assisted Testing, Item Analysis, Mathematics Tests, Multiple Choice Tests
Sheehan, Kathleen; Mislevy, Robert J. – 1994
The operating characteristics of 114 mathematics pretest items from the Praxis I: Computer Based Test were analyzed in terms of item attributes and test developers' judgments of item difficulty. Item operating characteristics were defined as the difficulty, discrimination, and asymptote parameters of a three parameter logistic item response theory…
Descriptors: Basic Skills, Computer Assisted Testing, Difficulty Level, Educational Assessment
Braswell, James S.; Jackson, Carol A. – 1995
A new free-response item type for mathematics tests is described. The item type, referred to as the Student-Produced Response (SPR), was first introduced into the Preliminary Scholastic Aptitude Test/National Merit Scholarship Qualifying Test in 1993 and into the Scholastic Aptitude Test in 1994. Students solve a problem and record the answer by…
Descriptors: Computer Assisted Testing, Educational Assessment, Guessing (Tests), Mathematics Tests
Peer reviewedBridgeman, Brent – Journal of Educational Measurement, 1992
Examinees in a regular administration of the quantitative portion of the Graduate Record Examination responded to particular items in a machine-scannable multiple-choice format. Volunteers (n=364) used a computer to answer open-ended counterparts of these items. Scores for both formats demonstrated similar correlational patterns. (SLD)
Descriptors: Answer Sheets, College Entrance Examinations, College Students, Comparative Testing
Peer reviewedO'Neill, Paula N. – Journal of Dental Education, 1998
Examines various methods for assessing dental students' learning in a problem-based curriculum, including objective structured clinical examination; clinical proficiency testing; triple jump evaluation (identifying facts, developing hypotheses, establishing learning needs to further evaluate the problem, solving the learning needs, presenting…
Descriptors: Allied Health Occupations Education, Clinical Teaching (Health Professions), Computer Assisted Testing, Curriculum Design

Direct link
