Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 2 |
Descriptor
Source
Journal of Educational… | 3 |
Author
Bodmann, Shawn M. | 1 |
Chen, Cheng-Huan | 1 |
Clariana, Roy B. | 1 |
Hu, Yue | 1 |
Prestera, Gustavo E. | 1 |
Robinson, Daniel H. | 1 |
Su, Chien-Yuan | 1 |
Publication Type
Journal Articles | 3 |
Information Analyses | 1 |
Numerical/Quantitative Data | 1 |
Reports - Evaluative | 1 |
Reports - Research | 1 |
Tests/Questionnaires | 1 |
Education Level
Higher Education | 3 |
Postsecondary Education | 2 |
Elementary Education | 1 |
High Schools | 1 |
Junior High Schools | 1 |
Middle Schools | 1 |
Secondary Education | 1 |
Audience
Location
Asia | 1 |
Europe | 1 |
North America | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Hu, Yue; Chen, Cheng-Huan; Su, Chien-Yuan – Journal of Educational Computing Research, 2021
Block-based visual programming tools, such as Scratch, Alice, and MIT App Inventor, provide an intuitive and easy-to-use editing interface through which to promote programming learning for novice students of various ages. However, very little attention has been paid to investigating these tools' overall effects on students' academic achievement…
Descriptors: Instructional Effectiveness, Programming Languages, Computer Science Education, Computer Interfaces
Clariana, Roy B.; Prestera, Gustavo E. – Journal of Educational Computing Research, 2009
This experimental investigation replicates previous investigations of the effects of left margin screen background color hue to signal lesson sections on declarative knowledge and extends those investigations by adding a measure of structural knowledge. Participants (N = 80) were randomly assigned to receive 1 of 4 computer-based lesson treatments…
Descriptors: Memory, Instructional Materials, Replication (Evaluation), Correlation
Bodmann, Shawn M.; Robinson, Daniel H. – Journal of Educational Computing Research, 2004
This study investigated the effect of several different modes of test administration on scores and completion times. In Experiment 1, paper-based assessment was compared to computer-based assessment. Undergraduates completed the computer-based assessment faster than the paper-based assessment, with no difference in scores. Experiment 2 assessed…
Descriptors: Computer Assisted Testing, Higher Education, Undergraduate Students, Evaluation Methods