ERIC Number: ED672932
Record Type: Non-Journal
Publication Date: 2025
Pages: 15
Abstractor: As Provided
ISBN: N/A
ISSN: N/A
EISSN: N/A
Available Date: 0000-00-00
A Multimodal Interactive Framework for Science Assessment in the Era of Generative Artificial Intelligence
Yizhu Gao1; Xiaoming Zhai1; Min Li2; Gyeonggeon Lee3; Xiaoxiao Liu4
Grantee Submission
The rapid evolution of generative artificial intelligence (GenAI) is transforming science education by facilitating innovative pedagogical paradigms while raising substantial concerns about scholarly integrity. One particularly pressing issue is the growing risk of student use of GenAI tools to outsource assessment tasks, potentially compromising authentic learning and evaluations. Addressing these challenges requires reflection on existing assessment practices and features. This position paper advances a conceptual framework for science assessment through the lens of "multimodality" and "interactivity." Multimodality emphasizes the use of diverse, organized semiotic resources for meaning making, while interactivity characterizes assessment environments where outcomes are shaped by students' actions. With the two dimensions, our multimodal interactive framework classifies assessments into four categories, with varying degrees of modality and interactivity. We argue that tasks with higher modality and interactivity can potentially overcome the concerns of GenAI on academic integrity. To further articulate this point, we provide concrete assessment examples for each category and explain how the prompt and response affordances in each assessment category help gauge students' understandings of key science constructs and identify tasks that are resistant or susceptible to AI-based outsourcing. We conclude by discussing how the framework serves as a meaningful analytical tool for educational researchers and practitioners. [This is the online first version of an article published in "Journal of Research in Science Teaching."]
Descriptors: Artificial Intelligence, Computer Software, Science Education, Integrity, Risk, Outsourcing, Student Evaluation, Authentic Learning, Evaluation Methods, Semiotics, Barriers, Guidelines, Science Tests, Scientific Concepts, Concept Formation, Learner Engagement, Test Format, Interaction Process Analysis, Academic Standards, Elementary Secondary Education
Publication Type: Reports - Evaluative
Education Level: Elementary Secondary Education
Audience: N/A
Language: English
Sponsor: Institute of Education Sciences (ED)
Authoring Institution: N/A
IES Funded: Yes
Grant or Contract Numbers: R305C240010
Department of Education Funded: Yes
Author Affiliations: 1Department of Mathematics, Science and Social Studies Education, University of Georgia, Athens, Georgia, USA; 2College of Education, University of Washington, Seattle, Washington, USA; 3National Institute of Education, Nanyang Technological University, Jurong West, Singapore; 4Department of Educational Psychology, University of Alberta, Edmonton, Alberta, Canada