NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1303530
Record Type: Journal
Publication Date: 2021-Sep
Pages: 12
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-0036-8326
EISSN: N/A
Available Date: N/A
Can AI Be Racist? Color-Evasiveness in the Application of Machine Learning to Science Assessments
Science Education, v105 n5 p825-836 Sep 2021
Assessment developers are increasingly using the developing technology of machine learning in transforming how to assess students in their science learning. I argue that these algorithmic models further embed the structures of inequality that are pervasive in the development of science assessments in how they legitimize certain language practices that protect the hierarchical standing of status quo interests. My argument is situated within the broader emerging ethical challenges around this new technology. I apply a raciolinguistic equity analysis framework in critiquing the "new black box" that reinforces structural forms of discrimination against the linguistic repertoires of racially marginalized student populations. The article ends with me sharing a set of tactical shifts that can be deployed to form a more equitable and socially-just field of machine learning enhanced science assessments.
Wiley. Available from: John Wiley & Sons, Inc. 111 River Street, Hoboken, NJ 07030. Tel: 800-835-6770; e-mail: cs-journals@wiley.com; Web site: https://www-wiley-com.bibliotheek.ehb.be/en-us
Publication Type: Journal Articles; Reports - Evaluative
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: N/A