Investigating the promise of automated writing evaluation for supporting formative writing assessment at scale

Date
2022-01-23
Journal Title
Journal ISSN
Volume Title
Publisher
Assessment in Education: Principles, Policy and Practice
Abstract
We investigated the promise of a novel approach to formative writing assessment at scale that involved an automated writing evaluation (AWE) system called MI Write. Specifically, we investigated elementary teachers’ perceptions and implementation of MI Write and changes in students’ writing performance in three genres from Fall to Spring associated with this implementation. Teachers in Grades 3–5 (n = 14) reported that MI Write was usable and acceptable, useful, and desirable; however, teachers tended to implement MI Write in a limited manner. Multilevel repeated measures analyses indicated that students in Grades 3–5 (n = 570) tended not to increase their performance from Fall to Spring except for third graders in all genres and fourth graders’ narrative writing. Findings illustrate the importance of educators utilising scalable formative assessments to evaluate and adjust core instruction.
Description
This is an Accepted Manuscript of an article published by Taylor & Francis in Assessment in Education: Principles, Policy and Practice on 01/23/2022, available online: http://www.tandfonline.com/10.1080/0969594X.2022.2025762. This article will be embargoed until 07/23/2023.
Keywords
Writing assessment, elementary, automated essay scoring, automated writing evaluation
Citation
Joshua Wilson, Matthew C. Myers & Andrew Potter (2022) Investigating the promise of automated writing evaluation for supporting formative writing assessment at scale, Assessment in Education: Principles, Policy & Practice, DOI: 10.1080/0969594X.2022.2025762