Assessing L2 English speaking using automated scoring technology: examining automarker reliability

Article Status
Published
Authors/contributors
Title
Assessing L2 English speaking using automated scoring technology: examining automarker reliability
Publication
Assessment in Education: Principles, Policy & Practice
Date
2021-7-4
Volume
28
Issue
4
Pages
411-436
Journal Abbr
Assessment in Education: Principles, Policy & Practice
Citation Key
xu2021a
Accessed
14/06/2024, 04:04
ISSN
0969-594X
Short Title
Assessing L2 English speaking using automated scoring technology
Language
en
Library Catalogue
DOI.org (Crossref)
Extra
<标题>: 使用自动评分技术评估二语英语口语:考察自动评分器的可靠性 <AI Smry>: A study that investigated the reliability of an automarker using candidate responses produced in an online oral English test found that the automarkser was slightly more lenient than examiner fair average scores, particularly for low-proficiency speakers. Read_Status: New Read_Status_Date: 2026-01-26T11:33:53.531Z
Citation
Xu, J., Jones, E., Laxton, V., & Galaczi, E. (2021). Assessing L2 English speaking using automated scoring technology: examining automarker reliability. Assessment in Education: Principles, Policy & Practice, 28(4), 411–436. https://doi.org/10.1080/0969594X.2021.1979467
Powered by Zotero and Kerko.