Automated language essay scoring systems: a literature review

Article Status
Published
Authors/contributors
Title
Automated language essay scoring systems: a literature review
Abstract
Writing composition is a significant factor for measuring test-takers’ ability in any language exam. However, the assessment (scoring) of these writing compositions or essays is a very challenging process in terms of reliability and time. The need for objective and quick scores has raised the need for a computer system that can automatically grade essay questions targeting specific prompts. Automated Essay Scoring (AES) systems are used to overcome the challenges of scoring writing tasks by using Natural Language Processing (NLP) and machine learning techniques. The purpose of this paper is to review the literature for the AES systems used for grading the essay questions.
Publication
PeerJ Computer Science
Volume
5
Pages
e208
Date
2019-8-12
Journal Abbr
PeerJ Comput. Sci.
Language
en
ISSN
2376-5992
Short Title
Automated language essay scoring systems
Accessed
08/11/2024, 16:39
Library Catalogue
DOI.org (Crossref)
Extra
Citation Key: hussein2019 <标题>: 自动化语言作文评分系统:文献综述 <AI Smry>: The purpose of this paper is to review the literature for the AES systems used for grading the essay questions, and presents a structured literature review of the available Handcrafted Features AES systems.
Citation
Hussein, M. A., Hassan, H., & Nassef, M. (2019). Automated language essay scoring systems: a literature review. PeerJ Computer Science, 5, e208. https://doi.org/10.7717/peerj-cs.208
Powered by Zotero and Kerko.