You are here

Automated Essay Scoring in an English as a Second Language Setting

Title: Automated Essay Scoring in an English as a Second Language Setting.
Name(s): Dikli, Semire, author
Hasson, Deborah, professor directing dissertation
Jeong, Allan, outside committee member
Kennell, Patrick, committee member
Wood, Susan, committee member
Department of Middle and Secondary Education, degree granting department
Florida State University, degree granting institution
Type of Resource: text
Genre: Text
Issuance: monographic
Date Issued: 2007
Publisher: Florida State University
Place of Publication: Tallahassee, Florida
Physical Form: computer
online resource
Extent: 1 online resource
Language(s): English
Abstract/Description: The main purpose of this study was to explore how two ESL students who are exposed to the AES feedback as opposed to two who are presented the written TF incorporated the type of feedback they received into their drafts. The participants consisted of adult ESL students who were attending at the Intensive English Center at a university in North Florida. A class of 12 students was divided into two groups. Approximately half of the students were exposed to computerized feedback (AES group) and the other half received written feedback from the teacher (TF group). However, the focus of this study was four case study students (two from each group). The data were collected through various sources: a) diagnostic essays, b) student essays on five writing prompts (both first and subsequent drafts), c) analytic and/or holistic feedback that were assigned to the essays either by the MY Access!® program or by the teacher, d) demographic, computer literacy, and opinion surveys, e) student and teacher interviews, and f) classroom observations. The results of the study revealed that the nature of the AES feedback and written feedback was different from each other. While the written TF was shorter and more focused, the AES feedback was quite long and generic. The MY Access!® program provided extensive amount of feedback points on all five traits. The document (essay) analysis results revealed that the program suggested twice as many usable feedback points as written feedback points provided by the teacher. However, the students who were exposed to the MY Access!® program used only the half of the usable feedback points suggested. The results also showed that both feedback pairs were quite similar within pairs, and they were substantially different across pairs in terms of the feedback points they were suggested on five traits. Furthermore, while the extent to which each pair used the type of feedback they received in their drafts was quite similar within pairs for most traits, it varied dramatically across pairs for all prompts with the exception of the mechanics and conventions feedback. This study is unique because there has been no research published regarding the use of an AES system in an ESL classroom setting at the time of this study was being conducted. It is the only study that focused on the feedback capacities of an AES program rather than its scoring ability.
Identifier: FSU_migr_etd-0080 (IID)
Submitted Note: A Dissertation submitted to the Department of Middle and Secondary Education in partial fulfillment ofthe requirements for the degree of Doctor ofPhilosophy.
Degree Awarded: Degree A warded: Summer Semester, 2007.
Date of Defense: Date of Defense: December 8, 2006.
Keywords: Writing, Assessment, Technology, ESL
Bibliography Note: Includes bibliographical references.
Advisory committee: Deborah Hasson, Professor Directing Dissertation; Allan Jeong, Outside Committee Member; Patrick Kennell, Committee Member; Susan Wood, Committee Member.
Subject(s): Education
Persistent Link to This Record:
Host Institution: FSU

Choose the citation style.
Dikli, S. (2007). Automated Essay Scoring in an English as a Second Language Setting. Retrieved from