TY - JOUR
T1 - Designing Academic Writing Analytics for Civil Law Student Self-Assessment
AU - Knight, Simon
AU - Buckingham Shum, Simon
AU - Ryan, Philippa
AU - Sándor, Ágnes
AU - Wang, Xiaolong
N1 - Publisher Copyright:
© 2016, The Author(s).
PY - 2018/3/1
Y1 - 2018/3/1
N2 - Research into the teaching and assessment of student writing shows that many students find academic writing a challenge to learn, with legal writing no exception. Improving the availability and quality of timely formative feedback is an important aim. However, the time-consuming nature of assessing writing makes it impractical for instructors to provide rapid, detailed feedback on hundreds of draft texts which might be improved prior to submission. This paper describes the design of a natural language processing (NLP) tool to provide such support. We report progress in the development of a web application called AWA (Academic Writing Analytics), which has been piloted in a Civil Law degree. We describe: the underlying NLP platform and the participatory design process through which the law academic and analytics team tested and refined an existing rhetorical parser for the discipline; the user interface design and evaluation process; and feedback from students, which was broadly positive, but also identifies important issues to address. We discuss how our approach is positioned in relation to concerns regarding automated essay grading, and ways in which AWA might provide more actionable feedback to students. We conclude by considering how this design process addresses the challenge of making explicit to learners and educators the underlying mode of action in analytic devices such as our rhetorical parser, which we term algorithmic accountability.
AB - Research into the teaching and assessment of student writing shows that many students find academic writing a challenge to learn, with legal writing no exception. Improving the availability and quality of timely formative feedback is an important aim. However, the time-consuming nature of assessing writing makes it impractical for instructors to provide rapid, detailed feedback on hundreds of draft texts which might be improved prior to submission. This paper describes the design of a natural language processing (NLP) tool to provide such support. We report progress in the development of a web application called AWA (Academic Writing Analytics), which has been piloted in a Civil Law degree. We describe: the underlying NLP platform and the participatory design process through which the law academic and analytics team tested and refined an existing rhetorical parser for the discipline; the user interface design and evaluation process; and feedback from students, which was broadly positive, but also identifies important issues to address. We discuss how our approach is positioned in relation to concerns regarding automated essay grading, and ways in which AWA might provide more actionable feedback to students. We conclude by considering how this design process addresses the challenge of making explicit to learners and educators the underlying mode of action in analytic devices such as our rhetorical parser, which we term algorithmic accountability.
KW - Argumentation
KW - Civil law
KW - Learning analytics
KW - Natural language processing
KW - Participatory design
KW - Rhetoric
KW - Writing analytics
UR - http://www.scopus.com/inward/record.url?scp=85043300510&partnerID=8YFLogxK
U2 - 10.1007/s40593-016-0121-0
DO - 10.1007/s40593-016-0121-0
M3 - Article
SN - 1560-4292
VL - 28
SP - 1
EP - 28
JO - International Journal of Artificial Intelligence in Education
JF - International Journal of Artificial Intelligence in Education
IS - 1
ER -