Out with the Old: Reviewing the Effectiveness of Individualized Feedback in Online Writing Courses

Audience Level: 
All
Institutional Level: 
Higher Ed
Abstract: 

This study proposes that collective feedback in online writing courses (and any courses using writing assessments) may positively impact student performance while providing opportunities for authentic online writing practices. This mixed methods study uses student performance data in conjunction with learning analytics to understand how students engage with collective feedback.

Extended Abstract: 

Online writing instruction (OWI) too often mimics the writing process used in the traditional classroom. By and large, online writing instructors continue to turn to the draft and feedback strategies used in traditional classrooms to help students acquire the skills necessary to be effective writers. In the traditional classroom, this process becomes a dialogue between instructor and student who translates specific, individualized comments provided by the instructor into content revisions. In the online space, writing instructors use this same process to try to create social presence in the course and to mimic face-to-face instructor-student interactions. However, what would ideally culminate into strategic revisions and meta-cognitive awareness of writing on the students’ parts devolves into individualized, lengthy, marginal comments that are instructor time-intensive yet often go unseen or unacknowledged by the student.

A previous study in collective feedback in online learning revealed there is more room for research in this area. Gallien and Oomen-early (2008) studied the quantitative impact of collective feedback over individualized commenting in an online health course by reviewing final grades for the two sections. While their quantitative data revealed that students who received individualized commenting performed better in their final grades, their qualitative data showed a different effect. Students reported not feeling any more or less connected to the instructor given their method of receive feedback. This study brought to light two important points: (1) instructor presence through feedback may not be as impactful in creating presence as we would like to believe and (2) more research is needed that takes a more targeted review the impact collective feedback has on student performance formative and summative assessments.

Therefore, despite studies that reveal that individualized commenting may not build the social presence instructors intend, individualized commenting remains a long-standing practice in online writing assessment pedagogy. Yet, term after term, instructors voice frustrations about whether or not students use (or even review) the feedback provided to them to improve their writing. However, without alternative solutions, instructor burnout, time constraints, and other factors lead this traditional feedback strategy to become a rote practice of copying and pasting recurring comments addressing the same errors across multiple student drafts. Essentially, instructors perpetuate the “genre of the end comment” (Smith, 1997) that scholars have long warned against. Feedback eventually becomes meaningless and useless to both student and instructor.

This research seeks to fill the gap within online feedback studies and TPC online pedagogy studies by placing into conversation the community of inquiry model and the significance of social presence with the call for more effective, authentic online writing assessment practices within Technical and Professional Communication service courses (and, more broadly, any courses at large using writing assessments). This research argues that an effective alternative to individualized commenting does exist that serves multiple purposes: to prepare students for authentic, online collaboration practices (Dannel, 2011; Paretti, 2006, 2008), to answer the call for updated OWI pedagogy beyond those adapted from traditional classrooms (Dannel, 2000), and to allow online instructors to reallocate time to other, more impactful teaching practices that contribute more specifically to creating a positive social presence within the community of inquiry model (Anderson, 2008; Richardson and Swan, 2003; Garrison, Anderson and Archer, 2001).

Additionally, this study incorporates the use of learning management system (LMS) data analytics, or learning analytics (LA), to further quantitatively understand how students use and access feedback within an online writing course. LA have been helpful tools to identify at-risk student behaviors and to reveal important self-regulated learning practices (Dietz-Uhler & Hurn, 2013; McFayden & Dawson, 2010; You, 2016). However, LA research warns of relying too heavily on the data without understanding its strategical impact on course design. This study seeks to fill that space by using LA to understand student behaviors that may help instructors better design their course to deliver feedback in a way that students will use. Ultimately, LA regarding student access to feedback may finally answer the long-standing question: “Do my students read my comments?” This study will review the use LA of three online writing courses to understand if students accessed the collective feedback, how students accessed the feedback, and any frequency of access data available.

In the end, this study’s findings reveal that the adoption of the innovative methodology of collective feedback may improve student performance in the online writing classroom. Additionally, though, this study seeks to use learning analytics to better understand if students access instructor feedback in online writing courses. Based on this data and student performance with collective feedback, ultimately, this study asks, is time for out with the old and in with the new?

Conference Track: 
Learning Effectiveness
Session Type: 
Education Session
Intended Audience: 
Faculty
Instructional Support
Researchers