Outcome Assessment of Learning: Content Analysis of Asynchronous Online Discussion

Audience Level: 
All
Session Time Slot(s): 
Institutional Level: 
Higher Ed
Strands (Select 1 top-level strand. Then select as many tags within your strand as apply.): 
Abstract: 

Designing assignments that optimize the educational value of asynchronous discussion tools requires careful consideration of strategies to evaluate learning outcomes. The current study utilized a post priori analysis of students’ responses to an online discussion board assignment to identify organically emerging indicators of learning outcomes.

Extended Abstract: 

Use of learning management systems (LMS) has become prevalent across the continuum of education formats (Allen and Seaman, 2013). Specific asynchronous tools such as the discussion board have been shown to be effective tools for student-instructor as well as student- student communication (Calderon, Ginsberg and Ciabocchi, 2013; Ke, 2013); constructivist learning (Ke, 2013; Lane, 2014), and critical thinking skills (Williams & Lahman, 2011).  However, the literature suggests several challenges in designing, and especially in assessing online discussion assignments. There is a lack of systematic comprehensive analysis of online discussion assignments for the purpose of identifying direct learning dimensions that can be objectively evaluated. Current work in this area includes a priori parameters of critical thinking skills (Lai, 2012), focusing on  instrumental dimensions such as length of communication instances (Brooks & Bippus, 2012); and indirect measures of learning such as analysis of students’ reaction to the  assignment (Mathews & La Tronica-Herb, 2013) rather than an assessment of learning that has occurred as a result of the assignment, or analysis of instructors’ perceptions of students’ skills development in course  (Klisc, McGill & Hobbs, 2009).  Seeking to fill this gap, and to gain a better understanding of learning dimensions that emerge in response to an online discussion assignment, the current study utilized a post priori content analysis of students’ responses to an online discussion prompt in order to identify organically emerging indicators of learning for the purpose of creating an evaluation rubric for such assignments.  The analysis was guided by the following research question:  What are the organically emerging patterns in content and structure of discussion board posts that can indicate quality of learning?

This presentation describes the assignment, explains the data analysis methodology and presents a summary of the results with the hope that participants will Participants will become familiar with challenges associated with outcome assessment of online discussion assignments, learn  strategies for content analysis of students’ post on a discussion board learning assignment, and  explore the relevance of the method described here to their own teaching and outcome assessment of students learning in an online environment. 

 Participants

The participants included 49 graduate students in a large private university in New York, USA who were enrolled in a psychopathology class. The participants (mostly females) represent three cohorts (spring semesters of 2012, 2013 and 2014, respectively), all of which have been evaluated on the same assignment.  The students were required to a case study (adapted from Morrison, 1994 and identical across the three cohorts) and respond to a prompt by instructor regarding the proper diagnosis and treatment plan /or next steps in addressing the issues presented in the case study. Students were divided into groups and were instructed to respond to each other within their respective groups and to reach a consensus in the group in regards to the diagnosis.  Students were given 5 days to complete the assignment. A total of 83 responses were analyzed.  To maintain confidentiality, all responses were de-identified and numerically coded. 

The Assignment

Consistent with Ke (2013) and with Williams and Lahman (2011), who advocate a group approach in asynchronous online discussion assignments to promote peer-to- peer interaction, the assignment in this study was designed as a group discussion. Each class was randomly divided into 3 or 4 groups, typically consisting of three to four students.  Students had access to their own group’s asynchronous discussion thread, each of which also included the instructor, who had access to all the groups, respectively.   For all three cohorts, students in each group were required to read a weekly case study (this analysis is based on one case study, identical to all groups across the three cohorts) and respond to a prompt by the instructor regarding the proper diagnosis and treatment plan /or next steps in addressing the issues presented in the case study.  The case study focused on a topic to be discussed in class the following week and required students to complete the weekly assigned readings (as per the syllabus) in order to correctly respond to the prompt. The prompt to all the groups was identical and was posted by the instructor 6 days before the class was scheduled to meet face-to-face. The prompt directed the students to the location on the course LMS’ dashboard where the case study could be found, and included the question and the due date (24 hours before the class met face to face). It is important to note the prompt in 2014 also included explicit instructions for students to respond to each other's posts.  Such language was not included in the previous semesters, although the instructor orally encouraged the students to do so.  The instructor read all posts on the due date and provided feedback to each group, respectively.   

 

Data Analysis

All posts were de-identified and numerically coded by student, group and by cohort.  The content of students’ posts was analyzed using inductive constant comparison method, identifying units of meaning within each response, grouping units of meaning across responses into categories and generating themes. The researcher and her assistant conducted an initial joint analysis of sample random responses, followed by independent analyses for the purpose of testing inter-rater reliability. The researches then compared their results, discussed areas of disagreement and continued to engage in analysis until a point of saturation has been reached when no new categories or themes have emerged.  Additionally, responses were analyzed for promptness (determined by the relative sequential place of the response on the thread) and for frequency.

Results  

A total of 83 responses were analyzed.  There were 23 responses that involved disagreement between raters leading to 72% inter-rater agreement on the content analysis. These disagreements focused on variability in grouping identified units of meaning into categories. There was a complete agreement on most categories, with the exception of one and on all the generated themes.

Themes and Categories Emerging from the Data

Themes

Contextual Communication

Inter-personal Communication

Reflective Communication

Effort and Motivation

Categories

Diagnostic impressions

Positive feedback/encouragement

Reflection on the process of learning

Promptness of responses

 

Rationale for diagnostic impressions

Constructive criticism

Reflection on the quality of the material

Frequency of posts

 

Treatment plan

Engaging peers in conversation

 

Professional development

 

Discussion

The post priori themes that have emerged from the current analysis reflect the findings from previous a priori analyses (Ke, 2013). However, the current findings reveal that a comprehensive post priori analysis yields richer dimensions of online communication via a discussion board. For example, while the current analysis yielded many “agree” or “disagree” responses, as Lane (2014) cautions, these responses were often accompanied by rationale for the agreement or disagreement with peers, thus suggesting that the students engaged in constructivist learning.  The results also suggest that students engage in contextual learning of the material as evident in responses that addressed diagnostic impressions and rationale.  Of particular interest were results that suggested learning dimensions that are unique to the online format. For example, within the Inter-personal Communication theme, categories of responses included patterns of peer-to-peer communication that were characterized by positive  feedback and encouragement, providing constructive criticism and engaging peers  in   the conversation.  In a traditional face to face interaction, these patterns of communication tend to emerge in a didactic instructor-student interaction. However, in the online learning environment, students demonstrated learning of complex, and collaborative, inter-personal communication skills.  Similarly, the results suggest that students’ responses demonstrate a reflective process that extends beyond contextual learning.  For example, students reflected not only on the material in the prompt, but also on the process of learning, sharing what aspects of the assignment was enjoyable or challenging, thus suggesting a spontaneous and an independent development of meta learning skills that, unless directly prompted by the instructor, is typically missing in the time and space limited face to face environment.

Thus, the results indicate evidence of direct learning that encompass development of contextual learning as well as of communication and meta-learning skills. The identified themes serve as the dimensions of learning with the respective categories representing specific skills within each dimension, all of which can be ranked to yield indicators of successful learning. These dimensions and skills can be utilized deductively to evaluate students’ responses to an online discussion board assignments across curricular area for the purpose of direct learning outcome assessment.

References

Allen, I.E., & Seaman, J, (2013).  Changing Course: Ten Years of Tracking Online Education in

the United States.  Babson Survey Research Group.

Brooks, CF & Bippus, AM (2012). Underscoring the Social Nature of Classrooms by Examining the Amount of Virtual Talk across Online and Blended College Courses.  European Journal of Open, Distance and E-Learning.

Calderon, O., Ginsberg, P. A., & Ciabocchi, L. (2012). Multidimensional Assessment of Blended

Learning: Maximizing Program Effectiveness Based on Student and Faculty Feedback.

JALN- Journal of Asynchronous Learning Networks, 16 (4), 23-37.

doi: 10.1080/10437797.2013.796767

Ke, F (2013). Online Interaction Arrangements on Quality of Online Interactions Performed by Diverse Learners across Disciplines. Internet and Higher Education, 16, 14-22.

Klisc, C, McGill, T, & Hobbs, V (2009). The Effect of Assessment on the outcomes of asynchronous online discussion as perceived by instructors. Australian Journal of Educational Technology, 25(5), 666-682.

Lai, K (2012). Assessing participation skills: online discussions with peers.  Assessment &

 Evaluation in Higher Education, 37(8), 933-947.

Lane, LM (2014). Constructing the Past Online: Discussion Board as History Lab. History

Teacher, 47(2), 197-207.

Mathews, AL & La Tronica-Herb, A (2013). Using Blackboard to Increase Student Learning and Assessment Outcomes in a Congressional Simulation. Journal of Political Science Education,

9(2), 168-183.

Williams, L & Lahman, M (2011). Online Discussion, Student Engagement, and Critical Thinking. Journal of Political Science Education, 7(2), 143-162.

 

 

Conference Session: 
Concurrent Session 3
Session Type: 
Education Session - Research Highlights