Perfecting the blend of communication during teaching/learning events, whether in face-to-face or online environments is a challenge. This session will offer a framework for analyzing face-to-face and online discourse in blended or hybrid courses. Participants will share in discourse analysis techniques that can provide insight about engaging learners in both online and f2f environments.
Examining Interaction in Blended Courses: SCOPe of Analysis
Perfecting the blend of communication during teaching/learning events, whether in face-to-face or online environments is a challenge. How to measure the communication blend has become a critical part of online research in recent years as blended environments have proliferated (Anderson, 2003; Melton, Graf, & Chopak-Foss, 2009). .
Session Outcomes
1. Participants will gain a deeper understanding of discourse analysis methodology through the use of Plickers (I.e. Paper Clickers) to interactively code discourse as a group.
2. Participants will suggest conclusions about the language-in-use of blended course participants.
3. Participants will categorize discourse episodes into a framework for analysing face-to-face and online discourse in blended courses.
Research Question: How can instructors perfect an optimum blend of communication episodes in blended courses?
Sub-Questions:
a. How do instructors measure the interaction that occurs in online and f2f environments "on the fly?"
b. What do instructors do with the information, once they have tabulated results of online discourse analysis?
c. Why should instructors pay attention to the discourse that occurs in blended courses?
Literature Review
Researchers have long understood that engagements and interactions need to be an intentional part of the instructional design of any successful face-to-face or online course (Author, 2007; Althaus, 1997; King & Doerfert, 1996; Smith, 2005; Spring, Graham, & Ikahihifo, 2018; Zhang, Perris, and Yeung, 2005). In fact, students feel transactional distance (Moore, 1991), or a psychological gap when they are separated geographically from instructors in online courses. Authentic interactions in online courses can ameliorate this gap and encourage social presence (Moore, 1991; Short, Williams, & Christie, 1976; Stein, Wanstreet, Calvin, Overtoom, & Wheaton, 2005). In fact, students reported feeling better equipped and knowledgeable on outcome achievement when interactions were deepened (López-Pérez, Pérez-López, & Rodríguez-Ariza, 2011). But how can instructors design deep, authentic interactions and gauge their students' interactions?
Careful structure of blended courses can also reduce transactional distance, since students and instructors meet together physically during the course of the semester (Anderson, 2003; Bonk, Olson, Wisher, & Orvis, 2002; Chen, & Jones, 2007; Melton, Graf, & Chopak-Foss, 2009). In terms of learning effectiveness, blended courses were found to offer small to moderate positive effect sizes in when compared to fully online or fully traditional or face-to-face learning environments (Dziuban, Graham, Moskal, Norberg, & Sicilia, 2018).
Research Methodology
Discourse analysis techniques offer one way to understand how participants use language in blended courses (Author, 2012; Gunawardena, Lowe, & Anderson, 1998; Imm, & Stylianou, 2012). Traditionally, researchers analyzed face-to-face interactions for the ways that participants used their language, or the moves they made (Britton, 1993; Cazden, 1988; Dillon, 1994). In online courses, discussion boards provide written speech that is easily captured and analyzed for moves, or reasons for communication (Althaus, 1997; Gunawardena, Lowe, & Anderson, 1998).
Students who are highly engaged and who participate often in online and face-to-face courses are more successful in courses and achieve higher grades (Author, 2007; 2012). The nexus of high participation (measured by counting the number of words used in discussion board forums or f2f discussions) and high engagement (measured by counting the variety of moves used in discussion board forums or f2f discussions) has been called a Connected Stance (Author, 2012). In several studies of asynchronous online discussion boards, as well as synchronous face-to-face discussions, students who assumed a Connected Stance earned higher grades and reported that they had high rates of course satisfaction, than students who assumed a Disconnected Stance, or those who participated less than average and used fewer moves for communicating.
In order to determine which students have assumed a Connected Stance, researchers need to analyze language patterns and assign codes to language-in-use. However, face-to-face or synchronous online sessions pose a unique challenge to instructors who wish to analyze the discourse in their own classes.
Research Results and Discussion
In order to meet this challenge, some researchers suggest using meta-coding schemes during communication events, in order to capture real-time language-in-use. The potential number of moves, or reasons why people communicate (Cazden, 1988), is vast. Time constraints and content coverage often impede instructors’ willingness to analyze discourse, especially during face-to-face learning events. The SCOPe Framework (Author, 2013; 2018) provides one technique that instructors can employ to gather a sample of discourse moves during an interaction event. This presentation will demonstrate one such technique and provide a framework that divides communication events into four categories: Self -referencing language, Content-referencing language, Other-referencing language, and Platform-referencing language, or the SCOPe framework. Using four broad moves, instructors can identify the ways in which students use their language in academic settings. The presentation will also reveal results of ongoing research of online and f2f communication events.
Presentation Engagement
During the presentation, participants will experience how to analyze discussion board data and oral discourse data. They will also discuss various interpretations of the data, drawing conclusions about what the data show are engaging teaching practices. Finally, participants will categorize the data into the framework and suggest ways in which the blended course structure might strengthen the communication episodes.
Bibliography
Author (2007)
Author (2012)
Author (2013)
Author (2018)
Althaus, S.L. (1997). Computer-mediated communication in the university classroom: An experiment with online discussions. Communication Education, 46, 158-174.
Anderson, T. (2003). Getting the mix right again: An updated and theoretical rational for interaction. The International Review of Research in Open and Distance Learning, 4. (2). Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/149/230
Bonk, C.J., Olson, T.M., Wisher, R.A., & Orvis, K.L. (2002). Learning from focus groups: An examination of blended learning. Journal of Distance Education, 17(3), 97-118.
Britton, J. (1993). Language and learning. Portsmouth, NH: Boynton/Cook.
Cazden, C. (1988) Classroom discourse. Portsmouth, NH: Heinemann.
Chen, C.C., & Jones, K.T. (2007). Blended learning vs. traditional classroom settings: Assessing effectiveness and student perceptions in an mba accounting course. The Journal of Educators Online, 4(1), 1-15.
Dillon, M. (1994). Using discussion in classrooms. Philadelphia: Open University Press.
Dziuban, C., Graham, C. R., Moskal, P. D., Norberg, A., & Sicilia, N. (2018). Blended learning: the new normal and emerging technologies. International Journal of Educational Technology in Higher Education, 15(1), 3.
Gunawardena, C. N., Lowe, C. A., & Anderson, T. (1998). Transcript analysis of computer-mediated conferences as a tool for testing constructivist and social-constructivist learning theories. In Distance learning ‘98. Proceedings of the annual conference on distance teaching & learning (pp. 139–145). EDRS document ED 422854
Imm, K., & Stylianou, D. A. (2012). Talking Mathematically: An Analysis of Discourse Communities. Journal Of Mathematical Behavior, 31(1), 130-148.
King, J. C., & Doerfert, D. L. (1996). Interaction in the distance education setting. Retrieved from http://www.ssu.missouri.edu/ssu/AgEd/NAERM/s-e-4.htm
López-Pérez, M., Pérez-López, M. C., & Rodríguez-Ariza, L. (2011). Blended learning in higher education: Students’ perceptions and their relation to outcomes. Computers & Education, 56(3), 818-826
Melton, B., Graf, J., & Chopak-Foss, J. (2009). Achievement and satisfaction in blended learning versus traditional general health course designs. International Journal for the Scholarship of Teaching and Learning, 3(1).
Mondada, L. (2006). Participants’ online analysis and multimodal practices: Projecting the end of the turn and the closing of the sequence. Discourse Studies, 8(1), 117-129.
Moore, M. G. (1991). Distance education theory. The American Journal of Distance Education, 5(3), 1-6.
Short, J., Williams, E., & Christie, C. (1976). The social psychology of telecommunications. London: John Wiley & Sons.
Smith, R. (2005). Working with difference in online collaborative groups. Adult Education Quarterly, 55(3), 182-199.
Spring, K. J., Graham, C. R., & Ikahihifo, T. B. (2018). Learner Engagement in Blended Learning. In Encyclopedia of Information Science and Technology, Fourth Edition (pp. 1487-1498). IGI Global.
Stein, D. S., Wanstreet, C. E., Calvin, J., Overtoom, C., & Wheaton, J. E. (2005). Bridging the transactional distance gap in online learning environments. The American Journal of Distance Education, 19(2), 105.
Zhang, W., Perris, K., & Yeung, L. (2005). Online tutorial support in open and distance learning: students’ perceptions. British Journal of Educational Technology, 36(5), 789-804.