Should You Use Interactive Content? – A Research Story

Audience Level: 
Intermediate
Session Time Slot(s): 
Institutional Level: 
Higher Ed
Streamed: 
Streamed
Abstract: 

Does the addition of interaction and choice into static documents and video lead to better student learning outcomes?   Come experience the research design intended to answer that question and join us in a discussion about the role of student - content interaction in online and blended learning.

Extended Abstract: 
Session Description:

We don’t want session participants just to hear about our research, we want them to experience it.  We will kick off our session by dividing the audience into control and experimental groups.  During the first ten to fifteen minutes of the presentation, participants will engage in a mini-experiment that is modeled after our research project.  All participants will take a pre-assessment quiz.  Then, the control group reviews traditional content (on the topic of fair use), and the experimental group will review an interactive version of the same content.  Both groups will take a post-assessment quiz on the material.  We use a Google Form to manage the process of the mini-experiment.  We will review the anonymously collected data from the session and compare the scores to the actual research project results later in the presentation

After that first fifteen minutes simulating the research study, our presentation briefly reviews Moore’s (1989) and Northrup’s (2001) models of interaction, which formed the basis for our research design, along with other research on interactive content.  We will follow that up with a brief discussion of our hypothesis and methodology. 

The next stage of the presentation includes a thorough demonstration of the interactions we built and use in the research project.  We also will briefly review other tools that could be used to create interactions for online or blended courses.  Access to these items and the ones used for our mini-simulation will be provided to session participants. 

Next, we will discuss our surprising (to us) preliminary results.  Then we will review how the results of the session mini-experiment compare to our actual studies preliminary results.  We will share our current conclusions based on the results of the research study before opening the floor to questions.   

At the conclussion of our presentation we want participants to

  • Use their presentation experience to guide them in future applications of interactive content in online teaching
  • Understand how to apply interactive elements in their own teaching situations
  • Recognize the pedagogical advantages of interactive content
Research Study:

Our presentation is based on research which is described here. 

The driving question for our research is: Does the method of information delivery have a significant impact on student learning in an online environment? Our research defines interactive information delivery as a format that requires students to make additional choices about the order in which they interact with the learning materials and one that requires specific student actions, such as additional mouse clicks to access the material.  The interactive elements in the research were built using Articulate Storyline.

Specifically, we are comparing student learning gains from a pre-quiz to a post-quiz.  The control group received instruction from traditional text and video.  The experimental group received the same content through enhanced interactive delivery methods that divided the elements into smaller parts and required student interaction to continue moving through the materials.  We hypothesized that students who encountered the learning content through the interactive information delivery method would demonstrate greater learning gains than students who encountered the learning content through traditional text and video.       

The IRB approved research is taking place at a large, public mid-Atlantic university over the course of the 2016-2017 and 2017-2018 academic years in the fall and spring semesters.  Voluntary student participation is solicited via email.  To date, a total of four courses have participated with two from the 2016 fall semester and two from the 2017 spring semester.  We anticipate this study to run with two more courses during 2017 fall semester and two from the 2018 spring semester. 

Students are assigned to control or experimental groups.  At three points during a 15-week semester, students take a pre-quiz, review course-related material, and take a post quiz.  The control group receives a standard PDF and YouTube video.  The experimental group receives the experimental materials, which is organized in smaller chunks and designed for choice in reviewing the information segments.  The text, visuals, and video are the same between control and experimental groups.  Pre and post quizzes with the same questions were used to measure learning gains for each of the three sets of materials.  Learning gains for each of the three sets are analyzed.   

To date there is no statistically significant difference between the two groups.  Average scores for the control group have been better than the experimental group, but not in a statistically significant way.

Analysis:

Thus far, the data disproves our hypothesis that students who encountered the learning content through the interactive information delivery method would demonstrate greater learning gains than students who encountered the learning content through static text and video.  Although both groups demonstrate learning gains, the control groups score minimally, but consistently, higher than the experimental groups. 

Our preliminary conclusions consider the following:

Students may already be well acquainted with the concepts of digital citizenship, common knowledge, and interview protocols.  Although the class in which the experiment is run is in a first-year writing sequence, online student enrollment has consistently included upper classmen, whom we assume may have advanced knowledge on these fundamental concepts. 

The course structure prior the study’s incorporation of interactive items was already following Universal Design in Learning principles.  The key content had both text, visual, and auditory delivery mechanisms.  Since we know this helps students, the added interaction may not be brining measurable and/or significant impact.  Therefore, if video and text are used in combination for course materials, the addition of extra chunking and click interactions may not be worth the extra time needed to build those interactions as it relates to student learning outcomes. 

There may be a learning curve to the interactive items that is initially impeding their benefits; students may be more focused on the clicking and choosing than on the content itself. 

We may not have pushed the interaction capabilities far enough to take advantage of the tools.    

Simply, interactive items may not add anything of value to the learning process. 

Abridged Citations:

Bernard, R., Abrami, P., Borokhovski, E., Wade, C., Tamim, R., Surkes, M., & Bethel, E. (2009). A Meta-Analysis of Three Types of Interaction Treatments in Distance Education. Review of Educational Research, 79(3), 1243-1289. Retrieved from http://www.jstor.org/stable/40469094

Lusk, D. L., Evans, A. D., Jeffrey, T. R., Palmer, K. R., Wikstrom, C. S. and Doolittle, P. E. (2009), Multimedia learning and individual differences: Mediating the effects of working memory capacity with segmentation. British Journal of Educational Technology, 40: 636–651.

Moore, M. G. (1989). Editorial: Three types of interaction. The American Journal of Distance Education, 3 (2), 1-6.

Northrup, P. (2001). A framework for designing interactivity into Web-based instruction. Educational Technology, 41(2), 31-39.

York, Ann M., Stubmo, Teri, and Nordengren, F.R. Learner Media Preferences in an Evidence-Based Practice Asynchronous Web Module.  MERLOT Journal of Online Learning and Teaching, Vol. 5, No. 3, September 2009.

 

Conference Session: 
Concurrent Session 5
Session Type: 
Express Workshop