Pedagogical Practices in a Fully Online Biology Lab Course: How Do Student Perceptions Correlate with Achievement?

Final Presentation: 
Audience Level: 
All
Session Time Slot(s): 
Institutional Level: 
Higher Ed
Supplemental File: 
Abstract: 

We investigated the extent to which various instructional practices of a fully online undergraduate Biology course contribute to student learning. Out of 10 practices, only guided discussions were found effective at deepening student understanding of the course content.

Extended Abstract: 

Introduction

Online learning is the fastest growing trend for learners and educators. According to Online Report Card (Seaman, Allen, & Seaman, 2018), online student enrollments have steadily increased for the fourteenth straight year. Online coursework and distance education programs, per se, have become more and more popular in every level of education. . In 2016 there were 3,338 U.S. degree granting higher education institutions that offer distance education programs. In fall 2014, there were approximately 3 million students (14% of the total) who took all courses online (National Center for Education Statistics, 2016).

Given the rapid growth in online courses and institutions providing fully online degree programs, it is important to establish a strong foundation and research-based guidance about how the design and implementation of online courses affects student learning, in order for online education to get rid of the negative reputation that it sometimes carries. As a step toward these goals, the  purpose of this study was to investigate how much the different instructional practices used in a fully online Biology course contributed to student learning in terms of students’ perceived usefulness of these components. The research question guiding the study is: what is the relative effect of instructional practices on student learning?

Prior research

Empirical evidence shows that online learning is an effective alternative to face-to-face courses for undergraduate students (Means, Toyama, Murphy, Bakia, & Jones, 2010). Specifically for online settings at higher education context, there are mainly four categories for examining student achievement: the effect of 1) student motivation and self-regulation (Rakes & Dunn, 2010); 2) student behaviors recorded through trace, log, or learning management system (LMS) data (Gašević, Dawson, & Siemens, 2015); 3) student characteristics and learning styles (Loomis, 2000); and 4) students perceptions of the various practices and components of an online course, such as interaction, discussions, and lecture videos (Hegeman, 2015; Song, Singleton, Hill, & Koh, 2004).

These four categories of research on student achievement have identifed factors differently. For example, student behaviors such as students’ log data, frequency of opening a page, reading time, and number of posts created have been associated with student learning (You, 2016). These behaviors identify how students interact with each other, when they take quizzes and submit assignments, and the quantity of their participation in online discussions. However, studies showed that frequency or amount of  time spent on such activities are not always related to student performance (Hadwin, Nesbit, Jamieson-Noel, Code, & Winne, 2007).

Similarly, previous research has focused on student perceptions and how they relate to best practices in online learning. Realizing what students perceive as helpful is crucial in order to design online courses for better student performance (Lao & Gonzales, 2005; Young & Norgard, 2006). This type of information has been gathered by survey studies that examined both students’ and instructors’ perceptions regarding the elements that enable students to be successful (Ralston-Berg, 2011). However, these studies did not connect perceptions to student outcomes.  In fact, there is little empirical evidence establishing a link between students’ perception towards instructional practices and their learning at the end of the online course (Song et al., 2004). Analyzing and relating students’ perceptions to their learning can provide insights in terms of instructional design and online teaching. Such capacities can also provide a direction for future studies in providing more quality online courses.

The Context

As the trend and demand for online education has gradually grown nationwide, the University of Florida (UF) also started to offer fully online programs and bachelors’ degrees through a program called UF Online. One of the courses offered in UF Online is a 2000-level undergraduate Botany lab course, Plant Diversity, which includes instructional components that enable students to work both independently and collaboratively. The four-credit course has 13 modules and an integrated Botany lab.

The course was designed based on a framework drawn from a meta-analysis study by Means et al. (2010). The study found that 1) online instruction is effective when it is collaborative and instructor-directed (rather than independent); 2) elements such as videos or online quizzes don’t impact student learning; 3) self-control of interaction and prompts for reflection can enhance online learning experience; 4) the addition of synchronous communication with peers or the instructor is not a significant moderator of online learning effectiveness; 5) text-based and other media materials together have a greater impact, compared to only text-based materials; 6) opportunities for practice, simulations and feedback provided are significant practices in online environments; and 7) the presence of instructions to guide interactions in group work do not appear to improve learning outcomes.

In brief, the ten instructional practices used in the Plant Diversity course are:

  1. Lectures are professionally-produced, instructor-led video materials.

  2. Textbook is a short textbook covering many of the main concepts in the course.

  3. Formative exams (13) are six-item multiple choice tests that students take at the end of each module.

  4. Group work assignments (4) are structured collaborative activities in which students work in small groups to accomplish multiple tasks.

  5. Individual assignments (18) are short, quick-check activities in which students explore a Botany concept or process by answering open- and closed-ended questions.

  6. Discussions (6) are implemented in small groups and allow students to debate controversial Botany topics and paradoxes, share their responses to open-ended questions, and reply to posts by other students.

  7. Peer-reviewed activities (5) are short learning activities through which students post their response to a video or a case. Peer reviewers are expected to raise question and bounce ideas off each other by providing different perspectives.

  8. Reflections (4) are self-assessments of student learning about a reading material in order to rehearse concepts and descriptions.

  9. GoFlag activities (3) are cyberlearning activities developed around socially relevant themes and real-world connections of plants.

  10. Labs (11) provide hands-on learning experiences for students to develop skills of a Botanist, such as observing and drawing.

The course was created in the Canvas LMS and all the practices were implemented through Canvas.

Method

The research design used in this study is correlational survey design. There were 105 undergraduate students in the course, which lasted 13 weeks.

Achievement test: This comprehensive test includes 36-item multiple choice questions to assess students’ understanding in a broader perspective. This test is created with the aim of measuring students procedural (acquired by doing) and a posteriori knowledge (requires reasoning and reflection). Each question is worth 1 point. The total score can range 0 to 36.

Survey of perceptions: This eleven-item survey asks questions about the perception of students about instructional practices they are engaged in the course. For each instructional practice, a question asks how well that practice deepened students’ understanding of the content. One additional question is for assessing students’ experience and satisfaction with peer reviews.

The data were analyzed using multiple regression in SPSS 23 using multiple linear regression analysis.

Findings   

Findings show that the only instructional practice that has a statistically significant effect on student learning was discussions (ß=.45, p < .05). This can be interpreted: as the degree to which students percept that discussions deepen their understanding of content increases (i.e., more positive, from slightly well to very well), their student learning improves.

Discussion

The finding of this study is supported by a meta-analysis study by Means et al. (2010) who conclude that students’ self-control of interaction and prompts for reflection can enhance online learning experience.

Our study has implications for practice and design in online courses. The discussions in this Biology course were structured to help students get engaged with the content. Students took a role of an introverted thinker to digest a contrasting response from their peers and an extroverted thinker to reflect and share what they thought about a dilemma or a controversial topic. Receiving and giving feedback were encouraged through prompts from the instructor. Discussions were also a gateway to apply knowledge in hands-on labs. Students were able to interact with their peers before taking the labs, which strengthened their knowledge and prepared them for a more rigorous activity.

The findings of this study also have implications for research. The other nine practices did not yield any statistically significant results. This might be because of the use of single items to measure students perception towards instructional practices. The other reason could be that other practices were focused more on deepening other types of knowledge. For example, readings and video lectures in design promoted factual or declarative knowledge.

Future research that duplicates the finding of this study is needed. To confidently claim the effect of instructional practices on student learning, incorporation of more items might provide more robust results. Analysis with more factors/variables related to self-regulation or student logs can produce different perspectives.

Interactivity
At the beginning and the end of the sessions, audiences will be asked to join two Kahoot! Quiz challenges and the presenters will discuss the answers.

Conference Session: 
Concurrent Session 8
Conference Track: 
Research
Session Type: 
Education Session
Intended Audience: 
Administrators
Faculty
Instructional Support
Training Professionals
Technologists
Researchers