In this session, we explore the extent to which college students who chose online courses prior to COVID-19 were more resilient when forced online during Spring 2020. We also consider outcomes in courses that transitioned online to those courses that were originally fully online at the beginning of the semester.
Session Goals
Attendees will be able to identify the extent to which taking a fully online course prior to the COVID-19 pandemic correlated with more resilient course outcomes when students were forced into emergency remote teaching. They also will be able to describe differences in student course outcomes by original course mode (originally fully online vs. not originally fully online) and how observed trends compared to outcomes in prior non-pandemic years.
Study Motivation
Research on the relationship between online course-taking and college outcomes is conflicting; some studies suggest that online courses are critical to academic progress (Johnson & Mejia, 2014), while others suggest that online course-taking is correlated with poorer outcomes (Jaggars & Xu, 2010; Xu & Jaggars, 2014). Due to ethical and practical concerns, no large representative randomized controlled trials comparing fully online versus face-to-face students have occurred. However, the spring 2020 term allowed us to leverage the unique circumstances presented by the COVID-19 pandemic to explore the relationship between online course-taking and course outcomes in college for all students, not just those who initially selected the online modality. Specifically, our study explored 1) the extent to which students who chose online courses prior to the onset of the pandemic were more resilient when forced into courses relying on emergency remote teaching, and 2) as all classes were forced fully online during the spring pandemic term, this also allowed for comparison of outcomes in courses which were forced to transition online to those courses that were originally fully online at the beginning of the semester.
Theoretical Framework
This research draws on the concept of resilience, defined as “a phenomenon or process reflecting relatively positive adaptation despite experiences of significant adversity or trauma” (Luthar, 2006, pg. 6). In this context, we conceptualize resilience as relative—the extent to which a student’s course outcomes improved, stayed the same, or worsened during the pandemic term when all courses were forced online.
Method
This study uses a dataset consisting of all courses taken by students (N= 241,080) enrolled in either fall 2019 or spring 2020 at CUNY, the third largest university system in the U. S. Courses were classified by instructional mode. Fully online was any course that was originally listed as a fully online course at the beginning of a semester. All other classes were classified as traditional courses. These classifications describe the semester-long delivery mode during the fall semester, but during the spring pandemic semester, this only describes the mode at the start of the term (before the pandemic hit). Course-taking behaviors were also used to classify students. Traditional mode students were students who did not enroll in any courses that were originally fully online in either the fall or the spring semester. Dual mode students were students who enrolled in at least one course section that was originally fully online and at least one traditional course, in both the fall and the spring terms. Our main outcome of interest was successful course completion, defined as course completion with a C- or better. Additional control variables were included: gender, race/ethnicity, age, G.P.A., first-semester freshman status, the median household income of a student’s zip code, and college level (two-year vs. four-year).
Stata was used to conduct statistical analyses: mixed for multi-level linear probability models and melogit for multi-level logistic regression models. We report the results of multi-level linear probability models because our primary concern is in estimating particular parameters and not forecasting a specific outcome and since we cannot otherwise compare coefficients across logistic regression models (Buis, 2010). The multi-level models included: (level 1) the individual course record; (level 2) the particular student; and (level 3) the college where that student took that course. This allowed us to control for clustering by student and by college, accounting for the tendency of grades in different classes for a given student, and grades for classes at the same college, to be more similar to one another. KHB decomposition (Kohler et al., 2011) was used to calculate direct and indirect effects for mediation analysis.
While it is impossible to make causal inferences based on the observational data collected in this study, it is possible to use the data, along with the conditions surrounding the pandemic, to consider various hypotheses about the relationship between voluntary online course-taking, ERT and course outcomes. The patterns that we observe may have many different possible explanations, and these will be explored through our results and in conjunction with interactive discussion during the session.
Results and Discussion
- We ran separate models to evaluate course outcomes by term for traditional vs. dual mode students. Dual mode students had higher overall successful course completion rates compared to traditional mode students, at both two- and four-year colleges in both terms (). However, the interaction by term was not the same for two-year versus four-year colleges. At four-year colleges, the gap between dual and traditional mode students remained almost the same across terms, increasing slightly. However, at two year-colleges, the gap between the dual and traditional mode students grew substantially larger from fall to spring (): traditional mode students did significantly worse in spring than fall, whereas dual mode students did significantly better in spring than fall (). These differences in slope for each group were statistically significant ().
This difference by institution type is puzzling…. what explanations could be the reason behind the trends observed? Come to the session to find out!
- We next looked at patterns comparing outcomes in fully online vs. traditional mode courses across semesters for dual mode students, which allowed us to consider course mode while holding student characteristics constant. Dual mode students at both two- and four-year colleges did significantly better () in both fully online and traditional courses in the pandemic vs. pre-pandemic term. However, the gap between fully online and traditional mode courses remained unchanged—in both terms, at both two- and four-year colleges, students were significantly less likely to successfully complete fully online than traditional mode courses (), even after all traditional mode courses were moved fully online during the pandemic. There was no significant interaction between term and course mode in 2019-2020. This, too, is puzzling, given the fact that all courses moved to a fully online mode early in the spring: we would have expected this gap to disappear once both types of courses were being taught fully online; or even reverse, given the rushed nature of the transition to an online mode for traditional mode courses and the relative inexperience of most instructors and students in the fully online medium.
So, what could explain this counter-intuitive finding? Come to the session to find out!
Implications
While findings from this study are not causal, they do indicate that institutions should be particularly cautious about forcing students into a course medium not of their choosing, as well as overgeneralizing which courses are “good” or “bad” for all students. Until more data is available, the patterns observed in this study suggest that institutions may do better to err on the side of ensuring that students are able to take courses in the mode that they feel is best for them. Institutions might want to be especially mindful of indirect ways in which offering online and on-campus sections might force students into a mode that they would otherwise not choose. Further, some interesting patterns were observed in this study between traditional and fully online courses during the pandemic term that call into question assumptions that have been made in the past about online courses. Persistent differences in outcomes between courses that students chose to take on-campus vs. online even after all courses are shifted fully online suggest that future research into online course outcomes may be best served focusing not on the online medium itself, but rather on the characteristics of students who self-select into online courses (e.g., time poverty, stressors, skills, resources, etc.); or the characteristics of the courses themselves (e.g., importance to a student’s major; alignment with student interests; level of synchronicity, interaction or community; etc.).
Interactivity Plan
This session will include an interactive discussion organized around a series of provocative questions grounded in our research results and intended to inspire future research and policy/intervention direction. After an overview of the research, questions showcasing different types of possible explanations for the findings will be posed to the participants. Attendee responses will be compared to supplemental analyses conducted and the literature, followed by probing discussions of how assumptions about online course-taking may be impacting policy. Take-aways for future research directions will be elicited, aimed at identifying policy and interventions that could be continued or developed to either support promising pandemic trends or counter new problematic developments. This interactive discussion format will help the online research community, as well as faculty, staff and administrators who support online students, understand our results and to consider how they may be applied in future institutional research and campus support efforts.
[proposal word count not including references: 1482 words]
References
Buis, M. L. (2010). Stata tip 87: Interpretation of interactions in nonlinear models. The Stata Journal, 10(2), 305-308. https://journals.sagepub.com/doi/pdf/10.1177/1536867X1001000211
Jaggars, S. S., & Xu, D. (2010). Online learning in the Virginia Community College System. Community College Research Center (CCRC), Teachers College Columbia University. https://ccrc.tc.columbia.edu/media/k2/attachments/online-learning-virginia.pdf
Johnson, H. P., & Mejia, M. C. (2014). Online learning and student outcomes in California's community colleges. Washington, DC: Public Policy Institute. https://www.ppic.org/content/pubs/report/R_514HJR.pdf
Luthar, S. S. (2006). Resilience in development: A synthesis of research across five decades. In D. Cicchetti & D. J. Cohen (Eds.), Developmental psychopathology: Risk, disorder, and adaptation (p. 739–795). John Wiley & Sons. https://doi.org/10.1002/9780470939406.ch20
Kohler, U., Karlson, K. B., & Holm, A. (2011). Comparing coefficients of nested nonlinear probability models. The Stata Journal, 11(3), 420-438.
Xu, D. & Jaggars, S.S. (2014). Performance Gaps between Online and Face-to-Face Courses: Differences across Types of Students and Academic Subject Areas, The Journal of Higher Education, 85(5), 633-659, https://doi.org/10.1080/00221546.2014.11777343