Learning Analytics-based Policy Recommendations for Fostering Online Student Success

Audience Level: 
All
Institutional Level: 
Higher Ed
Abstract: 

In this study, the researchers aimed to develop some evidence-based recommendations for educational policy targeting non-traditional higher education students.  The evidence analyzed was archival (de-identified) data gathered in the United Kingdom from students in an online education setting, which bears many similarities to American online higher education:  namely, non-traditional student demographics, open enrollment, and a completely online education structure. Using a series of logistic regressions yielded two findings.  First, student demographics such as age, sex and disability status were not, in most contexts, statistically significant predictors of student success.  In these contexts, the number of previous attempts at a course or task is a more statistically significant predictor of student completion than student demographics.  Second, the number of credits students had prior to taking an online course is a useful predictor of whether a student is successful the first time they attempt an online learning module.  Taken together, these findings suggest that targeted policies which may include offering support services and academic advising geared toward helping students with less previous academic experience successfully complete a course the first time they attempt it, may well yield greater results than more diffuse policies targeting all students regardless of previous academic experience.

Extended Abstract: 

Higher education leaders are interested in strategies to help promote student success. Many institutions, particularly those serving online students, grapple with student retention. Most studies show that student attrition rates at online institutions are 3% to 5% higher than those of traditional institutions (U.S. News and World Report 2015). Student success is closely tied to accountability (Eaton 2011) and thus there is growing concern with higher attrition rates for online students. Researchers including Astin (1993), Braxton, Hirschy, and McClendon (2004), Pascarella (1985), Spady (1970) and Tinto (1975, 1993) have dedicated several decades to the study of student attrition, yet the growth of online education adds new complexity and a confounding variable to this issue. An ongoing goal for online educators is to foster student success in learning tasks and efficiency in the number of student attempts required to achieve that success. 

Understanding what factors predict student success in tasks and efficiency in student attempts can help leaders make policy decisions about how to structure online learning here in the United States. The researchers in this study sought to contribute toward a comprehensive framework of online educational policy actions by extending interpretations of traditional student learning outcome predictors to include online andragogic factors. The two research questions in this study were: (1) To what extent, if any, do age, gender and number of attempts at task success predict successful completion of a learning module? and (2) To what extent, if any, do age, gender, and number of previously studied credits predict the number of attempts required by students to complete a learning module?

The purpose of this study was to determine predictors of student success in learning tasks. In addition, the researchers wanted to understand efficiency in student task attempts necessary to achieve success for at risk online students.  The study was informed by archival (de-identified) data gathered in the United Kingdom from students in an online education setting which bears many similarities American online higher education:  namely, non-traditional student demographics, open enrollment and a completely online education structure.  Therefore, with some limitations, findings from the dataset can extended to implications within an American online higher education context. 

Student retention and graduation are of utmost significance, not only for student success, but also for university reputation and longevity. Student persistence and completion pose unique challenges in today’s online universities. Online programs have become a fast-growing segment of higher education (Carnegie Foundation for the Advancement of Teaching 2011; Planty et al. 2008; U.S. Department of Education 2011). Yet, online program attrition rates continue to be a concern for researchers and practitioners (Council of Graduate Schools 2007, 2010, 2012).

Gilliam and Kristonis (2006) recommend leaders identify problems related to student attrition and retention. Fike and Fike (2008) concluded it is essential to use data to guide decisions supportive of retention and to provide insight into factors influencing student retention.  Researchers note that it is always more cost effective to retain students than replace students (Flegle, Pavone, & Flegle 2009). 

Student satisfaction with an institution significantly impacts retention (Freeman, Hall, & Bresciani 2007; Luxe, Luse, & Mennecke 2016).  The impact of student satisfaction is often clear, but the influence and satisfaction levels are debated and unidentified.  Researchers noted that student satisfaction with educational offerings, services, and faculty are predictors of student retention (Fike & Fike 2008). DeShields, Kara, and Kaynak (2005) noted many institutions of higher learning are utilizing a more customer-oriented philosophy in the delivery of services to bolster student retention (Kara &DeShields 2004).

Quality of faculty is another factor related to student retention (Cole, Kim, & Priddis 2015).  Qualified and satisfied faculty influence the attrition rates of online students because program delivery is strongly related to student experience. Ambrose, Huston, and Norman (2005) identified variables that shape faculty morale and longevity.  Significant sources of satisfaction were salaries, collegiality, mentoring, reappointment, promotion, and tenure. Schreiner (2009) affirmed there is a positive relationship linking student satisfaction with faculty and retention. Faculty engagement during students’ learning experiences is also critical in the effort to increase retention (Tinto 2007).

Helgensen and Nesset (2007) explored the relationship of student satisfaction and the perception of reputation of an educational institution on loyalty. The researchers tested the perception of student reputation, student satisfaction and student loyalty, and student satisfaction and the perception of the reputation of the educational institution.  According to Marzano-Navarro, Pedraja -Iglesias, and Rivera -Torres (2005), the relationship between the satisfactions with courses, satisfaction experienced by students, and loyalty to the institution is significant to student retention.  

The publicly available dataset used in this study represents de-identified raw data from the Open University system in the United Kingdom (Kuzilek et al. 2015).  As such, a series of data processing steps were necessary to ensure that only the relevant raw data were being analyzed for the purposes of this study.  The original form of this dataset (Kuzilek et al. 2015) is a series of Comma Separated Values (CSV) files presenting different aspects of the data collected.  For this study, the files containing student demographic information and information on number of attempts at online tasks and student success were combined using the unique student identifier as the common case element.  Once combined using MS Excel®, the dataset was imported into IBM SPSS® Version 20, which was then used to process the dataset. 

The data were originally collected across two academic terms (identified as Term B and Term J) across two years (2013 and 2014). Since the type of academic work required may have differed across the two terms (Kuzilek et al. 2015), only the B term data were used to minimize confounding variables resulting from significantly differing academic work requirements.  Within each term, the data were further refined using the code assigned to the learning module.  For some modules, parts of the historical academic data were not available when the dataset was created (Kuzilek et al. 2015) and these modules were therefore not used for the analysis to minimize issues with data reliability.  In order to minimize the effects of confounding variables between learning modules, one learning module (module FFF) was selected for the analysis.

While student identifiers were unique to each student, a particular student may be represented in more than one case since the raw data captured each attempt a student made at a task or module as a separate case.  To avoid counting students more than once (which could produce artificially elevated significance levels possibly resulting in an increased likelihood of Type I error), only the case with the highest numbered attempt associated with it was retained for each student. All other cases associated with that student’s identifier were deleted from the dataset.  

These findings suggest two conclusions.  First, since number of attempts is a more statistically significant predictor of student completion than student demographics, understanding what influences how many attempts students need in order to be successful at a module can be more useful for targeted educational policy-making.  The finding suggests that students who only required a single attempt are 1.5 times more likely to successfully complete the course than those who required more than one attempt.  This finding adds to the previous literature regarding the relationship between motivation and persistence (Luxe, Luse, & Mennecke 2016; Rovai 2003) by suggesting that success, or lack thereof, in the first attempt at an online course may have significant impacts on the student’s desire or ability to persist in their attempts at success.  Therefore, institutions of online higher learning might find it significant to track number of previous attempts as part of their learning analytics measures, as this variable can ultimately be useful in shedding light on how and why students successfully complete.

The second finding has two implications and a significant limitation.  The first implication is that the number of credits students had prior to taking an online course is a useful predictor of whether a student is successful the first time they attempt an online learning module.  The relationship might be a weak one but is nevertheless statistically significant.  This implication builds upon previous literature suggesting that previous academic performance can be an important predictor of persistence on the part of the student (Berge & Huang 2004).  It is recommended that institutions of online higher learning add the tracking of credits previously completed by students to their learning analytics measures in addition to any other demographic collection.  By doing so, institutions can focus retention efforts on those students who may not have extensive previous academic experience prior to attempting online courses.  Such targeted policies which may include offering support services and academic advising geared toward helping students with less previous academic experience successfully complete a course the first time they attempt it, may well yield greater results than more diffuse policies targeting all students regardless of previous academic experience.

The second implication is that for those students who seem to need numerous attempts before being successful, age seems to be more of a statistically significant predictor than any other demographic factor.  In fact, the findings suggest that older students requiring multiple attempts might be as much as 14 times more likely not to successfully complete the course.  This finding is closely aligned with earlier findings suggesting that some demographics play a key role in determining persistence and success in the nontraditional student’s academic endeavors (Bean & Metzner 1985).  Institutions of only higher learning that currently track demographics may find it useful to focus on collected more detailed age information.  

Session Type: 
Discovery Session