Hear how faculty and administrators at one online institution use both high-tech and low-tech data analytics for multiple purposes– at the class level to identify at-risk students for focus of faculty outreach ,at the department level for faculty development and scheduling, and at the institution level to inform curriculum revisions.
Analytics are a powerful tool, but what good are they if not used to determine the ‘current state’ of a course, student, instructor and institution, and to impact behavior and outcomes? Individual faculty, department and institutions often use data to review faculty/department/curriculum performance once the semester concludes, or in multi-year departmental and institutional reviews. This presentation focuses on the use of data not just in retrospect but also while a class is in progress to order to improve both student and faculty performance before the class concludes.
With the COVID-19 pandemic, and the restrictions on in-person contact, data can often step in. In the in-person classroom, an instructor can determine immediately where students are struggling based largely on facial cues and through discussion. In the online classroom, getting a pulse of student understanding is different. Working without facial cues or in-person classroom discussions, online faculty can leverage course-level learning analytics and asynchronous discussion to identify where to direct their efforts at instructing their students, and how to customize outreach based on the needs of individual students or the entire cohort depending on the time of the semester/session.
Department chairs rely on conducting face-to-face observations of faculty in the classroom in order to evaluate faculty performance and also to provide kudos and guidance as needed. Such data also often helps guide future adjunct scheduling to ensure their areas of strength are leveraged. For online classes (fully online or hybrid/blended) data analytics can fill the gaps and supplement the information gathered from a virtual course observation.
Institutions that have high-tech data at their disposal can easily drill down to student participation, assignment grades, course completion and pass rates to identify indicators to student retention and persistence. Similarly comparing D/F/W rates for the same class taught by multiple faculty members across multiple sessions/semesters can help highlight trends in student behavior or gaps in the curriculum.
Dietz-Uhler and Hurn (2013) discuss the use of classroom-level learning analytics to create a more personalized learning experience for students as well as institutional use of data analytics to attempt a prediction of student success. Conversely, Bloemer et al. (2017) ascertain that attempting to predict student success based on D/F/W results may not be as successful as digging deeper into institutional student records to assess actionable items and stage of student academic career.
The goal of this session will be for participants to come away with knowledge on how to leverage institutional data analytics and/or course-level learning analytics to assess where to direct efforts related to student success such as outreach, resources, individualized attention, and teaching assignments, with the ultimate goal of impacting student engagement, increasing persistence/retention rates, and higher student satisfaction. This presentation will share tips and suggestions that can be implemented by faculty, administrators, instructional support staff, technologists (data) regardless of type of institution (large, medium, small, 4-year or 2-year) offering fully face to face, blended, or online classes.
Some institutions and departments have access to a comprehensive database that allows faculty and administrators’ access to data over multiple courses, multiple sessions and multiple faculty which can be parsed out as needed. Others not having sophisticated databases or limited data, must make do with data that is more piecemeal and collected course by course. Nevertheless, whether through a sophisticated database or collected by hand and tracked in an Excel spreadsheet or somewhere in between, as an example, the following data points can prove extremely useful in keeping a finger on the pulse of the classroom, both short term and long term:
-
Assignment submission rates
-
Attendance (real or virtual)
-
Time spent in classroom
-
D/F/W rates
After studying one large University’s bottom-up and top-down process to inform decision-making, Dziuban et al. (2012) suggest having an idea of what data to collect and doing something with the data, while emphasizing that providing a safety net for students to succeed should be the end result of any attempt at analytics.
We will share our institution’s utilization of both high-tech and low-tech data to drive decision-making and student success. Institutional high-tech data analytics are used at various levels – to strategize both faculty and advisor efforts at driving engagement in a student’s 180-day life cycle to directing specific efforts related to reducing retaker attempts which inform curriculum updates. Course-level low-tech learning analytics are used by faculty to determine week-to-week and session-to-session ‘what students need’ instructionally as a course, as well as individually. Additionally, an end of session Course Health’ analysis triangulates both high-tech and low-tech, as well as quantitative and qualitative data to help us keep track of multiple measures, such as assignment submission rates and grades, grade distribution, failure and withdraw rates, persistence, student and course survey results, and institutional statistics such as first 180-day retention. This helps long term planning in terms of faculty teaching assignments and curriculum and sequencing revisions as well as broader university initiatives such as collaborative initiatives between faculty and advisors.
References
Bloemer, W., Day, S., & Swan, K. (2017) Gap analysis: An innovative look at gateway courses and student retention. Online Learning, 21(3), 5-14. https://eric.ed.gov/?id=EJ1154312
Dietz-Uhler, B., & Hurn, J. E. (2013). Using learning analytics to predict (and improve) student success: A faculty perspective. Journal of Interactive Online Learning, 12(1), 17-26. http://www.ncolr.org/issues/jiol/v12/n1/using-learning-analytics-to-predict-and-improve-student-success.html
Dziuban, C., Moskal, P., Cavanagh, T., & Watts, A. (2012). Analytics that inform the University: Using data you already have. Journal of Asynchronous Learning Networks, 16(3), 21-38. https://eric.ed.gov/?id=EJ982670