Creating an Activity-Level Measure of Student Engagement and Exploring the Potential of LMS Log Data as a Proxy Measure of Student Engagement

Audience Level: 
All
Session Time Slot(s): 
Institutional Level: 
Higher Ed
Streamed: 
Streamed
Strands (Select 1 top-level strand. Then select as many tags within your strand as apply.): 
Abstract: 

This presentation reports on the outcomes of dissertation research that validated a longitudinal, activity-level measure of student engagement for blended learning contents and then used the instrument to explore the potential for using LMS log data as a proxy measure for student engagement.

Extended Abstract: 

This presentation reports on the outcomes of dissertation research titled Measuring Student Engagement in Technology-Mediated Learning Environments (Henrie, 2016).   

First we report on the development of a multidimensional, activity-level student engagement instrument. Existing measures do not directly focus on engagement “in-the-moment.”  Rather, these measures attend to a student’s overall experience in a class or school (Fredricks, et al., 2011; Henrie, Halverson, & Graham, 2015).  Studying activity-level student engagement directly addresses the link between engagement and performance in a learning activity.  Using confirmatory factor analysis, we evaluated two short scales that measured students’ emotional and cognitive engagement.  These scales were cross-validated across two student samples with good model fit.  We found evidence that characteristics of the learner and the learning activity lead to unique pathways of engagement over time, which may affect the quality of achieved outcomes.  

Second we used the learner engagement instrument to examine the relationship between LMS log data and self-reported student engagement survey scores from 319 participants in 7 courses at two different universities.  Learning management systems are becoming more common in higher education courses, meaning LMS log data are an increasingly ubiquitous source of information on student engagement in learning.  Computer systems are being developed that extract, analyze, and act upon these data, such as early-alert systems and intelligent tutoring systems (Bienkowski, Feng, & Means, 2012; Ferguson, 2012; Siemens, 2013).  Should log data serve as a meaningful proxy for survey scores, these data would be a minimally disruptive and scalable approach to quickly identify who needs help, evaluate design, and personalize instruction.  We report multiple approaches to structuring the log data, such as using one week’s worth of log data compared to one day’s worth of log data, as well as aggregating log data at differing levels of granularity, including:

·       General Level (total page views and time spent)

·       LMS Interaction Type Level (page views and time spent on learning, procedural, and social pages)

·       Target Assignment Level (page views and time spent on different types of LMS pages)

The log data variables defined by these alternative structuring approaches were correlated to the student engagement survey scores to study the relationship between these two sources of data. This analysis was done for data from three separate courses.  Statistically significant, but small, correlations were found in one course (r = 0.23 - 0.33).  Overall, log data were not found to be a strong proxy measure for students’ self-reported cognitive and emotional engagement.  Our results underscore the complexity of learning and the relationship between observed and reported cognitive and emotional states.  Future educational research using log data will need to account for the complex factors that help explain trends in student engagement.

References: 

Bienkowski, M., Feng, M., & Means, B. (2012). Enhancing teaching and learning through educational data mining and learning analytics: An issue briefWashington, DC: SRI International. Retrieved from http://www.ed.gov/edblogs/technology/files/2012/03/edm-la-brief.pdf

Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning4(5/6), 304–317. doi:10.1504/IJTEL.2012.051816

Fredricks, J., McColskey, W., Meli, J., Mordica, J., Montrosse, B., & Mooney, K. (2011). Measuring student engagement in upper elementary through high school: A description of 21 instruments. (Issues & Answers Report, REL 2011–No. 098). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southeast. Retrieved April 27, 2015 from http://ies.ed.gov/ncee/edlabs/regions/southeast/pdf/REL_2011098.pdf

Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: A review. Computers & Education90, 36–53. doi:10.1016/j.compedu.2015.09.005

Henrie, C. R. (2016). Measuring Student Engagement in Technology-Mediated Learning Environments. Unpublished doctoral dissertation. Provo, UT: Brigham Young University.

Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist57, 1380–1400. doi:10.1177/0002764213498851

Conference Session: 
Concurrent Session 2
Session Type: 
Education Session - Research Highlights