Program learning outcomes (PLOs) aligned to assignment rubrics in Canvas courses generate meaningful but unmanageable amounts of assessment data. We teamed up with our analytics colleagues who helped reduce our analysis time from 2 weeks to 2 minutes. Come see how we set up the PLOs and wrangled the outcomes.
Our unit within the university currently works with 48 collaborative multi-campus online programs. We primarily utilize the Outcomes tool in Canvas to embed program learning outcome (PLO) rubrics directly into course assignments. This allows the faculty to score the program learning outcome rubrics in real time as they are reviewing student work.
Background on the Problem:
After final grades have been submitted each semester, the Canvas Outcomes Report is generated for the respective term. The data from this massive csv file must then be sorted and filtered for further analysis. Initially this work was done manually for each program. To increase efficiency, excel formulas and calculations were implemented. However, there were still two major limitations to this process:
- Inability to identify only those students enrolled in the collaborative online programs.
- The amount of time necessary to complete the analysis.
The vast majority of the courses in the collaborative multi-campus programs have mixed enrollment – meaning that students from a variety of programs (both online and in-person) can register for these courses. Currently, there is no mechanism within Canvas to easily identify a student’s major/degree. Instead, it is necessary to cross-reference with other systems at the university
The amount of time it takes to sort through the data will become exponentially more unwieldy as more programs transition to using Canvas Outcomes. Even using formulas and calculations, it took two full working weeks to complete the analysis and create the necessary tables from learning outcomes assessed in 42 courses. For the 22-23 AY, it is anticipated that we will have data from over 70 courses. Even with the now-developed formulas, manual calculation is subject to human error and the necessity to check and double-check the final result. As such, our process was not deemed sustainable.
Solution:
We have partnered with our analytics colleagues to automate the process of creating the necessary tables for annual assessment reports and address the limitations listed above. Using a tool called Alteryx, we can combine the Canvas LMS Outcomes data with several other university data sources to resolve the limitations of our prior manual methods. With the click of a button, what took 2 weeks to do manually now takes just a few minutes. We are also able to produce audit files that help evaluate the data further, as well as Excel and PDF outputs of the results needed. As an added bonus, the workflow also generates an extract file that we use to create a Tableau dashboard so we can visualize this data.
Next Steps/Further Development:
Further development of the workflow will include longitudinal analysis and reporting. Additionally, dashboards will be created to provide alternate ways to visualize the assessment data.
Level of Participation:
Participants will be asked to share their experiences working with Canvas Outcomes data, the processes they have developed, the partnerships they have fostered, and provide feedback to improve our process. Additionally, participants will have the opportunity to share insights on longitudinal analysis of program learning outcomes and ways to visualize the data.
Session Goals:
- Demonstrate how to align Program Learning Outcomes with specific course assignment rubrics within Canvas LMS.
- Identify challenges and opportunities of analyzing program learning outcomes data extracted from Canvas Outcomes.
- Discuss the development of workflows and automated processes.
- Discuss options for data visualization and longitudinal analysis of program learning outcomes.