Sparty Matters: A Remix of Quality Matters for Internal Peer-to-Peer Course Reviews

Final Presentation: 
Audience Level: 
All
Session Time Slot(s): 
Institutional Level: 
Higher Ed
Streamed: 
Streamed
Abstract: 

This session will highlight a course review process initiated by our Provost in the Spring of 2020 when the novel coronavirus necessitated a rapid shift to remote learning at MSU. A background will be shared along with what worked, what did not work, and planned revisions for the future.

Extended Abstract: 

Background 

This session will highlight a course review process initiated by the Michigan State University Provost in the Spring of 2020 when the novel coronavirus necessitated a rapid shift to remote learning at MSU. A background will be shared along with what worked, what did not work, and planned revisions for the future. Since most of the faculty thrust into some form of online teaching did their best to adapt courses rapidly to Zoom and/or D2L and other online learning technologies, faculty were concerned that finishing the Spring might be mostly survival. The institution knew summer study for undergraduates would need to be delivered online. A robust faculty development program was prepared to help faculty prepare to teach online. This resulted in synchronous and asynchronous formats for professional-led workshops covering online teaching and learning best practices. It was decided that we should encourage faculty developing online courses for Summer and Fall 2020 to follow up with a structured peer-review format utilizing the Sparty Matters rubric, which had been voluntarily developed and used by some academic units at MSU in preceding years. An incentive for faculty to complete the full program was offered by the Provost’s office and the Associate Provosts worked with Assistant Deans in each college to prioritize and enroll instructors in training cohorts and review cohorts. 

Sparty Matters is based on the widely researched rubric maintained by the Quality Matters organization, of which MSU is a subscribing member, as well as previous simplification efforts of quality measurements at MSU in collaboration with MSU IT and the MSU College of Arts and Letters. This initiative drew from previously developed MSU rubrics and formative review processes in the colleges, as well as literature associated with online course quality. These materials were synthesized by Nick Noel and his collaborators to compile previous efforts into the summer 2020 rubric. 

The structure for the course review process involved assigning instructors, and two facilitators, to a review group and session. Whenever possible we organized instructors by their college. Each review session lasted two weeks. The first week instructors were asked to review two courses, with the facilitators providing guidance and answering questions. In the second week, instructors went over the reviews of their courses and formulated a revision plan for their course based on the feedback they received. These revision plans were due two weeks after the end of the session. At all phases we emphasized that the reviews were formative feedback, not mandates, and that they should be approached with kindness and understanding. In that vein, we put a low threshold of one course review being the minimum for being counted as having completed the process. All told, 327 instructors went through the process. 

What worked

The general approach of asking instructors to review each other’s courses seems to be, on the whole, a positive experience. Several instructors indicated that they found it useful to get feedback on their course, and see how other instructors had organized their courses. Many also appreciated the QM rubric sheet as a method of providing guidance to instructors on what they were looking for. Overall, there is indication that instructors would like to engage with their peers on the practice of teaching, and on how to best organize and deliver their courses.  

Facilitators reported on many aspects of the course review process that worked well related to collaboration and course alignment of learning objectives with learning activities, instructional materials, and evaluation. The rubric and corresponding feedback also heightened awareness about accessibility guidelines and assisted educators in understand some essential things they could do to better meet these guidelines. Most importantly, it helped them approach their course design from a learner's perspective. One facilitator said, “Our faculty members are actively embracing the challenges of teaching online. They are excited to learn from each other and are enthusiastic about making changes to improve their courses. Building a course is in effect, a lonely process, and I think it was wonderful that they now have a new shared experience joined with others, across disciplines, who are also working diligently to improve the learning experience. As a facilitator, it was a privilege to watch their excitement.” Working across disciplines was a common sentiment of what worked. Still, some suggested that working within the same discipline would have been more beneficial. Many simply stated how advantageous it was generally for faculty to be able to see how other courses are set up in D2L.   

It was important to recognize and iterate that faculty in the working groups may have been at all different levels of experience and comfort with online education. Even facilitators had varying degrees of experience. For some, Sparty Matters and the Quality Matters rubric itself upon which it was built was something they had never known of or used before. One facilitator mentioned that it would be something they would continue to be using from now on. Similarly, going through the QM training and review process also provided facilitators with the information they could apply toward modifying their own courses.  

These peer-reviews were timely and helpful because of COVID-19 when a lot of instructors were in the process of moving their courses online for the first time. The process aided instructors in giving them a sense of community, new perspectives on designing for active learning, sharing of resources, and space to reflect on their own courses and ask questions of their peers. Some teams reported having important discussions during the meetings on topics that are not always reflected in department meetings and trainings. It was a safe environment to ask questions about why and how one goes about teaching and there aren’t always many places for these types of conversations.   

What didn't work

There are several areas were found that could be improved in future iterations. Some instructors did not find the reviews to be meaningful due to the fact that Quality Matters, and thus Sparty Matters focuses only on the design elements of the course, rather than the teaching and learning elements. One participant noted that the frame “has nothing to do with the teaching or the learning process.” This was the strongest criticism, but one that may come from a misunderstanding of the intent of the reviews which intentionally focus on design only.  

Another simply just didn’t find the quality review work all that helpful, noting “One thing that I learned during this process is that I don't actually need or want the kind of help that was being offered. It just wasn't useful for me. However, there are other folks that I am certain found it very useful...this should not be seen as a poor reflection on the facilitators. I don't know what circumstances they were working under, and I feel it is important to be generous and grateful for the work they put into making this experience as valuable as possible while meeting university expectations.” 

Recommended revisions for future course reviews 

Facilitators recommended that they be paired with others who have done it before so that they have an opportunity to observe a previous session to get an idea of the duties and what they look like. There was a stated need for more clarity for what the facilitator role entails at the beginning of the process. They also suggested providing more documents, email templates and timelines. Some facilitators who facilitated multiple different sessions suggested a more standardized way of doing the reviews with faculty so that there wouldn’t be big changes from session to session.  

Specifics on what to communicate at the beginning of the process were discussed such as the timeline for compensation and what must be met to qualify. Instructors should be reminded how important the deadlines are because of how it holds up the process for many others in the group. There was a need for a tutorial for instructors on how to add reviewers to their courses in D2L. One facilitator suggested that it would be insightful to see faculty comments on the reviews they received and to follow up with a survey to see if student performance improved, if complaints went down, if they found the feedback useful and if they were able to implement the suggestions. Lastly, it was recommended that more faculty be encouraged to do these peer-reviews as they were extremely beneficial and that they be done every 3 years to ensure quality curriculum and alignment.

Conference Session: 
Concurrent Session 1
Conference Track: 
Process, Problems, and Practices
Session Type: 
Education Session
Intended Audience: 
All Attendees