Visualizing Course Evaluation Data: Survey Research to Assess Student Learning Rather than Faculty Performance

Audience Level: 
All
Session Time Slot(s): 
Institutional Level: 
Higher Ed
Streamed: 
Streamed
Special Session: 
Research
Abstract: 

Visual analytics can showcase trends in student course evaluations. However, building a dynamic dashboard is only effective if the survey is designed with the right questions. We will discuss visualizing course evaluation data to shift thinking from assessing instructors, to assessing teaching and learning for online classes overall.

Extended Abstract: 

Given the sudden shift to remote learning with the Covid-19 pandemic, feedback from students is more essential now than ever before. Feedback in a college’s end-of-course evaluation survey is critical for the online teaching community. This data helps faculty and academic leadership understand areas of strengths and improvements at the college and course level. Using visualization techniques for interpreting survey data allows for targeted improvements to curricula for student learning and performance. There are several methods for visualizing survey data, but how do you research course evaluating data without isolating individual instructors or classes? More importantly, how do you shift the institutional culture away from assessing instructors to assessing student learning?

First, research on course evaluation data that is intentionally focused on student learning and the student experience gleans deeper insights related to student preparation, course behavior, course design, and instructor presence overall. Visual analytics presents survey data in a friendly and easy-to-understand manner, rather than the typical mean results in a table format by course section. Tools such as Tableau can be used to create Doughnut Charts, Divergent Stacked Bar Charts, and comparative views in the aggregate. This showcases trends for a group of courses, as opposed to a typical printout showing data for just one class section. In addition, redesigning the survey instrument is essential when looking for targeted research questions on course evaluation, for example, using questions like:

From a scale of 1 (strongly disagree) to 5 (strongly agree):

  • I can apply the course concepts and principles learned in this course to real-world situations beyond the classroom.
  • The course and my instructor covered what was stated in the course outcomes and the syllabus.
  • Assignments and assessment rubrics helped me to learn.

Survey research should be conducted alongside more holistic storytelling with data. Finally, in the age of information overload, data cannot solve everything and survey dashboarding techniques must be created in partnership with faculty teaching the courses.

Level of Engagement

The session will briefly include a story of this college’s attempt to visualize survey data, and ultimately a story of failures and lessons learned. Through leading questions and group discussions, participants will discuss ways to shift thinking of course evaluation surveys from assessing individuals, to assessing teaching and learning overall. Sample questions to elicit discussion include:

  • What questions do you use on your course evaluation instrument?
  • Would you say your course evaluation questions are instructor-centric? Or student-centric?
  • Do you require or incentivize students to complete the course evaluation?
  • How do you use the data? Do you utilize results of the survey for faculty performance? Should we?

There is opportunity for some lively debate in this session regarding the value of course evaluation data and sharing experiences for different institutions.

Session Goals:

Individuals attending this session will:

  • Identify visual analytics techniques for survey research
  • Discuss methods to shift approaches of using survey data to focus on student learning and performance, rather than faculty evaluation
  • Write actionable research-based survey questions for course evaluation data
  • Discuss strategies for using survey data for continuous program and course improvement
Conference Session: 
Concurrent Session 3
Conference Track: 
Research, Evaluation, and Learning Analytics
Session Type: 
Education Session
Intended Audience: 
Administrators
Faculty
Technologists
Researchers