Nearly 700 online course reviews have been conducted by instructional designers at UCF. The results of these reviews serve to improve individual courses, but also contain clues about how faculty development can be improved to encourage an effective course design from the beginning. Participate in this discovery session to identify design elements that are most commonly missed and how faculty development is being redesigned as a result.
Prompted by the State University System’s 2025 Strategic Plan for Online Education, the Instructional Design team from [Institution] began offering course reviews for eligible faculty teaching in online course modalities. These reviews explore course components proven to be best practices in online course design in categories such as orientation, content, assessment, interaction, and accessibility. Two levels of review are offered; the Quality review focuses on basic elements of design like offering a clear starting point for students, while the High Quality review explores more sophisticated components like self-assessment and depth of learning. Faculty who have participated in the reviews have cited improved navigation, heightened accessibility, and stronger assignments as just a few benefits experienced. As of this writing, nearly 200 faculty members have earned over 450 online course designations at [Institution].
The data that exists within over 700 online course reviews is invaluable, yielding some interesting trends. In this session, we will highlight several items from the course reviews that are most often present and offer several items that are commonly missed. For instance, while we found that most required elements were present in the syllabus and students had opportunities to interact with other students, we also discovered that instructor expectations (when and how they would give feedback and respond to messages) and grading criteria were commonly in need of improvement.
These results can certainly serve to improve individual online courses, but their real value lies in applying the results to faculty development at large. For instance, a template was created which prompted instructors to include information about their response and feedback time. Since the template was implemented, all faculty going through the faculty development have had that item fulfilled in their reviews.
This topic is relevant, as many institutions have faced an incredible growth in online courses over the last year and are struggling with quality assurance. Sharing the results of over 700 reviews can illuminate some gaps for those looking to start their own review process or improve an existing process.
To make the session interactive, we will take advantage of the features of PlayPosit, especially the poll and multiple choice questions, as well as open-ended discussion. Participants will be encouraged to reflect on the way that online courses are assessed for quality at their institutions, and if they are conducting reviews, reflect on how the results of those reviews are being applied beyond individual course improvement. Participants will walk away with some knowledge about commonly missed elements and how to address them within a faculty development context.