The Online Learning Potential Rubric engages faculty in formative assessment of their courses, using an approach grounded in research-based learning principles that deepen student learning. Participants will assess a course using 3 rubric dimensions (prior knowledge, practice & feedback, self-directed learning) and discuss the rubric as a faculty development tool.
Background:
In recent years we have seen unprecedented growth in online instruction. As Miller (2014) noted in her book Minds Online, we have gone from a few online courses to near ubiquity in one decade.
During a similar time period, the interdisciplinary field of learning sciences has coalesced, and new understandings about cognitive architecture, memory, and cognitive load have begun to find practical applications in the design of learning experiences. Texts such as How People Learn: Brain, Mind, Experience and School (Bransford, Brown, & Cocking, 2000) How Learning Works (Ambrose, Bridges, DiPietro, Lovett, & Norman, 2010), and E-Learning and the Science of Instruction (Clark & Meyer, 2011) have synthesized learning research and made it accessible to educators.
With few exceptions, however, growth in online learning and advances in learning sciences have had little overlap. Carnegie Mellon’s Online Learning Initiative (oli.cmu.edu) was one of the first to apply learning science research to online course design. Miller’s (2014) book is perhaps the first synthesis of learning science research applied specifically to online learning. Most recently, the final report of MIT’s Online Education’s Policy Initiative (2016) outlines the potential of online learning as a lever for integrating lessons of learning sciences research into higher education curricula.
The Center for Advancing Teaching and Learning Through Research at Northeastern University (CATLR) grounds all of its work in the learning sciences. A such, we seek to integrate a learning science perspective into our faculty development efforts related to online learning. In addition to individual workshops, CATLR’s offerings in this domain include extended Workshop series, ongoing Inquiry Groups, individualized consultations, and a year-long Online Course Design Fellows program.
Innovation:
CATLR’s Online Learning Potential Rubric is one component of that programming, informing individual consultation, faculty self-assessment, and research that is intended to deepen the generative application of learning sciences in online course design and facilitation.
The Online Learning Potential Rubric is a formative assessment tool for online course design, intended to be used by faculty -- either individually, with peers, or with a consultant. The rubric draws on the principles defined in the books How Learning Works (Ambrose et al., 2010) and e-Learning and Science of Instruction (Clark & Meyer, 2011). Faculty can assess their use of 64 individual strategies mapped across 8 principles. The results create a radial graph visualization of areas of relative strength and areas in which a course design could be enhanced.
By focusing on principles of learning, the rubric is designed to help faculty understand why common “best practices” are important. The rubric is not intended to be a “scoring instrument,” used by external evaluators, but is meant to begin conversations about iterative course design. At the same time, the rubric can serve as an educational tool, as it helps faculty to understand how evidence-based learning principles can be practically applied in online courses. It is generative rather than prescriptive, guiding faculty attention to the learning principles at play in coursework, as opposed to scoring in relation to specified course elements.
Express Workshop Format:
We believe that the best way to understand the value of the Online Learning Potential Rubric is to apply it, which is why we are proposing to conduct an Express Workshop. Typically, we would bring faculty through the complete rubric in a 90-minute format. For a 45-minute format, we have selected three dimensions of the rubric that we feel will best demonstrate the unique value of this approach to faculty development. We propose to introduce the purpose of the rubric and then bring participants through an abbreviated version of our workshop, focusing on principles that address prior knowledge, practice & feedback, and self-directed learning. Participants should be prepared to consider a particular course along these three rubric dimensions. We anticipate that participants will experience the following:
- Develop familiarity with research-based principles of learning.
- Apply these principles to online learning with research-based learning principles.
- Evaluate a specific course across three research-based learning principles.
- Identify how the Online Learning Potential Rubric can support faculty development in relation to online learning.
Replicability:
Because the Online Learning Potential Rubric is based on research-based learning principles, it can be used across a wide range of contexts. Its applicability is not tied to a particular type of school, discipline, or format of online or blended learning. We have used the rubric with faculty teaching both undergraduate and graduate courses; in professional programs; in a highly supportive course design program and a one-off workshop; and in disciplines as diverse as English, Business, Linguistics, Nursing, History and Engineering. We are seeking to pilot the Rubric beyond our own institution in order to advance evidence-based approaches to faculty development, increase use of the learning sciences as a lens for designing and improving online courses, and contribute to the body of knowledge about how people learn in online settings.
Impact:
Through both informal feedback and a survey completed after working with the rubric, faculty have indicated that using the Online Learning Potential Rubric is a useful and thought-provoking experience. The evidence section of this proposal provides details on this work. Even experienced faculty have been surprised by the low scores they gave themselves in relation to some learning principles, though this is to be expected because learning science is a new lens for viewing their work. Nonetheless, we have found that faculty are motivated and energized when they deepen their understanding and of how learning works (independent of format), why specific strategies increase learning, and when it could be helpful to activate and apply a specific principle in their teaching.
Evidence:
We have primarily used the Online Learning Potential Rubric in a workshop setting with groups of faculty. In the workshop, we begin by emphasizing the formative nature of this tool. We show the radar graphs generated by other online educators, including our own, to emphasize the variety of results and demonstrate that a perfect circle is unlikely to be achieved. The facilitator then explains each principle represented on the rubric and briefly discusses implications for course design. Participants complete the corresponding section of the rubric, thinking of a particular course, and map the resulting score on the radar diagram. We proceed in this manner through the eight dimensions of the rubric. When scoring is complete, we lead a discussion on what insights the results give faculty about the design of the course they are evaluating. In what areas are they strong? Where could they make improvements? What might those improvements look like? Did they add any indicators to the rubric?
In the Fall of 2015 we conducted a focus group / pilot of the rubric. Participants included faculty from Business, History, Philosophy and Engineering. Their online teaching experience ranged from 0.5 to 10 years. All were able to identify both areas of strength and opportunities for improvement in their course designs. The more experienced faculty, in particular, were surprised by areas that showed room for improvement, but were eager to get support in developing effective, evidence-based strategies. In a group discussion following the experience, they were able to generate ideas for new strategies for implementing the principles.
Scope:
In the early stages of the development of the Online Learning Potential Rubric, we created a paper-based instrument and worked with faculty in face-to-face environments. We are now developing an online version of the rubric to expand the ways in which we are able to use it with faculty locally, embed links to the learning sciences research that informs the rubric, and increase our ability to share it with the broader community of faculty developers working in online learning. Participants in this workshop will have the opportunity to join a pilot of the online version of the rubric that is in development.
References:
Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How learning works: Seven research-based principles for smart teaching (1st ed.). San Francisco: Jossey-Bass.
Bransford, J. D., Brown, A. L., & Cocking (Eds.). (2000). Learning and transfer. In How people learn: Brain, mind, experience, and school (Expanded, pp. 51–78). Washington, DC: National Academy Press.
Clark, R. C., & Meyer, R. E. (2011). E-learning and the science of instruction: Proven guidelines for consumers and designers of multimedia learning (3rd ed.). San Francisco: Pfeiffer.
Miller, M. D. (2014). Minds Online: Teaching Effectively with Technology. Cambridge, Massachusetts: Harvard University Press.