Educational organizations need evidence about educational interventions that is practical, actionable, and relevant. This workshop delivers the knowledge and skills necessary for individuals to conceptualize and operationalize rapid cycle evaluations, and thereby develop the ability to collect and utilize evidence through the practical application of research.
Workshop Overview
Educational organizations (e.g., schools, districts, state education agencies, networks) need evidence about their educational technologies that is rigorous yet practical. Rapid cycle evaluation (RCE) is a scientifically sound method of generating timely and actionable evidence on educational interventions. This workshop delivers the knowledge and skills necessary for individuals to conceptualize and operationalize RCEs, and thereby develop the ability to collect and utilize evidence through the practical application of research. The workshop has been designed for anyone who stands to benefit from making evidence-based decisions in educational settings, which includes educators, administrators, data and accountability staff, and technical/technology support staff.
Workshop participants will acquire a fundamental set of skills required for evidence-based practice, such as generating research questions, understanding how to collect data, recognizing and communicating research designs, interpreting the results of analyses, and communicating findings with key stakeholders. Participants will have the opportunity to practice these skills with real data from authentic educational settings. Participants will exit this workshop with the knowledge and skills necessary to build the capacity of their organizations to use evidence to make data-driven decisions about teaching and learning with educational technology.
Introduction (~10 min)
The workshop will start with introductions. The presenters will introduce themselves and, more importantly, the participants will introduce themselves. This will allow everyone to get familiar with each other and to get acquainted with the diversity of perspectives that are represented. Ideally, the workshop will consist of multiple stakeholders (e.g., educators, researchers, entrepreneurs, vendors, funders) who are geographically dispersed and who operate in diverse educational contexts. After introductions—which will include information on interests, area(s) of expertise, and expectations for the workshop—participants will examine the following learning objectives:
-
Understand the fundamental principles of RCE. Understand the types of evidence provided by RCE and determine how evidence informs important data-driven educational decisions.
-
Understand how to collect and validate data for the purpose of RCE.
-
Understand how to plan, design, and conduct analysis for RCE. Interpret the results of an RCE in order to make inferences about the findings and communicate those findings to a variety of stakeholders.
-
Conduct RCE in the "real world" with data, and use the results to inform decisions.
Workshop Topics
RCE: Who, What, When, Where, Why? (~10 minutes)
Presenters will discuss the theoretical underpinnings and fundamental principles behind RCE. Participants will discuss their views on evaluation and evidence, including past experiences. The workshop will explore ways in which rapid cycle evaluation enables educational organizations to circumvent barriers of traditional research, bridge the gap between research and practice, and shift the paradigm of evidence gathering in educationl contexts.
Depending on the size of the workshop, participants will either work together in one group or form small groups to engage in collaborative discussion. In the group(s), participants will have the opportunity to share their views on evaluation, explain barriers they have faced when attempting to collect or use evidence on educational interventions, and present ways they could benefit from having actionable evidence to inform their decisions. A key point of discussion will be how actionable evidence can advance the quality of teaching and learning with digital technology.
Datum, Datum, Datum (~15 minutes)
This portion of the workshop will explore the process of data collection, data processing, and data cleaning. Presenters will engage in an interactive demonstration on data cleaning and preparation that allows participants to actively explore ways in which data quality matters. Compelling examples of good data quality will be celebrated and humorous examples of poor data quality will be offered as well.
Participants will be provided with an artificial dataset that has been generated through data simulation techniques to mimic the type of data one would see in an authentic education setting. The dataset will include variables on usage (e.g., extent to which students both used and mastered a given digital technology), achievement (e.g., a measure of achievement that can be linked to usage), and covariates (e.g., demographics and other variables that will allow the analyst to hone in on different populations of students). The dataset will be “messy”—that is, participants will receive data prior to data cleaning, processing, or engineering. Participants will work together collaboratively to clean the data and structure it in a way that enables them to conduct analyses on it.
Lastly, presenters will briefly discuss the importance of data privacy and data security. Further, presenters will examine a few frameworks for data interoperability. This portion of the session will give participants the opportunity to discuss their experiences with data sharing and data use, and will highlight standards and best practices.
RCE: Results Are In—Now What? (~20 minutes)
Collecting data and conducting analyses is only a portion of the evidence-based process. Individuals must also be capable of interpreting results, making sound inferences, and communicating the findings to key stakeholders in a way that will provide them with actionable information. In this portion of the workshop, participants will work together to design, conduct, interpret, and communicate results.
First, participants will work together in pairs or small groups to design and conduct an RCE. Participants will be encouraged to design the RCE based on research questions that are relevant to their educational setting. The output from this portion will be a set of reports and dashboards that display the results of their analysis.
Next, as a group, presenters and participants will examine the reports in an engaging and interactive manner. Participants will explore different interpretations of the results and learn how to make sound inferences based on the types of evidence available. Participants will have the opportunity to discuss how these types of reports can inform the decisions they make on a daily basis.
Finally, participants will either pair up or form small groups to practice communicating the results of an RCE. In a role-play scenario, some participants will take on the role of different stakeholders while other participants explain the results in compelling and useful ways. Communication strategies will be discussed and participants will learn how to tailor their style to meet the needs of the audience.
Real-World RCE: Ready… Set… Go! (~35 minutes)
Thus far in the workshop, participants have learned about the theory behind RCE, have practiced processing a dataset, and have stepped through the process of conducting an RCE and communicating the results. The aforementioned work has been done with presenters and in pairs or small groups. This portion of the workshop provides participants with the opportunity to apply what they have learned thus far in the workshop. Ultimately, this hands-on experience should prepare participants to transfer (i.e., maintain and apply) their learning beyond the context of the workshop so that they feel ready and able to use RCE to generate evidence that informs important decisions.
At the end of this portion of the workshop, participants will be given a set of resources that they can use to inform their ability to conduct RCEs in their educational settings. These resources include a framework for rapid cycle evaluation, a rubric for grading educational technology, a glossary for understanding research terminology, and FAQs on conducting RCEs.
Conclusion
Traditionally, the randomized controlled trial (RCT) has been considered the gold standard for research designs. Although RCTs are appropriate for addressing some research questions, they fail to inform many of the practical questions and decisions that education organizations face throughout any given day or year. Thus, RCEs have been posited as a means for gathering a continuum of evidence in a way that is both practical and rigorous.
Schools, districts, states, and networks have worked with the presenters to conduct over 100 RCEs on dozens of educational technology products. Education organizations continue to leverage the presenter's Framework for Rapid EdTech Evaluation, rubric and grading protocol for collecting educator feedback, edtech management platform, and IMPACT functionality for analyzing data on cost, usage, and impact. These organizations across the US have used the results of the RCEs to inform decisions about instruction, implementation, and budget.