Data Analysis for Instructional Designers – A Step by Step Framework

Abstract: 

The importance of data analysis has steadily changed job functions in many fields, including instructional design. However, combining data with design isn’t part of many instructional designer training programs. This presentation helps bridge this skills gap with a simple framework for bringing data into the curriculum development process.

Extended Abstract: 

By the end of this presentation, participants will be able to

  • Identify the evolving role of the instructional designer within the data revolution

  • Determine the benefits of leveraging data analysis in instructional design

  • Utilize a framework for incorporating data analysis into their practice

  • Review a case study of how data analysis was used to evaluate a training program

   

Building content and deploying learning experiences: these are the main responsibilities we traditionally assign to the role of instructional designer. But the tasks of instructional designers are no longer so clear cut. The rapidly growing importance of data science has been steadily changing job functions in many fields, including instructional design. Data isn’t a passing trend. Ninety percent of all of the world’s data has been created in the past few years and there is considerable continued growth expected this arena.

 

The vast amount of available data and explosion of data collection opportunities and technologies has touched almost every industry, creating a demand for hybrid roles that incorporate data analysis in everyday responsibilities. Instructional designers are not exempt from this data revolution and are increasingly expected to incorporate data in their decision making and design.

 

This new demand for a instructional designer/data analyst hybrid role may seem daunting to some not accustomed to operating in the data science field. However, online learning, which puts data at instructional designers’ fingertips, offers a monumental opportunity. The foundational goal of the instructional designer is to create impactful learning experiences. Utilizing data supports this objective, increasing learning efficacy in four key ways:

 

(1) It can inform the instructional designer what content is specifically appropriate for an audience without relying on guesswork or “going with your gut,” measures that are imperfect at best.

 

(2) It can help instructional designers utilize an Agile working process to iterate more quickly.

 

(3) It can evolve content in a personalized and efficient way that delivers the most impact for the learner.

 

(4) It can provide both qualitative and quantitative proof of the efficacy of a program or learning experience.

 

While the benefits of leveraging data to inform design are evident, the actual process of incorporating it into the field is not so simple. One of the number one barriers to using data in instructional design comes from a lack of training focused on data skills. This new hybrid instructional designer/data analyst role calls for a set of skills that aren’t typically taught as a package in higher education programs. With this weak training ecosystem, it’s no surprise that instructional designers are not ready to take on work with data in their roles.

 

The purpose of this presentation is to help instructional designers bridge this skill gap by providing a framework for approaching data and demystifying data collection, analysis, and implementation into a simple step-by-step process that anyone can follow.

 

To contextualize this framework for conference participants, this presentation will illustrate how the instructional design team at General Assembly faced this data challenge and the concrete steps we took to implement data into our course design strategy.

 

General Assembly, a global EdTech startup, works to to assess, source, and train the modern workforce in this increasingly technological economy. One of the hallmarks of General Assembly’s program offerings is in data science and analytics. General Assembly’s data leadership, however, was not in alignment with our instructional design team’s regular practice.

 

Our instructional design team realized this gap and the lost opportunities it represented and decided to leverage the General Assembly data analysis framework in our course design.

 

Step 1: Set the Goal

 

The goal of our instructional design team was to look at the data available for the top ten and bottom ten performing asynchronous learning modules in Digital Foundations, one of our flagship online corporate training courses, which helps employees build digital fluency in areas such as the customer journey, digital leadership, digital marketing, user experience design, and data and analytics. The goal was to determine common lesson strengths, weaknesses, and insights into how to build more successful content in the future. This specific objective helped keep the data gathering and analysis process focused and working towards a specific outcome.

 

Step 2: Acquire Data

 

Once our instructional design team knew what our goal was, we needed to determine which data sources were available to us. Gathering data is rarely a straightforward step. Many times there is no one central place for all product and student data. Our instructional design team collaborated with other members of the company, including product managers and engineers, to determine which data was available and where it was housed. From that collaboration, we were able to gather data from our LMS on qualitative lesson rating, lesson completion rate, and quantitative lesson feedback. We added to this data pool our own information, such as lesson content type, mode, and age.

 

Step 3: Clean and Organize Data

 

Data often comes in a messy package. Before we could analyze our data, we had remove redundancies, improve labeling, and translate thousands of entries into more easily understandable pivot tables. In addition, the team codified hundreds of qualitative feedback inputs provided by students, dividing comments into subcategories that could then be quantified and analyzed. Data cleaning and organization takes some time, but it was a necessary step to creating a firm basis for analysis.

 

Step 4: Analyze and Visualize Data

 

At this step, our instructional design team leveraged their cleaned and organized data to create correlation maps between lesson rating and lesson age, content area (digital leadership, user experience design, customer journey, data and analytics), mode (such as full lesson or shortened interview format), number of completions. In addition, we created and examined visual representations from the codified quantitative data to further identify trends in highly ranked and poorly ranked lessons.

 

As a result of this visualization, our team was able to gain such insights as the following:

  • More poorly rated lessons tended to improve in ratings over time. This allowed us to see that we should wait a period of at least six months before we determined whether or not a lesson needed to be adjusted.

  • Our most consistently high rated lessons fell into our Digital Leadership content area and our consistently low rated lessons fell into the User Experience Design area. When we looked further into our qualitative data, we were able to see that lessons that are more general in nature and that could apply to a variety of roles were more successful than those that were very role-specific, such as UX design. Through this, we determined that future lessons for Digital Foundations should be contextualized for a broader audience.

  • Lessons that had qualitative comments about confusing assessment questions were consistently rated poorly. Through this finding we were able to see the quantitative impact of imperfect instructional design and prioritize making this fixes.  

 

Step 6: Communicate Data

 

Once we completed our analysis, we engaged data storytelling best practices to communicate our findings to the larger product and leadership teams. Armed with data, we made a case for immediate quick fixes to Digital Foundations, longer term development to the course, and cross-product strategies we would bring to our instructional design practice in general. Based on the story we created around our data collection and analysis, leadership was motivated to integrate our recommendations into the instructional design team’s official product roadmap.

 

Step 7: Implement Findings

 

When our instructional design team started our analysis of Digital Foundations, we had only a vague idea of the performance of our product. While we had worked with subject matter experts to create the content, subjected the course to vigorous internal QA, and tested its demand in the market using a pilot group, we did not understand how it was performing in its large-scale implementation.

 

Our data analysis helped us both understand how the product was doing, but also concrete steps we could take in the near term and long term to improve and iterate on the product to provide learners with a more applicable and impactful learning experience. Before the end of Q4 2016, we will be making all assessment question updates as a quick and easy win for course improvements. In Q1 2017, our roadmap includes scoping capacity allocation for refreshes of low performing courses and expansion of course offerings in our Digital Leadership area, our top performing section.

 

Data analysis is a crucial part of the instructional design process. Instructional designers should leverage data during each stage of the build process, from conducting data analysis on current courses to determine product direction, to incorporating data from usability testing to improve design, to making iterations based on data in the post-build stage.

 

Participants will walk away from this presentation armed with a data analysis framework that they can apply to their own work. They will not only understand the important role data plays, but be able to move from understanding to application to build learner-centric content more efficiently and effectively.

Conference Track: 
Challenging Barriers to Innovation
Session Type: 
Education Session
Intended Audience: 
Design Thinkers
Training Professionals
Technologists