Lessons Learned from Implementing an Adaptive Learning Courseware Across Multiple Public Institutions in Maryland

Audience Level: 
All
Institutional Level: 
Higher Ed
Abstract: 

This presentation will give an overview of the Adaptive Learning in Statistics (ALiS) project, which involves a multi-year implementation of adaptive learning courseware in introductory statistics across multiple Maryland public institutions. The key impact findings on learning outcomes and the lesson learned from the first pilot year will be discussed.

Extended Abstract: 

Presentation Goals

The presentation will cover an overview of the Adaptive Learning in Statistics (ALiS) project, which involves a multi-year effort to study the efficacy of adaptive learning courseware in introductory statistics courses across nine public institutions in Maryland. The process for implementing the ALiS course as well as the key findings from the impact analysis, which compares student outcomes between the pilot and traditional sections in the first year of the course offerings (AY17-18), will be discussed. We will share the lessons learned and challenges faced from both the implementation and research perspectives. The audience will have a chance to engage in a discussion about opportunities and challenges of adopting adaptive learning courseware at a system-wide level and possible strategies to overcome them.

Context

Completion of college-level requirements in mathematics is one of the most common and vexing impediments to college completion rates, especially for students from lower-income families and underrepresented minorities, where high-school preparation in math is often insufficient (see Table 104.91 in Snyder, de Brey, & Dillow, 2016, p. 51; also see Bailey & Cho, 201 and Zeidenberg, Jenkins & Scott, 2012). For this reason, important efforts to improve student outcomes in higher education have focused on the application of learning science and use of digital learning platforms to improve the delivery of math content, and on the identification of common learning outcomes and broadly accepted standards in foundational courses – such as introductory statistics.

Ithaka S+R and TPSE Math, as the co-principal investigators of the ALiS project, aimed to test whether an active learning approach, based on a set of standardized learning outcomes and built on a sophisticated adaptive learning platform can unify content and improve learning outcomes in gateway math courses. The assumption is that this strategy can – with the commitment of system-wide faculty and administrative leadership – improve learning outcomes across a potentially large group of two-year and four-year institutions, particularly for at-risk students.

The ALiS course was co-developed and tested during 2016-17 by the course design working group at the University of Maryland College Park (UMCP) and Montgomery College (MC) under the direction of the ALiS project team[1] with the leadership from Ithaka S+R and the Kirwan Center for Academic Innovation of the University System of Maryland (USM). The project team supported the delivery of the course at the pilot institutions by providing professional development and training for instructors and facilitating communication and resource-sharing through regular check-in calls and a project-wide virtual learning community.

Building on the experience gained during pre-pilot offerings of the ALiS course during 2016-17 at UMCP and MC, and with significant funding from the Bill and Melinda Gates Foundation, the team was able to scale the delivery of the course and pedagogical approaches at seven additional institutions in Maryland[2]. The Urban Institute undertook a rigorous evaluation to determine if the adaptive learning technologies in combination with a thoughtful pedagogical approach could measurably improve student learning outcomes without negatively affecting course satisfaction.

Questions

The assessment of the AY17-18 pilot offerings conducted by the Urban Institute research team aimed to answer the following questions:

  • Did the combination of the use of the adaptive learning courseware and active learning pedagogy improve student learning outcomes (i.e., grade points and statistics knowledge), student completion rates, and student satisfaction with the course?

  • What lessons learned can be drawn from the preliminary findings that can inform the research field and other institutions’ efforts to launch similar initiatives?

Methods

The Urban Institute led an independent evaluation of the effectiveness of the ALiS intervention in improving student learning outcomes. The research employed a match-pair design, with a preference for same-instructor matched pairs, where pilot and traditional sections were both taught concurrently by the same instructor. Where same-instructor matched pairs were infeasible, the team matched sections taught by different instructors that were offered in the same format and the same or similar times of the day. In Fall 2017, seven institutions were included in the study, with 20 matched pairs – 12 of which were taught by the same instructor. In Spring 2018, two additional institutions joined the pilot, with a total of 30 matched pairs – 21 with the same instructor.

The research team used de-identified student records, data from the Acrobatiq platform, data from a baseline and end-of-semester student survey, and an end-of-semester instructor survey to inform the analysis.  The end-of-semester surveys were administered to both the instructors (pilot only) and students (pilot and traditional) to better understand the course structure and the participant characteristics as well as their satisfaction with the ALiS intervention. The research team also coordinated among institutions to implement a common final assessment, validated and refined each semester through item response theory analysis (IRT). The common final assessment was administered both to pilot and traditional students at the end of the semester as part of high stakes exams, to measure and compare the level of statistics knowledge between the pilot and traditional sections.

The analysis employed multivariate regression with matched pair fixed effects and clustered standard errors by section. Outcomes analyzed were grade in the course, passing with a C or better, pass rate, statistics knowledge (ascertained through the common final assessment), and student satisfaction with the course. Analysis was completed within each institution, by institution type (two-year and four-year), and across all colleges. Because of unequal student participation across the institutions, colleges were equally weighted for the aggregate analysis to adjust for over and under representation in the study sample.

Preliminary Results

(Note: We only report partial results from the AY17-18 pilot year here since the data collection for Spring 2018 is still ongoing. The November presentation will cover the full year’s results.)

The estimated impact analysis of the ALiS intervention in Fall 2017 aggregated across all seven colleges revealed marginal, albeit positive, impacts on student learning outcomes. In the aggregate level, the students in the pilot sections earned grades that were 0.169 GPA points higher on average than those in the traditional sections (p<0.10). However, significant impacts did not emerge in pass rate, statistics knowledge, or student satisfaction. When constrained to same-instructor matched pairs, no significant results emerged in grade or pass rate, but students in pilot sections scored 0.161 standard deviations higher on statistics knowledge (p<0.05) while rating satisfaction 0.622 points lower on a five-point scale (p<0.01), and within the same instructor pairs, the statistics knowledge scores for the pilot students were 0.161 standard deviation higher than the mean (p<0.05). Moreover, the subgroup impacts were minimal and not significant. The institution-specific results revealed variation in the precise estimates across institutions, but a general pattern of statistics knowledge and satisfaction moving in opposite directions at two-year colleges. In two colleges, neither was significant, but in two other colleges, estimates were significant in opposite ways (higher knowledge/lower satisfaction in one and lower knowledge/higher satisfaction in the other). The four-year colleges generally had positive results on all measures, though only some were significant.

Discussion

The Urban Institute  research team is gathering and analyzing the final set of data from AY17-18 to report on the overall impact of the ALiS intervention on student learning and satisfaction. It is not clear, and therefore requires future work and research, whether this kind of effort can be sustained in the long run or expanded further within Maryland and what kind of impact the effort may have on addressing the college cost and transfer-of-credit issues.

References

Bailey, T., & Cho, S.W (2010). Developmental education in community colleges. Issue Brief prepared for the White House Summit on Community Colleges, Teachers Colleges, Community College Research Center, Columbia University, New York, NY.

Snyder, T. D., de Brey, C., & Dillow, S. A. (2016). Digest of Education Statistics 2014, NCES 2016-006. National Center for Education Statistics.

Zeidenberg, M., Jenkins, D., & Scott, M. A. (2012). Not just math and English: Courses that pose obstacles to community college completion (No. 52). CCRC working paper.

---

[1] In addition to UMCP and MC, the ALiS project team includes Ithaka S+R as the overall project manager; the Kirwan Center for Academic Innovation of the University System of Maryland (USM) as the project advisor and coordinator; Acrobatiq as the provider of the adaptive learning platform and courseware and providing support for course delivery; and the Urban Institute as the project evaluator.  

[2] The seven additional institutions are: Anne Arundel Community College, Community College of Baltimore County, Frostburg State University, Harford Community College, Towson University, University of Maryland Baltimore County, and Wor-Wic Community College. Note that Towson University and Wor-Wic Community College were not part of the pilot in Fall 2017.

Conference Track: 
Innovations, Tools, and Technologies
Session Type: 
Education Session
Intended Audience: 
All Attendees