Using advanced adaptive learning technologies, we are exploring whether it is possible to design courses with a dynamic system for multiple cognitive levels and an assessment model to enable each student to reach their maximum potential and to build a framework to facilitate the instructional design process.
Most higher education courses today employ a “one size fits all” teaching philosophy. Students in a course receive the same instructional activities and are assessed with the same benchmarks against learning outcomes, regardless of aptitude or preparedness. While some students will receive higher assessed learning outcomes (grades) than others, and some students might fail, there is very little effort to differentiate instructional activities across students of different skill levels. What if courses could be designed such that the best students are provided opportunities to be challenged more than they would be in a normal class? And what if, at the same time, instructional activities could be designed for weaker students so that they could more easily demonstrate basic competency in course learning outcomes?
Our proposed presentation reports on results from an on-going study that aims to develop a simple and scalable design to improve learning outcomes and student readiness through using adaptive, differential, tutoring, and universal design learning strategies. Using advanced technologies, we are exploring whether it is possible to design courses with a dynamic system for multiple cognitive levels and an assessment model to enable each student to reach their maximum potential.
This project is innovative, timely, and exciting. Because adaptive learning is identified as being in the top 10 technology trends in higher education in 2016 (Educause, 2016; Gartner, Inc., 2016; New Media Consortium, 2016), we are compelled to explore building a framework for this technology enhancement in an innovative way that accelerates every student’s learning potential. We seek a design framework for adaptive learning that can be generalized across knowledge domains, while also scaling to larger numbers. The framework is founded upon a conceptualization of student readiness, which itself is undergoing development and refinement through this project, that is an integration of design frameworks seeking to improve individual learning needs, frameworks such as differentiation, tutoring, universal design, and blended learning.
Broadly speaking, readiness is a central concept within the context of student learning. By readiness, we refer to student’s differing levels of skills, knowledge, or dispositions that influence knowledge acquisition and cognitive processes (see: Anderson and Krathwohl, 2001). Readiness may also be contextualized as the acting agent of PLE (Personalized Learning Environment or Experience; see: Dabbagh, N., & Kitsantas, A., 2012), in that within any learning context each student will use their unique combination of aptitude, experiences, motivations, and dispositions to actively leverage and mash resources needed for each activity to meet learning objectives and/or achieve goals.
Our position is that lectures, online instructional course designs, and other approaches suffer from a singular perspective: the design targets a level of skill or knowledge acquisition that often lies between the strongest and the least strong student. We view readiness, whether for a lecture, a homework assignment, or an exam, as an action-related aspect of every student’s personalized learning experience. As such, readiness should be an instructional design consideration, where instructors may plan for a range of student readiness in such a way that all students have the possibility to grow. We aim to refine readiness and with that build an instructional design framework to be used for building learning solutions in an adaptive learning system such as Smart Sparrow.
Exploratory Study Target Questions-
Using adaptive system technologies, is it possible to design an instructional solution that raises competency for all levels of student abilities?
-
Are adaptive system technologies and instructional designs effective with highly diverse graduate level STEM students?
-
Do we confirm previous research results using similar technologies and techniques that the overall range of student achievement outcomes decreases?*
-
Do we confirm and extend previous research results* using similar technologies and techniques that for graduate level STEM students the perception of using an adaptive system as helping learn course material better is positive (student and faculty satisfaction)?
-
Can we quantify a concept of student readiness to facilitate design and use of adaptive technologies into instructional contexts?
-
How easy is such an adaptive solution implemented?
* (See: Dziuban, C. D., Moskal, P. D., & Howlin, C., Fall 2014; Dziuban, C., & Moskal, P., 2016)
Study contextThe exploratory 2 year study is in its first year at Keck Graduate Institute (KGI), a member of the Claremont Colleges Consortium. Students at KGI come from a variety of STEM backgrounds and enroll in interdisciplinary applied science and management coursework designed with the goal of launching careers in the bioscience industries. We selected a course titled “Introduction to the Bioscience Industries” (“ALS 359”), which contains learning outcomes emphasizing qualitative and quantitative problem solving and oral and written communication skills. Readiness is an issue due to the diversity of undergraduate backgrounds. For example, many students with biology backgrounds have weaker prior experience in quantitative problem solving compared to most engineers. Moreover, many students with liberal arts college backgrounds have stronger readiness for communication activities compared to students from larger universities. The project plan includes expansion with collaborators at other institutions and subject area domains.
Conceptual FrameworkDrawing from the literature and from practice guidelines in differentiated design, universal design, tutoring, and adaptive learning, the project intends to integrate these design approaches into a new conceptual framework, and then to iteratively assess the framework’s effectiveness. Further, the project aims to explore and refine how the conceptual framework facilitates an instructor to design and implement adaptive technologies in STEM coursework.
Methods
Over the course of two years, the project will leverage multiple sections of the ALS 359 course, using a controlled study format with implementation of the technique into experimental sections, while collecting achievement outcomes from other sections as control groups. Ten iterations of the ALS 359 course will be assessed. This includes 6 traditional classroom sections of 40-50 registered students, where 4 sections will be treatment and 2 control. The remaining 4 sections will be online classes, whereof 2 will be treatment and 2 control, with about 16 students in each section.
In the treatment sections that include an integrated adaptive learning solution, the design format is Design - Implement - Measure - Report - Adjust – Repeat, as a Design and Development Research method (Richey & Klein, 2007) that integrates design improvement into each iteration and looks for differences in a) student academic performance, and b) design approach to iteratively improve the adaptive framework.
Data CollectionStudent experience data (e.g., selected choices, time spent in solution ‘frames’, scores to assessments) are collected from 1) the adaptive technology system to explore design effectiveness and instructor perception of student’s range of readiness; and 2) exchanges between students and instructor, in both control and treatment course sections, as emails and session notes. A survey instrument is used to collect student perception of using the adaptive solution. Reflections on adaptive designs from instructor or designer(s) are collected during and after design cycles to collect thought processes and other considerations going into design of adaptive solutions. Experience data is used to learn how instructors should relate subject matter to leveraging system capabilities to build a readiness construct that facilitates using adaptive technologies.
AnalysisThe team employs analysis strategies using both quantitative and qualitative techniques, such as descriptive statistics comparing readiness indicators, regression analyses of aggregated dataset across control and treatment groups to build prediction models, as well as collaborating and leveraging work by Dziuban and Moskal (2014, 2016).
What Participants to Session Can ExpectParticipants can expect a standard presentation format, where the study team outlines project goals, technology tools, measurement strategies, student profiles (graduate in STEM programs), course brief descriptions and delivery context, and findings from a study in progress. The session will include opportunities for questions and discussion. Presentation slides will be made available, as well as any design templates (as works in progress) to support designing adaptive learning strategies using our readiness framework.
Presenters
George Bradford, Ph.D., Director of Instructional Design and Development, Keck Graduate Institute, part of the Claremont Colleges Consortium. George is an educational professional with over 30 years of experience leading, supporting, developing and managing integrative learning opportunities and online support systems for all levels of faculty, students, and business professionals on an International level. Previous research explored relationship between student satisfaction and cognitive load. LinkedIn Page: https://www.linkedin.com/in/georgerbradford
Steven Casper, Ph.D., Dean School of Applied Life Sciences, Keck Graduate Institute. Dr. Casper is a social scientist with over two decades experience as a higher education educator, researcher, and administrator. He is an advocate of active learning techniques within STEM education, and has received grants in this area from the Fletcher Jones, Parsons, and Weinberg Foundations. LinkedIn Page: https://www.linkedin.com/in/steven-casper-77155840
Meghana Joshi, Ph.D., Director of the Biocon-KGI program, Keck Graduate Institute. Dr Joshi is a Molecular Biologist and a consultant to the Biopharmaceutical industry. She has several years of teaching and research experience in both undergraduate and graduate-level STEM programs and institutions in the US as well as in India. She directs and teaches a unique online program for a top Biotechnology company in India to help bridge the gap between industry and academia. LinkedIn Page: https://www.linkedin.com/in/joshimeghana
ReferencesAnderson, L. W., Krathwohl, D. R., & Bloom, B. S. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives. Allyn & Bacon.
Dabbagh, N., & Kitsantas, A. (2012). Personal Learning Environments, social media, and self-regulated learning: A natural formula for connecting formal and informal learning. The Internet and higher education, 15(1), 3-8.
Educause ECAR. (2016). Trend Watch 2016: Which IT Trends Is Higher Education Responding To? Retrieved from: https://library.educause.edu/resources/2016/3/trend-watch-2016-which-it-trends-is-higher-education-responding-to
Gartner Inc. (2016). Learning analytics: passe, panacea, or pathway to success? Presentation by Glenda Morgan.
New Media Consortium. (2016). NMC Horizon Report: 2016. Johnson, L., Adams Becker, S., Cummins, M., Estrada, V., Freeman, A., and Hall, C. (Eds). Higher Education Edition. Austin, Texas: The New Media Consortium.
Klein, J., & Richey, R. (2007). Design and development research.