As academic technologies are more deeply integrated into higher education, the ‘data exhaust’ emitted can provide insights into student success, technology adoption, and instructional design. What should we do with large- and smaller-scale research findings about student learning? What barriers prevent further progress and what can we do?
As academic technologies mature and are more deeply integrated into higher education, the ‘data exhaust’ from these applications can provide surprising insights into student success, technology adoption, and instructional design. In his 2008 dissertation, John Campbell, founder of the Signals project and formerly of Purdue University, asked what academic institutions (and consequentially, the vendors providing them with applications collecting data) ought to do with their analytics findings, which he called the “ethical obligation of knowing”. What are the results that we’ve seen in the past decade, and what are the ethical implications of this work?
In this presentation, we assemble a group of academic leaders and EdTech analytics gurus to talk about what they’re discovering through their work analyzing data from academic technologies, the surprises and pivots they are taking as they try ideas and learn from practice, and what barriers we need to overcome. We will illustrate conceptual ideas with large-scale empirical research results from using analytics in practice.
Results that we will discuss include:
Questions and Topics to Discuss Include:
-
Beyond their own observations in class, should faculty be informed of “big data” [institutional] predictive analytics results identifying at risk students? If so, when and how should this be done, lest they inadvertantly become biased against helping students they may feel are destined to fail anyway?
-
If we first want to preserve student agency and responsibility for learning, how can we convey what we think we know about their likelihood of success without tainting a teachable moment for them to develop or demonstrate grit, persistence and a growth mindset?
-
If research supports -- and we accept -- student use of digital learning environments as a proxy for their engagement, how might we use learning analytics findings to identify and inform effective course designs that help?
-
What role, if any, might analytics findings play in informing the work of advisors to raise student awareness and encourage help-seeking behaviors? By default, advisors may have a wider view across all of their advisee’s courses whereas faculty may have a deeper view of students’ learning within their specific course.
-
Is it possible that an institution’s culture, will and even conception of human learning is revealed in the type of interventions it does or does not deploy? According to the Hobson’s Starfish Intervention Inventory -- formerly the Predictive Analytics Research (PAR) Student Success Matrix -- an intervention is “any program, service, offering, action, intervention or policy that supports or assists students in the successful completion of a given course and/or completion of degree or credential of value in the workplace.” These can be proactive, aimed at preventing issues or reactive to address issues that arise. More info: Starfish Predictor Definitions
-
Finally, in light of abuses by Facebook, Cambridge Analytica and growing concerns about privacy in the European Union, is the window closing on the perception and potential of what learning analytics can achieve? What principles ought to guide institutions about their use of student data? A good place to start may be the 2017 IMS Global “Learning Data and Analytics Key Principles” white paper.
Panelists (note: many speakers did not have profiles in the system, so I'm providing them here)
· John Fritz, Associate Vice President, Instructional Technology, UMBC
· Daniel Green, Director, Product Analytics, VitalSource
· Jared Stein, Vice President of Higher Education Strategy, Product, Instructure
· Jenn Stringer, Chief Academic Technology Officer & Asst. Vice Chancellor Teaching and Learning, UC Berkeley
· John Whitmer, Director Analytics & Research, Blackboard (panel facilitator)