Test Design Strategies and Automated Proctoring: Practical Tips to Deter Cheating Online and Offline

Audience Level: 
All
Session Time Slot(s): 
Institutional Level: 
Higher Ed
Abstract: 

Use of high-stakes testing in online courses has been rapidly growing in higher education.  Maintaining integrity and deterring students from cheating becomes a growing concern among faculty.  In this session, attendees will explore specific test design strategies/tools and discuss the impact automated proctoring has on teaching and learning.

Extended Abstract: 
Introduction

Maintaining academic integrity is a continuous challenge for faculty teaching fully online, blended, and traditional face-to-face courses.  Since Baylor University’s first inception of teaching in the online space with the Hankamer School of Business’ online MBA (Master of Business Administration) program and undergraduate courses from the College of Arts & Sciences in 2015, there has been an increased concern among faculty over minimizing cheating on high-stakes exams and assessments effectively.  It is a common misconception that cheating might be more likely to occur in an online environment simply because the faculty and students are physically separated from each other.  In fact, studies comparing cheating in online vs. face-to-face learning environments have yielded mixed results (Grijalva, Nowell, and Kerkvliet, 2006; Lanier, 2006; LoSchiavo and Shatz, 2011; Stuber-McEwen, Weseley, and Hoggatt, 2009).  This “remoteness” of online learning environments does make it challenging to monitor different types of assessment activities, especially high-stakes tests and exams.

During the 2018-2019 academic year, Baylor University had selected Proctorio as the premier automated proctoring solution that integrates with the university’s official learning management system (LMS), Canvas.  The institution’s decision to integrate such solution was heavily driven by (a) high costs of traditional online live proctoring services and (b) a desire to encourage faculty to re-think what it means to assess and measure student learning beyond traditional testing.  Through the university’s instructional design team and faculty development opportunities, Baylor online instructors have:

  • been presented with several test design strategies and tools available through the learning management system that can deter cheating in any learning environment.
  • been introduced to Proctorio, Baylor University’s current automated proctoring solution.
  • discussed alternative forms of assessment.

In this discovery session, attendees will, first, gain insight into specific test design strategies and LMS features that Baylor faculty have found to be effective and an alternative to using any form of proctoring for online tests.  Secondly, attendees will have an opportunity to discuss some of the faculty members’ decisions to use Proctorio for high-stakes testing in their summer online undergraduate courses or fully online graduate professional education programs.  This includes a deeper exploration of Proctorio’s LMS integration and toolsets that allow faculty to control the level of proctoring ranging from enabling specific actions for the identification verification process to tracking eye movement and other behaviors during tests.  Alternative forms of assessment will also be an underlying theme across the two areas of focus described above.

Test Design Strategies and Tools

Delivering traditional summative assessments (e.g. mid-terms, chapter tests, and final exams), such as multiple-choice, multi-answer, true/false, matching, and short answer items, is not the only strategy to measure student learning.  Some test design tips include delivering more frequent short tests/quizzes to stay on track with the subject matter, present higher-order mastery questions that require deeper knowledge of application of material, constructing test banks that have a mixture of traditional items and higher-order mastery questions while adding questions over time, and designing tests that are “open book.” 

While not perfect solutions, the following are test feature sets in Canvas that many Baylor faculty have found to significantly reduce cheating on tests:  randomizing test questions with question groups, shuffling answers, customizing question banks, allowing multiple attempts, permitting/not permitting students to see their test responses, viewing of correct responses for mid-term and final exam preparation, requiring an access code, setting quiz availability dates and times, and prevent students from viewing quiz scores.  Respondus LockDown Browser and Respondus Monitor are additional tools some Baylor faculty have taken advantage of.  This session will explore these and several more test design strategies and LMS features faculty may want to consider to minimize or deter students from cheating on an online test.

Automated Proctoring with Proctorio

Proctorio is Baylor’s premier secure online proctoring service software that integrates with Canvas.  This automated proctoring solution allows faculty to set the basic and advanced controls for each exam with a few mouse-clicks.  Advanced biometrics monitors and flags suspicious behavior.  An instructor dashboard accessed through Canvas provides timestamped flags of when suspicious behavior occurred, access to the recorded (video and audio) test session, and other user analytics.  This session will present specific Proctorio features that Baylor faculty have used in recent online courses including audio and video (screen, web camera, and test environment) recording options, lock down options such as preventing re-entry and blocking downloads from the web, verification options including identification card checks and audio/video verifications, and basic to advanced behavior settings.  In showcasing these unique toolsets and features, pedagogical/andragogical benefits and challenges of using an automated proctoring solution will be discussed.

Alternative Assessments: An Underlying Theme

Alternative assessment methods such as case studies, writing assignments, collaborative projects, and debates can avoid several problems often associated with traditional testing in both online and face-to-face learning environments.  Palloff and Pratt (2013) suggest selecting assessment methods that are more learner-centered and authentic.  This discovery session will subtly identify alternative assessment methods including, but not limited to, electronic portfolios, simulations, debates, interviews, and digital media assignments as a supplement to the two focal points: test design strategies/tools and automated proctoring.

References
  • Grijalva, T., Nowell, C., & Kerkvliet, J. (2006). Academic honesty and online courses. College Student Journal, 40(1), 180-185.
  • Lanier, M. (2006). Academic integrity and distance learning. Journal of Criminal Justice Education, 17(2), 244-261.
  • LoSchiavo, F. M., & Shatz, M. A. (2011). The impact of an honor code on cheating in online courses.  MERLOT Journal of Online Learning and Teaching, 7(2).
  • Palloff, R. & Pratt, K. (2013).  Lessons from the virtual classroom: The realities of Online teaching.
  • Stuber-McEwen, D., Wiseley, P., & Hoggatt, S. (2009). Point, click, and cheat: Frequency and type of academic dishonesty in the virtual classroom. Online Journal of Distance Learning Administration, 12(3), 1-10.
Outcomes/Goals
  • Attendees will identify settings found on most learning management systems (LMS) that enhance online test security.
  • Attendees identify and discuss test design strategies that can reduce occurrences of cheating on online tests and exams.
  • Attendees will explore engaging learner-centered and authentic instructional activities that can be used as alternative assessments to traditional tests both in online and face-to-face learning environments.
  • Attendees will describe common faculty perceptions and misconceptions of cheating in traditional face-to-face vs. online learning environments.
Effective Practice Criteria
  • Innovation:  This discovery session will introduce current and new strategies for reducing and/or deterring learners from cheating on online tests by exploring test design strategies and learning management system features, discussing implications of automated online proctoring solutions, and consider alternative assessments while maintaining academic integrity.
  • Replicability:  All techniques and strategies presented in this session can be effectively and efficiently implemented at other universities for any online, blended, and face-to-face learning environment.
  • Impact:  The strategies and techniques presented in this session, along with additional ideas attendees identify, can assist with minimizing cheating in online tests.  These strategies and techniques can easily be adapted at other institutions.
  • Supporting Evidence:  The presenter will provide a list of peer-reviewed research sources that support the strategies’ effectiveness discussed in this interactive session.  Authentic examples of these implemented strategies from Baylor University faculty will also be highlighted.
  • Scope:  All techniques and strategies discussed are applicable to all learning environments.
Materials
  • A laptop will be used to show examples of online courses that use a variety of test feature settings with Baylor University’s instance of the Canvas learning management system. 
  • A handout will be provided listing suggestions/tips and best practices effective test design and suggestions to deter students from cheating on online assessments.
Target Audience
  • Higher Education faculty, instructional designers, instructional technologists, and administrators will benefit from this discovery session.
  • All experience levels may benefit from this session.
Audience Active Engagement
  • Attendees will have a hands-on opportunity to implement and preview common online test settings in Canvas sandbox environments from both faculty and student perspectives.
  • Attendees will also be able discuss the implications for teaching and learning of specific Proctorio settings for automated proctoring.
  • Participants will be asked to share their experiences, suggestions, and challenges with cheating on online tests/exams while maintaining academic integrity.
Position: 
10
Conference Session: 
Concurrent Session 11
Conference Track: 
Tools and Technologies
Session Type: 
Discovery Session
Intended Audience: 
Administrators
Design Thinkers
Faculty
Instructional Support
Students
Technologists
All Attendees