Team four’s Initial Summary of a Learning Event and Approach to Critical Inquiry

The current global pandemic and resultant restrictions on gatherings, has challenged educational institutions to rapidly transition from in person, to remotely delivered courses. Among the challenges this type of transition presents is how to preserve academic integrity in a remote, uncontrolled setting, particularly considering assessments. Respondus (2020) offers solutions for remote assessment proctoring. For our shared learning experience, our team  selected video tutorials created by Respondus (2020) about two of their products; Respondus Monitor, and LockDown Browser. After viewing all of the available videos and conducting further research, we have gained an understanding of how this software works and some of the rationale driving institutions to adopt it. Each team member agreed that the products could be a user-friendly and straightforward proctoring solution for both institutions and students alike.

Respondus Monitor and Lockdown Browser provide “cost effective, scalable, and convenient solutions for protecting the integrity of online exams” (Respondus, 2016, 3:10). Essentially, Lockdown Browser works by preventing learners from accessing unauthorized content or resources during their exam, while the companion Monitor feature offers a means of authenticating a user’s identity via facial recognition and then monitoring their behaviors during an exam by use of a webcam (Respondus, 2020). Teclehaimanot, Hochberg, Franz, Xiao and You (n.d.) noted that in order for educators to prevent the issue of academic dishonesty, student identification and authentication is vital. Both Lockdown Browser, and Monitor are available to be used within many popular Learning Management Systems (LMS) (ex. Brightspace, Blackboard etc.), which renders these tools as accessible solutions which are easily integrated into existing LMS’s. The Respondus company offers easy-to-understand arguments and pitches for how and why to use their software.

Not all online assessments require protective software measures and programs, such as those provided by Respondus. Some even feel that businesses in this industry “are selling a narrative that students can’t be trusted” (Harwell, 2020, pp.9),  however, summative assessments that require high academic standards and integrity are arguably definitive candidates for such programs. Particular summative assessments must take verification of student identity and technical issues, such as student hardware usage, software, and bandwidth into consideration (Benson & Brack, 2010). The tests given must be fair, meaning the test environment and restrictions associated must also demonstrate equality to all students taking the exam. This can be a challenge when students are not co-located in the same classroom. The Respondus Monitor program tutorial particularly sought to address the above issues of students taking an examination from different locations. The tutorial program did an exceptional job of visually and cogently describing how the specific monitoring software addresses potential issues of students taking an exam from greater distances; and to give the software credibility, the company was not haughty when describing the fact that students may require greater bandwidth, combined with an adequate internet connection, in order for the software to be trustworthy (Respondus, 2016).

The clear and concise arguments for how-and-why a particular learning provider should use this software, combined with the user-friendly online tutorial environment to navigate the potential software, makes the overall potential of using these softwares a real contender within a plethora of potential learning environments.

References

Benson, R., & Brack, C. (2010). Online assessment. Online learning assessment in higher education: A planning guide (pp. 107-151). Whitney, UK: Chandos Publishing Oxford. Retrieved from https://ebookcentral-proquest-com.ezproxy.royalroads.ca/lib/royalroads-ebooks/reader.action?docID=1582338&ppg=128

Harwell, D. (2020, April 01). Mass school closures in the wake of coronavirus are driving a new wave of student surveillance. The Washington Post. Retrieved from https://www.washingtonpost.com

Respondus. (2016). Respondus monitor: Protecting the integrity of online exams [Video File]. Retrieved from https://www.youtube.com/watch?time_continue=197&v=hv2L8Q2NpO4&feature=emb_logo

Respondus. (2020. April 16). Retrieved from https://www.respondus.com/products/monitor/

Teclehaimanot et al. (n.d.). Ensuring Academic Integrity in Online Courses: A Case Analysis in Three Testing Environments. Retrieved from https://members.aect.org/pdf/Proceedings/proceedings17/2017/17_12.pdf

 

3 thoughts on “Team four’s Initial Summary of a Learning Event and Approach to Critical Inquiry”

  1. Thank you for sharing your summary. I think your topic is fascinating! I asked myself this question when COVID-19 started, and universities/colleges/schools transitioned to online learning: How can we make sure our students won’t cheat?

    I read several perspectives from educators worldwide; some encouraged using such Softwares; others think that effective online learning applies assessment instruments where traditional cheating doesn’t benefit student grades. Personally, I agree with the latter approach, however, I would love to learn from you. I am looking forward to reading your findings!

    Here is a Twitter thread I liked that discussed this topic https://twitter.com/johnhawks/status/1237582836753764352

    Good luck!
    Tala

  2. Thank for sharing your groups experience with engaging in the Respondus video tutorials Lisa!

    This is a very hot topic at my institution now, exam period crept up and added a layer of complexity to the design and delivery of remote instruction for faculty muddling through in supporting the students meeting the learning outcomes.
    Many faculty have expressed concerns about maintaining academic security and integrity in this new reality, and our Teaching and Learning Commons has taken an empathic approach to this situation. Students (and faculty) are dealing with tremendous stress and we are asking faculty to approach assessments with flexibility, and compassion. As you have stated, some programs “require high academic standards and integrity”; even with the required rigour and elevated standards, I feel that there are strategies that can support academic integrity without using external proctoring software, or adding additional pressures:
    • Identifying the learning outcomes still needing to be accomplished (is a final assessment necessary? Or have the outcomes been met?)
    • Consider providing options for completion, can the final assessment be an opportunity for deeper learning by providing agency in the demonstration of outcomes
    • Ensure your assessment is current
    • Add a question bank and randomize it so the assessments are unique for each student
    • Set time parameters
    • Assume students will use external resources and as a result ask higher level thinking questions
    • Ask students to sign an academic integrity contract
    In addition to implementing the above assessment design recommendations, there are many considerations when using external invigilators or proctoring tools:

    • What is the privacy, security and data policies and procedures behind the tool you plan to use?
    • What equipment do your students have to access the software? Not all students have a webcam or microphone and could be excluded if they cannot confirm identity. What other hardware is needed? Do they need admin rights on a computer to download software? (some students borrow devices)
    • What connectivity issues may arise? Is there on the spot support?
    • How do you plan for accommodations? Do students need assistive tech? transcription?
    • How will you communicate with the students if they have questions or need support?
    • Can international students access it?

    Some food for thought as you move forward in your research, I look forward to seeing what you discover😊

    Cheers,
    Lisa

  3. Hi Lisa,
    This is such an extremely interesting and timely topic!
    As an academic librarian, I see the cases of plagiarism on campus (students are required to attend a 1-hour mandatory workshop with a librarian if they were accused in plagiarism). On our campus, the cases of plagiarism have increased by over 200% since the beginning of the pandemic. This poses so many questions: from the effects of stress and other mental health issues on students’ plagiarism to how prepared the faculty is to teach online and construct the assessment based on this new reality. It is so interesting that the technology you are discussing assumes that the assessment is largely based on the closed book exams and remembering the information or in the words of Paulo Freire, “banking model of education” . My critical inquiry is looking into cultivating the sense of agency in the students from the critical pedagogy perspective, and some of the authors suggesting that they do not grade their students at all (See Jesse Stommel’s blog post here: https://www.jessestommel.com/ungrading-an-faq/). While most institutions probably fall between these two extremes, I think the need for these types of tools questions some underlying dynamics in the way we assess our students.
    I am really looking forward to seeing what you find in your critical inquiry and to your presentation!
    Marta

Leave a Reply to tmami Cancel reply

Your email address will not be published. Required fields are marked *