Marni's Global Digital Learning Journey

Learning Innovation Critique of Digital Learning Assessments

Image from Canva

While I had previously been exploring learning analytics, implementing digital learning assessments arose as another impactful learning innovation within that research. As digital learning assessments use technology to evaluate students’ performance, I am fascinated with the immediate feedback and gratification they provide me whenever I am involved with their utilization. Therefore, this critique focuses on digital learning assessments of the formative type and their relationship with technology. 

To begin, digital learning assessments rely on technology to evaluate students’ knowledge and skills. This innovation leverages platforms and digital tools for administering, scoring, and providing assessment feedback (Luppicini & Haghi, 2013). In my case, digital learning assessments are primarily implemented using my learning management system (LMS) platform. The digital learning assessments can be built into courses, allowing for automated responses throughout students’ studies on an ongoing formative basis. As Conrad and Openo (2018) described, these formative assessments are “usually not graded” (p. 17). Morris et al. (2021) supported this formative assessment approach in their research of providing feedback to students regarding their performance. Furthermore, Petrović et al. (2017) studied that when students use this feedback, the impact on learning in achieving the desired learning outcomes is significant.

In terms of digital tools for developing digital learning assessments, many educational technology software vendors aim to provide students with engaging learning experiences. Çekiç and Bakla (2021) reviewed using various educational vendors’ automated digital learning assessment tools. These software tools provided opportunities like incorporating gamification and creating interactive presentations that assess students through polls, surveys, and quizzes. Accessibility to this technology is essential as mobile phones and tablets are required to input assessment responses. The results would also be displayed in real-time on these devices. Furthermore, Febriani and Abdullah (2018) demonstrated in their study that automatic formative digital learning assessments were more flexible and timelier when compared to conventional methods. In addition to the benefits for students, “saving more time and reducing effort spent by instructors in the assessment process” was achieved (Febriani & Abdullah, 2018, p. 37).

Although digital learning assessments offer many benefits, potential usability challenges can impact the effectiveness and user experience. Therefore, it is essential to consider the risks of incorporating digital learning assessments. Kasani et al. (2020) concluded that digital learning assessments within LMSs encountered technical connectivity and bandwidth limitations. As a result, these technical difficulties during formative digital learning assessment practices could impact the quality of learning. Another risk is ensuring students and educators become familiar with new software and platforms. This learning curve can be challenging, especially for non-technical technology-adverse users (Debuse & Lawley, 2016). Alruwais et al. (2018) reiterated this finding by emphasizing the need for professional development training for teachers to gain confidence in using digital learning assessments with students. When covering potential risks, I would be remiss not to mention security and privacy again as I had done when exploring learning analytics. To further this discussion, Timmis et al. (2015) recommended dialogue between researchers and educators to promote actions regarding assessments and learning analytics. For example, “A series of wide-ranging policy-focused debates should be initiated to involve all stakeholders and directly address the challenge of rethinking assessment purposes and the role of digital technologies in contributing to such changes” (Timmis et al., 2015, p.18).

With that recommendation from Timmis et al. (2015), I am hard-pressed to determine the value proposition for incorporating digital learning assessments into course design. On top of the security and privacy concerns, there are other points to consider. Moreover, with course development, digital learning assessments should be designed to reduce bias and ensure fairness for all students. To ensure this fairness, Dron explained authentic assessment practices where “every activity contributes to both individual learning and the learning of others, gives learners control, and is personally relevant and uniquely challenging for every learner” (Dron, 2018, as cited in Conrad & Openo, 2018, p. 177). This guidance from Dron reinforces that I still have much to learn on this fascinating topic of formative assessments, particularly the relationship with technology.

This critique highlights the importance of aligning factors such as accessibility and appropriate technology knowledge to effectively utilize digital learning assessments. I am immensely interested in this topic due to the potential value of engaging students with real-time feedback to impact their learning experiences positively. Stay tuned as I anticipate more exploration of this learning and technology topic in the future.

References

Alruwais, N., Willis, G., & Wald, M. (2018). Advantages and challenges of using e-assessment. International Journal of Information and Education Technology, 8(1), 34-37. doi: 10.18178/ijiet.2018.8.1.1008

Çekiç, A., & Bakla, A. (2021). A review of digital formative assessment tools: Features and future directions. International Online Journal of Education and Teaching (IOJET), 8(3), 1459-1485. https://www.researchgate.net/publication/352994478_A_review_of_digital_formative_assessment_tools_Features_and_future_directions

Conrad, D., & Openo, J. (2018). Assessment strategies for online learning: Engagement and authenticity. AU Press.

Debuse, J. C. W., & Lawley, M. (2016). Benefits and drawbacks of computer-based assessment feedback systems: Student and educator perspectives. British Journal of Educational Technology, 47(2), 294–301. doi:10.1111/bjet.12232

Febriani, I., & Abdullah, M. I. (2018). A systematic review of formative assessment tools in the blended learning environment. International Journal of Engineering & Technology, 7, 33-39.

Kasani, H. A., Mourkani, G. S., Seraji, F., & Abedi, H. (2020). Identifying the weaknesses of formative assessment in the e-learning management system. Journal of Medical Education, 19(2). http://dx.doi.org/10.5812/jme.108533

Luppicini, R., & Haghi, A. K. (2013). Education for a digital world: Present realities and future possibilities. Apple Academic Press.

Morris, R., Perry, T., & Wardle, L. (2021). Formative assessment and feedback for learning in higher education: A systematic review. Review of Education, 1–28. https://doi.org/10.1002/rev3.3292

Petrović, J., Pale, P., & Jeren, B. (2017). Online formative assessments in a digital signal processing course: Effects of feedback type and content difficulty on students learning achievements. Education and Information Technologies. DOI 10.1007/s10639-016-9571-0

Timmis, S., Broadfoot, P., Sutherland, R., & Oldfield, A. (2015). Rethinking assessment in the digital age: Opportunities, challenges and risks. British Educational Research Journal, 42(3), 1–23.

2 Comments

  1. Hi, Marni, thank you for sharing this thoughtful post!

    We agree that digital learning assessments can be great tools to provide frequent and relevant formative feedback when designed well and from the perspective of “assessment for learning” (Wiliam, 2006).

    That said, we appreciated your critique and consideration of some of the potential pitfalls of this approach, and we also agree with the importance of aligning factors “such as accessibility and appropriate technology knowledge to effectively utilize digital learning assessments”. We wondered if you had also considered potential cultural biases and subjective interpretation? While these tools can provide canned feedback, they are limited in personalizing the experience for diverse learners.

    Furthermore, although you discussed some of the technological barriers, there are some concerns about cheating using digital learning assessments (for example, if a student can code, they could potentially alter an H5P or other digital learning activity) – how might you ensure and maintain reliability and validity in the assessments you provide via digital learning assessment tools?

    There is lots of potential for formative assessment and supporting students with this technology, and we look forward to seeing what else you discover!

    Lisa and Leeann

    Reference

    Wiliam, D. (2006). Assessment for learning: why, what and how. Orbit: OISE/UT’s magazine for schools. 36. 2–6.

    • Marni

      Thanks, Leeann and Lisa, for sharing your insights. You gave me a lot to ponder this past week.

      Upon reflecting on cultural biases and subjective interpretations, I brought out a book I recently purchased called Amplify Learner Voice through Culturally Responsive and Sustaining Assessment. The content was impactful for me. For example, the act of questioning is deeply analyzed, and I now will approach my questions with learners more thoughtfully and meaningfully before asking them (Bloomberg et al., 2022). Our class discussion about empathetic questioning for the Design Thinking project reminds me of the same type of deep questioning examples. This could be a fascinating project to explore in terms of attempting to create these questions into digital formative assessments.

      Regarding the idea of students cheating through code, I have made this a team-building exercise for my teammates to uncover potential risks and provide solutions. So far, conducting a hackathon with our software developer students has been suggested. I am interested in seeing where we go from here with other recommendations.

      Before closing, I want to share a new article from Hodges and Kirschner (2024) with everyone that highlights the potential effects of artificial intelligence on instructional design and assessments. Happy reading!

      References

      Bloomberg, P. J., Vandas, K., Twyman, I., Dukes, V., Fairchild, R. C., Hamilton, C., & Wells, I. (2022). Amplify learner voice through culturally responsive and sustaining assessment. Mimi and Todd Press.

      Hodges, C. B., & Kirschner, P. A. (2024). Innovation of instructional design and assessment in the age of generative artificial intelligence. Tech Trends, 68, 195-199. https://doi.org/10.1007/s11528-023-00926-x

Leave a Reply to Lisa Gedak Cancel reply