Assignment 3 – Design Thinking in Action: Reimagining Assessments of Student Learning in Digital Environments
Ashley Breton & Katia Maxwell
MALAT, Royal Roads University
LRNT 524 – Innovation, Design, and Learning Environments
Leeann Waddington & Lisa Gedak
January 2, 2022
“If I had an hour to solve a problem, I’d spend 55 minutes thinking about the problem and 5 minutes thinking about solutions.” – Albert Einstein
Context
The coronavirus disease 2019 (COVID-19) pandemic forced many higher education institutions to pivot face-to-face instruction online. While education is constantly evolving and changing, this shift has created new professional challenges for educators and intensified the need to adapt how they teach and assess to best support student learning in digital environments.
Prior Assumptions
Before beginning the five-phase Design Thinking process (Doorley et al., 2018), our team identified theories from adult education and online learning to provide the framework for our design challenge. We wanted to determine how the instructor intends learners to interact with the content, the instructor, and the other students. Employing Knowles’ (1984) four principles of andragogy, the team assumed:
- Adults need to be involved in the planning and evaluation of their instruction,
- Experience (including mistakes) provides the basis for the learning activities,
- Adults are most interested in learning subjects that have immediate relevance and impact to their own work, job, and mainly in their personal life, and
- Adult learning is problem-centered rather than content-oriented (Kearsley, 2010 cited in Branco, 2018, p. 7).
Next, we established instructional goals and objectives after considering the main principles of instruction to improve student learning (Bates, 2015a). Employing Knowles’ (1984) adult learning theory of andragogy, to increase the effectiveness of online learning, we agreed instructors would need to ask students to complete real-world tasks and offer assessments that centre around the learning outcomes of an activity (Kearsley, 2010 as cited in Branco, 2018).
The Design Process
Our team infused Design Thinking with the ADDIE model (Bates, 2015a) and rapid, responsive elements of the Agile approach (Svihla, 2017; Thurston, 2014) throughout this design challenge to improve our problem-solving and decision-making capabilities (Bates, 2015a). We initially saw the design challenge as straightforward. For instructors to evaluate students more appropriately online, the instructors assessment practices needed to change. However, the opportunity turned out to be much more significant. By the end of the process, what we ultimately produced was a wholly redesigned online assessment tool for instructors to use, which offers more authentic, hands-on learning experiences for students in higher education courses. This dramatic reframing of the opportunity emerged from our team’s human-centred, empathetic approach to the process. We discovered that merely fixing online assessments would not solve the problem, we needed to change the philosophy behind them.
Problem Statement
Working as partners, our design team (Katia and Ashley) comprising secondary and post-secondary educators, teaching in production for television and film, and English as a Second Language (ESL), followed the experiences of instructors and students to design a solution to address this complex challenge by dedicating our efforts to improve online assessment strategies. Using Gabriella, a 25-year-old graduate student, and Ashoka, a 46-year-old college instructor, as extensions of ourselves as we worked through the five-phases (Doorley et al., 2018).
Gabriella and Ashoka are anxious because COVID-19 and the constant threats of pandemic-related hardships have made the sudden transition to teaching and learning online challenging. Specifically, Gabriella worries that her skills and abilities have not been appropriately evaluated and graded in the spring and fall semesters of 2021, but hopes the winter 2022 term will bring a better learning and assessment experience. Ashoka is looking to make teaching and learning online a more human-centred experience with assessments that meaningfully connect with her students. From this we arrived at our problem statement: Instructors, like Ashoka, need an authentic way to support student agency around online assessments because it will help equip students, like Gabriella, with the critical skills they need to be successful, independent, life-long learners. Next, our team stepped in to find a solution that satisfies the needs of both stakeholders.
The Solution
Looking at our problem statement we employed the authentic assessment model as our design solution. Together we reviewed the ethnographic research and developed insights and design criteria to generate ideas and appraise the overall purpose, process, and use of an authentic assessment (see Figure 1; Weleschuk et al., 2019).
Figure 1. Non-functional prototype sketch

Authentic assessment is based on students’ abilities to perform meaningful tasks which they may confront in the real-world, and demonstrate their mastery through the production of digital artefacts or participation in project-based learning tasks (DePaul Center of Teaching and Learning, n.d.). In a nutshell, it is a type of assessment that determines students’ learning in a way that goes beyond traditional tests which tend to assess static, over-simplified elements of activities (Wiggins, 1990). Table 1 summarizes the advantages of authentic assessment over traditional assessment.
Table 1. Authentic assessment versus traditional assessment
| Authentic Assessment | Traditional Assessment |
| Requires students to contextualize and apply what they have learned. | Asks students about what they learned out of context and tends to encourage rote memorization (“what do we need to know for the test?”) |
| Forces students to work within the ambiguities and grey areas present in the real world. | Encourages students to think about issues in “right” versus” wrong terms. |
| Challenges students with a full array of tasks, challenges, and priority-setting that is required in solving problems in the real world. | Tends to focus on single answers to problems. |
| Look at students’ abilities to plan, craft, and revise thorough and justifiable arguments, performances, and products. | Rarely provides students opportunities to plan, evaluate, adjust, and revise responses. |
| Often include ambiguous problems and roles that allow students to practice dealing with the ambiguities of the real world. | Frequently focus on discrete, static (and often arbitrary) elements of the skills necessary to work on ambiguous challenges. |
Note: Authentic assessment versus traditional assessment [Table] was taken from Non-traditional assessment models (DePaul Center of Teaching and Learning, n.d.) adapted from The case for authentic assessment (Wiggins, 1990). This table summarizes the advantages of authentic assessment over traditional assessment.
The general philosophy of authentic assessment is that if an instructor wants to know how well a student can do something, the best way to assess them is to have them do it (DePaul Center of Teaching and Learning, n.d.). Some examples of authentic assessment in an online context could be e-portfolios, interviews, role-plays and simulation activities, and so forth. Next, our team identified four pillars or best practice methods for creating an authentic assessment, which include:
- STANDARDS – Identify the standard knowledge and skills students need to be able to do to be successful in the field of their context after they complete the course,
- AUTHENTIC TASKS – Work with university or college faculty to determine how students might be able to demonstrate their ability to do the task(s) (this becomes the authentic tasks designed or selected for the course),
- SUCCESS CRITERIA – Identify success criteria to evaluate the task(s), and
- RUBRIC – Evaluate students’ abilities to complete the criteria of the task(s) using a rubric or other scoring guide (DePaul Center of Teaching and Learning, n.d.; Mueller, 2018; Mueller, 2019).
Reflecting upon the d.School Bootcamp Bootleg deck (Doorley et al., 2018), we decided to identify one example of authentic assessment to test rather than coming up with a complete prototype of authentic assessment. We selected an electronic portfolio (e-portfolio) as our assessment tool, which consists of “a purposeful collection of student work that exhibits the students’ efforts, progress and achievements in one or more areas” (Meyer et al., 1991, p. 60). Through the use of e-portfolios, students would be given an authentic task(s) or desired learning outcome(s) and collaborate with their instructor to decide on the elements to be assessed, what the assignment would look like, the rubric, and would participate in the grading of the final product (see Figure 2).
Figure 2. E-portfolio model as an authentic assessment tool in higher education

Afterwards, our team set forth to develop an e-portfolio assignment by taking it through the four pillars of authentic assessment creation (DePaul Center of Teaching and Learning, n.d.; Mueller, 2018).
1) Ashoka and Gabriella looked at the standards or course outcomes and co-created educational goals together.
2) Together they agreed to create an authentic task in the form of an e-portfolio titled “My Portfolio: A Memory Book” that Gabriella could take with her after graduation. Gabriella gets to have direct input in the creation of her assessment model and how she chooses to demonstrate deeper learning. Ashoka then gets to offer feedback and loop in peer feedback as well. This would be to ensure the portfolio is applicable in life after the course. Ashoka has the opportunity to effectively integrate technology. While Gabriella has a number of digital portfolio building tools to choose from, such as; Google sites, Behance, Fabrik, Evernote, Squarespace, Wix, Weebly, Showcase, and more, which broadens her digital literacy skills, provided opportunities for complex problem-solving skills, and greater chances for collaboration. Ashoka and Gabriella would go through the portfolio and ensure Gabriella herself completed it properly and then grade it appropriately.
3) After identifying success criteria for the e-portfolio, Ashoka employs socratic questions as a tool to determine the level of performance for each criterion. This provides Gabriella with another opportunity to demonstrate her understanding of course content through open dialogue between student and instructor as they review completed components of the e-portfolio together. For example, Ashoka could ask: How did you arrive at what elements to include in your portfolio? Why is it important to look at what you included? What are you assuming the person reviewing your portfolio is looking for? How does this portfolio reflect who you are in the field? Are you able to further explain all elements of your portfolio and how they support you in the field? Gabriella would answer and be graded appropriately.
4) Finally, the combination of the criteria and the levels of performance for each criterion became the rubric for the collection of artefacts (e.g., digital images, videos, audio, and documents) and problem-based learning tasks included in the e-portfolio. The collection of items acted as a way to measure whether or not the learning has been transformative.
After the testing phase, our team concluded that the use of an e-portfolio model to authentically assess students in an online environment offered both of our stakeholders (Gabriella and Ashoka) a collaborative opportunity to create something with purpose that not only builds critical skills for the future, but promotes deeper relationships and mutual respect between teacher and learner (Morris, 2018). Together, the student and instructor negotiated the design of the assessment as they decided what would go into the e-portfolio, the elements to be evaluated, the rubrics, and then collaboratively participated in grading the elements showcased (Weleschuk et al., 2019). However, for e-portfolios to be truly valuable assessment tools for digital learning environments in higher education, the long-term commitment involved in documenting a students’ work over time when constructing an e-portfolio needs to be emphasized above the final product and grade (Mueller, 2019). This kind of authentic assessment empowers students to reflect on their best work that demonstrates learning derived from experiences that connect to various aspects of their life (e.g., personal, school, work, and community) (Mueller, 2018). These items selected for the portfolio acts as evidence of a students’ growth, allowing them to become primary players in their own learning .
In conclusion, as a team, it is our belief that e-portfolios as a tool of authentic assessment offers the best way to support student agency around assessments to provide students with the critical skills they need to be successful, independent, life-long learners. Through the five-phase Design Thinking process we found evidence to suggest that e-portfolios are valuable tools of assessment because they allow students and educators opportunities to “participate fully and meaningfully in [the] technological activities” that make up so many aspects of our lives (Morris, 2018, para. 42). For “to teach in a manner that respects and cares for the souls of students which is essential if we are to provide the necessary conditions where learning can most deeply and intimately begin” (hooks, 1994 as cited in Specia & Osman, 2015, p. 195). Authentic assessments, like e-portfolios, offers our instructor, Ashoka, a space to empathize with her learners, understand where they are coming from, and witness how their experiences impact their learning (Specia & Osman, 2015). Using this as a tool of assessment, our student, Gabriella, can demonstrate how she would prefer to apply her knowledge, skills, and abilities to real-world situations, instead of being assessed on what she can recall out of context.
Both members of our design team can apply e-portfolios as a tool of authentic assessment to our unique contexts. Applying authentic assessments and using the four pillars of authentic assessments to navigate the evaluation of assignments would provide a safe, flexible, responsive, collaborative, empathetic space for learning “owned by the learner, structured by the learner, and told in the learner’s own voice” (Hartnell-Young & Morris, 2007, p. 39).
References
Branco, M. (2018). The adult learning theory–Andragogy. Psycho-Educational and Social Intervention (PESI). http://www.psiwell.eu/images/io3/PESI-manual-for-trainers.pdf
DePaul Center of Teaching and Learning, (n.d.). Non-traditional Assessment Models. DePaul University. https://offices.depaul.edu/center-teaching-learning/assessment/assessing-learning/Pages/non-traditional-assessment-models.aspx
Hartnell-Young, E. & Morris, M. (2007). Digital portfolios: Powerful tools for promoting professional growth and reflection (2nd edition). SAGE Distributor. ISBN: 9781483334226
Meyer, C., Paulson, L., & Paulson, P. (1991). What makes a portfolio a portfolio? Eight thoughtful guidelines will help educators encourage self-directed learning. Association for Supervision and Curriculum Development. 48(5), 60-63. https://files.ascd.org/staticfiles/ascd/pdf/journals/ed_lead/el_199102_paulson.pdf
Morris, S., (2018). Critical digital pedagogy and design. Sean Michael Morris. https://www.seanmichaelmorris.com/critical-digital-pedagogy-and-design/
Mueller, J. (2018). Authentic assessment toolbox: Enhancing student learning through online faculty development. Journal of Online Learning and Teaching. https://jolt.merlot.org/vol1_no1_mueller.htm
Mueller. R. (2019). ePortfolio: Best practices for use in higher education. Taylor Institute for Teaching and Learning. https://elearn.ucalgary.ca/wp-content/uploads/2019/07/e-portfolio-support-document_best-practices-final.pdf
Specia, A. & Osman, A. (2015). Education as a practice of freedom: Reflections on bell hooks. Journal of Education and Practice, 6(17). ERIC Digest. https://files.eric.ed.gov/fulltext/EJ1079754.pdf
Svihla, V. (2017). Chapter 23. Design Thinking and Agile design. In R. West (Ed.), Foundations of Learning and Instructional Design Technology (1st ed.). https://edtechbooks.org/lidtfoundations
Thurston, T. (2014, March 5). Don’t pick sides, create an ADDIE-Agile mashup [Blog]. eLearning Industry. https://elearningindustry.com/dont-pick-sides-create-an-addie-agile-mashup
Wiggins, G. (1990). The case for authentic assessment. ERIC Digest. https://files.eric.ed.gov/fulltext/ED328611.pdf
Weleschuk, A., Dyjur, P., & Kelly, P. (2019). Online assessment in higher education. In Taylor Institute for Teaching and Learning Guide Series. Taylor Institute for Teaching and Learning at the University of Calgary. https://taylorinstitute.ucalgary.ca/resources/guides