Design Thinking and Student Engagement

Blog post by Todd Pezer and Dugg Steary

Design thinking is a human-centred iterative process utilized to gain empathy for the view of others to a complex problem.  This process is employed prior to the search for solutions to best ensure extraordinary results.  Design thinking has been successfully applied by organizations to improve products, services, processes, or education.  Our team utilized the learning tools and design thinking process from Stanford University d.school (Stanford, 2016) to investigate common challenges within our respective organizations and to develop a prototype solution that would be useful and meaningful for each team member.

Point-of-view

Our team consisted of a pilot educator from within the airline industry and a paramedic educator from a community college.  Despite varied competency-based objectives and standards within each organization, both educators identified common challenges during the early stages of the design thinking process.

Two challenges resonated with each educator in both organizations.  Specifically, is the learning material being delivered to students sufficiently relevant and enticing to maintain their interest, and to what level are students engaged with the artefacts or education resources?  After addressing these key components, our team hopes to have a better understanding of the student level of engagement and will then repeat the design thinking process to develop solutions for improvement.

Prototype Solution

For each delivery model within the respective organizations, including online, blended, and in-class modalities, our team has speculated that a disconnect exists between student engagement and the educator’s perspective on the students’ level of engagement.

To identify the level of student engagement, the team will develop a multiple-item Likert formative and summative scale assessment (Gliem, 2003) and incorporate this assessment within, and at the conclusion of, each learning module.  This assessment will be embedded within the learning materials, for example, Captivate, Blackboard, Moodle, or other Learning Management System module.  This assessment will be a mandatory completion element by the student before proceeding through larger modules and at the conclusion of each learning module.  Educator perspectives on student engagement within the module will be measured using a similar assessment tool.

A combination report consisting of a negotiated score determined by both the educator and student assessments will be generated for each learning module.  It is anticipated that these reports will provide valuable insight into the level of student engagement in comparison to the educator’s perspective of engagement.  Additionally, it is proposed that trends of engagement patterns will be gleaned from the in-situ assessments.  Specifically, it is anticipated that patterns will emerge outlining the type and quality of materials that showed increased student engagement during the learning module.

Possible Challenges

Our team identified that level of engagement was the first step in a larger initiative.  Once the initial findings are assessed, our team plans to expand the prototype to identify the level of engagement with identified individual students and compare this with other education metrics including attendance, test scores, competency attainment, etc.  Additional one-on-one interviews related to students’ level of engagement within learning modules is anticipated to be beneficial but is not within the scope of this prototype.

Our team identified that the timely collection and distribution of the combination report will allow educators to self-reflect and adapt subsequent learning modules to improve student engagement.  Providing training to improve student engagement would be beneficial for educators within each organization.  Comprehensive training on the use of formative and summative assessments would need to be provided for each educator and student to ensure compliance and accuracy of completion.

Thank you for reviewing our design thinking process summary.  We look forward to your feedback and insight.

 

References

Gliem, J. A., & Gliem, R. R. (2003). Calculating, Interpreting, and Reporting Cronbach’s Alpha Reliability Coefficient for Likert-Type Scales. In Midwest Research-to-Practice Conference in Adult, Continuing, and Community Education, (pp. 1–88). Columbus, OH. Retrieved from https://scholarworks.iupui.edu/

Mattelmäki, T., Vaajakallio, K., & Koskinen, I. (2014). What Happened to Empathic Design? Design Issues, 30(1), 67–77. Retrieved from http://10.0.4.138/DESI_a_00249

Stanford University Institute of Design. (2016). A virtual crash course in design thinking. Retrieved from http://dschool.stanford.edu/dgift/

Stanford University Institute of Design. (2016). The Virtual Crash Course Playbook. Retrieved from http://dschool.stanford.edu/dgift/

Tran, N. (2016). Design Thinking Playbook.  Retrieved from https://dschool.stanford.edu/resources/design-thinking-playbook-from-design-tech-high-school

 

Image by Karolina Grabowska is licensed under CC BY 4.0 (CC0 license)

9 Replies to “Design Thinking and Student Engagement”

  1. Hi Dugg and Todd,

    Engagement – oh my – you have my interest! I appreciate your topic you have identified, as it is a frequent topic among educators. Are we engaging our students? It is an interesting point. I am intrigued by your question “to what level are students engaged with the artifacts or resource”? When I read your design process, I wonder about two clarifying questions.

    1) How will you determine if the Likert-assessment will provide you with accurate data when it is mandatory to fill out and forcing engagement at that moment? You have recognized that one-to-one interviews would be useful, this is there consideration to having some qualitative statements to clarify a position taken on the Likert-assessment? Do you have examples in mind for what the formative and summative assessments may resemble?

    2) You mentioned the hopes of trends developing the data. An establishing pattern would be difficult to gather without the same questions asked. Will the same formative/summative assessment (or types of questions) be asked repeatedly to collect that data?
    Or will the questions and ratings differ as the instructors move through the curriculum?

    Thank you for sharing. Engagement is a big idea that can lead anyone astray down a rabbit hole quickly. To this end, in your discovery of further steps in the design process, share if you use this design type! You may be on to something…

    Bobbi

    1. Thanks for this thoughtful overview of your experience with design thinking. You have captured some significant points – the degree to which dialogue and engagement is promoted by the process and the degree to which various prototypes / ideas are generated. I hope you found the process to be lively, noisy and fee flowing.

      You have captured the most important aspect of design thinking – it slows the rush to a solution and has the potential to support thoughtful design finding. This step sometimes frustrates folks who think they NEED to get to a solution ASAP … trust the process … HA!

    2. Bobbi – I could resist. Never underestimate the value of a good rabbit hole! I’ve used design thinking, successfully, to restructure our faculty’s B Ed program; to lead user feedback sessions with East and West African colleagues critiquing the usability of Stanford’s foldscope (https://news.stanford.edu/2016/06/16/foldscope-microscopy-everyone/), and with school districts as an option to strategic planning. I know you know this, but I just couldn’t resist commenting! Not sure I’d have a career without the odd rabbit hole …

  2. Hi Todd and Dugg,

    I think it’s a great point that you guys make about how while your disciplines may be different, but similar challenges emerge in both settings. We can all learn from each other no matter the background or situation. Your prototype intrigues me as I have a pre/post survey model in a project I’m involved with as well. Students measure self-assessed technological competency at the beginning of the course and at the end. I’d like to share some of my challenges. While the data is useful from a researcher point of view, it does not affect the students in question and intervention is typically too late. In your model I understand students go through various modules and I suppose tracking can be done if students are taking a number of modules to achieve an end goal (i.e., credential). I can say at my organization that the tracking for adult continuing education students is quite difficult as many have different goals in mind and do not have a set number of courses that they must take. Another challenge is the timing of the assessment if your modules are short. My courses are 9 weeks long and if the assessments are not placed with enough time in between, you will not capture any meaningful data.

    Overall I think it’s a prudent plan (one that I’m currently embarking on). I’m curious to how the specifics would work in your model (though I understand how restricted we are in 500 words).

    Cheers,
    George

  3. Thanks for this thoughtful overview of your experience with design thinking. You have captured some significant points – the degree to which dialogue and engagement is promoted by the process and the degree to which various prototypes / ideas are generated. I hope you found the process to be lively, noisy and fee flowing.

    You have captured the most important aspect of design thinking – it slows the rush to a solution and has the potential to support thoughtful design finding. This step sometimes frustrates folks who think they NEED to get to a solution ASAP … trust the process … HA!

  4. Hi Todd and Dugg,

    I too am very interested in to what degree students actually engage with courses, materials, and teaching approaches. In my career, I have tried many different types of surveys in order to try to gather data on what students have found most inspiring and what types of lessons have caught their attention the most, but frequently students seem to approach these surveys as just something to fill out. Often the spots where they are asked to elaborate are left blank.

    Might there be a way to gather data during the course in order to add to your results? Perhaps through short questionnaires or elements built into the course to allow students to give feedback in real time?

    Are there other ways to measure engagement? I like how you are making links between attendance and results on tests/activities, as I would anticipate direct links can be made between this and engagement. Have you also considered evaluating different groups or classes with like or different methods at the same time? Or mixing students up in classes in order to try to ascertain to what degree the complement of students has an effect on the engagement? Just an idea.

    I also really enjoyed your idea regarding how this information can be used in professional development. So rarely do educators get a chance to learn from what works for other educators. So often teachers don’t understand what really makes their students tick.

    Stu

  5. Hi Todd and Dugg,
    Thank you for your work on trying to figure out how to gauge learning-engagement in a meaningful way. My first thought was related to the courses we’re taking now. For example, how will our instructors know that we are engaging with the materials and course content beyond our postings, activities and assignments? How will they know which readings we appreciated vs which readings we quoted from because we needed a quote and we’re good at finding ones that fit (I may be taking an intellectual risk here ;-)).

    The idea of asking for input from the learners on each module instead of at the mid- and end-points of a course would allow instructors “to learn and to proactively personalize learning through an intentional process!” (Crichton & Carter, 2017, p. 35).

    In the online courses I teach, I ask learners to post their aha-moments and biggest take-aways from each module. Mine is an informal approach with no ability to track what works consistently except by copying answers into a spreadsheet and looking for commonalities.

    Would your “multiple-item Likert formative and summative scale assessment ” be a universal tool, or only apply to the courses you currently teach?

  6. Hi Todd and Dugg,

    Interesting topic.

    It makes me ponder what exactly is engagement, how do you measure engagement, and how do you know that you’re really measuring engagement?

    How important is engagement? I could design a learning activity that is incredibly engaging that has little educational value. Does improved engagement lead to improved learning outcomes? If so, is there a reason not to just measure learning outcomes?

  7. Hello Dugg and Todd.

    First off, I enjoyed your succinct introduction of the tool and why and how you used the design thinking process in this activity. As I read through your post, I felt supported in understanding your process.

    Your speculation that educators may have different perspectives around student engagement than do students was one that I had not considered that deeply before. I thought your idea to include an engagement assessment throughout and at the end of each module was outstanding, especially when coupled with an on-going review of the educator’s perspectives. I wonder if you have determined the options you’d choose to include in the Likert scale? Teasing out engagement seems challenging to me, so I wonder if the scale options would correlate to satisfaction, value or something entirely different?

    In your possible challenges you note that comprehensive training on the use of the assessments would be required to ensure compliance and accuracy of completion. As participation in the assessments will be mandatory, do you anticipate that your design would provide this training at the beginning of a learning intervention? Should a student provide wholly positive feedback, or feedback that was deemed inconsistent, would it be important for the educator follow up and provide additional context with the goal of helping the assessments become more meaningful for the students?

    Thank you for sharing your practical prototype solution and for encouraging me to consider the disconnect between student and educator perspectives of student engagement.

Leave a Reply

Your email address will not be published. Required fields are marked *