This post was prepared in collaboration with Wendy Grymaloski
Introduction & Context
Kate and Wendy are both high school teachers in British Columbia (BC) who aim to provide a high-quality education to ensure the development of educated citizens (Province of British Columbia, 2020). Although we teach various grades and subjects in different school settings—one public, one private—much of our core duties, responsibilities, and beliefs are the same. We provide a learner-centred, flexible experience and create situations that foster all students’ success (Province of British Columbia, 2020). Up until spring 2020, we were teachers in a traditional face-to-face school environment. However, when COVID-19 struck, we had to quickly pivot to teaching in an online classroom. Although so far, we have been fortunate this school year to remain mostly face-to-face for our instruction, we continue to present and manage our courses digitally for several reasons:
- Schools could be shut down again at any point, and teachers need to shift smoothly and quickly to remote learning.
- Students may have extended or increased absences due to sickness or personal choice to stay home.
- Establishing predictable routines and procedures help set students up for success; therefore, having experience in the online learning environment, should either of the above situations take place, is essential.
Shortly before the pandemic, all schools in BC implemented a new K-12 redesigned and enhanced curriculum—a framework for doing, understanding, and knowing in a social constructivist context. In March 2020, when we were forced online, teachers were still in the learning phase of curriculum implementation, primarily regarding assessing students in this new context. Despite currently working face-to-face and digitally, we still believe that teaching strategies and instructional design should center around a social constructivist framework. Assessment in the digital learning environment needs to reflect that.
The prototype presented in this blog post is one assessment tool that may be used for student self-assessment of engagement in a digital learning environment. The tool was not only developed to tackle assessment within the new BC curriculum but was also developed to radically reimagine the design of digital learning environments and realize the potential of critical instructional design in practice. Picciano (2017) argues self-reflection is a great way to encourage discussion rather than straight lecturing, whether in a face-to-face or online environment. In addition, because learning in the new curriculum is constructed with peers and the teacher, assessment and feedback should be just as flexible online as it is face-to-face (Butt, 2010, Chapter 2).
While the situation we described above presents many challenges, we identified student engagement as our top priority. Our joint problem statement became: Maintaining student engagement in the digital learning environment is a challenge. We asked ourselves three main questions:
- How can we help students become engaged learners in a digital learning environment?
- How can engagement be assessed?
- How can self-assessment help increase their engagement?
As our design solution, we decided to create a self-assessment rubric for students to assess their engagement in a digital learning environment.
Focusing on factors like effort that students are more likely to control and remedy rather than knowledge of content and ability helps students overcome failure (Dörnyei, 2007). When instructing face-to-face, it is easy to gauge student engagement on an ongoing basis; however, in a digital learning environment, assessment of engagement needs to be more streamlined and transparent for both teachers and students.
We chose a rubric for simplicity, transparency, and effectiveness. In our case, the rubric would be provided to the student at the beginning of the course, outlining the target behaviors for engagement. Students could then reference the rubric to clarify what is expected of them, which is essential in any learning environment where access to the teacher is not always immediate. Clarity and understanding of expectations set out by rubrics have been proven to help improve student performance (Kearns, 2012). They have also been shown to decrease anxiety around assessment and negative self-regulation tactics in students (Panadero & Romero, 2014).
The choice to use a rubric as a self-assessment tool was also an intentional decision to increase accountability and engagement. Learning to self-monitor actions, thoughts, feelings in the learning process is crucial to student success and is a skill that needs to be practiced (Panadero & Romero, 2014). Formative self-assessment should be conducted more than once for students to check-in, reflect on their engagement, and revise their behaviour as necessary. This type of self-assessment is not an opportunity for students to formally grade themselves, as that decreases the effectiveness and increases the likelihood of students inflating self-evaluations (Andrade, Du, & Mycek, 2010). An added benefit in using formative self-assessment is using the self-assessment results as a springboard for teacher feedback.
Self-assessments help develop critical thinking skills and support a constructivist approach to learning (Conrad & Openo, 2018). Although we have provided a framework of the rubric, involving students in constructing the criteria can help increase the rubric’s effectiveness. It can build the relationship between teacher and student and helps students understand the importance and relevance of the criteria, which then increases buy-in to the learning skills (Panadero & Romero, 2014). This rubric can easily be adapted with student input or made more specific to a course if necessary. It is not meant to capture all learning skills; it is to provide a snapshot of engagement in the online learning environment. Teachers could also easily adapt this to use for their own summative assessment.
Andrade, H. L., Wang, X., Du, Y., & Akawi, R. L. (2009). Rubric-referenced self-assessment and self-efficacy for writing. The Journal of Educational Research, 102(4), 287-302. https://scholarsarchive.library.albany.edu/cgi/viewcontent.cgi?article=1012&context=etap_fac_scholar
Butt, G. (2010). Making assessment matter. Bloomsbury Publishing Plc.
Conrad, D., & Openo, J. (2018). A few words on self-assessment. In T. Anderson (Ed.), Assessment strategies for online learning: Engagement and authenticity (pp. 152-158). AU Press, Athabasca University.
Dörnyei, Z. (2007). Creating a motivating classroom environment. In Cummins J., Davison C. (Eds.), International handbook of English language teaching (pp. 719-731). Springer. https://doi.org/10.1007/978-0-387-46301-8_47
Kearns, L. R. (2012). Student assessment in online learning: Challenges and effective practices. Journal of Online Learning and Teaching, 8(3), 198. https://jolt.merlot.org/vol8no3/kearns_0912.pdf
Panadero, E., & Romero, M. (2014). To rubric or not to rubric? The effects of self-assessment on self-regulation, performance and self-efficacy. Assessment in Education: Principles, Policy & Practice, 21(2), 133-148. https://mycourses.aalto.fi/pluginfile.php/927618/mod_resource/content/1/PanaderoRomero2013.pdf
Picciano, A. G. (2017). Theories and frameworks for online education: Seeking an integrated model. Online Learning, 21(3), 166-190. https://doi.org/10.24059/olj.v21i3.1225
Province of British Columbia (2020). Curriculum Overview. https://www.curriculum.gov.bc.ca/curriculum/overview
When I read the outline of this assignment, I immediately thought to myself “well I don’t have experience in instructional design so I don’t think I have any tools in my toolkit.” Thankfully, upon doing the readings and reflecting on what a “tool” is, I have many at my disposal.
As a teacher, you need to have many “tools” at your disposal at any given time in order to create lessons, instruct, pivot, provide feedback, collaborate with your colleagues, etc. I’m sure there are so many instances where “tools” I use have become so reflexive that I don’t even consider them tools anymore. There are many factors that may influence what tool you use, some of which include context, desired outcome, need, availability, time, etc.
In my design practice, I identified Brave, Resilient, Organized, Empathetic as my superpowers.
Brave: I believe you have to be brave to be innovative, tackle design and instructional challenges, and try new things.
Resilient: Things don’t always go as planned, and I know I can bounce back when things do wrong.
Flexible: I can adapt to changing situations and needs.
Organized: Being organized brings me joy.
Empathetic: I always put the needs of my students first.
I put the note *in progress* at the top of my graphic because I am sure I forgot some, and know that I am always evolving and learning so will be adding to this list.
Exploring Design Models
As a high school teacher, I have often thought about the learning environment I create, and how I am going to help my students engage meaningfully with the material. Beyond the course content I’m teaching, I’ve focused on the physical space of the classroom: seating arrangements, engaging and purposeful wall material, etc; as well as cultivating a social-emotional environment of respect, caring, openness, risk-taking, and trust. Up until now, however, I had not necessarily considered which specific theories informed the decisions I made, or how those theories are used to design a learning environment. In the seven years I have been teaching, there have been massive changes in curriculum and technology use in the classroom. It has felt messy at times, trying to keep up with all of the changes, while still trying to be innovative and always put the needs of my students first. There hasn’t always felt like there was time to explain theoretically the decisions I’ve been making. This week’s activity has allowed me to put a formality behind what I’ve been doing, by helping me understand the role of learning theories in selecting a design model, what goes into making design decisions, and models I’ve found myself drawn to.
When considering which design model to utilize, it is apparent that learning theories play an important role. Ertmer & Newby (2013) discuss three dominant learning theories, behaviorism, cognitivism, and constructivism, and how they inform design by taking the understanding of how people learn and putting it into tangible materials, activities, and lessons. I agree with their assertion that an understanding of these theories translates into better instruction, as it would give instructors more purpose behind their actions. This is also supported by Merrill’s (2002) ideas of the five principles of effective instruction, which apply to almost all design models to some degree.
With all of the massive changes that have taken place in education, especially with the need to quickly pivot and shift to online learning due to Covid-19, an understanding of learning theories and effective instruction principles becomes even more important for educators. While some educators may believe in one learning style and design style over others, I personally see the value in shifting between several. Factors that may influence my decision include course content, mode of learning (in person or online), timing (beginning, middle, or end of the course), students’ needs, assessment requirements, available resources, and personal preferences/teaching style.
One of the design models that stood out to me in these readings was the PIE (plan, implement, evaluate) model as it seems to sum up what my “go-to” design process is. Not overly complicated, but hits the important steps. It also emphasizes the use and application of technology in instructional design (Dousay, 2017), which is particularly relevant in today’s educational landscape.
Bates (2015) suggests that there is a need for more “agile” design, due to the rapidly changing nature of education, and that will “enable students to develop and practice the skills and acquire the knowledge they will need in a volatile, uncertain, complex and ambiguous world.” After reviewing many of the design models, there are many facets of this idea I agree with. However, the fact that it is a newer design model and there hasn’t been as much research on it, could be problematic. But to me, isn’t that what innovation is meant to do? Try new ways of approaching learning, even if it may fail? I believe it is worth exploring, and I certainly have a lot more exploring to do in innovation and design!
Bates, T. (2015). Chapter 4.3 The ADDIE Model, Chapter 4.7 ‘Agile’ Design: flexible designs for learning, and Chapter 10 Trends in Open Education. In Teaching in the digital age. BCcampus. http://opentextbc.ca/teachinginadigitalage
Ertmer, P., & Newby, T. (2013). Behaviorism, Cognitivism, Constructivism: Comparing critical features from an instructional design perspective. Performance Improvement Quarterly, 26(2), 43-71.
Dousay. T. A. (2017). Chapter 22. Instructional Design Models. In R. West (Ed.), Foundations of Learning and Instructional Design Technology (1st ed.). Available at https://edtechbooks.org/lidtfoundations.
Merrill, M. D. (2002). First principles of instruction. Educational Technology Research and Development, 50(3), 43-59.
This is your WebSpace powered by WordPress site. It will be the home for your journey through the MALAT program, and you can customize it to meet your needs and reflect your style.
Use these tutorials to learn more about customizing your WordPress site.