Mark's Blog

A MALAT Student Blog

An educational Community of Inquiry (CoI) is a “group of individuals who collaboratively engage in purposeful critical discourse and reflection to construct personal meaning and confirm mutual understanding” (Lalonde, 2020, 0:12; as cited in Garrison & Arbaugh, 2007). This theoretical framework can be a scaffold for many learning environments, such as the aerospace control simulator training environment. Air traffic control, a sub-type within the aerospace control community of the Canadian Armed Forces is highly specialized and unique. Air traffic control training can be defined as a “dynamic training environment where controllers constantly receive a large volume of information from multiple sources to monitor changes in the environment, make decisions, and perform effective actions in a timely manner” (Xing and Manning, 2005, p. 1). Simulator training is multifaceted in characteristics and stressful for students to successfully complete. CoI principles may help to provide a learning environment where content and simulator experiences are not the only learning variables experienced by students. Vaughan, Cleveland-Innes, and Garrison (2013) would suggest it would not simply be blends of content with no learning experiences, but if applied effectively, applied CoI would bring effective facilitation “of both students and instructors, creating a climate, supporting discourse, and monitoring learning such that presence can emerge and inquiry occur” (p. 46). Below is an infographic (Figure 1.1) showing the potential strategies supporting CoI elements within an aerospace control simulator learning environment.

Figure 1.1 Infographic: Community of Inquiry framework as Basis for Aerospace Control Simulator Training

Within the three CoI elements, the aerospace control training community can provide a multifaceted and collaborative training environment. In terms of social presence, post-simulation de-briefs can provide active communication from all students and each class can be provided with their own group chat network to foster continued communication and collaboration when face-to-face is not feasible. A Lessons learned program added to seek to engage the thinking of all students by learning through pre-set scenarios prior to simulation…this to help with students’ thinking outside the box whilst being able to effectively voice their thoughts and ideas. Pre- and Post- Q&A sessions included to enhance and assess students’ comprehension of material currently being presented. The Teaching presence can be enhanced through instructors explaining their own personal and past-failures, a notion to encourage intellectual risk-taking. The overall infographic presented above gives a reasonable representation of specific strategies that can be employed within the aerospace community. These ideas are supported by the CoI descriptions as noted above and can provide the addition of constructed personal meaning whilst confirming mutual understanding within this already multifaceted training environment.



Garrison, D., & Arbaugh, J.B. (2007). Researching the community of inquiry framework: Review issues and future directions. Internet of Higher Education, 10, 157-172. doi:

Lalonde, C. (2020, August 22). Facilitation in a community of inquiry. Retrieved from

Vaughan, N.D., Cleveland-Innes, M., & Garrison, D.R. (2013). Teaching in blended learning environments: Creating and sustaining communities of inquiry. Athabasca University Press. Retrieved from

Xing, J., & Manning, C.A. (2005). Complexity and automation displays of air traffic control: Literature review and analysis (Report No. DOT/FAA/AM-05/4). Washington, DC: US Department of Transportation Federal Aviation Administration.

Read More

Digital facilitation is not a new concept for myself, but the more I participate within the online learning environment of this multifaceted topic, it further lends myself to ask: how an instructor fits into the whole online learning picture? In reference to Bull’s (2013) “Eight Roles of an Effective Online Teacher,” teachers must demonstrate specific characteristics in order to be deemed effective. Instead of speaking about three of the roles I agree with, I will state three different characteristics I believe make an effective online teacher.


  1. Humility: This virtuous characteristic puts student ahead of self (teacher); it is the antithesis to pride, self before others. I have had many teachers over my years being a student in elementary, secondary, post-secondary, and professional education. A common theme that all my favourite instructors had was humility. A lost adjective over the years, but one that I hope makes a come-back within our westernized culture.
  2. Out-of-the-Box: Creativity and the ability to capture the audience is another characteristic which I believe, although not completely necessary, makes an effective teacher. I do not simply mean a teacher who is artistic per se, but one which makes the student turn their head in active engagement…the ability to capture one’s attention in mind and thought.
  3. Empathic: Empathy seems to be a common theme within this program. Many of my great teachers were empathetic in their teaching styles and approach. This is important to get buy-in and participation from all students, including those who struggle.

Two Questions about Digital Facilitation

  1. How do you effectively facilitate online courses that are regarded as more stressful, or within high-stakes online/digital learning environments?
  2. How do you effectively assess learners’ comprehension through online assessment?

My final Thought: What is required in abundance for air traffic control students?

Read More


Xing and Manning (2005) describe air traffic control as a “dynamic environment where controllers constantly receive a large volume of information from multiple sources to monitor changes in the environment, make decisions, and perform effective actions in a timely manner” (p. 1). Due to COVID-19, pre-course material of general aviation knowledge within this complex environment now must be taught solely online. Prior, training was done via classroom instruction with subject matter experts. Currently, Chain of Command (CoC) has dictated that students will self-study via online notes and textbook styled resources. This is no different than simply giving them a textbook and telling them to go teach themselves. Although the material is basics of aviation, complex topics still exist and confirmation of understanding is important to foster confidence in the material prior to the more stressful training environments students will see via simulation. My initial idea was to implement learning videos, that could be made by the subject matter experts to aid in student learning and engagement of the pre-course material. However, in order to know if this was a good direction to go in, I completed a short independent samples t-test to determine if the videos would be relevant and helpful to students. This leant well to my empathic design implementation using a student led process as encouraged by Worsham and Roux (2019). I haven’t tried empathic design principles and methods with an emphasis of student-led designers at the forefront, but the idea intrigued me to give it a try.


Two independent groups of 30 students each were selected to complete a mini-lecture on altimeters. For those who do not know: An altimeter is “an aneroid barometer designed to register changes in atmospheric pressure accompanying changes in altitude” (“Altimeter”, n.d., para. 1). One group was to complete the lecture via lecture notes only and then asked to rate the learning tool as helpful or not: a 1 representing unhelpful and a 5 representing very helpful. The other group was given both the lecture notes and a learning video on altimeters and asked the same question and given the same rating scale.


Independent Samples T-Test

Null Hypothesis: Average Score for pre-course learning format for altimeters is p=2.98 (out of 5.0 scale)

N1: 30
df1 = N – 1 = 30 – 1 = 29
M1: 2.98
SS1: 22.24
s21 = SS1/(N – 1) = 22.24/(30-1) = 0.77

Alternative Hypothesis: Average score for pre-course learning format for altimeters including online learning video is p>2.98 (out of 5.0 scale)

N2: 30
df2 = N – 1 = 30 – 1 = 29
M2: 4.1
SS2: 8.7
s22 = SS2/(N – 1) = 8.7/(30-1) = 0.3

T-value Calculation

s2p = ((df1/(df1 + df2)) * s21) + ((df2/(df2 + df2)) * s22) = ((29/58) * 0.77) + ((29/58) * 0.3) = 0.53

s2M1 = s2p/N1 = 0.53/30 = 0.02
s2M2 = s2p/N2 = 0.53/30 = 0.02

t = (M1 – M2)/√(s2M1 + s2M2) = -1.12/√0.04 = -5.92

The t-value is -5.92122. The p-value is < .00001. The result is significant at p < .05.


The test indicated the rejection of the null hypothesis, thus concluding that the results were statistically significant. This now gives me more substantial evidence to pursue a learning video as a course of action as a prototype for my learning technology to develop. It also shows the inclusion of the learner to directly have say in the developing of a potential learning technology and resources to be used – lending to learner-led design. It is important to note that the t-test and p-value calculation do not reveal that learning videos are the best choice of learning technologies. The tests simply show that lecture notes combined with a learning video were statistically significant to determine they were possibly more effective than only having lecture notes. More to follow in my design note and video presentation into the nuances of exactly how I will design the prototype. For now, the t-test puts me on a good footing to spring into my empathic design process and justification for my prototype.



Altimeter. (n.d.). In Merriam-Websters Dictionary. Retrieved from

Worsham, D., & Roux, S. (2019). Foundations in Learner-Centered Design. Retrieved from

Xing, J., & Manning, C.A. (2005). Complexity and automation displays of air traffic control: Literature review and analysis (Report No. DOT/FAA/AM-05/4). Washington, DC: US Department of Transportation Federal Aviation Administration.

Read More
My PoP – Activity #2

My PoP – Activity #2

Posted By on Jul 2, 2020

As noted, before in my first design notes’ post, my problem of practice (PoP) is rooted by the widespread effect COVID-19 has had on the Canadian Armed Forces, particularly within Force Generation; FG: military term for training. Before looking into my PoP I read Seelig’s article “How Reframing a Problem Unlocks Innovation.” The article was a force multiplier or encouragement for me to produce open frames or “[experiences that are created] to inform and increase the way we think” (Seelig, 2013, para. 2). What does this mean within our context? The short version is simply the idea that we can innovate through re-framing our question, or problem of practice, that better represents the specific needs identified and moreover coordinately encourages said innovation at the same time. Asking questions such as “why do we need to make this particular learning technology?” are important to maintaining innovative openness. Combined with empathic design processes, one can tailor their technological learning solution to a problem of practice that can be truly traced through understanding the needs of the diverse population affected as a whole.

My problem of practice (PoP): Due to COVID-19, the Canadian Armed Forces has required air traffic control pre-course training to be solely done via online learning without the benefit of instructors.

Seelig, T. (2013). How reframing a problem unlocks innovation. Fast Company. Retrieved from

Read More

My initial thoughts into critical inquiry were based on the overarching idea that you must delve deep into the topic at hand in order to suck out all the useful information. As long as one paid serious attention to the topic, researched it thoroughly, and added meaningful discussion, you successfully engaged in critical inquiry. I believe there is truth to my initial thoughts, but perhaps clarity of understanding is needed, particularly in the context of learning technology. Selwyn (2010) proposed that critical study within the field of learning technology has been engrossed, although not completely misplaced, in explaining issues on how effective learning technologies are designed, developed and implemented. I believe these topics of study are clearly important, but suggest Selwyn’s case for critical study is not the downgrading of past studies, but rather an upbringing of more personal critical thought and application to future studies. Selwyn’s (2010) further reflections suggest that learning technology studies need to apply critical study into the social scientific, self-reflective, and self-analytical sphere of inquiry. This is where I believe I am starting to understand a different side of critical inquiry, one that is more personal and socially reflective. For example, in asking the question pertaining to my individual learning plan, “is online summative assessments appropriate for phase one terminal or tower air traffic control students?” critical inquiry from a social scientific viewpoint may look at some of the psychological aspects for a student within the air traffic control learning environment. Further critical inquiry might look into aspects of air traffic control culture and the contributions it may have on the air traffic control learning environment, good or bad. Whatever the inquiry may detail, Selwyn has given me pause to make sure I have some type of social scientific or cultural research within my paper to give greater substance into my critical inquiry.

Selwyn, N. (2010). Looking beyond learning: Notes towards the critical study of educational technology. Journal of Computer Assisted Learning, 26(1), 65-73. Retrieved from

Read More

As I continue to research different journal articles and other reference material for my topic, I find myself surrounded by many great resources. My topic is currently on: Feasibility of online assessment in high stress learning environments: An air traffic control perspective. As with much of my research, I try to look at what has been done in the past or at least learn from it. As George Santayana (2020) would say, “those who cannot remember the past are condemned to repeat it” (para. 3). I decided to take a good close look at Weller’s (2020) book: 25 Years of Ed Tech, in order to see if history can provide insights into my critical inquiry.


Although hoping for some direct primary research to my topic, it was difficult to find relevant material within this book. However, instead of finding information directly linked to the potential feasibility of online assessments, I was able to find secondary applications that could aid in the effectiveness of online assessments within air traffic control. One of these such applications was the use of digital badges. Digital badges are a “good example of how ed tech [has] evolve[ed] when several other technologies…[have made] the environment favour[ed] for their implementation” (Weller, 2020, p. 151). Air traffic control contains complicated operations that are difficult to become proficient at, simply due to its increasing complexity in each stage of training. For the student, complexity increases in many areas such as, but not limited to: knowledge, understanding of air traffic control principles, ability to maintain situational awareness, communications, visual and auditory acuity, advanced problem-solving, and the ability to perform exceptionally well under numerous pressure variables (time, risk-factors, weather, multiple sensory inputs etc.).  Assessments are done routinely and the standard passing marks for each test usually lies between 85-95%, any marks lower than this are considered an auto-fail. Weller (2020) does suggest that digital badges do have the potential to be effective within digital learning environments in which they are employed, particularly due to “help[ing] to structure courses into manageable chunks, with convenient awards along the way” (p. 154). Iafrate (2017), a writer for eLearning Industry, stated that, “badges have been successfully used to set goals, motivate behaviours, represent achievements and communicate success in many contexts…badges can have a significant impact, and can be used to motivate learning, signify community and signal achievement” (para. 5). Within a pure military context, the recognition of personnel has been a long-standing and significant part within all levels of the military, regardless of rank. I truly wonder if digital badges can be a new endeavour to assist in military digital learning environments?


The incorporation of digital badges within specified stages of air traffic control learning, particularly following successful digital online assessments, deserves further research and thought. I am contemplating even having a meeting on the topic with my fellow supervisors. I can definitely see the benefit of looking at the history of Ed Tech – I am very glad I read Weller’s book. The past is filled with many failures and successes, but sometimes you just need to apply past inventions with some fresh innovative ideas. Who knows what positive outcomes you will see?



George Santayana. (2020, April 29). In Wikipedia. Retrieved from

Iafrate, M. (2017, November 06). Digital badges: What are they and how are they used. Elearning Industry. Retrieved from

Weller, M. (2020). 25 years of ed tech. Veletsianos, G. (Eds). Edmonton, AB: AU Press. doi:

Read More

My current topic I am pursuing is the feasibility of online assessment in high stress learning environments: An air traffic control perspective. This topic is unique to my vocation, the Chief Terminal Air Traffic Controller at 19 Wing Comox, and difficult to pursue with the lack of current literature on air traffic control training. Within these research entry log-posts, my goal is to give a snapshot into my critical inquiry sojourn into this topic whilst supplying insights from particular references I am using. It is my hope through these log entries I also get help from you, the reader, to add comments to further aid me in my critical inquiry sojourn.

One of the constructive criticisms I received in Part 1 of the assignment was my overall explanation of critical inquiry. Although, I am confident in my understanding of critical inquiry, my own military ‘way of thinking’ was getting in the way. In this case, my striving to always demonstrate effective outcomes – thus, missing the importance of the process. In my Part 1 paper, I referred to my critical inquiry as a reasonable way to ‘demonstrate’ potential effectiveness, when in actuality, critical inquiry seeks to ‘interrogate’ potential effectiveness. At first glance, the diction of these particular word choices seems inconsequential; however, ‘demonstration’ often will define an outcome whereas ‘interrogation’ often will define a process. The process may in-turn lead to an outcome, but the process, hence interrogation, lends itself to the term critical inquiry more appropriately than demonstration.

It is through this interrogative lens which I must pursue my topic – with the hope to demonstrate or perhaps suggest even, an outcome to my paper’s problem being addressed. One unique reference I am currently reading is “Ghosts of the Machine,” by Owen (2017). Owen (2017) elaborates that “the title ‘Ghosts in the machine’ is used here to draw attention to how organisations comprise people who in turn shape – and are shaped by – their ways of organizing” (p. 2). As I seek to find the feasibility of online assessments within air traffic control learning environments, I must bear in mind the ‘ghosts of the machine.’ Individuals and organizations are complex and interconnected in many ways. Perhaps part of my sojourn needs to address the underlying presuppositions and/or culture within the air traffic control community in order to better understand the feasibility of online assessments?



Owen, C. (2017). Ghosts in the machine: Rethinking learning, work and culture in air traffic control. (2017). New York, NY: CRC Press. Retrieved from:


Read More