My titled ARP as of so far is “Air Traffic Control Training Addressing Student Task Saturation through the Use of Simulator Technologies by the Royal Canadian Air Force.” My main question of research: In what ways might technology be used to address concerns of trainee task saturation during air traffic control simulator training?
In approaching this research project, I am currently looking at two theoretical frameworks that may fit the bill for this project. Firstly, Technology Acceptance Model (TAM): the model that answers the question, why do people use technology? Under the TAM model, people use technology because of its ease of use and perceived usefulness (Davis, Bagozzi, & Warshaw, 1989; see also Venkatesh & Davis, 2000). Portz, et al. (2019) states “the theory posits that a person’s intent to use and usage behavior of a technology is predicated by the person’s perceptions of the specific technology’s usefulness” (p. 1). Secondly, the theory of cognitive apprenticeship will also be used as a theoretical framework. Cognitive apprenticeship is defined as “learning-through-guided-experience on cognitive and metacognitive, rather than physical, skills and processes” (Collins, Brown, & Newman, 1989, p. 456). Cognitive apprenticeship is a model of instruction that works to make thinking visible; it is composed on four main concepts: methods, ways to promote the development of expertise; sequencing, keys to ordering learning activities; sociology, social characteristics of learning environments; and content, types of knowledge required for expertise (Collins, Brown, & Holum, 1991). Much of air traffic control instructional practices are based on the underlying processes of cognitive apprenticeship due to the complex-tasks students must be able to demonstrate consistently throughout their training.
- Is the TAM model an appropriate framework for assumptions in reference to my research question?
- What issues, if any, may I have when using the TAM model framework?
- Is cognitive apprenticeship an acceptable theoretical framework or is it more of a way of instructing then a framework itself?
- What positives, if any, do you see if I use these frameworks in reference to my research question?
Collins, A., Brown, J.S., & Holum, A. (1991). Cognitive apprenticeship: Making thinking visible. American Educator, 15(3), 6-11. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.124.8616&rep=rep1&type=pdf
Collins, A., Brown, J.S., & Newman, S.E. (1989). Cognitive apprenticeship: Teaching the craft of reading, writing, and mathematics. Knowing, Learning, and Instruction: Essays in Honor of Robert Glaser, L.B. Resnick (Eds.). Hillsdale, NJ: Erlbaum.
Davis, F.D., Bagozzi, R.P., & Warshaw, P.R. (1989). User acceptance of computer technology: A comparison of two theoretical models. Managing Science, 35(8), 982-1003. doi: https://doi.org/10.1287/mnsc.35.8.982
Portz, J.D., Bayliss, E.A., Bull, S., Boxer, R.S., Bekelman, D.B., Gleason, K., & Czaja, S. (2019). Using the technology acceptance model to explore user experience, intent to use, and use behavior of a patient portal among older adults with multiple chronic conditions: Descriptive qualitative study. Journal of Medical Internet Research, 21(4). doi: 10.2196/11604
Venkatesh, V., & Davis, F.D. (2000). A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management Science, 46(2), 186-204. doi: https://doi.org/10.1287/mnsc.220.127.116.1126
In terms of disseminating my final product, I have already been invited to present my paper and research to the Aerospace Control Advisory Group (ACAG). For those who did not read my PADLET post, I am an Aerospace Control Officer within the Royal Canadian Air Force and my main specialty is Air Traffic Control. The ACAG is a high-ranking military group of senior Aerospace Control officers whose main purpose is to evaluate and make key decisions into the training and operations carried out by all Aerospace Control Officers within the Canadian Armed Forces. This include both domestic and international operations…combined and joint operations. One of my senior officers has already said they are interested in hearing my finding in terms of how they can better train air traffic controllers. The military spends significant resources to train just one air traffic controller, yet due to the high-stressors within the training and the equally high-standards required to pass both knowledge and practical based assessments, many students are not successful. The Aerospace Control training community has worked hard to innovate the training within the past decade in order to bring down the failure rate within the training programs. With domestic and international operational commitments on the rise and combined with a World that appears to becoming more unstable, the Canadian Armed Forces cannot afford to fail students of any trade.
One of my goals in taking on a Master’s Degree was to aid the aerospace control community in some shape or form. As I thoroughly enjoy teaching air traffic control students in all phases of training, it made sense to do an applied research project to help aid the ACAG group in making training better and students more successful. The ACAG group meets usually twice a year and I most likely will be invited once finished my Master’s to present my research project paper in an open forum with most senior level military brass within my trade. I look forward to making a difference but also completing this undertaking myself.
This course has been a fantastic learning experience for myself. As a senior instructor supervisor in the Canadian Armed Forces, it was particularly relevant and collaboratively engaging to hear and analyze the cohort’s copious points of view on digital facilitation.
Three takeaways that stood out to me:
- Instruction and facilitation are not equal. There is a dichotomy in terms of instruction and facilitation. Although perhaps maybe not from a complete denotation perspective, the application of empathic leadership comes to mind when facilitating a group of students. Although instruction is important in order for knowledge and understanding to be imparted, facilitating the learning experience as a whole is equally important.
- Digital or virtual learning environments need to be safe spaces that encourage collaboration from all learning groups and individuals. Race, gender, and other cultural aspects must be acknowledged and woven into the facilitation/instruction and learning environment as a whole. No one person should feel excluded in any shape or form.
- Motivating learners within a digital or virtual environment can be difficult. Instructors and facilitators must find ways to encourage learning and foster motivated learning behaviours both within the individual student and the class as a whole.
Two questions I have about digital facilitation based on my course experience:
- How does digital facilitation change in an online military course?
- What are differences in an online military course that may prove unique in terms of digital facilitation?
A last Funny 3-2-1
Group 2 would like to thank you for your deeply engaged participation during this past week’s learning. You were introduced, through asynchronous and synchronous learning tools, how to manage and facilitate diverse learning perspectives and conflicting ideas within an online learning environment (Monsell, 2020, para. 1). The week started with a three-day asynchronous discussion to answer two questions:
- If a discussion among learners were to become heated or emotionally charged, should the facilitator jump in to shut down the conversation? Or is it beneficial to the learning to see how it plays out?
- Recognizing that conversation styles, such as argumentative, peacekeeping, collective, passionate, etc., can be culturally based (Ross, 2013), what can a facilitator do to moderate these different communication styles to ensure each student’s authentic voice is still heard?
The discussion was well attended and provided many diverse opinions whilst respect for each others’ viewpoints were sustained throughout. During the asynchronous discussion, most individuals demonstrated a position of concurrence-seeking discussion behaviour, about 2/3, while only 1/3 demonstrated clear diverse or conflicting ideas. The discussion posts revealed that some participants have either not thought of or not deliberately included intellectual conflict within their facilitation environments. The discussion revealed many individuals pointing out that conflict, as a term of both behaviour and deeper meaning, is a highly dynamic concept. Moreover, discussions revealed that conflict can indeed offer beneficial learning opportunities for instructors and facilitators. Johnson & Johnson (2009) stated, “intellectual conflict is that spark that energizes students to seek out new information and study harder and longer” (p. 37). It was always our intention to light a spark within all of you; to successfully demonstrate not only an appreciation for constructive controversy, but to allow you to actively engage in respectful discourse throughout the week.
Although, the discussions started with only 1/3 demonstrating clear diverse or conflicting ideas, this percentage rose to over fifty percent when everyone participated in the synchronous online session. This is a significant jump in the amount of constructive controversy taking place in the week, perhaps showing that people required time to digest the information and research posed to them and for them to process said information prior to either consciously or unconsciously participating in sharing diverse/conflicting ideas. Perhaps individuals had a requirement to first feel safe within the online environment within the group, prior to opening up – all interesting possible reasons for sure.
During the online synchronous learning, you all were asked to brainstorm facilitator strategies for authentic conversations that will support positive and successful online learning through experiential learning, reflection, and discourse. This was accomplished through splitting everyone up into break-out rooms within collaborate. Please see the infographic below to see what you brainstormed as a class.
By the end of the week, everyone was asked to reflect on the learning from the past week and complete a short blog post. The blog posts did not disappoint the Group 2 facilitators and we encourage everyone to read them to add to your learning and comprehension. The main consensus revealed that most people thought the week to be very thought-provoking and highly interactive. Here are some notable quotes from the Blog Posts:
“My initial reactions are that conflict is unproductive…reflecting, I see value in constructive controversy”
“This is a lesson that will have a lasting impact on me in terms of inspiring me to promote constructive discourse among my own learners”
“This week has made me realize that [constructive controversy] can be a valuable tool”
I learned “the importance of recognizing personal biases and the notion of building ‘safe, not brave’, learning spaces”.
“I found the Johnson & Johnson (2009) reading to be really valuable in that it gave me a better vocabulary for what is happening in the classroom.”
The end of the week clearly demonstrated that everyone, in one way or another, was able to actively learn and participate in the learning of others. Participants were able to safely divulge and explain diverse and unique opinions in regards to all the topics at hand. Group 2 would like to thank you for your participation and hope you all learned something new on the topic of constructive controversy. Perhaps now, it is not something you automatically avoid, but with the help of the co-created facilitator strategies learned this week…it is something in fact, you are able to successfully incorporate into your own teaching or unique learning environment.
Johnson, D. & Johnson, R. (2009). Energizing learning: The instructional power of conflict. Educational Researcher, 38(1), 37–51. https://doi.org/10.3102/0013189X08330540
Monsell, C. (2020, September 20). Group 2 – Final facilitation plan [blog post]. Retrieved from https://malat-webspace.royalroads.ca/rru0101/group-2-final-facilitation-plan/
An educational Community of Inquiry (CoI) is a “group of individuals who collaboratively engage in purposeful critical discourse and reflection to construct personal meaning and confirm mutual understanding” (Lalonde, 2020, 0:12; as cited in Garrison & Arbaugh, 2007). This theoretical framework can be a scaffold for many learning environments, such as the aerospace control simulator training environment. Air traffic control, a sub-type within the aerospace control community of the Canadian Armed Forces is highly specialized and unique. Air traffic control training can be defined as a “dynamic training environment where controllers constantly receive a large volume of information from multiple sources to monitor changes in the environment, make decisions, and perform effective actions in a timely manner” (Xing and Manning, 2005, p. 1). Simulator training is multifaceted in characteristics and stressful for students to successfully complete. CoI principles may help to provide a learning environment where content and simulator experiences are not the only learning variables experienced by students. Vaughan, Cleveland-Innes, and Garrison (2013) would suggest it would not simply be blends of content with no learning experiences, but if applied effectively, applied CoI would bring effective facilitation “of both students and instructors, creating a climate, supporting discourse, and monitoring learning such that presence can emerge and inquiry occur” (p. 46). Below is an infographic (Figure 1.1) showing the potential strategies supporting CoI elements within an aerospace control simulator learning environment.
Figure 1.1 Infographic: Community of Inquiry framework as Basis for Aerospace Control Simulator Training
Within the three CoI elements, the aerospace control training community can provide a multifaceted and collaborative training environment. In terms of social presence, post-simulation de-briefs can provide active communication from all students and each class can be provided with their own group chat network to foster continued communication and collaboration when face-to-face is not feasible. A Lessons learned program added to seek to engage the thinking of all students by learning through pre-set scenarios prior to simulation…this to help with students’ thinking outside the box whilst being able to effectively voice their thoughts and ideas. Pre- and Post- Q&A sessions included to enhance and assess students’ comprehension of material currently being presented. The Teaching presence can be enhanced through instructors explaining their own personal and past-failures, a notion to encourage intellectual risk-taking. The overall infographic presented above gives a reasonable representation of specific strategies that can be employed within the aerospace community. These ideas are supported by the CoI descriptions as noted above and can provide the addition of constructed personal meaning whilst confirming mutual understanding within this already multifaceted training environment.
Garrison, D., & Arbaugh, J.B. (2007). Researching the community of inquiry framework: Review issues and future directions. Internet of Higher Education, 10, 157-172. doi: https://doi.org/10.1016/j.iheduc.2007.04.001
Lalonde, C. (2020, August 22). Facilitation in a community of inquiry. Retrieved from https://www.youtube.com/watch?v=Nv1bUZv5PLs&feature=youtu.be
Vaughan, N.D., Cleveland-Innes, M., & Garrison, D.R. (2013). Teaching in blended learning environments: Creating and sustaining communities of inquiry. Athabasca University Press. Retrieved from https://www.aupress.ca/books/120229-teaching-in-blended-learning-environments/
Xing, J., & Manning, C.A. (2005). Complexity and automation displays of air traffic control: Literature review and analysis (Report No. DOT/FAA/AM-05/4). Washington, DC: US Department of Transportation Federal Aviation Administration.
Digital facilitation is not a new concept for myself, but the more I participate within the online learning environment of this multifaceted topic, it further lends myself to ask: how an instructor fits into the whole online learning picture? In reference to Bull’s (2013) “Eight Roles of an Effective Online Teacher,” teachers must demonstrate specific characteristics in order to be deemed effective. Instead of speaking about three of the roles I agree with, I will state three different characteristics I believe make an effective online teacher.
- Humility: This virtuous characteristic puts student ahead of self (teacher); it is the antithesis to pride, self before others. I have had many teachers over my years being a student in elementary, secondary, post-secondary, and professional education. A common theme that all my favourite instructors had was humility. A lost adjective over the years, but one that I hope makes a come-back within our westernized culture.
- Out-of-the-Box: Creativity and the ability to capture the audience is another characteristic which I believe, although not completely necessary, makes an effective teacher. I do not simply mean a teacher who is artistic per se, but one which makes the student turn their head in active engagement…the ability to capture one’s attention in mind and thought.
- Empathic: Empathy seems to be a common theme within this program. Many of my great teachers were empathetic in their teaching styles and approach. This is important to get buy-in and participation from all students, including those who struggle.
Two Questions about Digital Facilitation
- How do you effectively facilitate online courses that are regarded as more stressful, or within high-stakes online/digital learning environments?
- How do you effectively assess learners’ comprehension through online assessment?
My final Thought: What is required in abundance for air traffic control students?
Xing and Manning (2005) describe air traffic control as a “dynamic environment where controllers constantly receive a large volume of information from multiple sources to monitor changes in the environment, make decisions, and perform effective actions in a timely manner” (p. 1). Due to COVID-19, pre-course material of general aviation knowledge within this complex environment now must be taught solely online. Prior, training was done via classroom instruction with subject matter experts. Currently, Chain of Command (CoC) has dictated that students will self-study via online notes and textbook styled resources. This is no different than simply giving them a textbook and telling them to go teach themselves. Although the material is basics of aviation, complex topics still exist and confirmation of understanding is important to foster confidence in the material prior to the more stressful training environments students will see via simulation. My initial idea was to implement learning videos, that could be made by the subject matter experts to aid in student learning and engagement of the pre-course material. However, in order to know if this was a good direction to go in, I completed a short independent samples t-test to determine if the videos would be relevant and helpful to students. This leant well to my empathic design implementation using a student led process as encouraged by Worsham and Roux (2019). I haven’t tried empathic design principles and methods with an emphasis of student-led designers at the forefront, but the idea intrigued me to give it a try.
Two independent groups of 30 students each were selected to complete a mini-lecture on altimeters. For those who do not know: An altimeter is “an aneroid barometer designed to register changes in atmospheric pressure accompanying changes in altitude” (“Altimeter”, n.d., para. 1). One group was to complete the lecture via lecture notes only and then asked to rate the learning tool as helpful or not: a 1 representing unhelpful and a 5 representing very helpful. The other group was given both the lecture notes and a learning video on altimeters and asked the same question and given the same rating scale.
Independent Samples T-Test
Null Hypothesis: Average Score for pre-course learning format for altimeters is p=2.98 (out of 5.0 scale)
df1 = N – 1 = 30 – 1 = 29
s21 = SS1/(N – 1) = 22.24/(30-1) = 0.77
Alternative Hypothesis: Average score for pre-course learning format for altimeters including online learning video is p>2.98 (out of 5.0 scale)
df2 = N – 1 = 30 – 1 = 29
s22 = SS2/(N – 1) = 8.7/(30-1) = 0.3
s2p = ((df1/(df1 + df2)) * s21) + ((df2/(df2 + df2)) * s22) = ((29/58) * 0.77) + ((29/58) * 0.3) = 0.53
s2M1 = s2p/N1 = 0.53/30 = 0.02
s2M2 = s2p/N2 = 0.53/30 = 0.02
t = (M1 – M2)/√(s2M1 + s2M2) = -1.12/√0.04 = -5.92
The t-value is -5.92122. The p-value is < .00001. The result is significant at p < .05.
The test indicated the rejection of the null hypothesis, thus concluding that the results were statistically significant. This now gives me more substantial evidence to pursue a learning video as a course of action as a prototype for my learning technology to develop. It also shows the inclusion of the learner to directly have say in the developing of a potential learning technology and resources to be used – lending to learner-led design. It is important to note that the t-test and p-value calculation do not reveal that learning videos are the best choice of learning technologies. The tests simply show that lecture notes combined with a learning video were statistically significant to determine they were possibly more effective than only having lecture notes. More to follow in my design note and video presentation into the nuances of exactly how I will design the prototype. For now, the t-test puts me on a good footing to spring into my empathic design process and justification for my prototype.
Altimeter. (n.d.). In Merriam-Websters Dictionary. Retrieved from https://www.merriam-webster.com/dictionary/altimeter
Worsham, D., & Roux, S. (2019). Foundations in Learner-Centered Design. Retrieved from https://uclalibrary.github.io/foundations
Xing, J., & Manning, C.A. (2005). Complexity and automation displays of air traffic control: Literature review and analysis (Report No. DOT/FAA/AM-05/4). Washington, DC: US Department of Transportation Federal Aviation Administration.
As noted, before in my first design notes’ post, my problem of practice (PoP) is rooted by the widespread effect COVID-19 has had on the Canadian Armed Forces, particularly within Force Generation; FG: military term for training. Before looking into my PoP I read Seelig’s article “How Reframing a Problem Unlocks Innovation.” The article was a force multiplier or encouragement for me to produce open frames or “[experiences that are created] to inform and increase the way we think” (Seelig, 2013, para. 2). What does this mean within our context? The short version is simply the idea that we can innovate through re-framing our question, or problem of practice, that better represents the specific needs identified and moreover coordinately encourages said innovation at the same time. Asking questions such as “why do we need to make this particular learning technology?” are important to maintaining innovative openness. Combined with empathic design processes, one can tailor their technological learning solution to a problem of practice that can be truly traced through understanding the needs of the diverse population affected as a whole.
My problem of practice (PoP): Due to COVID-19, the Canadian Armed Forces has required air traffic control pre-course training to be solely done via online learning without the benefit of instructors.
Seelig, T. (2013). How reframing a problem unlocks innovation. Fast Company. Retrieved from https://www.fastcompany.com/1672354/how-reframing-a-problem-unlocks-innovation