The Role of Chatbots in Education. Assignment 2, Part 2.

Throughout our known history, humans have used other beings to help us complete tasks and accomplish our goals with greater ease. Beginning with domesticating animals as a means of transportation and food, we then created machines to replace them. As our sophistication in the area of machinery has evolved, we have progressed to computers and other technologies that continue to expand our capabilities. In the past fifty years we have investigated the area of mechanical agents to create machines that emulate human characteristics. This essay explores the use of machines, robots, conversational pedagogical agents, social agents, and chatbots in the area of education drawing on the research of five articles and connecting their findings. For the purpose of this essay, the term ‘chatbot’ will be used throughout regardless of the various terms used in the articles. While some may believe that using chatbots in educational settings is not valuable, research shows that the use of chatbots improves the learner’s experience. Despite the controversy, chatbots are here to stay.

The authors of each article agree that chatbots and humans can interact and build relationships with each other to achieve learning objectives. Specifically, Riel (2019) suggests that three critical functions of educational chatbots are necessary to achieve educational objectives in a principled way. Conversational functionality: to engage in conversation with a human being. Educational goals: designed to meet intended educational goals. Pedagogical roles: assume a pedagogical role in its design if it will teach students. Furthermore, he states, “…an educational chatbot must also actively play a role in the learner’s education towards achieving the educational goals established by a chatbot’s designer (much like the work of a teacher, coach, or tutor)” (p. 3).  The idea of chatbots aiding in the process of learning by assisting humans is also discussed by Gulz, Haake, Silvervarg, Sjoden, & Veletsianos (2011), through the illustration of a math game in which students “teach their agents to play” (p. 134). In doing so, the agent assumes the role of a Teachable Agent (TA) which “constructs a mathematical model by means of artificial intelligence (AI) algorithms” (p. 134). Further to this, Dale (2016) suggests that chatbots and humans can, and more importantly, do, engage in relationships. In fact, he believes we have already reached the point of being unconcerned if we are dealing with a real person or not. Brahnam & De Angeli (2008) expand this further, stating that chatbots have the ability to act as social stimuli, which are, “…animated, they perceive while they are perceiving, change while inducing change in others, have and elicit intents, motives, desires, and emotions” (p. 1). Bull, Hall, & Kerly (2007) employ the methodology of learner modelling which creates a model based on individual learners and allows for customization and interaction between the learner and a system that is collaborative and also, adaptive. The insights of each article provide an overarching opinion that chatbots can interact with learners in ways that resemble human interaction with other humans. However, human interactions are not solely positive in their nature and in this regard, relationships between chatbots and humans also reflects some of the negative outcomes of our engagement.

Brahnam & De Angeli (2008) write about the ‘dark’ side of human and chatbot interaction raising the questions about potential abuse of chatbots by humans. They argue that relationships can be developed with chatbots and can incorporate emotions such as rage and anger towards chatbots in the same way humans can experience emotions with one another. They state that chatbots can be abused and misused citing the examples of “cyberbullying, electronic spam, and frauds (p.5). This idea is also discussed by Gulz, Haake, Silvervarg, Sjoden, & Veletsianos (2011) in reference to a study conducted in 2008 by Doering, Scharber and Veletsianos which found significant instances of learners abusing chatbots. The authors’ opinion based on their subsequent study is that this issue is not as prevalent, however, they acknowledge the variables between the two studies and caution that their findings should be viewed in light of this caveat. The opinions expressed by Brahnam & De Angeli (2008) however, suggest that negative findings in studies are often framed in a more positive manner. Instead, their research determined “…that verbal abuses (e.g., insults, threats, foul language, sexual advances, and pornographic sex-talk) abound in user interactions” (p. 2). Given that violence between humans is a reality of our modern world, it would make sense that this would also be factor in the relationships between chatbots and humans and bears further consideration.

In each of the articles the authors highlight areas and make recommendations for future study and/or development. Riel (2019) expresses the concern that “…algorithms used in automated and personalized educational software disproportionately place underrepresented students into less-productive paths” due to “inherently biases in the system” (p. 10). He recommends that designers need to be aware of this and to account for it in the analysis of the data. Gulz et al. (2011) identify the need for longer term studies to be conducted to provide data that examines the effects of chatbot use overtime.  Bull et al. (2007) state that designing “specific scripts”, “a chatbot can provide the necessary negotiation facilities for an enhanced” learner experience (p. 12). Overall, the authors present positive viewpoints on the use of chatbots and the benefits that could be achieved while also suggesting further research is required. As Riel (2019) observes, “Such research is timely, as chatbots are poised to provide unique benefits and new possibilities to learning environments that are not achievable in more traditional learning situations” (p. 11).

Chatbots have a role to play in educational environments that will add tremendous value, however, attention needs to be paid to the specifics of that role. In that, chatbots should act as assistants, aiding humans, rather than replacing them. By examining the nuances of human relationships, we can use this data to create chatbot profiles that meet the needs of learners. Most importantly, given the trend and increasing use of chatbots, it is crucial that educators expand their research parameters to comprehensively study the positive, and negative impacts. Failing to do so distorts the full picture and results in a biased and inaccurate view. Whereas, endeavoring to examine the research findings meticulously will enable designers to make adjustments that improve the experience for the learner.


Brahnam, S., & De Angeli, A. (2008). Special issue on the abuse and misuse of social agents.Interacting with Computers, 20 (3), 287-291.

Bull, S., Hall, P., & Kerly, A. (2007). Bringing Chatbots into education: Towards Natural Language Negotiation of Open Learner Models. In: Ellis, R., Allen, T., & Tuson, A. (eds). Applications and Innovations in Intelligent Systems XIV. SGAI 2006. Springer, London Retrieved from

Dale, R. (2016). The return of the chatbots. Natural Language Engineering, 22(5), 811-817. Retrieved from

Gulz, A., Haake, M., Silvervarg, A., Sjoden, B., & Veletsianos, G. (2011). Building a Social Conversational Pedagogical Agent: Design Challenges and Methodological approaches. In Perez-Marin, D., & I. Pascual-Nieto (Eds.), Conversational Agents and Natural Language Interaction: Techniques and Effective Practices. 128-155. IGI Global. Retrieved from

Riel, J. (2019). Essential Features and Critical Issues with Educational Chatbots: Toward Personalized Learning via Digital Agents. In Khosrow-Pour, M. (Ed.), Encyclopedia of Organizational Knowledge, Administration, and Technologies. Hershey, PA: IGI Global. Retrieved from

Image retrieved from


2 Replies to “The Role of Chatbots in Education. Assignment 2, Part 2.”

  1. Very interesting essay on chatbots, Sue!

    I haven’t thought about chatbots enough, but will need to… As a K-12 educator, I see any form of edtech, chatbots included, that engage students on new or deeper levels as valuable. Similarly, opportunities to personalize learning, which chatbots can do, are also of great value to most learners.

    And ABUSE of chatbots is something that I had not ventured to think about — but of course it would exist. What a world we live in. Where people would abuse one another in person, the opportunities and courage to do so online are only magnified – especially if one is “only abusing a machine or distant bot.” Imagine, the programmers creating responses to try to teach civility to various ages of online users of chatbots.

    I look forward to working with chatbots more, as part of various learning tools and resources — as an educator with my students.


    1. Thanks Leigh! I was surprised as well. But of course, once we think about it more, the instances of online violence are stark examples. And, for the most part, what we’re witnessing on Facebook and Twitter, involve ‘real’ people. So it struck me as an important consideration, particularly for children who are not only acquiring knowledge but also learning behaviour.
      The diversity of the sources I found offered many fascinating ‘rabbit holes’ so it was challenging to stay focussed. It’s definitely an area I would like to explore in more depth.

Leave a Reply to s1reid Cancel reply

Your email address will not be published. Required fields are marked *