{"id":313,"date":"2025-05-29T22:54:24","date_gmt":"2025-05-30T05:54:24","guid":{"rendered":"https:\/\/malat-webspace.royalroads.ca\/rru0282\/?p=313"},"modified":"2025-05-29T22:54:24","modified_gmt":"2025-05-30T05:54:24","slug":"parasocial-bonds-and-algorithmic-influence-on-youtube","status":"publish","type":"post","link":"https:\/\/malat-webspace.royalroads.ca\/rru0282\/parasocial-bonds-and-algorithmic-influence-on-youtube\/","title":{"rendered":"Parasocial Bonds and Algorithmic Influence on YouTube"},"content":{"rendered":"\n<p>As I began this research journey into the intersection of parasocial interaction and algorithmic influence on YouTube-based learning, I expected to uncover nuances in content delivery and learner engagement. What I didn\u2019t anticipate was how quickly the inquiry would expand, raising broader questions about trust, visibility, and equity on platforms like YouTube.<\/p>\n\n\n\n<p>Earlier in my inquiry, I leaned more on foundational texts from Selwyn (2010) and Fawns (2022) which helped me to view platforms like YouTube not just as tools, but as larger systems with its own set of entangled values, ideologies, and commercial operations. Further research led me to studies by Bucher (2018), Selwyn (2022), and Bishop (2019), which deepened my understanding of algorithmic curation as both a technical and political force. These articles unpacked how recommendation systems are not just responsive to user behaviour, but actively shape that behaviour by favouring more engaging or profitable content. Simultaneously, readings on parasocial interactions by Labrecque (2014), Chen (2016) and Beautemps &amp; Bresges (2022) brought to light the emotional dimensions of online learning, showing how perceived attachment with content creators could subtly influence learners&#8217; trust, retention, and motivation. <\/p>\n\n\n\n<p>One of the first tensions I encountered was in understanding parasocial interactions<strong> <\/strong>not just as social phenomena, but as pedagogical experiences. I questioned whether emotional connections enhanced engagement, or potentially distorted perceptions of credibility. Through my readings and content analysis of popular channels on YouTube like <em>English with Emma<\/em>, <em>Organic Chem Tutor,<\/em> and <em>Khan Academy<\/em>, I started to see how parasocial presence could serve as a proxy for trust. These highly popular content creators with millions of subscribers repeatedly had high engagement in comparison to smaller creators with similiar content in their videos. I wondered whether learners were naturally gravitating toward these creators due to familiarity and trust, or if YouTube\u2019s algorithm were playing a decisive role in making them more visible. This observation led me to explore the concept of algorithmic agency, examining how algorithms not only curate content but also shape user behaviour and learning pathways.  Bucher (2018) describes algorithms as not only technical processes but agents of power that shape human behaviour in subtle but profound ways. In my research of YouTube, I examined how the recommendation engine reinforces engagement loops that prioritize emotional resonance and quality production over accessibility or pedagogy. Consequently, I found that YouTube is less about helping learners find what they need, and more about predicting what will keep them watching. And that has real implications for what kind of content gets seen, who gets heard, and what voices are left out.<\/p>\n\n\n\n<p>The more emotionally compelling a creator is, the more likely they are to trigger high watch times and return visits. The algorithm reads these signals and promotes their content further. In turn, learners encounter more of this creator\u2019s videos, strengthening that parasocial bond. This feedback loop creates a powerful structure where visibility, trust, and engagement are produced through predictive design. Yet this is where questions about equity deepened. I began to question whether YouTube as a platform is as accessible as it seems and whether it genuinely supports diverse learning. A study by Haroon et al. (2023) was really enlightening in this regard. By creating 100,000 simulated user accounts, the researchers discovered that YouTube&#8217;s algorithm tends to recommend content aligning with a user&#8217;s existing ideological leanings. This pattern suggests that YouTube&#8217;s recommendation system may inadvertently create echo chambers, limiting exposure to diverse perspectives and potentially skewing the learning experience. Recognizing this, I began to reflect on the broader implications of whether platforms have a responsibility or not in ensuring that their algorithms promote a more balanced and inclusive educational environment. <\/p>\n\n\n\n<p>Things became more complex when I considered the economic side of YouTube. Many creators earn money through ads, sponsorships, or paid memberships, which means their content isn\u2019t shaped only by what\u2019s helpful to learners, but also by what performs well. Decisions about tone, visuals, and how often videos are released may be influenced by the pressures of keeping an audience and making a living. While monetization isn\u2019t inherently bad, it adds tension around who gets promoted and who stays invisible. Consequently, the system may favour those who are more marketable, not necessarily more educational.<\/p>\n\n\n\n<p>In the end, my journey so far has been less about finding clear answers and more about asking better questions. How do emotional bonds and algorithmic systems work together to shape trust in learning? What happens when engagement becomes the main measure of educational success? And how can learners, educators, and designers become more aware of the invisible systems that influence what and how we learn? As I wrap up my research, I aim to shift my critical inquiry to uncover these questions in my final paper. In particular, I will focus on how trust, visibility, and equity are shaped in digital environments where performance data and predictive analytics guide what content gets seen and valued.<\/p>\n\n\n\n<p><strong>References<\/strong><\/p>\n\n\n\n<p>Beautemps, J., &amp; Bresges, A. (2022). The influence of the parasocial relationship on the learning motivation in self-regulated learning with YouTube videos. <em>Frontiers in Education<\/em>, 7, 1021798. <a href=\"https:\/\/doi.org\/10.3389\/feduc.2022.1021798\">https:\/\/doi.org\/10.3389\/feduc.2022.1021798<\/a><\/p>\n\n\n\n<p>Bishop, S. (2019). Managing visibility on YouTube through algorithmic gossip. <em>New Media &amp; Society<\/em>, 21(11\u201312), 2589\u20132606. <a href=\"https:\/\/doi.org\/10.1177\/1461444819854731\">https:\/\/doi.org\/10.1177\/1461444819854731<\/a><\/p>\n\n\n\n<p>Bucher, T. (2018). <em>If&#8230;Then: Algorithmic Power and Politics<\/em>. Oxford University Press. <a href=\"https:\/\/doi.org\/10.1093\/oso\/9780190493028.001.0001\">https:\/\/doi.org\/10.1093\/oso\/9780190493028.001.0001<\/a><\/p>\n\n\n\n<p>Chen, C. (2014). Forming digital self and parasocial relationships on YouTube. <em>Journal of Consumer Culture<\/em>, <em>16<\/em>(1), 232\u2013254. <a href=\"https:\/\/doi.org\/10.1177\/1469540514521081\">https:\/\/doi.org\/10.1177\/1469540514521081<\/a><\/p>\n\n\n\n<p>Fawns, T. (2022). An entangled pedagogy: Looking beyond the pedagogy\u2013technology dichotomy. <em>Postdigital Science and Education<\/em>, 4(3), 711\u2013728. <a href=\"https:\/\/doi.org\/10.1007\/s42438-022-00302-7\">https:\/\/doi.org\/10.1007\/s42438-022-00302-7<\/a>  <\/p>\n\n\n\n<p>Haroon, M., et al. (2023). Auditing YouTube\u2019s recommendation system for ideologically congenial, extreme, and problematic recommendations. <em>Proceedings of the National Academy of Sciences<\/em>, 120(10), e2213020120. <a href=\"https:\/\/doi.org\/10.1073\/pnas.2213020120\">https:\/\/doi.org\/10.1073\/pnas.2213020120<\/a><\/p>\n\n\n\n<p>Labrecque, L. (2014). Fostering Consumer\u2013Brand Relationships in Social Media Environments: The role of Parasocial Interaction. <em>Journal of Interactive Marketing<\/em>, <em>28<\/em>(2), 134\u2013148. <a href=\"https:\/\/doi.org\/10.1016\/j.intmar.2013.12.003\">https:\/\/doi.org\/10.1016\/j.intmar.2013.12.003<\/a><\/p>\n\n\n\n<p>Selwyn, N. (2010). Looking beyond learning: notes towards the critical study of educational technology. <em>Journal of Computer Assisted Learning<\/em>, <em>26<\/em>(1), 65\u201373. <a href=\"https:\/\/doi.org\/10.1111\/j.1365-2729.2009.00338.x\">https:\/\/doi.org\/10.1111\/j.1365-2729.2009.00338.x<\/a><\/p>\n\n\n\n<p>Selwyn, N. (2022). Making sense of the digital automation of education. <em>Postdigital Science and Education<\/em>, 4(3), 287\u2013297. <a href=\"https:\/\/doi.org\/10.1007\/s42438-022-00362-9\">https:\/\/doi.org\/10.1007\/s42438-022-00362-9<\/a> <\/p>\n\n\n\n<h5 class=\"wp-block-heading\"><em><strong>Photo licensed from Envato. <\/strong><\/em><\/h5>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>As I began this research journey into the intersection of parasocial interaction and algorithmic influence on YouTube-based learning, I expected to uncover nuances in content delivery and learner engagement. What I didn\u2019t anticipate was how quickly the inquiry would expand, raising broader questions about trust, visibility, and equity on platforms like YouTube. Earlier in my [&hellip;]<\/p>\n","protected":false},"author":299,"featured_media":314,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[9],"tags":[],"class_list":["post-313","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-lrnt526"],"_links":{"self":[{"href":"https:\/\/malat-webspace.royalroads.ca\/rru0282\/wp-json\/wp\/v2\/posts\/313","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/malat-webspace.royalroads.ca\/rru0282\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/malat-webspace.royalroads.ca\/rru0282\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/malat-webspace.royalroads.ca\/rru0282\/wp-json\/wp\/v2\/users\/299"}],"replies":[{"embeddable":true,"href":"https:\/\/malat-webspace.royalroads.ca\/rru0282\/wp-json\/wp\/v2\/comments?post=313"}],"version-history":[{"count":1,"href":"https:\/\/malat-webspace.royalroads.ca\/rru0282\/wp-json\/wp\/v2\/posts\/313\/revisions"}],"predecessor-version":[{"id":315,"href":"https:\/\/malat-webspace.royalroads.ca\/rru0282\/wp-json\/wp\/v2\/posts\/313\/revisions\/315"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/malat-webspace.royalroads.ca\/rru0282\/wp-json\/wp\/v2\/media\/314"}],"wp:attachment":[{"href":"https:\/\/malat-webspace.royalroads.ca\/rru0282\/wp-json\/wp\/v2\/media?parent=313"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/malat-webspace.royalroads.ca\/rru0282\/wp-json\/wp\/v2\/categories?post=313"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/malat-webspace.royalroads.ca\/rru0282\/wp-json\/wp\/v2\/tags?post=313"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}