LRNT523: Activity 3

As a management consultant guiding organizations through AI adoption, one thing is clear: how people share and acquire knowledge is changing as fast as the technology they’re trying to adopt. Weller’s insights (2020) from 2002-2011 still ring true, offering lessons that are more relevant than ever. Let’s jump into two that stood out to me.
Curate, Don’t Create
Remember when everyone was obsessed with creating content? Blogging, vlogging, and yes, a bit of oversharing. But then came a revelation: why create when you can curate? Tools like RSS feeds and wikis taught us that the real power lies in gathering and curating content from existing sources. For organizations, this is a goldmine—stop writing long, outdated manuals and start curating the best resources from across the web. It’s faster, stays fresher, and it’s a whole lot less painful.
Connectivism vs. Corporate Control
Now here’s a challenge we still face: connectivism. This theory suggests that learning happens through networks—connections between people, resources, and information. Sounds great, right? Enter corporate control. Most organizations are still clinging to rigid, top-down learning models while touting learning frameworks like the 70-20-10 model (70% on-the-job, 20% social learning, 10% formal training). But in reality, it’s all formal training! Employees need space to explore, innovate, and connect freely—or else, they’re learning at a 1990s pace in a 5G world.
Takeaway: The future of learning is decentralized. Stop controlling—start curating and connecting.
References
Jasper Art. (2024). Photographic image for blog post: From Control to Curate: The Decentralized Learning Revolution. https://app.jasper.ai/art
Weller, M. (2020). 25 years of ed tech. Athabasca University Press. https://read.aupress.ca/read/25-years-of-ed-tech/section/e69021f2-91b6-4ca4-9d0b-81d3e9748707

Great post, Kirsten, and I’m wondering about your perspective on the impact AI will have through rapid generative content creation at scale and how that will affect the ability of organizations to engage in curation activities. What happens when there’s too much content to curate? Is there a role for establishing trustworthy sources of truth or some decentralized method of qualifying the integrity/validity/value of the content?
Hi Chris. You have hit on a topic that I often think about. Given the statistics on data growth that we often hear in the technoverse, even quantum solutions will not be able to find answers swimming in pools of noise. That said, focusing AI on narrow tasks can prove effective and while AI proliferates content, with specific prompting, it can also turn down the volume and successfully perform the task of curation. Weller discusses certification of information in chapters about blogging, analytics and badging. My prediction, like his, is that we will have fun for a while creating way too much content and realize that we need to back up and rethink what is the problem we are really needing to solve and re-apply this technology towards effective means to generate content. Thanks for your thought-provoking question.
Kirsten, you make a really good argument when you say that switching from creating material to curating will save time and improve relevance by utilizing already excellent resources. Additionally, I find your parallel between connectivism and corporate control right on. The obvious problem I could imagine is the conflict between supporting natural, network-based learning and maintaining hierarchical hierarchies. To fully realize the potential of modern learning, businesses should adopt more decentralized techniques. What potential role does AI, in your opinion, have in further decentralizing education without sacrificing quality?
Hi Weri,
Excellent question! I think the role AI will play depends on the acceptance of AI by a corporation. Employees will be using it – whether they like it or not – and counting on it to fill in the blanks for learning that the organization is not providing for them. The organization can save time, employees can access new information to support daily work and it does not have to usurp the role of structured corporate learning and development programs. BUT, if the organization is reticent to relinquish any control…well, then AI will be a thorn in their side.
Kirsten
Hi Kristen and Chris, I love this conversation as I think it does get, in part, to the crux of it – how much is too much and what do we need to know in order to do “it” (whatever tasks/job/role) well? The rapid departure of skilled knowledge holders from the workforce has the potential to amplify AI’s ability to curate corporate knowledge that needs to be “passed along” via formal and perhaps informal training. However, at some point someone needs to decide the various information sharing thresholds – what is it that needs to be passed forward and why and what can be left behind (and at what risk/cost?). These questions of knowledge management have been around for a long time and I am curious to see where they go now given the ‘new technology kid on the block’. Kirsten and Chris, how are you seeing this play out in your settings? What if any guidelines are being developed to frame out the thresholds? Interested in your thoughts and looking forward to a continued conversation. Ciao, Elizabeth
It is a challenge to get this right. I’ve seen both ends of the spectrum. Working with corporations, I see some that have a very structured learning path with modules delivered over several years – in person – career impacting. Given they are usually from the genesis of the company – the idea of incorporating AI in delivering or developing is off the table and so the data set of information is well curated/limited. Often these trainings are seen as a “perk” because it usually meant a trip to HQ and meeting colleagues they likely wouldn’t otherwise meet. These trainings have frightening names like “induction” or “leadership series” and their singular goal is to ensure that everyone hears the same information the same way. How much of this information really needed to be passed on is a good question and how much might have come in a different format…well, these days we spend so much time looking at screens that I’d be loathe to see this one activity change. Usually this training is for onboarding or management skills. Everything else an employee needed to learn was on the job. And if there wasn’t on the job opportunities, then they were googling like crazy to figure out how to get something done. AI can both put these desperate searchers into a tailspin or become their best friend. Rather than guidelines for thresholds or information purging, what I have seen is organizations limit the use through blacklisting applications on corporate devices.
Hey Kristen!
Another thought provoking write up.
In regard to your “Connectivism vs. Corporate Control”.
I hear you, however the 70-20-10, or the 80-20 models are both extremely successful methods in the areas of Transferring Skilled Knowledge such as Nursing, Trades Apprenticeships and Technology Engineering. I believe there is a great method for every task to be learned and I agree, we need to be mindful of Top Down approaches and strive to create change if its not relevant to the sector we are in. Have you been evolved with apprenticeship systems? Experiential Learning is as the Rage Currently. I’d love to hear more of your thought on this. Cheers!
My experience with apprenticeship systems is limited to my son being in the ACE program in high school. But I am a massive believer and supporter and enable and facilitator of experiential learning. It is my favourite way to learn so I’m biased! I was so interested to hear how you are using AI coupled with hands on learning in your classes. Amazing.