Design principles for my case (WIP)

Definitions

design principles – what influences how you create a learning environment, how the principles can improve learning

innovation: the introduction of something new; a new idea, method, or device 

impact on learning: the direct effect on the knowledge or skill acquired by instruction

reliance on technology: dependence on technical capabilities to accomplish a task; confidence in a manner of accomplishing a task based on experience

usability: the quality or state of being usable; ease of use;  capable of being used; convenient and practicable for use

risk: the possibility of loss or injury; someone or something that creates or suggests a hazard; the chance that an investment will lose value

the value proposition of innovation in design: a clear, simple statement of the benefits, both tangible and intangible, that [a new idea, method, or device in [a plan or protocol for accomplishing learning]] will provide, along with the approximate price … for those benefits

Simplicity

I see simplicity as about minimalism – include only what is absolutely needed to contribute to learning. If it doesn’t add value for a learner, don’t include it. Simplicity encompasses language, aesthetics, navigations, and all other elements of the user experience.

  • innovation – including only what is essential will be a new idea for my clients who will likely want lots of “bells and whistles”
  • impact on learning – reduces cognitive load for learners
  • reliance on technology – the technology tool needs to match the complexity of the learning needs
  • usability; – want to test how usable the software is for development and for end users; not sure whether it will live up to its promise of usability
  • risk – to learners: not being able to use the elearning platform; to me: offering a solution that doesn’t work; to clients: technology fails, learning outcomes not achieved
  • the value proposition of innovation in design – benefits = better learning outcomes

Customization

All content I develop is unique for my clients and based on their situations; it cannot be pulled from off the shelf resources because even if the technology is off the shelf, their business processes aren’t.

I also need to include design elements that provide contextual clues so learners feel like what they are learning is part of the corporate culture and brand. There should be no cognitive dissonance and they shouldn’t be distracted by elements of the design that aren’t on-brand or what they would expect from their company.

  • innovation – all content is new
  • impact on learning – avoid cognitive dissonance with branding
  • reliance on technology – the technology is off the shelf, not customized;
  • usability – how much will the software allow for customization? is it a hard or soft technology?
  • risk: to learners – content included is too out of the box to be of value; to me: software doesn’t allow for level of customization required
  • the value proposition of innovation in design – benefits = unique to the company, contextual – link to adult learning best practice

 

Productivity = Effectiveness + Efficiency

End users are busy and have demanding jobs on top of the training they are being asked to take, so it’s important not to waste users time. Users should be able to advance as quickly as possible through the training if they are able to demonstrate they are able to meet learning outcomes. Learners should be given the choice to take longer to learn some concepts than others if needed. This ultimately comes down to learner control and self-paced instruction. Anything that doesn’t need to be included shouldn’t be because it is inefficient and ineffective. This connects to the design principle of simplicity.

  • innovation – allowing users to be self-paced and have learner control will be new
  • impact on learning – self-pacing and automoty follows adult learning best practices
  • reliance on technology – using the technology to develop the training will allow for faster development, feedback, revisions; productivity will only be achieved if the technology works the way it promises
  • usability – how quickly can you learn the software, as both a developer and a learner?
  • risk: to learners, the software is hard to use and will embarrass them if they don’t know how to use it; it will take them longer to do than in person; to me: solution is not cost effective for clients; to clients: they will lose money/time vs. costs of delivering in person
  • the value proposition of innovation in design – saves money vs. in person time; users see value in taking the training; may be more receptive to future training

Pragmatism

This design principle is less learner focused and more client focused. I believe that learning solutions need to reflect the reality of a situation and not always the ideal or best practice, depending on the constraints you are working with. Sometimes that means learning is compromised, and that’s a trade-off I will advise my clients of so they can make an informed decision against other variables. The technology solution I propose needs to work for my clients. The risk to them needs to be minimal. The solution needs to meet my clients needs.

  • innovation – the new authoring software is a new development method for my clients; e-learning delivery in this format will be new to my clients and some end users
  • impact on learning – there may be trade offs here and some end users may have a hard time learning technology while also learning how to use elearning
  • reliance on technology – as long as the client can keep the package, and it works, they don’t care what technology is used
  • usability – will using the software be practical for office workers to use?
  • risk – the solution will not be pragmatic; this solution minimizes risk of other solutions
  • the value proposition of innovation in design – can be delivered on time and on budget and meet needs

Sustainability

This is where I see the biggest opportunity for innovation, because my clients don’t seem to understand the value of creating something that will have some longevity. They are only focused on the here and now, and are less concerned with updates to the content later on. For example, the materials created for online learning can be accessed over and over again as performance support but delivering an ILT course in the classroom can only be used once. Clients don’t care if a course is responsive, but they will be happy the course can be converted to responsive once that becomes a requirement, without having to redo content. Using a subscription based cloud software also ensures that clients always have access to the latest functionality for future improvements.

  • innovation – the idea of easy iterations and updates will be new to my clients
  • impact on learning – delivery is not one time but can be delivered over time as performance support tool
  • reliance on technology – will have access to the latest technology features going forward
  • usability – how can usability features be built in at the outset to ensure future updates are easy? how easy is it to make revisions based on user feedback?
  • risk to learners – it’s hard to re-access content; to me: clients are unhappy with needing to engage me for changes rather than making them in house; to clients: that they will have spent more on a solution they never intend to sustain
  • the value proposition of innovation in design – future changes will cost less; reduce errors by learners because they can revisit content

Quality

This is a personal value that I hold. I take pride in my work and want to make sure that whatever I deliver to my clients is above average. I define quality as meeting my clients needs, being learner focused, following instructional design and graphic design and technology and multimedia best practices, delivering by deadline and within their budget, and making sure every deliverable is one I would be proud to include in my portfolio. It really is about doing good work. I would need to reflect on how this connects to innovation.

  • innovation – new management will not know what to expect, so they will be surprised by the level of quality I deliver; the technology provides superior quality to PPT
  • impact on learning – a focus on quality will ensure learning outcomes
  • reliance on technology – Rise is a quality product and industry recognized
  • usability – if users can’t use it, the software isn’t a quality product
  • risk – that the software is low quality
  • the value proposition of innovation in design – delivers on need for prototype and is sets a standard of excellence for the rest of the company – makes clients look good; learners want to work for a company that invests in them

References

https://en.wikipedia.org/wiki/Value_propositionhttps://www.merriam-webster.com/dictionary/designhttps://www.merriam-webster.com/dictionary/riskhttps://www.merriam-webster.com/dictionary/usablehttps://www.merriam-webster.com/dictionary/usabilityhttps://www.merriam-webster.com/dictionary/innovationhttps://www.merriam-webster.com/dictionary/learninghttps://www.merriam-webster.com/dictionary/impacthttps://www.merriam-webster.com/dictionary/technologyhttps://www.merriam-webster.com/dictionary/relied

Case development

Below is an iterative version of the case I have chosen to develop based on the two instances of new or renewed learning practices in my organization. It will be republished with more content as I develop it. Comments and questions are welcome during the development process.

  • A title – used to introduce the reader to what this case is a case of.
    • Keywords: online, e-learning, Storyline, Articulate, Rise, cloud-based, authoring tool, content, simplicity, cognitive load, technology training, business process training; digital learning environments (DLE)
    • Working title: Using a cloud-based e-learning authoring tool to design online business process and technology training for office workers
    • Working title: Using Rise as a DLE to integrate business process and software training for office workers
  • A quote – positions your case within a broader human experience. Could be a quote taken from a range of sources (i.e., historical figure, common knowledge, participant in the case, etc.)
    • I love Einstein – find a quote from him that might work
    • Everything should be as simple as it can be, but not simpler (https://quoteinvestigator.com/2011/05/13/einstein-simple/)
  • An introduction – several paragraphs that help the reader to understand why the case under study is important and has significance to the organization in which the case is situated.  The introduction also explains the underlying issues inherent in the case and shares any required background information.
    • Why the case is important
      • moving from traditional classroom-based delivery to self-paced online delivery
      • using a cloud-based authoring tool
      • aligns with principles of adult learning
      • may lead to more effective learning solutions
      • removes barriers to e-learning development
      • other organizations can learn from this organization
    • Why it is significant to the organization
      • conservative and traditional corporate culture
      • has relied on the status quo for a long time and is now seeing a cultural shift
      • this is a pilot project
      • if successful, other areas of the company will want to follow suit
      • potential to save the company money – ROI would need to be calculated
      • if it fails, learners will make mistakes with large industrial customers that could have significant negative consequences; a highly regulated industry, so mistakes would be not only financial from customer impacts but potentially legal from fines/penalties
    • Underlying issues
      • time and resource constraints on project – must be delivered by mid-February and estimated hours is over 400 – not enough time available
      • opportunity – budget is sufficient to do quality work
    • Required background information
      • Has the client used e-learning before? yes, but it has been clunky with more robust software than needed, and more expensive than it should have been
      • Has the client used the proposed authoring tool before? no
      • How has my client delivered previous technology and business process training? in-class, instructor-led training, typically PPT demos but no active scenario-based participation or practice; some e-learning has been used with practice scenarios, but the e-learning software was difficult for learners to use, which reduced the effectiveness of the training
      • vendor-provided software training is available but is off the shelf, not customized to the company’s business processes; client thinks the training materials are of sufficient quality (basically annotated screen captures)
      • a gap analysis has been conducted on what is changing; the expectation is to train to the gap – current vs new
      • previous experience working with this client has been positive but there has been a change in management that could affect client relationship – future work is at stake
  • Case Narrative – shares the story of the case and the evidence.  This section is descriptive and forms the bulk of the case.  It could include charts, pictures, graphics, statistics, etc.
    • Who: learners are all office workers and already have some familiarity with computers; assumption that they are computer literate to level X (need to check Stats Canada levels again); the number of learners taking the training is approximately 25; learners are likely taking the training because they have to, not because they want to
    • What: training is for customer relationship management software; there is a need to integrate the vendor-provided training with business-specific process in a way that is seamless
    • When: training will have a limited shelf life of about 3-6 months before it is outdated and requires refreshing because IT/processes are continually changing
    • Where: training will be delivered online at learners’ computers.
    • Why: the company is moving from multiple IT platforms in multiple business units to one platform across all business units – ensures continuity across the company, reduces risk, and helps manage upgrades/infrastructure long-term
    • How: Learners will use either desktops or laptops. Does not require mobile optimization. Client wants to own the file on its intranet but does not care about how it is created, as long as they own it and it works; proposed authoring tool is Articulate Rise
  • Discussion – analyzes the case narrative and helps the reader to understand the learning environment innovation from either a new or renewed perspective
    • the challenge will be to use an online delivery tool to deliver the training in a way that lets learners focus on the subject matter at hand and not on learning the online tool.
    • The online tool needs to be intuitive, user-friendly, simple, easy to access, navigate, load, stop and return to
    • All design decisions related to use of the tool need to reduce cognitive load
    • Design decisions need to integrate content that I have no say over; it needs to repurpose vendor-supplied materials and client-supplied materials
  • Questions – prompts for the readers to consider or questions for the readers to answer for the case writer to help move the case forward or further develop the situation described in the case
    • How will using Articulate Rise enable or support e-learning?
    • How can I reduce the cognitive load for learners?
    • What can I do to engage uninterested learners?
    • How can I layer content for all computer abilities? Or should I assume everyone is starting at the same baseline?
    • What should I assume is already known? How can I determine that?
    • Even though my client doesn’t care about sustainability now, how can I put measures in place to make updating the content easier and cost-effective later on?
    • How will I measure success? How will end users measure success? How will my clients measure success?
    • What will be different about e-learning that will make it better than classroom training? How can I make sure e-learning is more effective than classroom training?
    • What kind of changes will I need to make to the process of developing content for e-learning vs the classroom?
  • Resources and References – used to support the case and provide additional information
    • Link to articulate rise website

New or renewed learning practices

Below are “two instances of new or renewed learning practices in [my] organizational context.”

Key definitions:

New: “having recently come into existence”; “having been seen, used, or known for a short time”; “being other than the former or old”; beginning as the resumption or repetition of a previous act or thing”; “different from one of the same category that has existed previously”; “of dissimilar origin and usually of superior quality” synonyms: recent, modern, novel, unfamiliar, fresh (all quotes from Merriam Webster dictionary)

Renew: “to make like new: restore to freshness, vigor, or perfection”; “to restore to existence”; “to make extensive changes in”; “to do again”; “to begin again”; “to grant or obtain an extension of or on”; synonyms: regenerate, revive, rebuild, repeat, resume, replace, replenish (all quotes from Merriam Webster dictionary);

Learning practice: something that enables or supports learning (e.g., problem, prototype, method, tool)

Organizational context

Learning practice #1: Read and agree online compliance training

One client and one potential client have both recently complained to me about the current state of their compliance training. In its current form, their compliance training consists of e-learning modules where content is copied and pasted from a word document into a PowerPoint document. Learners must click through each slide to reach the end of the module, at which point they need to acknowledge, using a digital signature of some kind, that they have been provided with the information and agree to follow the directions provided within it.  Though the e-learning modules cover off organizational compliance requirements, they do not encourage actual learning of the material.

There is an opportunity to improve the learning that occurs during online compliance training. Specifically, the learning practice would be integration of interactivity into the learning modules. The learning practice could be considered new (other than the former way of doing things, different from the same category that exists) or renewed (make extensive changes).

Learning practice #2: Deliver technology training in person

A client recently requested I develop an instructor-led training course for field-based workers, which would show them how to use new software required as part of their job. The instructor-led format led to sub-optimal results for a number of reasons, one of which was that learners kept skipping ahead with their laptops, wanting to go at their own pace rather than follow along at a slower pace with the instructor. Based on this experience, I suggested to my next client that technology training might be better delivered online. I recommended that it would be easier to incorporate the business context with the technology in an e-learning module than attempt to incorporate the technology with the business context in a classroom. Most clients default to classroom-based training because it is what they know.

There is an opportunity to shift technology training from classroom-based to on-demand e-learning, for the benefit of learners. Specifically, the learning practice would be integrating business process and technology training using an online platform. The learning practice would mostly be considered new for my client (recently come into existence, other than former delivery methods, different from former technology training, of dissimilar origin and (hopefully) of superior quality), but it could also be considered renewed, since they have previously delivered technology and process training to their workers, and this would simply be a change in how it is delivered.

Reflection

While I think learning practice #1 would be more manageable for the scope of this assignment, as the focus is solely on interactivity, I would benefit greatly from exploring learning practice #2, because it is a real-world situation I am facing at the moment, and the assignment could be used to strengthen my recommendations and actual deliverables for my clients. I have also already completed an assignment on compliance training so would like to stretch myself with a new topic.

I am concerned the learning practice I identified is a bit fuzzy. Is the learning practice about content integration (ie., connecting two pieces of disparate content), or about training delivery (ie., moving offline content online)? Technically it is both, but for the scope of the assignment I will need to narrow it down. I will noodle on that a bit before deciding.

Innovation and change: Changing how we change

Below are my annotations for Innovation and change: Changing how we change by Dron (2014).

Major Theory

  • The terms soft and hard technologies is confusing as compared to software and hardware; I don’t understand why academics insist on creating and using obscure terminology
  • “What makes a technology softer or harder is the degree to which humans are colmpelled to, may, or should make creative choices” (p.242). Personal example: my boyfriend and I wanted to create a photobook. I preferred the more restrictive “hard” technology because it enforced graphic design principles, while my boyfriend lamented its restrictions for not enabling his creativity

Open Questions

  • “It becomes increasingly difficult to find the most effective and relevant OERs” (p. 248) – Perhaps the educational sector needs a company like Google to help rank and prioritize content. Does Research Gate or do academic journals already fulfil this function?
  • “The use of adaptive hypermedia (AH) in which a single set of resources can be adapted to many different user needs” (p. 249) – repurpose contenting is done all the time in marketing by adapting content for different media platforms; the argument that AH is difficult to produce can be flipped to argue repurposing the same content multiple ways is faster than creating new content for each purpose.
  • “It is difficult to improve flexibility without also increasing difficulties or at least complexity for learners” (p. 249). The question for designers should be, what will be most effective for learners? An inflexible authoring tool for designers may create the simplest, easier solution for learners. There are always trade offs
  • “Their [AH] cost effectiveness remains open to question” (p. 249) – Perhaps designers need to consider how user generated content can be used to reduce costs.

Implications for Practice

  • “If it is assumed that change is a good (or at least a necessary) thing, then it is important that an organization designs the processes and procedures to support it (p. 251); this is a big assumption that not everyone may agree with. While change may be inevitable, not everyone will agree that it is good or necessary. Many organizations are resistant to change and would prefer to stick with the status go.
  • “even the most well-meaning centralized IT departments are bound by the need to cater for everyone to produce something that is, inevitably, a compromise for some, if not all, who wish to use it” (p. 256). A great example of this is the limitations imposed by RRU on the use of WordPress plugins and other functionality. Additional functionality would benefit learners but could create complications for the IT department.
  • “There are no simple answers to this problem apart from careful adherence to standards (as they emerge) for interfaces, coding, and design” (p. 257-258). I would suggest taking a page from the publishing world. Publishers have been relying on editorial style guides for decades, if not longer, to help ensure consistency and enforce standards.

References

Dron, J. (2014). Innovation and Change: Changing how we Change. In Zawacki-Richter, O. & T. Anderson (Eds.), Online distance education: Towards a research agenda.Athabasca, AB: AU Press.

Assessing d.learning: Capturing the journey of becoming a design thinker

Below are my annotations for Assessing d. learning by Goldman et al. (2012).

Introduction and background

Research question: “How can we understand what is learned in design thinking classes, and how might assessments contribute to that process in authentic and helpful ways?” (p.14)

From mindsets to mindshifts

Definition of mindshift: “active shifts that students are making …” and “re-synthesis and reorientation of their worldviews, routes, and propensities in problem-solving” (p. 15)

Consider fixed versus growth mindset

  • Question: how is a growth mindset different from a mindshift?
  • Answer: they are very similar; a mindshift is “the active process of developing a mindset” (p. 22)

Four key mindshifts:

  1. Human-centred – “focus on empathy for others” (p. 16); consider needs of other people, not your own needs
  2. Experimental – “everything may be considered a prototype” (p. 17); be prepared to evolve your ideas
  3. Collaborative – collaboration is needed for problem-solving
  4. Metacognitive – being aware of what stage you are in when in the design thinking process

Needs

  • Students need subject matter knowledge and skill to innovate
  • 21st century skills = cultural shift, what skills are needed to succeed today
  • Current assessment methods are lacking; new assessment/metrics needed to measure 21st century skills — see Silva (2008) for details

Theoretical perspectives

This section talked about a number of theoretical perspectives but lacked structure and focus for me to make sense of them in any meaningful way. Three key points stood out:

  • Design thinking = problem solving – this finally clicked for me
  • “based in experiential and sociocognitive view of learning” (p.19)
  • Constructionism (Papert) – people learn by creating tangible outputs of ideas

Research methods and analysis

This section described activities taken to iterate the assessment tool prototypes discussed in the next section. It would be worth revisiting if I were ever to create a similar research study, but otherwise could be skipped over.

Assessment tool prototypes

Four assessment tool prototypes were developed:

  1. The reflective assessment rubric – includes mindshifts listed above; three levels of expertise; related skills (interviewing, prototyping, synthesis, persistence, resilience, adaptability, risk-taking, brainstorming, bias toward action, storytelling, process vocabulary, collaboration); outcome: rubric is inappropriate for assessing mindshifts
  2. The windaboolah experiment task – performance based task; students were asked to participate in a design challenge, to design an ambiguous, made-up item with specific criteria; this allowed for assessment of human-centred mindshifts based on level of experience with design thinking
  3. The designing twenty-first century learning spaces task – reflection based task; assesses human-centred mindshifts; requires guidelines for scoring responses
  4. The assessment dashboard – visual online dashboard to document learning progress as it happens and over time, including relevant skills, process stages, portfolio of work

I’m not sure I would use any of these activities or assessment tool prototypes in my own work. They lack the rigor and practicality my clients would expect of an assessment tool.

Summary and discussion

The majority of the paper and therefore this section focused on human-centred mindshifts. I would have liked to learn more about the other mindshifts identified as well.

References

Goldman, S. et al. (2012). Assessing d.learning: Capturing the journey of becoming a design thinker. In H. Plattner, C. Meinel & L. Leifer (eds). Design thinking research: Understanding innovation. (pp. 13-33). Berlin: Springer.

How to write the case study

Below are my annotations for the article How to write the case study by Monash University (2017)

The article identifies two types of case studies:  analytical approach vs. problem-oriented method. The analytical approach could be considered a post-mortem of sorts while the problem-oriented method is focused on solving a current issue. There are other types of case studies, such as those used for marketing, self-promotion, or applications for awards or professional designations.

The article distinguishes between a case (the situation) and case study (analysis or problem-solving)

Elements of the case study are identified:

  • Synopsis/Executive Summary
  • Findings
  • Discussion
  • Conclusion
  • Recommendations
  • Implementation
  • References
  • Appendices (if any)

It seems odd the conclusion would be reached prior to the recommendations or implementation section. I wonder why that is?

Question: how long is a case study supposed to be?

The additional sources cited in the article were difficult to navigate and understand.

Levelling Up: An Empathic Design Approach

Tasked with a design challenge in course LRNT524: Innovation, Design, and Learning Environments at Royal Roads University, we (Gavin S. and Amber M.) partnered up to participate in Stanford’s d.school design thinking process (Stanford, 2016). Together, we reached a solution that encourages learners in our respective organizations to take intellectual risks and be engaged in their learning community.

Context

Through design thinking, we discovered our organizations and learners have some crossover.

Our organizations

Both organizations require ongoing, hands-on, skills-based training for learners as well as tracking for regulatory compliance. Both struggle with inconsistent instructor delivery, limited budgets, and difficulty coordinating in-person sessions.

Our learners

Learners hold safety-sensitive positions in both organizations (Amber’s are trades workers; Gavin’s are volunteer firefighters). Learners are geographically dispersed and located in remote communities, have different levels of computer savvy and technological comfort, and work variable hours that make attending scheduled training a challenge.

In-person training is critical for our learners, so we focused on prototyping an online learning community that would drive offline (real life) engagement and intellectual risk-taking.

Empathic Design

Our prototype incorporated four layers of sensitivity found in empathic design (Mattelmaki et al, 2014). Our learners have various skill sets, motivations, and needs we needed to account for (sensitivity toward humans) while ensuring content was relevant, authentic, and problem-based (sensitivity toward design) through real-time and in-person delivery that drove teamwork (sensitivity toward collaboration) and used technology appropriate for all levels of experience (sensitivity toward techniques).

Prototype

One prototype component used was gamification, with an approach similar to consumer rewards credit cards (the more it’s used, the greater the reward). Below are four features we came up with:

Learning target

Organizations assign a target value (e.g., 10,000 virtual points) to each learner’s annual learning plan. To demonstrate they have met their learning plan, learners must reach the target value. They can accumulate points by participating in learning activities. This feature addresses the organizational need for compliance tracking and sets expectations for learners, reflecting Gagne’s second event of instruction, expectancy (Thomas, 2010).

Flexibility of choice

Learners decide how to earn points by choosing which activities to participate in (e.g., attend in-person training events, respond to peer questions in discussion forums, or share lessons learned via blog posts) based on geographic or time availability. This feature uses self-directed learning (Merriam, 2001, as cited in Vann, 2017) and personalization (Bates, 2016) to mitigate obstacles to learner engagement.

Engagement and intellectual risk-taking

Points correspond to learning activities based on the engagement and intellectual risk-taking required. Attending training events might equal 1000 points while asking or responding to questions might equal 250. Based on social constructivism (Anderson, 2016), this feature directly addresses our design challenge.

Motivation

Progress is tracked with points. At key milestones or achievement levels (e.g., 2500, 5000, 7500, and 10000 points), learners can redeem points for real-life, tangible rewards. Each achievement level provides more rewarding options, allowing learners to “level up.” Learners can also choose rewards most meaningful to them. This feature draws on motivation theory (The RSA, 2010) and behaviourism (Ertmer & Newby, 1993) by encouraging continued learner participation.

Anticipated Challenges

We empathize with the struggles our learners face and know our prototype does not address all of them. An online community that is device and platform agnostic might remove technology barriers, live-streaming or on-demand services might mitigate geographic obstacles, and events in multiple locations might help avoid scheduling issues. Even so, we acknowledge our prototype is in its infancy.  Do you have suggestions for testing our prototype or improving its features? We welcome your feedback in the comments.

 

References

Anderson, T. (2016). Theories for Learning with Emerging Technologies. In G. Veletsianos (Ed.), Emergence and Innovation in Digital Learning: Foundations and Applications (p. 38). Edmonton, AB: Athabasca University Press https://doi.org/10.15215/aupress/9781771991490.01

Bates, T. (2016). Choosing and using media in education. In Teaching in a Digital Age: Guidelines for designing teaching and learning (pp. 334). Vancouver BC: Tony Bates Associates https://doi.org/10.1017/CBO9781107415324.004

Ertmer, P.A. & Newby, T.J. (1993) Behaviorism, Cognitivism, Constructivism: Comparing Critical Features From an Instructional Design Perspective. Performance Improvement Quarterly, 6(4), 50–72. doi: 10.1111/j.1937-8327.1993.tb00605.

Mattelmäki, T., Vaajakallio, K., & Koskinen, I. (2014). What Happened to Empathic Design? Design Issues, 30(1), 67–77. Retrieved from  http://10.0.4.138/DESI_a_00249

Merriam, S. B. (2001). Andragogy and self-directed learning continue to be important to our present-day understanding of adult learning. New Directions for Adult and Continuing Education, 89, 3-13. doi:10.1002/ace.3

Stanford University Institute of Design. (2016). A virtual crash course in design thinking. Retrieved from http://dschool.stanford.edu/dgift/

The RSA (April 1, 2010). Drive: The surprising truth about what motivates us. Retrieved from https://youtu.be/u6XAPnuFjJc

Thomas, P. Y. (2010). Learning and Instructional Systems Design, 181–290. Retrieved from http://uir.unisa.ac.za/bitstream/handle/10500/4245/04Chap 3_Learning and instructional systems design.pdf

Vann, L. S. (2017). Demonstrating Empathy: A Phenomenological Study of Instructional Designers Making Instructional Strategy Decisions for Adult Learners. International Journal of Teaching and Learning in Higher Education, 29(2), 233–244. Retrieved from http://www.isetl.org/ijtlhe?

 

Constructivist views of instructional design

Below are some of my annotations and comments on Thomas’ (2010) chapter on learning and instructional systems design as part of a doctoral dissertation.

Introduction

The term instructivist and instructivism was new to me. A definition explaining how instructivism is different from behaviourism would have been helpful, especially since it was such a key element in the discussion of how instructional design is moving away from instructivism and toward constructivism.

History of Instructional Design

Having already studied the history of instructional design, much of what was discussed was already familiar to me. The list of instructional design models developed by Ryder would have been a useful appendix or chart to include, especially since no reference list was provided.

Conceptual Representation of ID

The key finding in this section is that there is no one agreed-upon definition for instructional design. A definition I disagreed with was by Seels and Glasglow (1998), who stated “learning should not occur in a haphazard manner,” (as cited in Thomas, p.186). I believe there needs to be a distinction between learning and training. Learning occurs all the time in real world situations, while training is a more formal structure for guiding that learning.

Instructional Design Models and Pedagogical Models

Sections 3.4 and 3.5 both discuss instructional design models. One question I had when initially skimming the article was the difference between instructional design models and pedagogical models. The answer is that instructional design focuses on the development process for a learning solution, while pedagogical models focus on the learning strategies that underpin the solution.

I was already familiar with the ADDIE model, rapid prototyping, Gagne’s nine events of instruction, and Merrill’s models, but the other models discussed were new: universal systems design, DC model, MRK model, Smith and Ragan’s model, and dynamic ID model.

Prior to reading this paper, I was unfamiliar with almost all of the pedagogical models discussed — Reigeluth’s Elaboration Theory,  ICARE Model, ASSURE Model, Mayes’ Pedagogical Framework, 7 principles for good practice in online courses, the blended learning models, though I had at least heard of the ARCS model and project and problem-based learning.

Summary of Features and Criticisms of ISD Models

The key finding by the author after review of tinstructional design models was, “even though each model had some differences, they were all basically similar in their need to provide certain components that are common to instruction” (p. 230). This was disheartening after trying to make sense of all the different models just discussed. The comment that traditional models are often criticized for being more process than people-focused rings true to me for corporate settings. My experience has been that most corporations are focused on business outcomes and want to align corporate training to business processes and tasks.

Beyond Traditional ISD

This was a short section with only one paragraph. As a reader, I’m not sure what my takeaway was supposed to be.

Core Foundations of the Grounded Design

This section identified five considerations for designing learning, regardless of delivery method used: psychological, pedagogical, technological, cultural, and pragmatic. While I agree all considerations are important, my approach is largely pragmatic. Whenever I design a learning solution for clients, I  ask myself, what is realistic given my client’s needs, their organizational culture, and their resources?

Constructivist Views and Constructivist Design Models

Sections 3.9 and 3.10 both discuss constructivism. In general, constructivism would be problematic for my clients. Corporations are accountable to their shareholders, so they need evidence that shows their learning and development budgets have been well-spent. Statements like “Constructivist strategies are often not efficient, resulting in a trial-and-error approach to the performance in the real world” (Merrill, 1997, as cited in Thomas, 2010, p. 265) makes constructivism a hard sell, even when it would be best for learners.

Alternative Instructional Design Approach

After a heavy emphasis on constructivism in the previous section, this section provided a more balanced view that acknowledged cognitivism and behaviourism still have a role to play in learning. The comment, “The problem is in selecting the most appropriate one to apply in a particular real setting” (p.268) gets at what I believe is the core role of an instructional designer: selecting the right approach for a given situation to maximize learning outcomes.

Blended Learning: Development of Design Criteria

I initially thought this section would provide limited usefulness to me because it focused on blended learning, and my area of interest is fully online learning, but most was relevant. Linking interface design to Gagne’s “gaining attention” event of instruction was an interesting connection, one I likely would not have made on my own.

It was re-affirming to read, “meaningful feedback improves performance” (Driscoll, 2002, in Thomas, 2010, p. 279), as I had received criticism from one of my clients that my instructional design provided too much learner feedback (the client requested I limit my feedback to “correct” or “incorrect”).

Role of Instructional Designers

Most of this section referred to instructional designers in higher education and referenced the role of teachers, students, faculty, lessons, curriculum, etc., so this section was not directly relevant to my professional practice or area of interest. The one statement that jumped out at me was that instructional designers need to fulfill multiple roles, including student, reviewer, tester, and project manager (p. 285).

Success Factors for Technology Integration

Having worked on multiple enterprise resource platform integrations in an organizational change management capacity, I agreed with everything included in this section but did not encounter any new or interesting findings. Most of the content seemed to state the obvious. This is more a reflection of my existing knowledge and experience than a comment on the content the author chose to include. Commentary on higher education was of little interest because it would be speculation as to whether the findings also apply in a corporate setting.

References

Thomas, P. Y. (2010). Learning and instructional systems design. In Towards developing a web-based blended learning environment at the University of Botswana. (Doctoral dissertation).

 

Good ol’ ADDIE

Photo credit: Educational Technology

I’m already familiar with ADDIE model, but I’ve never considered it through the eyes of a learner – until now. In his article, Is the ADDIE model appropriate for teaching in the digital age? Bates (2014) provides a quick summary of the ADDIE instructional design model and explores its benefits and limitations. (If you aren’t familiar with ADDIE, it’s used to guide instructional designers through creating learning solutions. The acronym stands for Analyze, Design, Develop, Implement, Evaluate.)

Since all of my work is in a business setting, I welcome Bates’ (2014) reference to ADDIE’s connection to corporate e-learning and training and can see how ADDIE would remain popular given its roots in behaviourism. In my experience, corporations have a strong preference for behavioural learning interventions (previous blog post).

I absolutely believe it’s important to address learner needs and characteristics when developing learning solutions, so I am surprised most of the benefits of ADDIE Bates mentions are for corporations or other entities; he doesn’t talk about any benefits for learners. In contrast, I have always thought the model does a great job of reflecting learner needs. The Analyze phase provides the foundation for all other phases, so even though the learner isn’t always mentioned explicitly, consideration for learners is still present.

One shortcoming I do find with the ADDIE model is that there is no link to performance support or follow up for learners after formal training is complete. Learners don’t stop learning once training is done, and learning transfer isn’t explicitly mentioned in ADDIE.

I’ve also found ADDIE is not entirely realistic. As Bates mentions, following ADDIE can be expensive and redundant. It’s not always possible to be as thorough as the ADDIE model requires. Budgetary requirements and time crunches often require combining steps or taking shortcuts rather than following each phase sequentially.  Depending on the size of the project, ADDIE doesn’t always make sense for my professional practice.

It is disappointing to have Bates criticize ADDIE without providing a new model for instructional designers to consider. I expected Bates to mention the Successive Approximation Model (SAM), at least in passing, because SAM seems to be replacing ADDIE in the business world. The model focuses on rapid prototyping and development, which addresses Bates concern that ADDIE is not flexible enough to address modern challenges.

References

Bates, T. (2014, September 9). Is the ADDIE model appropriate for teaching in a digital age? [Blog post]