Learning Designers! Boring Bullets are NOT Your Worst Enemy!

Congratulations! You can now play games at work!

This was the underlying theme of a compliance course that I recently had the opportunity to review. Designed as a highly engaging game, the course set you off on a hunt for clues to find an elusive parrot, traveling to various cities and learning about compliance along the way. As you unearthed a clue, a slice of compliance information got revealed, followed by a short quiz on the information. Your performance in the quiz earned you gadgets that helped you in your search for more clues.

As a game, the piece struck all the right notes. I got involved right from the word ‘go’. I wanted to find more clues. I wanted to collect more gadgets. I wanted to win.


Throughout the experience, I kept thinking “but what about learning?”. Because nowhere on my quest did I ever get to think about compliance. My main focus in answering the quiz questions was to earn gadgets on the way to unearthing my next clue.

This was probably because the core content was still presented as just that – content. The quiz was tarted up to be engaging, as a means to an end, and not an end in itself.

And, though the course told me at the beginning that “finding more clues will give you access to more compliance information”, I kept asking “why?”. Nowhere during the experience could I connect with why the compliance information was important to me, or how it was going to help me do my job better. Neither the content nor the quiz questions had anything to do with application. Written in typical legal-speak, the content portion of the course explained the ‘what’, while entirely missing the ‘why’ and the ‘how’.


Now, this is typically what Cathy Moore calls “putting lipstick on a pig”. If a course is not intrinsically interesting, then adding bells and whistles in the form of games and adventures can only serve to improve its cosmetic value. The core content remains in the same state – boring and ineffective.

I can understand why this happens. The word ‘compliance’ can conjure up images of boring, banal courses that spew out bullet point after bullet point of lackluster content. And, it’s exciting to think about making the course game-like, to add colors and audio and video, and challenges and what not, just to make it more ‘engaging’.


Cammy Bean refers to this as “clicky-clicky bling-bling” – a course that ranks high on the ‘bling’ thing, but fails to address its basic mandate of learning and performance improvement.

Bling is not a bad thing, per se. It’s like great packaging. Who isn’t attracted to products that are packaged well? But when applied to a bad product (one that doesn’t work as intended), it becomes “clicky-clicky bling-bling”.


What can we do to steer clear of “clicky-clicky bling-bling”, and make sure the course addresses the primary need it was commissioned to address? Here are some strategies I think we can follow:

1. Start by defining the goal, or the ‘What’s In It For Me’ (WIIFM) for the business. This is what will help us justify why the course needs to exist in the first place, and align the course design with what the business needs.

2. In addition to the ‘what’, make sure to amply address the ‘why’ and the ‘how’ for learners. This will help them understand that the game (or concept) exists to support the course, and not the other way around.

3. Put the learners in a context that closely mirrors real life, and have them make the decisions that they would have to make in real life.

4. Make each of the choices (the decisions that they would have to make in the scenario / context above) plausible, ensuring that they think before making a choice.

5. Let the scenarios and choices challenge learners the right way. Have them think hard about the situation and the consequences that they would have to face as a result of their decisions, not about the sentence construct and what it might mean before making a choice. Which leads me to my next point…

6. Write the content, and the quiz, in plain English. For example:

“When you see a violation, please report using the third-party hotline. You can do so confidentially and anonymously, which means you don’t have to fear retaliation.”

…is a lot simpler and sounds way better than:

“No employee will be retaliated against for reporting any known or suspected compliance violation. The organization makes reporting violations easy by utilizing an unbiased third-party vendor who receives Hotline calls. These calls are taken both confidentially and anonymously.”

7. Allow people a way out. I can’t overstate the importance of this. As learning designers, we might find a game or a concept very interesting, but it’s possible that learners consider it a dud. Don’t make them suffer through it. Allow them to take a well-designed assessment, and if they want to review information, let them go through a PDF with all the content. Because at the end of the day, what matters is that they learn and can perform better, not that they win the game.

What other strategies can you think of to avoid clicky-clicky bling-bling?

Beyond Effective E-Learning – Changing Habits, Not Just Behaviors

We absolutely nailed it! Folks are loving this!” exclaims Terry.

Reporting rates have increased… incidents have gone down… people are really using strong passwords! Management couldn’t be happier.” chips in Janet.

Terry and Janet, both learning designers in the corporate world, are discussing their freshly-launched security course, and how it has become an overnight sensation. They have every reason to be happy… learners and management have been singing its praise ever since it was ‘released’.

And why not? Terry and Janet have done their homework, diligently working with stakeholders to design a meaningful course. It is chock full of practice activities, and does a good job of both explaining the ‘how’, and convincing learners of the ‘why’. They have also spent considerable effort to ensure that it’s produced well, with all its associated bells and whistles, hence the learner love they are currently basking in.

Let’s fast forward a few months and see what happens. After all, the success of any initiative has to be measured by long-term adoption, and not just immediate outcomes, right?

Six months down the line, the number of security lapses has increased. Drastically. People seem to have reverted to their old ways.

Janet walks by a section of the office she rarely visits, and is dismayed to find passwords written on post-it notes stuck to computer screens. She calls up the IT department contact she was in touch with while developing the course, and he informs her that average password strength has dropped to ‘moderately weak’ from last month’s ‘reasonably strong’.

What just happened?

When the course was newly launched, it was so impactful that it motivated people to immediately go back and make their passwords stronger. And also to proactively look for any seeming lapses in security and report them. Hence the initial spike in the number of lapses being proactively reported, and the reduction in security incidents.

This continued for a while, until the effects of the course ‘wore off’. And in the absence of a system of checks and balances to keep people continuing to exhibit these behaviors, they started to slowly revert to their older habits, purely because they lacked the motivation to continue. It was simply too much effort.

The course, in the form of a single event, was a humungous success in convincing people of the need for better security, and providing them with the knowledge and skills for the purpose. Therefore, it was able to get people to demonstrate the desired behaviors. However, commitment faltered in the face of day-to-day work pressures, as happens when priority assigned to something ‘non-trivial’ goes down. And since there was no ongoing campaign to convert the new behavior into a long-term habit, the initiative failed in the long run.

So, how does an organization ensure that newly learnt behaviors become sticky enough to turn into habits? Here are a few pointers we can keep in mind:

1. Get started

Experts advise that the first step to habit formation is to just get started. Terry and Janet have already achieved this with a well-designed, engaging course that targets the right behaviors. Employees were motivated enough to strengthen their passwords, and to voluntarily come forward and report what they thought were security lapses.

2. Provide constant reinforcement

This can be done using both intrinsic and extrinsic elements.

Intrinsically, people can be reminded at regular intervals of the need for better security, and how it indirectly impacts them as individuals, and the organization as a whole. Case studies, stories, quizzes, etc. can work well in this regard.

On an extrinsic note, employees can be rewarded for having the ‘strongest password of the month’ or for reporting the ‘highest number of lapses’.

All of this can be done online or offline, or a combination of both, which should keep people motivated to continue the streak, and keep security on top of their minds.

3. Use social proofing for validation

Identify secret ‘champions’ of security, to further the cause… who are given specific tasks, such as discreetly starting a conversation on security at the watercooler, or on the company’s Intranet portal. Or sharing a security incident in another company which led to major losses.

Establish an online system where employees can display a ‘Security’ badge, which their colleagues can vote for. The employee who gets the maximum votes will have the strongest badge, and so on.

4. Design the environment for stickiness

Make it impossible, or at least difficult, to have weak passwords. Design the system in such a way that it does not accept passwords below a certain level of strength. Display posters at every work area, or possibly on every desk, constantly reminding people of the need for better security.

Ultimately, while a well-designed course can achieve needed behavior change, for the change to sustain over a period of time, what we need is habit change.

What other factors can we consider for achieving habit change in the long run? I welcome your inputs.

My Purpose for Working Out Loud (#WOLWEEK)

Today is Day 1 of the International Work Out Loud Week, and here’s my first ever go at working out loud.

For those of you who are uninitiated, International Work Out Loud Week, or #WOLWEEK, is a week dedicated to the practice of working out loud. It is designed to encourage pros and novices alike to share their work openly and publicly. This process of doing so (sharing work publicly) allows people to engage in conversation to contribute and to improve each other’s work, and to grow as a community. Working out loud has proved to be a highly effective practice for individual and organizational growth.

The structure of #WOLWEEK is as below.

The first task is to share a purpose on Day 1.

So here’s my purpose – get a hang of amateur, handheld video.

The virtues of video (not the slick, professional variety, but the amateur, handheld, explainer type) are well known indeed. But I’ve always had some trouble or the other with it, less with technology and more with getting in front of the camera.

So, there you go… that is my goal for this week.

Why I Still Love Bloom But Not His Verbs

I have a confession to make.

I’ve sat in scores of meetings with project stakeholders, painfully agonizing over the verbs to use for defining the objectives of a course. Should we use ‘Develop’ or ‘Construct’? Which one is better – ‘Describe’ or ‘Explain’? Would it be more appropriate to say ‘Classify’ or ‘Categorize’?

You get the idea. After all, objectives define the boundaries of a course, and we don’t want to get them wrong, right?

Yes, but…

It is absolutely important to define clear objectives. For designers, they set the boundaries of a learning intervention and decide on its level of sophistication. While for learners, they help (when presented well) set expectations at the beginning of a course. The catch lies in the phrase “when presented well”.

Benjamin Bloom and his colleagues provided the verbs as a helpful means to decide on the action to describe while defining objectives. However, we sometimes often fall prey to the notion that a verb has to absolutely be defined within the confines of the taxonomy level the course represents, failing which it can mislead learners.

For example, take the verb ‘Describe’. It appears on many levels of the taxonomy – Knowledge, Comprehension, and Evaluation. So, by just looking at the verb, you cannot understand at which level of the taxonomy the course belongs.

I agree that the taxonomy is complex. So while on one hand, there is a group of learning designers who insist on following the listed verbs to the T, there is another group who shuns the taxonomy altogether. This second group prefers to follow two things, not necessarily connected with each other:

1. That the taxonomy can, and should, be broadly classified into two levels – knowledge and performance
2. That all learning interventions, irrespective of type and level, should address performance, and not just knowledge

I agree with both the points above. No arguments there.

However. There is a middle path, which when tread well, can help us deliver courses at the right level and make it challenging and engaging for learners, without getting bogged down by the defined verbs.

The six levels of taxonomy in the cognitive domain are useful for deciding the level at which a course needs to be designed.


There are times when the fine distinctions between the levels are a useful measure on which to base a learning intervention. Let’s look at these (please note that I’m using the revised taxonomy here, while also providing a reference to the original one):

1. Remembering (Original Taxonomy Level – Knowledge):

Typically, no information should be presented at this level alone. It simply signifies rote learning with no understanding of the information being ingested.

An example of this would be when a person is getting started on the path to becoming a leader, they read quotes on leadership, and are able to repeat them.

2. Understanding (Original Taxonomy Level – Comprehension):

A few things can be taught at this level. Examples include:

a. Procedural information, such as the steps of a process which takes place in another department. The learner is not directly connected with the process, but it is something very useful for the learner to understand.

b. Conceptual information, such as how solar technology works. Again, the learner is not directly connected with applying the information in their day-to-day work, for example, someone in the marketing department of a company that produces solar energy, who is not directly involved with the production, but should be better informed about the company’s business.

The leader-in-the-making has moved one step higher, and she is now able to explain the meaning of the quotes in her own words. Most e-learning falls in this category; we are all too familiar with “By the end of this course, you would have understood…”.

3. Applying (Original Taxonomy Level – Application):

This level is typically at the center of most e-learning design, and for good reason. We want people to be apply their knowledge and skills to their jobs, and thereby showing measurable improvements in performance. This level, in a way, can also be considered to be the holy grail of e-learning, because if a person is able to do their job better as a result of what they learned in the course, then the course can be said to have achieved its goal. But the verbs to be used, that would depend less on the taxonomy and more on the job skills we are trying to impart.

An example of this would be a course that teaches solar technology to engineering students. Here, learners get to apply their understanding of the concepts to build a solar panel or some other equipment.

Back to the leader-in-the-making. She has taken a course on leadership, and she practices by applying the models she learnt in the course in her day-to-day work.

4. Analyzing (Original Taxonomy Level – Analysis):

This goes beyond the application of knowledge in a specific albeit wide set of contexts, and involves breaking down information into parts, or examining it and trying to understand its structure.

Here, the engineering students deconstruct the solar technology that they have learned, and examine its possibilities, applications and limitations.

The new leader is now able to analyze the models she has learnt, deconstruct them, and see the component parts of the whole.

At this point, we get into the realm of Higher Order Thinking, and it is difficult for a standalone e-learning course to transfer skills at this level and above, with the learner bearing more and more responsibility for their own development.

5. Evaluating (Original Taxonomy Level – Evaluation):

At this stage, people are able to validate information or ideas based on a set of criteria. They can present and defend opinions, using evidence as a solid basis for the same.

The leader is in a position to compare and contrast different models, evaluate and make a sound judgement on which ones are better, and for what reasons.

6. Creating (Original Taxonomy Level – Synthesis):

This is the ultimate level of cognition, where people are able to build new structures or patterns on their own based on existing information.

The leader is now an expert in her field, and she can create new models based on her experience and expertise.

While the levels build upon one another, they are not necessarily linear. In fact, many theorists believe that while the first three levels are in sequence, the last three levels exist parallel to one another, like this:


And, a course can be taught at several levels at the same time.

So, while designing courses, make sure to aim for the highest level on the taxonomy ladder that you can possibly go, without getting mired in the actual verbs to use, and you will have a learning experience that is engaging and interesting.

Learning Design Best Principles – From The Learnnovators & Quinnovation Project


We, at Learnnovators, joined hands with Clark Quinn of Quinnovation to develop a course on ‘Workplace of the Future’, which we recently shared with the learning community free of charge. The links to the course, as well as a series of blog posts by Clark Quinn explaining the underlying process, are provided at the end of this post.

The idea of the project was to develop a course under practical constraints typically faced by learning design and development teams, and show that it is possible to adhere to good principles. So, here they are, the principles that we employed in the development of the course:


The entire course is decision-based. It is full of practice activities, with minimal content that can be pulled by learners if needed.

Learners who enter the course see some initial context-setting content, and then are placed in a scenario where they need to take decisions to move the conversation forward. The ‘content’ is made available only as reference, and can be accessed if needed (not mandatory). However, the scenarios are challenging enough that learners have to access the content to understand the underlying logic of the decisions they are faced with. The decisions also include misconceptions that are likely to have been entrenched in the minds of learners.


We believe that the decision points in the scenarios are neither too easy nor too difficult. Learners need to ponder over them to make a choice, and accessing the reference content makes that decision more informed. This engages learners’ intellectual curiosity, motivating them to interact better with the course.

And, the right answer is not too obvious, to make learners want to access the content, in order to be able to take the scenario forward.


We’ve added intrinsic feedback at the end of the scenarios; by this, we ‘show’ the consequences of the learners’ decisions once they reach an endpoint. This we provide through a simple description of what happens in the organization a few weeks / months after they reach the end, and follow it up with an explanation why this happened.


Though this course is on a topic of importance, we didn’t want to make the experience overly long. Hence we stuck to about 30 minutes. This is the duration that a learner would have to spend, in order to get the most out of the course.


The course does not have audio. We did dabble with the idea of using audio, either for the dialogs or as ambient sounds, but we dropped the idea since we did not see value in it, and also because we were conscious that the course might be viewed in a public environment.


We have used a graphic novel approach for the visuals – it’s new and fresh, and not used enough in e-learning. Moreover, we ensured that we could get a fair representation of characters from different backgrounds and cultures.


We have taken into account that learners are intelligent, and that they will be able to deduce how to navigate the course and interact with the elements. So, while we’ve provided an initial heads up on the navigational elements, we have refrained from indicating, at every juncture, to ‘Click Next to continue’. While we have left it to the learners’ discretion to sense the availability of the Next button and click on it when they want to move forward, we have presented subtle visual cues for other interactions.

The course, in addition to the characteristic navigational ingredients like Menu and Previous & Next, includes a couple of novel elements. These are:

a) My Chat: Through this, learners can track the discussion in the scenario. It provides the dialog that has taken place so far in the form of a chat transcript.

b) My Path: This is an iconic representation of the complexity of the scenario, with a set of dots that changes color as a learner progresses along. This is to indicate to the learner that their path is not linear, and that there are multiple other paths available.

c) Reference: This is the content of the scenario, and is presented as a scrolling document. Learners who access the reference from a point in the scenario are taken to the section of the document that is relevant to the decision point they are in.

The course follows the open navigation model, wherein learners can move freely between the scenarios and the sections.


This turned out to be one of the most critical components of the course design process. The feedback we received from testers (a representative audience group) was deep and insightful, and we were able to make several improvements to the content as well as the structure as a result of this.

Here’s a very brief overview of the underlying process:

– Very often, we diverged and then converged, taking the best of ideas from all quarters and adding them to the course.

– For every objective, we developed the practice first, and then the associated content. If there are multiple practices, then we developed the final, most difficult one first.

– As scenarios got more complex, we made flowcharts in PowerPoint to understand where each link was leading.

– We decided to use Articulate Storyline (the most appropriate tool for development in this case) after careful consideration.

– To ensure that we all had a fair idea of the outcome of a scenario, we decided on a few parameters before starting to write a scenario:

a) The role played by the learner, and who they would be talking to in the scenario
b) The decisions that they would have to make
c) The misconceptions they are likely to have

Here are the links to the four blog posts written by Clark Quinn on Learning Solutions Magazine:
– Post 1 – Deeper Design: Working Out Loud
– Post 2 – Deeper Design: Beyond Traditional Instructional Design
– Post 3 – Deeper Design: Tweaking the Media
– Post 4 – Deeper Design: Putting It All Together
– Post 5 – Course Launch: Learnnovators and Quinnovation Launch Demo Course Based on Learning Design Best Principles

And, here’s the link to the course: Workplace of the Future.

My Top Ten Tools for Learning (2016)

top-ten-learning-tools-2016For nine years now, Jane Hart has been compiling the list of Top 100 Tools for Learning. This is the 10th year (the list has now extended to the Top 200 Tools), and voting is currently on, where you name the top 10 tools that you think are valuable. The link to the voting page is here. And the last day to vote is 23rd September 2016.

Here’s what I voted for:

  1. Google Search: A mobile encyclopedia I carry around with me. Has answers for everything, and eliminates the need to ‘remember’ anything.
  2. Twitter: I recently discovered the joy of Twitter for learning, especially Twitter chats, and have been hooked ever since. While some consider the format (of 140 characters) a constraint, I see it as liberating.
  3. Microsoft Word: Invaluable for writing document drafts, I often use a blank / scrap document to jot down my thoughts. Also, a great tool for storyboarding, especially when you know there will be multiple rounds of reviews and edits. Word can track changes really well, and you’ll know who made what changes in the document.
  4. Microsoft PowerPoint: Great for presenting ideas. Again, I look at this as a storyboarding tool. Works very well for presenting complex ideas and interactions.
  5. Microsoft Excel: The best tool I know for estimation and reporting. All you need to know are a few basic formulae, and Excel can work wonders for you.
  6. Microsoft Project: An extremely powerful project management tool… but at its bare minimum, can make complex scheduling seem like child’s play.
  7. Diigo: A great tool for social bookmarking, it allows you to quickly save web pages for later reading and referencing. You can even annotate sections of web pages.
  8. Skype: Helps keep in touch, and connect with clients and colleagues in a hassle-free manner.
  9. WhatsApp: Currently used more for social sharing, and less for learning. However, has immense potential for learning in closed groups.
  10. WordPress: Okay, I’m just adding it in so it will serve as a reminder that I should blog more :-).

So, what are your top ten tools?

Is Your Learning Solution a Disney or a Walmart?


It’s the stuff of legends. Right from the train that takes you to the park, to the super-friendly guys at the ticket counter, to the rides, the characters, the shows and the food, personally I can’t have enough of the magical, fairy-tale like experience at a Disneyland park. You go there expecting to get surprised, delighted, and overwhelmed.

On the other hand, think of your experience at a Walmart store. You have an agenda, and you want to get in, pick your stuff, checkout, and leave as quickly as possible. We don’t hear anyone complaining about the lack of a good experience at a Walmart store. (Maybe there are complaints about long queues and products not being available, but that’s not the point here.)

I’d say both types of experiences are needed, based on the context. While the first is pricey and high on experience, the second one is low cost but delivers what you need in the quickest, most efficient manner possible.

Now look at this in the context of the learning experiences we design. What is it that is needed? Are your learners expecting to sit back, buckle up, and enjoy the ride? Or are they looking for a quick answer to an urgent question? Worth considering before embarking on your learning design journey.

Forward Design

Forward Design

We design more information-based courses today than we would care to admit. Agreed, these courses can instead be called web pages, cheat sheets, information dumps, knowledge stores, etc. They don’t necessarily have to fall under the ambit of ‘courses’. But whatever we call them, the fact remains that these are designed by learning designers, and we would do well to keep a few good practices in mind before we set upon designing them.

Forward Design: E-learning’s Dirty Little Secret
The best learning programs are designed backwards. This means that you start by ascertaining the goals of a program, and then work backwards to meet those goals. (In case you haven’t heard this term before, here’s the definition of backward design by Wikipedia.) So, if a client comes to you stating the need to design a course for so-and-so topic, you push back a bit and ask them questions, as to what the actual goals are, what learners need to do, and so on. And then for those goals, design learning courses comprising activities and a series of support materials to help learners through the activities.

But many a time, what happens in reality is quite different. I hate to admit it, but we design ‘forward’ (there’s no such term, but I’m using it because what I am talking about is the exact opposite of ‘backward’ design) as much as we do backward, if not more. Consider these situations:

  • Learners are starting out on a new job, and there are loads of information to be covered
  • The subject is voluminous and complex, and learners will benefit from a sense of direction rather than being directly put into the proverbial soup
  • Learning is not directly tied to performance related goals (such as in higher education scenarios)
  • The client doesn’t have the time, or budget, or the inclination to get internal buy-in for a different approach (most likely it is all three!)

In such cases, where the purpose is to disseminate information, and not to change behavior (at least not directly), it is better to start out with what needs to be covered, rather than with the end goal in mind. Now the question is, how do you make the most of this approach? Here are some tips and guidelines:

1. Design your learning into the smallest units possible:

If the idea is for learners to obtain information from this course, make it as easy as possible for them. No one wants to go through a lengthy course that drones on endlessly. There is a lot of research pointing to the fact that learners have a short attention span (well, who doesn’t?), so you might as well keep it short and simple. Moreover, a short topic that addresses one or two learning goals is easier to digest and come back to than a long topic that covers dozens of goals. Which brings me to the next point…

2. Make it easy to search for content

If what you are designing is an information-based course, then is it not appropriate for learners to be able to come back to the course again and again? Let’s say you are trying to cover insurance related concepts for the employees of an organization. In that case, it would be safe to assume that learners will not just take the course once, but would come back whenever they have to refer to the concepts therein. In which case, there are two things you can do to make the course contents searchable:

  • Make your topic and screen titles simple and straight forward (remember, your learners should be able to look at the title and understand what is covered inside)
  • Enable the search feature inside the course (many authoring tools today allow you to do this; if not, you might have to take the help of course programmers to embed this functionality inside the course)

3. Do not narrate every screen

It can sometimes be tempting to do this, and many clients might even insist on this. But there are numerous research reports that point to the pitfalls of this approach. Narration without purpose tends to take control away from learners and reduce motivation. I’m not arguing against the use of audio narration. If used well, narration can really have a positive impact on the learning outcomes, but the key here is knowing when to use narration. Use only if one or more of the following conditions are satisfied:

  • You are describing a procedure or a complex concept, and you want learners to be able to follow it without having to read text on screen
  • You want to add a bit of emotion to what you are describing (for example, to provide feedback to a learner input)
  • You have one or more characters speaking as part of a scenario

There are could be other situations where audio narration lifts up the learnability of a course, but the key is to consider those situations carefully and then take a decision.

4. Do not lock navigation

This is one of the big afflictions of modern-day e-learning. In a well-meaning but futile attempt at ‘helping’ learners get the most out of a course, the ubiquitous Next button is locked down completely, and opens up ONLY when the learner has ‘completed’ the content of each screen. Result: Screen after screen, learners have to suffer through the agony of having to go through content that they cannot identify with, that they already know, or plainly are not interested in at that point of time. And if the entire content happens to be narrated, it is agony doubled for learners, since they need to wait for the entire narration to be complete before they can click the Next button. The answer: Do not lock the Next button, or any other button in the course.

To sum it up, my advice for those who want to design an information-based course: keep in mind that your learners are adults, and that they would want to take control of the pace at which they learn. In any case, isn’t that what you are designing for – so that learners can pull your content when they need it, instead of having it pushed to them?

What other tips/techniques/guidelines do you recommend for designing information-based courses? Thoughts?

Do e-learners satisfice?


Do e-learners satisfice?

My friend recently sent me a 750 GB external hard drive as a surprise gift. I was naturally excited, so I tore open the cover and packaging. Out came a shining new hard drive and a USB cord. I plugged in both ends and waited. Nothing happened. Pretty sure that I hadn’t plugged it in properly, I pulled out the cord and pushed it back into the USB slot. Nothing whatsoever. Time to go back into the packaging – I pulled out the ‘other’ cord to connect the drive to the power source. Of course. How obvious? I should have known.

I was actually taking a satisficing approach as opposed to an optimizing one.

Satisficing, according to Wikipedia, is “a decision-making strategy which attempts to meet criteria for adequacy, rather than to identify an optimal solution”. It can apply to just about any situation where you don’t evaluate your options before making a decision. Typically people satisfice when:

  • They don’t have enough information
  • The stakes are not too high

Now what about learners in a complex scenario? Do they really make intelligent, well thought out choices – follow the optimizing approach? Or do they just satisfice – see what comes up, and then go back and change their decision and see the response and so on?

How do you make sure learners actually follow an optimal decision making process in your interactivities? Some possibilities (each with its own pros and cons):

  • Don’t allow learners to return to the previous step in a scenario
  • Allow them to return, but give them negative scoring/feedback for changed decisions
  • Restrict the number of times they can go back and change their decisions

What do you think is a right approach?

The learners are probably wondering what fuss is all about, and telling themselves “what do I lose by making a wrong decision, it’s a scenario after all.” And they are probably right – if even after this trial and error method, they really understand what we are trying to tell them, the learning goal is achieved, right?