7 Questions to Ask to Create Effective E-Learning


We’ve all heard this before. I have, at least a few hundred times. A new client reaching out, saying: “We are looking to create an engaging course. Please make it as interactive as possible.”

And every time I hear this, I go “Hey wait! What about effectiveness?” I ruminate over why no one is talking about the effectiveness of a course, when that is the first thing we should be focusing on.

And then, one fine Sunday morning, it struck me. A light bulb moment!

We were talking to an interior designer for doing up our apartment, and before we met him, I had made a PowerPoint presentation, and as is typical of me, detailed every little corner that we wanted shelving in, including the length, width and height of each shelf inside the cupboards. And I had convinced my husband to not think about the style or the colors until we got this, the basics, right.

After all, form should follow function. And to me, this was the right way to do things, the focus on the effectiveness (the ‘livability’ of the house) before the engagement (the colors and the aesthetics).

When we met the designer with the presentation, he was not only stumped and taken aback, he told me he’d never seen anyone do this before.

People don’t necessarily go by effectiveness. They don’t say “I want my home to be functionally well-designed”. Instead they say something to the effect of, “I like contemporary, but I also like art deco, and I want my home to have elements of both.”, or “I love orange, and I want it in my living room”. And beyond outlining a few requirements, they leave it to the designer to figure out the rest.

That got me thinking. Just because the client (or the business head or SME) throws around a few terms, it doesn’t mean they are aware of what makes an effective course. That’s for us learning designers to think about and come up with.

Of course, we know engagement is really important. Only if the learner is engaged does their mind open up, and they become attentive and receptive to what the course is saying. And no matter how well we design the course, if the learner is not going to pay attention, then all our efforts are wasted.

But engagement alone is not enough. Movies, books and games have taught us that. Audiences take up adventures, go on journeys, and laugh and cry with characters, and once done, go back to being the same person they were before they went through the experience. Nothing changes. While this is okay for a work of fiction, it is not okay for a learning experience, because what we ultimately want is behavior change. We want to build the skill or ability for a person to do something they were not able to do before.

So how do we bring effectiveness to a course without having to lecture the client or other stakeholders about it? For starters, we can ask a few questions:

– What can they do after the training that they can’t do now?
– Why aren’t they doing it (or doing it well) now?
– What barriers do they face?
– What mistakes do they make?
– Are there some people in the learner group who are able to do this well now? If so, what are they doing differently?
– How will we know that our course is successful?
– Once they have completed the course, what can we do to:

  • Support them to do the task well
  • Motivate them to do the task well, and continue to do so

Once we’ve asked all of the above questions (and don’t for a moment think that we’ll get all the answers!), here are a few things we can do to nudge the course towards making it effective:

– Drop learners in a realistic setting, and have them ‘do’ the job they would have to do in real life. This could, depending on what the course is about, mean that they:

  • Make split-second decisions on the floor of a bustling hospital
  • Write code in a new program they are just learning
  • Talk to a customer, overcoming objections and trying to sell them a product or service

… perform any other job that the course is teaching them to perform

– For each action, show them the consequences of their action, and provide detailed feedback on why that action is right or wrong. And, when they have invested cognitive effort in working out the answer to a tough question, they are truly open to learning from the consequence, as well as the feedback. This is where real learning takes place.

– Create opportunities to support them and motivate them well after the training is over. Because after all, training is just the beginning of learning.

What do you think? What else can we do to make sure that our learning program turns out be not just engaging, but also effective?

The Role of Knowledge


For a few years now, the term *knowledge* has been getting a bad rap from across the spectrum. From a Learning Design perspective, we say: “In real life, no one will ask you to list the steps to perform first aid, what actually matters is that you’re able to administer first aid when the need arises.”

And even in general, we tend to discourage people from memorizing things. The thinking goes “Why do you need to know something that you can google and find out in a minute?”.

And so, while designing learning solutions, we focus all our energies on the application of knowledge, by designing plenty of practice activities.

This is not a bad thing at all. In fact, it’s very good.

But a lot of times, we fail to understand that knowledge, for knowledge’s sake, has a role to play as well.

There was a reason we learnt math tables by heart. These provide the foundational knowledge required for us to not have to rely on a calculator to perform simple math calculations.

Same is the case with alphabets and the fundamental vocabulary. Without it, we’d be unable to form sentences, unable to communicate our thoughts, and unable to express our feelings.

While these are rudimentary abilities meant for children, let’s look ahead to the kind of skill that we typically try to build for adults – for example, first aid.

Unless the first-aider knows the steps by heart, they wouldn’t be able to automatically administer first aid when the need arises. A good first-aider is one who has internalized the knowledge of the steps so well that they can perform without having to think about the steps.

In other words, their skill is built upon knowledge. And knowledge forms the key building block, or the foundation, on which application rests.

Therefore, while we focus on application and practice, let’s not forget what lies at the root of it all – knowledge.

2 Things to Consider When Defining Business Goals for a Course


Just like good learning outcomes answer the ‘What’s In It For Me’ (WIIFM) for the learner, a sound goal answers the WIIFM for business. It helps explain the need for a training intervention, and sets the direction for the project once it kicks off, course correcting and providing guidance as required.

Having established that (it’s important to have a clear business goal at the start of a project), we advocate for learning outcomes which read as follows:

  • Ask questions to probe the customer
  • Uncover the customer’s ‘real’ need
  • Explain the benefits (not features) to the customer
  • Lead the customer towards the sale

And, these learning outcomes contribute towards a larger business goal, which should read like this: “Sales will improve 5% by Q3”.

The understanding is that if learners are able to successfully demonstrate the desired behaviors on the job, then the business goal will take care of itself (considering other environmental factors, of course).

It should, but I have a couple of issues around this:

1. The above approach works well for outcomes that directly contribute to a business goal. It is not difficult to imagine similar learning-outcomes-leading-to-business-goal situations in other contexts, such as:

  • Better hand hygiene (learning outcome) results in fewer infections (business goal)
  • Tighter password security (learning outcome) results in lessened security threats (business goal)
  • Better call handling and resolution (learning outcomes) result in improved customer satisfaction ratings (business goal)
  • Greater use of personal protective equipment (learning outcome) results in fewer safety incidents (business goal)
  • However, let’s say we need to create a course on ‘E-mail Etiquette’.

The learning outcome would be to write effective e-mails (e-mails that are addressed and copied to the right people, and are clearly worded and structured).

(Aside: There was a real need for a course. E-mails that were poorly worded, as well as those without proper structure or call-to-action were part of the folklore at this organization.)

How do we equate this with a meaningful business goal, a metric that is important to business? We could say that effective e-mails lead to better clarity and lesser confusion within a team, and therefore this might enhance the overall effectiveness of the team.

So, the business goal would be to improve team effectiveness? The goal seems contrived at best to me, and I’m not convinced that effective e-mail alone will contribute significantly to the effectiveness of the team. There are so many factors at play – the culture of the organization, team size and dynamics, the goals and challenges faced by the team, not to mention other forms of communication.

I don’t think we need to force ourselves into that circle. While there is no excuse for poorly formed learning outcomes (actually, they should be performance outcomes; i.e., outcomes that lead to a change in behavior), a business goal is something that can be bypassed, if the outcome doesn’t directly impact a meaningful metric.

2. The second issue that I have relates to the measurability of the business goal. “Sales will improve 5% by Q3”.

Sales will absolutely improve if learners are able to implement the actions they learned in the course. But what about “5%” and “Q3”? Let’s look at a scenario and see where this goes.

Say the sales division has a team of 100. The course is rolled out in January, and all 100 go through the course within a month’s time. If the course is designed well, with plenty of practice, spaced repetition, and post-training performance support, we can reasonably expect that at least 60 will be able to demonstrate the stated behaviors. And given a time gap of four months (March – June) in which to practice and hone their newly learnt skills, they will be more effective, and successful, salespeople than they were before the training.

So, if the team was selling 500 units a month in January, they should ideally be selling at least 560 (60 salespeople selling one unit more each) in July (Q3) – an increase of 12%. Now, we know there are other factors to consider – product pricing, market conditions, competition, etc. – which probably haven’t changed much in the last two quarters.

Looking at the above, a conservative estimate of “5% by Q3” does seem achievable.

However, my discomfort with assigning such numbers to our lofty business goal stems from the fact that there are too many variables unrelated to the design of the course. The course must be rolled out in January, people should finish taking the course in February, and market conditions, competition, etc. should not have changed. All of which are well beyond the control and influence of instructional designers. Considering these factors, I believe that assigning such targets seems arbitrary, and a bit frivolous, to me.

Cathy Moore has some good advice for us here. She says “Consider this only a goal, not a guarantee.”

However, I still feel that we would do well to focus on what we can influence (change in behavior), rather than chasing a target which we have no control over. Of course, we want to prove that we are valuable to business. So, the goal can read “Sales will improve as learners are able to…”. If it leads to a 5% improvement, great. If the improvement is 10%, why not?

What am I missing here? How can we improve our business goals in a way that they are meaningful, realistic and achievable?

Duolingo Gets Both Gamification and Learning Design Right


We were collecting examples for a gamification webinar we conducted last September, and I was intrigued to know about Duolingo, the gamified language learning app. I downloaded the app immediately, but though it listed over 20 languages, there was none that I specifically wanted to learn. The intrigue intensified as I read about its features and rave reviews, but at that point, I had no time to experience the app by learning a new language, so I went ahead and delivered the webinar, citing Duolingo as an example of gamification done right. I gathered this much from users and gamification experts alike.

But Duolingo remained at the back of my mind. Recently I came across an article which cited a research finding that the “brain networks [of those who learned a new language] had become better integrated, which means they’re more flexible and allows for faster and more efficient learning”. This caught my attention. Who doesn’t like to become smarter, better integrated etc. ;-)? So I decided to check it out.

I wasn’t particularly interested in any of the languages on offer (by the way, did I mention that Duolingo is entirely free?), so I randomly chose Spanish. English, of course, remained my source language, the language through which I would be learning Spanish.

And here’s what I experienced:

When you start, you get to decide on your goals – the amount of work you want to put in everyday; choosing from:
– Casual (10 XP per day)
– Regular (20 XP per day)
– Serious (30 XP per day)
– Insane (50 XP per day)

(XP, or experience points, are awarded to you for completing lessons.) I chose casual, which meant I would have to spend 5-10 minutes a day, completing a single lesson each day.

When you reach your goal a few days in a row, you get a streak, which you have to work to maintain. For example, if you miss a couple of days’ lessons, your streak goes down. Streak basically refers to your knowledge of words in that lesson, and if you don’t keep up, it means you are forgetting those words, therefore your streak weakens to indicate that. Makes perfect sense. Did I hear someone say forgetting curve :-)?

Duolingo sends regular reminders to meet your daily goal, encouraging you to reach longer learning streaks. These reminders are fun and personalized, suggesting what you will be learning next, and motivating you to keep practicing.


As you complete lessons and gain XP, you level up, earning lingots. Lingot is a virtual currency that allows you to buy various things from a store, right from dressing up your owl (the Duolingo mascot), to power ups and extra lessons. Currently none of these ‘items on display’ sound interesting to me, so I haven’t ventured into any shopping as yet.


Progress indicators are all over the place in Duolingo. The lessons are arranged based on a virtual skill tree, and keep coloring as you advance. Within each lesson, you get to see how many more questions you have to answer, as well as how many you got right in a row.


I haven’t practiced for more than a week now, and here’s what my progress looks like.


Duolingo is highly social. You can comment on and discuss specific questions from lessons, and get answers from others in the community. I’ve found this feature to be especially useful, in the absence of ‘why this works the way it works’ explanations from Duolingo.


You can add friends from your networks, and have a leaderboard comparing your score (XP) with theirs. You can also see what they’ve been up to, including their latest comments, who they are following and who is following them, as well as levels gained.

Cross a certain level, and you get a badge, which you can share on your social network. For the record, I’m 7% fluent in Spanish as I’m writing this post.

A much-touted feature is the Immersion area, which asks you to translate some text in a collaborative space, thereby helping you practice your language skills even more, and awarding you XP incentives for participation and contribution.

The lessons themselves are designed really well. No theory, no grammar rules, just a multi-modal learning system which uses visual and auditory cues to help you learn new words.

You start on the lower end of a skill tree, learning really simple words first, and then naturally and effortlessly progress towards more complex terms and constructs.

Repetition, a huge factor for success of any learning, language learning especially, has been used very effectively in Duolingo. Words and phrases you learned in Lesson 1 reappear in Lesson 3, in newer avatars, and in new constructs. Unconsciously you begin internalizing them.

Having said all that, the app is not without its flaws. My biggest gripe, from a learning design perspective, is the blatant implausibility of distractors. Many a time, the correct answer is a dead giveaway, either due to the construct, or some silly reason like punctuation. Check out the screenshot below.

  1. Looking at the choices, I’m pretty sure the sentence has to start with ‘No’ because that is the only word which is in title case. (In this case, the word is obvious, in others, it’s not.)
  2. When I’ve dragged four words to form a sentence, it’s way too obvious that the fifth word is ‘cook’, since that is the only verb, amongst the remaining options.


But despite such shortcomings, Duolingo is one fantastic language learning experience, neatly wrapped in a ‘gamified’ package. I hope and crave for more learning experiences to be designed so well.

¡Buen trabajo Duolingo! ¡Seguid así!

Rich Learner. Poor Learner.


Evan, Laura, and Allen work in the L&D department of a large company. One morning, their manager Helen calls them into her cabin. She says “Congrats! You’re going to DevLearn in Vegas!”

Celebrations ensue, and the three excitedly get ready for the journey. Before leaving, they individually make plans for the trip. Here’s what each of their plans looks like:


At the end of the trip, I’d like to:

  • Come away with at least three ideas for improving my learning design
  • Connect with people who blog on learning, especially those who talk about social learning and community management
  • Attend at least four sessions on mobile learning (two of them possibly Clark Quinn’s and Nick Floro’s???)
  • Visit Aunt Maurice

Before leaving:

  • Buy formal shoes


At the conference:

  • Attend at least 9 concurrent sessions; squeeze in 12 if possible
  • Make notes and consolidate for later reference

In Vegas:

  • Go shopping


  • 3 days of conference
  • Wed Evening: Dinner with friends
  • Thurs Evening: Relax in the room
  • Friday Evening: Gambling at The Venetian!
  • Saturday: Grand Canyon
  • Sunday: Flight back

After coming back, Helen asks each of them to present their experiences from the trip.

Can you guess what would have happened?

You’re right! Evan had a clear set of takeaways to present from the conference, while Laura, though a bit scattered, did have a few points to talk about. Allen, unfortunately, had nothing substantial to present. What he did learn at the conference had quickly evaporated, thanks to his lack of goals and focus on learning.

Let’s think about this for a moment. Isn’t this something we encounter all the time? Learners, without as much as an explanation of what to expect, being pushed to attend a day-long training event on compliance (or code of conduct, or communication, or some other topic). Or being forced to take a bunch of long and context-less e-learning courses.

So, unless the learner is in ‘receptive’ mode (by that I mean they are emotionally and intellectually ready to receive the content), it is highly unlikely that a learning event is going to be of any benefit to them. This was exactly what we saw happening with Evan, Laura and Allen above. Despite attending the same conference, and probably sitting through the same sessions and meeting the same bunch of people, the amount of learning that each of them got what was directly proportionate to how ‘receptive’ they were.

So, how do we ensure this? How do we make sure that learners are ‘receptive’ to the learning experience that we’ve so painstakingly put together? Here are a few ideas:

1. Tell them the why and the how

This is the ‘What’s In It For Me’, or WIIFM, for the learner. It answers two key questions:

a. Why is this topic important? Not to the business, not to the organization, but to me, the learner, as an individual.
b. How is it going to help me in my life / work?

WIIFM features prominently in Instructional Design discussions, but gets missed out, or gets improperly implemented in many cases.

But get this one right, and we can have learners hanging on to every word in the course.

2. Make an emotional connect

There is a reason that people love stories. And it’s for the same reason that they are addicted to movies and games.

Joy, sorrow, challenge, competition, surprise, suspense, fear, anger, trust… these are just some of the emotions we can draw upon in our courses to keep learners coming back for more.

3. Address a need

Ultimately, the course needs to help the learner get better at something useful. Specifically, it should deliver what it promised to deliver in the WIIFM stage above.

A classic example of this is YouTube videos. It doesn’t matter how good or bad a video is. If it addresses my need of the moment, say ‘how to fix my washing machine’, I would still watch it over and over until I get the information right.

4. Make it byte sized

No one has the time or the inclination to go through a long-winded course that covers every little obscure detail of the policy you’re trying to cover. Make it to the point, and learners are much more likely to be receptive to the experience.

5. Make it optional

This has long remained a pet peeve of mine. Forcibly making learners sit through a class and having them switch off their cell phones does NOT equate to their minds being open to what the class is offering. Same goes for locking down the Next button in the hope that they will read and absorb every little piece of information presented on the screen.

In fact, these strategies have the opposite effect. An individual (especially an adult) who does not feel in control of their circumstances is very unlikely to have an open, receptive mind that is conducive for learning.

Explain the benefits, sell them the idea, and leave it to them to decide whether or not to take the course.

So what have I missed? What other ideas can we use to help learners become ready to receive the content? I would love to hear from you.

5 Ways To Avoid Overwhelming Learners

overwhelmed_learnerThe deluge is upon us! Run for cover!

Well, I’m not talking about an invasion or a natural calamity. I’m talking about the stuff that we are faced with every minute of every day – the torrent of information that keeps hitting us, threatening to sweep us off our feet and drown us, if we’re not careful.

Ah, the curse of social media, which constantly bombards us with information from all directions. Combine this with a heady dose of FOMO (Fear Of Missing Out), and we can be sure to get inundated in the oncoming flood, without retaining much that is useful.

I have a fairly simple practice for handling social media. There are a select few in the industry who I really respect and admire, and I therefore want to listen to their opinions. Over the years, I’ve ruthlessly eliminated any kind of distraction, which basically is any information that is not from this select few. Of course, I keep editing and pruning this list.

My challenge is, this ‘select few’ runs into a few dozens, if not more. So, what for many is typically a barrage of irrelevant information concealing a few precious gems, I have a steady stream of high quality, valuable content, which of course I don’t want to miss out on (this is real FOMO in action, you see).

Okay, having said that, this post is not about how I handle social media. It’s about we can ensure that learners don’t get washed away in a similar onslaught of information in the courses we design.

How many times do we end up having to include way more than the average individual can digest in a one hour course, or in a day’s workshop? Because the SME insisted “they have to know this”. Or the unit manager said “this part is mandatory”. We see their point, so we end up adding that piece of content.

A drop here, and a drop there. And slowly but surely, the drops add up to form the deluge.

We don’t of course want to overwhelm learners with too much information. Because we know that an overwhelmed mind is ill-equipped for learning. Scientists and researchers have time and again proved that cognitive overload (the situation when we’re faced with more info than we can handle) is actually detrimental to the learning process.

So what can we do to avoid putting learners in such a situation? Here are a few approaches I can think of:

1. Break up the information into smaller, byte-sized pieces

Content chunked into digestible units can go a long way in helping learners absorb the information easily, without feeling overwhelmed. We should, of course, take care to ensure that this is done well, and that the chunks are not too interdependent.

2. Distribute the information over a period of time

If it is not critical that the audience should consume the entire content in a short time (applicable for instructor-led courses where traveling is involved), it would help to deliver the content piecemeal over a stretched duration. Referred to as spaced practice, this approach has been seen to have tremendous benefits for learning and retention.

3. Create information loops

Chunking and spaced practice can only work well if sufficient repetition is built in to allow for absorption into long term memory. Therefore, whatever strategies are adopted, make sure to create these information loops, which are basically about summarizing and repeating content at varied intervals.

4. Cover each point in greater depth, or provide context

In a recent project, we were required to help learners recall safety precautions they had to take before undertaking any maintenance work. Learners had gone through in-depth training on these safety precautions, and the client insisted that it would be sufficient if we added a line mentioning this, along with a generic image indicating safety. We did, but in addition, we added a couple of lines containing a super quick overview of the safety precautions, and provided a link to the original course if they wished to review it. Result: the overview and the link helped learners better recall what they had learned in the original training.

5. Increase the duration of the course, if required

Implementing any or all of the above approaches might mean an increase in the duration of the course… which is okay, in my opinion. Ultimately, what counts is that learners have understood the content well enough that they are able to translate it back in their workplace.

What do you think? What other approaches can we use to make sure that we don’t end up overwhelming learners? I look forward to your comments.

There Ain’t No Better Teacher Than A Mistake

… as long as it leads to a lesson learned.

learning_from_mistakesIt was a series of online sessions on instructional design that I was facilitating. The audience was a smart group, comprising mostly of people from HR and talent development. Needless to say, the session was very interactive, and as I had come to expect during the series, they were questioning me and challenging me at every juncture.

I was extolling the virtues of well-designed multiple choice questions, and to make the point about options which are grammatically incorrect and aren’t parallel with each other, I had used a question from Cathy Moore’s blog post on MCQs, by giving full credit to Cathy, of course. Here’s how the question read, along with the options.

We can confuse learners when we:
a. fail to actually complete the sentence we started in the question.
b. inconsistent grammar in the options.
c. sometimes we veer off into another idea entirely.
d. wombats.

One of the participants asked me what word ‘wombats’ means, and I confidently replied that the word doesn’t exist. He accepted my response, and we moved on. But I kept thinking about the question, and later that night, I thought I’d google it, just in case, to see if it had any meaning. And, to my surprise, I learned that wombats are short four-legged creatures which are native to Australia.

Caught off guard and a bit embarrassed, I nevertheless decided to share my newfound knowledge with participants. Here’s what I wrote in the designated discussion forum:

In Session 3, I had used a Multiple Choice Question borrowed from Cathy Moore’s blog, in which one of the options she had listed was ‘Wombats’. The point that Cathy (and I) was trying to make was that sometimes the options are so obviously incorrect that learners have no trouble guessing the right answer, thereby passing the test without understanding the topic or having to think about it.

One of you asked what is the meaning of ‘Wombats’, and I confidently replied “it’s not a word”. My confidence came from the understanding that Cathy frequently uses fictional names, places and situations. But I wanted to double check this, and when I did, to my surprise I found that Wombats are short-legged animals that are native to Australia.

Apologies for the confusion caused because of this.

Lesson learned: Do not assume anything.

It was a mistake I’d made, and my realization and subsequent sharing of the same with participants sparked a discussion unlike any other thread in the entire forum.

This got me thinking: So what can we do to leverage mistakes in the courses we design? How can we make it possible for the learner to ‘stumble upon’ mistakes, and glean lessons from them? Because after all, mistakes are seldom made randomly. They represent manifestations of long-held beliefs or misconceptions. We don’t want to frustrate learners of course, but their moment of realization can turn out to be a huge learning point for them.

Here are a few possible ideas:

1. Use them as options in scenarios, and provide detailed feedback. While the options can reflect the common misconceptions, extensive feedback against each of the incorrect options can help explain why the option is wrong, and what would work better in that scenario. Note that such feedback against the correct option would also work well, to consolidate the learner’s understanding of why that option is correct.

2. Where possible, have them justify their choice of options. So if a question reads “What is the beverage that has highest consumption in the world?”, no matter what their choice, have them answer a follow-up question which reads something like “Why do you think so?”. This can help them think through their answer, and possibly even correct their original choice.

3. Provide an option for learners to correct their mistakes and redo the scenario. This can help address any frustrations with early failure.

4. Run a scenario and make learners point out mistakes. This is very similar to the scenario in the first point above, except that here the learner doesn’t make the decisions. Instead, they get to point out the mistakes that another character in the scenario is making, a nice relief from the decision-making scenario, and another great way to learn.

5. Tell failure stories. We often get enamored in success stories, but they don’t always give the full picture, nor do they tell anything about the struggles that went into the process. Failure stories, on the other hand, can teach as well as inspire, and give a helping hand to students who make similar mistakes.

6. And last but not the least, it is not a bad idea to include them as part of your explanation. As in, call out a mistake that’s commonly made, and say “DON’T DO THAT!”.

Finally, it’s important for any learning event to emphasize a growth mindset so that learners do not associate mistakes with shame, and view them instead as learning opportunities. Of course, a single learning event would be hard-pressed to do something like this on its own, but we can always try, right?

What other strategies can you think of to lead learners from their moment of ‘oops’ to ‘ah-a’?

Beyond Effective E-Learning – Changing Habits, Not Just Behaviors

We absolutely nailed it! Folks are loving this!” exclaims Terry.

Reporting rates have increased… incidents have gone down… people are really using strong passwords! Management couldn’t be happier.” chips in Janet.

Terry and Janet, both learning designers in the corporate world, are discussing their freshly-launched security course, and how it has become an overnight sensation. They have every reason to be happy… learners and management have been singing its praise ever since it was ‘released’.

And why not? Terry and Janet have done their homework, diligently working with stakeholders to design a meaningful course. It is chock full of practice activities, and does a good job of both explaining the ‘how’, and convincing learners of the ‘why’. They have also spent considerable effort to ensure that it’s produced well, with all its associated bells and whistles, hence the learner love they are currently basking in.

Let’s fast forward a few months and see what happens. After all, the success of any initiative has to be measured by long-term adoption, and not just immediate outcomes, right?

Six months down the line, the number of security lapses has increased. Drastically. People seem to have reverted to their old ways.

Janet walks by a section of the office she rarely visits, and is dismayed to find passwords written on post-it notes stuck to computer screens. She calls up the IT department contact she was in touch with while developing the course, and he informs her that average password strength has dropped to ‘moderately weak’ from last month’s ‘reasonably strong’.

What just happened?

When the course was newly launched, it was so impactful that it motivated people to immediately go back and make their passwords stronger. And also to proactively look for any seeming lapses in security and report them. Hence the initial spike in the number of lapses being proactively reported, and the reduction in security incidents.

This continued for a while, until the effects of the course ‘wore off’. And in the absence of a system of checks and balances to keep people continuing to exhibit these behaviors, they started to slowly revert to their older habits, purely because they lacked the motivation to continue. It was simply too much effort.

The course, in the form of a single event, was a humungous success in convincing people of the need for better security, and providing them with the knowledge and skills for the purpose. Therefore, it was able to get people to demonstrate the desired behaviors. However, commitment faltered in the face of day-to-day work pressures, as happens when priority assigned to something ‘non-trivial’ goes down. And since there was no ongoing campaign to convert the new behavior into a long-term habit, the initiative failed in the long run.

So, how does an organization ensure that newly learnt behaviors become sticky enough to turn into habits? Here are a few pointers we can keep in mind:

1. Get started

Experts advise that the first step to habit formation is to just get started. Terry and Janet have already achieved this with a well-designed, engaging course that targets the right behaviors. Employees were motivated enough to strengthen their passwords, and to voluntarily come forward and report what they thought were security lapses.

2. Provide constant reinforcement

This can be done using both intrinsic and extrinsic elements.

Intrinsically, people can be reminded at regular intervals of the need for better security, and how it indirectly impacts them as individuals, and the organization as a whole. Case studies, stories, quizzes, etc. can work well in this regard.

On an extrinsic note, employees can be rewarded for having the ‘strongest password of the month’ or for reporting the ‘highest number of lapses’.

All of this can be done online or offline, or a combination of both, which should keep people motivated to continue the streak, and keep security on top of their minds.

3. Use social proofing for validation

Identify secret ‘champions’ of security, to further the cause… who are given specific tasks, such as discreetly starting a conversation on security at the watercooler, or on the company’s Intranet portal. Or sharing a security incident in another company which led to major losses.

Establish an online system where employees can display a ‘Security’ badge, which their colleagues can vote for. The employee who gets the maximum votes will have the strongest badge, and so on.

4. Design the environment for stickiness

Make it impossible, or at least difficult, to have weak passwords. Design the system in such a way that it does not accept passwords below a certain level of strength. Display posters at every work area, or possibly on every desk, constantly reminding people of the need for better security.

Ultimately, while a well-designed course can achieve needed behavior change, for the change to sustain over a period of time, what we need is habit change.

What other factors can we consider for achieving habit change in the long run? I welcome your inputs.

Why I Still Love Bloom But Not His Verbs

I have a confession to make.

I’ve sat in scores of meetings with project stakeholders, painfully agonizing over the verbs to use for defining the objectives of a course. Should we use ‘Develop’ or ‘Construct’? Which one is better – ‘Describe’ or ‘Explain’? Would it be more appropriate to say ‘Classify’ or ‘Categorize’?

You get the idea. After all, objectives define the boundaries of a course, and we don’t want to get them wrong, right?

Yes, but…

It is absolutely important to define clear objectives. For designers, they set the boundaries of a learning intervention and decide on its level of sophistication. While for learners, they help (when presented well) set expectations at the beginning of a course. The catch lies in the phrase “when presented well”.

Benjamin Bloom and his colleagues provided the verbs as a helpful means to decide on the action to describe while defining objectives. However, we sometimes often fall prey to the notion that a verb has to absolutely be defined within the confines of the taxonomy level the course represents, failing which it can mislead learners.

For example, take the verb ‘Describe’. It appears on many levels of the taxonomy – Knowledge, Comprehension, and Evaluation. So, by just looking at the verb, you cannot understand at which level of the taxonomy the course belongs.

I agree that the taxonomy is complex. So while on one hand, there is a group of learning designers who insist on following the listed verbs to the T, there is another group who shuns the taxonomy altogether. This second group prefers to follow two things, not necessarily connected with each other:

1. That the taxonomy can, and should, be broadly classified into two levels – knowledge and performance
2. That all learning interventions, irrespective of type and level, should address performance, and not just knowledge

I agree with both the points above. No arguments there.

However. There is a middle path, which when tread well, can help us deliver courses at the right level and make it challenging and engaging for learners, without getting bogged down by the defined verbs.

The six levels of taxonomy in the cognitive domain are useful for deciding the level at which a course needs to be designed.


There are times when the fine distinctions between the levels are a useful measure on which to base a learning intervention. Let’s look at these (please note that I’m using the revised taxonomy here, while also providing a reference to the original one):

1. Remembering (Original Taxonomy Level – Knowledge):

Typically, no information should be presented at this level alone. It simply signifies rote learning with no understanding of the information being ingested.

An example of this would be when a person is getting started on the path to becoming a leader, they read quotes on leadership, and are able to repeat them.

2. Understanding (Original Taxonomy Level – Comprehension):

A few things can be taught at this level. Examples include:

a. Procedural information, such as the steps of a process which takes place in another department. The learner is not directly connected with the process, but it is something very useful for the learner to understand.

b. Conceptual information, such as how solar technology works. Again, the learner is not directly connected with applying the information in their day-to-day work, for example, someone in the marketing department of a company that produces solar energy, who is not directly involved with the production, but should be better informed about the company’s business.

The leader-in-the-making has moved one step higher, and she is now able to explain the meaning of the quotes in her own words. Most e-learning falls in this category; we are all too familiar with “By the end of this course, you would have understood…”.

3. Applying (Original Taxonomy Level – Application):

This level is typically at the center of most e-learning design, and for good reason. We want people to be apply their knowledge and skills to their jobs, and thereby showing measurable improvements in performance. This level, in a way, can also be considered to be the holy grail of e-learning, because if a person is able to do their job better as a result of what they learned in the course, then the course can be said to have achieved its goal. But the verbs to be used, that would depend less on the taxonomy and more on the job skills we are trying to impart.

An example of this would be a course that teaches solar technology to engineering students. Here, learners get to apply their understanding of the concepts to build a solar panel or some other equipment.

Back to the leader-in-the-making. She has taken a course on leadership, and she practices by applying the models she learnt in the course in her day-to-day work.

4. Analyzing (Original Taxonomy Level – Analysis):

This goes beyond the application of knowledge in a specific albeit wide set of contexts, and involves breaking down information into parts, or examining it and trying to understand its structure.

Here, the engineering students deconstruct the solar technology that they have learned, and examine its possibilities, applications and limitations.

The new leader is now able to analyze the models she has learnt, deconstruct them, and see the component parts of the whole.

At this point, we get into the realm of Higher Order Thinking, and it is difficult for a standalone e-learning course to transfer skills at this level and above, with the learner bearing more and more responsibility for their own development.

5. Evaluating (Original Taxonomy Level – Evaluation):

At this stage, people are able to validate information or ideas based on a set of criteria. They can present and defend opinions, using evidence as a solid basis for the same.

The leader is in a position to compare and contrast different models, evaluate and make a sound judgement on which ones are better, and for what reasons.

6. Creating (Original Taxonomy Level – Synthesis):

This is the ultimate level of cognition, where people are able to build new structures or patterns on their own based on existing information.

The leader is now an expert in her field, and she can create new models based on her experience and expertise.

While the levels build upon one another, they are not necessarily linear. In fact, many theorists believe that while the first three levels are in sequence, the last three levels exist parallel to one another, like this:


And, a course can be taught at several levels at the same time.

So, while designing courses, make sure to aim for the highest level on the taxonomy ladder that you can possibly go, without getting mired in the actual verbs to use, and you will have a learning experience that is engaging and interesting.

Learning Design Best Principles – From The Learnnovators & Quinnovation Project


We, at Learnnovators, joined hands with Clark Quinn of Quinnovation to develop a course on ‘Workplace of the Future’, which we recently shared with the learning community free of charge. The links to the course, as well as a series of blog posts by Clark Quinn explaining the underlying process, are provided at the end of this post.

The idea of the project was to develop a course under practical constraints typically faced by learning design and development teams, and show that it is possible to adhere to good principles. So, here they are, the principles that we employed in the development of the course:


The entire course is decision-based. It is full of practice activities, with minimal content that can be pulled by learners if needed.

Learners who enter the course see some initial context-setting content, and then are placed in a scenario where they need to take decisions to move the conversation forward. The ‘content’ is made available only as reference, and can be accessed if needed (not mandatory). However, the scenarios are challenging enough that learners have to access the content to understand the underlying logic of the decisions they are faced with. The decisions also include misconceptions that are likely to have been entrenched in the minds of learners.


We believe that the decision points in the scenarios are neither too easy nor too difficult. Learners need to ponder over them to make a choice, and accessing the reference content makes that decision more informed. This engages learners’ intellectual curiosity, motivating them to interact better with the course.

And, the right answer is not too obvious, to make learners want to access the content, in order to be able to take the scenario forward.


We’ve added intrinsic feedback at the end of the scenarios; by this, we ‘show’ the consequences of the learners’ decisions once they reach an endpoint. This we provide through a simple description of what happens in the organization a few weeks / months after they reach the end, and follow it up with an explanation why this happened.


Though this course is on a topic of importance, we didn’t want to make the experience overly long. Hence we stuck to about 30 minutes. This is the duration that a learner would have to spend, in order to get the most out of the course.


The course does not have audio. We did dabble with the idea of using audio, either for the dialogs or as ambient sounds, but we dropped the idea since we did not see value in it, and also because we were conscious that the course might be viewed in a public environment.


We have used a graphic novel approach for the visuals – it’s new and fresh, and not used enough in e-learning. Moreover, we ensured that we could get a fair representation of characters from different backgrounds and cultures.


We have taken into account that learners are intelligent, and that they will be able to deduce how to navigate the course and interact with the elements. So, while we’ve provided an initial heads up on the navigational elements, we have refrained from indicating, at every juncture, to ‘Click Next to continue’. While we have left it to the learners’ discretion to sense the availability of the Next button and click on it when they want to move forward, we have presented subtle visual cues for other interactions.

The course, in addition to the characteristic navigational ingredients like Menu and Previous & Next, includes a couple of novel elements. These are:

a) My Chat: Through this, learners can track the discussion in the scenario. It provides the dialog that has taken place so far in the form of a chat transcript.

b) My Path: This is an iconic representation of the complexity of the scenario, with a set of dots that changes color as a learner progresses along. This is to indicate to the learner that their path is not linear, and that there are multiple other paths available.

c) Reference: This is the content of the scenario, and is presented as a scrolling document. Learners who access the reference from a point in the scenario are taken to the section of the document that is relevant to the decision point they are in.

The course follows the open navigation model, wherein learners can move freely between the scenarios and the sections.


This turned out to be one of the most critical components of the course design process. The feedback we received from testers (a representative audience group) was deep and insightful, and we were able to make several improvements to the content as well as the structure as a result of this.

Here’s a very brief overview of the underlying process:

– Very often, we diverged and then converged, taking the best of ideas from all quarters and adding them to the course.

– For every objective, we developed the practice first, and then the associated content. If there are multiple practices, then we developed the final, most difficult one first.

– As scenarios got more complex, we made flowcharts in PowerPoint to understand where each link was leading.

– We decided to use Articulate Storyline (the most appropriate tool for development in this case) after careful consideration.

– To ensure that we all had a fair idea of the outcome of a scenario, we decided on a few parameters before starting to write a scenario:

a) The role played by the learner, and who they would be talking to in the scenario
b) The decisions that they would have to make
c) The misconceptions they are likely to have

Here are the links to the four blog posts written by Clark Quinn on Learning Solutions Magazine:
– Post 1 – Deeper Design: Working Out Loud
– Post 2 – Deeper Design: Beyond Traditional Instructional Design
– Post 3 – Deeper Design: Tweaking the Media
– Post 4 – Deeper Design: Putting It All Together
– Post 5 – Course Launch: Learnnovators and Quinnovation Launch Demo Course Based on Learning Design Best Principles

And, here’s the link to the course: Workplace of the Future.