Learnlets
Clark Quinn's Learnings about Learning
(The Official Quinnovation blog)

8 February 2009

Monday Broken ID Series: The Introduction

Clark @ 3:52 pm

First Series Post | Next Series Post

This is one in a series of thoughts on some broken areas of ID that I’m posting for Mondays.  The intention is to provide insight into many ways much of instructional design fails, and some pointers to avoid the problems. The point is not to say ‘bad designer’, but instead to point out how to do better design.

One of the first things learners see is the introduction to the content; it’s the first place that they can be disappointed, and all too often they are.  They are given objectives that don’t matter to them, they’re told what they’re going to see in dull terms, it’s all aversive rather than interesting.  Which is a wonderful way to start a learning experience, eh?

What we want to do is bring in the emotion! Almost all of instructional design is about the cognitive part, yet the motivational part is often just as important.  And we’ve got to go beyond simplistic views of what that means.

Even cognitive science recognizes that there’s more to the mind that the cognitive aspect, and includes the affective and conative as well.  Affective are your learning characteristics, your learner’s styles.  Whereas conative is the interesting bit: the intention to learn, which includes things like motivation to learn, anxiety about learning, etc.

I’ve gone off before about learning styles, and the short answer is to a) use the right media for the message, and b) to provide help for learners.  However, addressing motivation and anxiety is a different, and important, thing.  We want to assist their motivation, which happens by helping the learner connect this experience to themselves and their goals.  And we want to reduce their anxiety to an appropriate level (people perform better under a little pressure), by helping manage their expectations.

To help with motivation, there are a couple of things to do.  We know that learners learn better when we activate relevant information up front (it helps associate the new information to existing information).  I maintain that we want to extend that, and open them up emotionally too. And, I believe that it should be done first.  I think we need to indicate the consequences of the knowledge, either negative for not having the information, or positive for having the information.  I think the consequences can be exaggerated, to increase the emotional impact, within bounds, and it can be done dramatically (see Michael Allen’s Flight Safety video) or humorously.  I’ve used comic strips to begin elearning sections (we don’t use comics enough)!

There are nuances here: it has to be specific to the situation, not just a non-related exaggeration.  Done well, it can incorporate the cognitive association activation as well!  But hook them emotionally, and the information will stick better.  Too often in the learning I see, there’s not just little, but essentially no addressing why this information is important to the learner, and that’s got to be job number 1, or we risk wasting the rest of the effort.

Then we come to objectives, and here I nod in the direction of Will Thalheimer, who’s said this better than I: the objectives we show to the learner are not the ones we use to design!  Too often, there’s a section in the cookie-cutter template for objectives, and we slap in the ones we’re designing to.  Wrong, bad designer, no Twinkie ™.  We (should) use objectives [previous post] to align what we’re doing to the real need, but the learners don’t want to know about our metrics.  The objectives for them need to be rewritten in a WIIFM (What’s In It For Me) framework. They should get objectives that let them know what they’ll be able to do that they can’t do now, that they care about!

Another thing that helps, and now we’re onto anxiety more, is addressing expectations.  Stephanie Burns showed that of people who set out to accomplish a goal, those that succeeded were those who managed their expectations appropriately.  Similarly, when I run workshops, I find I get less concerns when I help lay out what’s going to happen and why rather than just barging ahead.  If people don’t know what to expect, or expect it’ll be X (e.g. entertaining) and there’s some Y (e.g. hard work), they get frustrated or concerned with the mismatch.  They can get upset in particular if one aspect is difficult and they feel like they’re floundering.  Making sure that the expectations are set appropriately helps learners feel like they’re in synch with what’s happening, and maintains their confidence.

A role that’s cognitive as well as motivational is that we don’t do enough has to do with contextualizing what’s happening.  Too often, learning is conduced in a vacuum.  Yet Charles Reigeluth’s Elaboration Theory suggests drilling down, and I say contextualize the learning in the larger context of what’s happening in the world. Even if we’re learning about some minor medical procedure, we can talk about how health care is a major issue, and getting it right is one of the components to make it effective and efficient.  Or somesuch, but you can quickly connect what they’re learning to the real world, and you should.  It’ll help again associate relevant knowledge and increase the effectiveness of the message by connecting what’s happening now to what’s really important.

And, I’ll finally add, no pre-tests, unless it’s to let the learners test out. I’ve talked about that before, so I’ll merely point you to my previous screed.

So, introduce your learners appropriately to the learning, get them cognitively and emotionally ready for the learning experience, and you won’t be throwing away all the effort to develop what follows the introduction, you’ll be maximizing it.  And that’s what you want, at the end, is for that learning to stick.

6 February 2009

Jumpstarting

Clark @ 3:57 pm

I’m on the Board of Directors for an educational not-for-profit that has had almost 30 years of successful work with programs in classrooms, nationally and internationally.  However, 5 years ago or so when I joined, they were doing almost nothing with technology.  Since then I’ve been working systematically to get them to the stage where they’re leveraging technology not just for education, but for the organization.

It’s been a slow road. There were several false starts along the way, with two separate groups within the organization having a go, but each withered.  I wrote a vision document, laying out the opportunities, but they just weren’t getting the message; they were already successful.  Several things have helped: the economic uncertainties of funding for the past few years,  an external group that looked to partner for online delivery (which went awry, sadly), and the growing use of technology by their ever-younger employees (and their audience!).

Mainly through persistence, consistently better messaging, and a growing awareness on the part of both Board and organization, I finally managed to get the Board to push for an IT Strategy from the organization, which led to the formation of an IT Committee on the Board.  (For my sins I got to chair it.)  Since then I’ve been working with the organization to start developing a strategy, though I can only advise.

Jumpstarting may seem hardly the right phrase for a several-years long process, but actually it’s a significant shift and real progress.  They’re still having trouble getting a real strategic vision, focusing a bit too much on tactics like a killer website instead of back-end system and information architecture, but it’s within grasp now.  I likely will be going down and giving the organization’s team a more in-depth view, and the Board has asked to get an overview of the new technologies and the opportunities.  I’m even going to run a survey to see if we can move to more use of technology for the Board’s communications (the number of trees…).

Persistence pays off, even in the most hidebound environments.  Serendipity helps, but you get better at getting the message across.  And the number of examples now available makes it even easier.  Jumper cables, anyone?

2 February 2009

Economic Impact

Clark @ 12:05 pm

The Learning Circuit’s Blog Big Question of the Month is “What is the impact of the economy on you and your organization? What are you doing as a result?”.  Heavy topic, but appropriate for the times.  I’ll answer two-fold: myself, and what I see for clients.

First, the impact: for some organizations it appears mostly an issue of scale, trying to do more with less, but for others it’s more cataclysmic; layoffs, no ability to secure capital for activity, no new business.  For me, it’s several projects that have gone on hold (extra capability available, call now, operators are standing by).

I’ve gone off already on the economic times and what I see are valuable steps for organizations: investing in capability.  I very much believe in walking the walk, so I’m investing in my own capability .  I’ve checked out some non-fiction books from the library that I’m reading to expand my abilities, I revamped my website (and continuing to improve it), writing a new article,  and I’ve started a new blog series on ID.

The point is to use down-time to be prepared to capitalize on the up-time.  Fingers crossed for all of us that it occurs soon.

1 February 2009

(New) Monday Broken ID Series: Objectives

Clark @ 7:27 pm

Next series post

This is the first in a series of thoughts on some broken areas of ID that I will be posting for Mondays.  The intention is to provide insight into many ways much of instructional design fails, and some pointers to avoid the problems. The point is not to say ‘bad designer’, but instead to point out how to do better design.

The way I’ve seen many learning solutions go awry is right at the beginning, focusing on the wrong objective.  Too often the objective is focused on rote knowledge, whether it’s facts, procedures, or canned statements.  What we see is knowledge dump, or as I’ve heard it called: show up and throw up.  Then, the associated assessment is similarly regurgitation of what you’ve just heard.  The reasons this happens, and why it doesn’t work, are both firmly rooted in the way our brains work.

First, our brains are really bad at rote remembering.  We’re really good at pattern-matching, and extracting underlying meaning.  That’s why we use external aids like calendars.  Heck, if it’s rote knowledge, don’t make them memorize it, let them look it up, or automate it.  OK, in the rare case where they do have to know it, we can address that, but we overuse this approach.  And that’s due to the second reason.

Experts don’t know how they do what they do, by and large.  Our brains ‘compile’ information; expertise implies becoming so practiced that the process is inaccessible to conscious thought (ask an expert concert pianist to describe what they’re doing while playing and their performance falls apart).  We found this out in the 80’s, when we built so-called ‘expert systems’ to do what experts said they did,  When the systems didn’t work, we went back and looked at what the experts were really doing, and there was essentially zero correlation between what they said they did, and what they actually did.

What happens, then, is that our Subject Matter Experts (SMEs) do recall what they studied, and toss that out.  They’ll dump a bunch of relevant knowledge on the designer, and the good little designer will develop a course around what the SME tells them.  So, we see objectives like:

Be able to cite common objections to our product.

What’s needed is to focus on more meaningful outcomes.  Dave Ferguson has written a nice post defending Bloom’s skill taxonomies, and he’s largely right when saying that focusing on what people actually do with the knowledge is critical. However, I find it simpler to distinguish, ala Van Merrienboer, between the knowledge the learner needs, and the complex decisions they  apply that knowledge to, with the emphasis on the latter.  So, I’d like to see objectives more like:

Be able to counter customer objections to our product.

The nuances may seem subtle, but the difference is important.

How does a designer do that?  SMEs are not the easiest folks to work with in this regard.  I’ve found it useful to turn the conversation to focus on the things that the learner needs to be able to do after the learning experience.  That is, ask them what decisions learners need to be able to make that they can’t make know.  Not what they need to know, but what do they need to be able to do.

And, I argue, what will likely be making the difference going forward will be skills: things that learners can do differently, not just what they know.  I recall a case where an organization was not just looking for the learners to understand the organizational values, but to act in accordance with them (and that that meant).  That’s what I’m talking about!

When it comes to capturing objectives, I’m perfectly happy with Mager’s format of specifying who the audience is, what they need to be able to do, and a way to determine that they’re successfully performing.  From there, you can work backwards to the assessment, to the concept, examples, and practice that will develop the skills to pass the assessment.

There’s another step, really, before this, and that’s determining what decision learners need to make differently or better to impact the bottom line, e.g. choosing objectives that will affect the organization in important ways, but that’s another topic for another day.

Doing good objectives is both a skill that can be learned, and a process that can be supported.  You should be doing both.  Starting from the right objective makes everything else flow well; if you start on the wrong foot, everything else you do is wasted.  Get your objectives right, and get your learning going!

« Previous Page

Powered by WordPress