This is the first in a series of thoughts on some broken areas of ID that I will be posting for Mondays. The intention is to provide insight into many ways much of instructional design fails, and some pointers to avoid the problems. The point is not to say ‘bad designer‘, but instead to point out how to do better design.
The way I‘ve seen many learning solutions go awry is right at the beginning, focusing on the wrong objective. Too often the objective is focused on rote knowledge, whether it‘s facts, procedures, or canned statements. What we see is knowledge dump, or as I‘ve heard it called: show up and throw up. Then, the associated assessment is similarly regurgitation of what you‘ve just heard. The reasons this happens, and why it doesn‘t work, are both firmly rooted in the way our brains work.
First, our brains are really bad at rote remembering. We‘re really good at pattern-matching, and extracting underlying meaning. That‘s why we use external aids like calendars. Heck, if it‘s rote knowledge, don‘t make them memorize it, let them look it up, or automate it. OK, in the rare case where they do have to know it, we can address that, but we overuse this approach. And that‘s due to the second reason.
Experts don‘t know how they do what they do, by and large. Our brains ‘compile‘ information; expertise implies becoming so practiced that the process is inaccessible to conscious thought (ask an expert concert pianist to describe what they‘re doing while playing and their performance falls apart). We found this out in the 80‘s, when we built so-called ‘expert systems‘ to do what experts said they did, When the systems didn‘t work, we went back and looked at what the experts were really doing, and there was essentially zero correlation between what they said they did, and what they actually did.
What happens, then, is that our Subject Matter Experts (SMEs) do recall what they studied, and toss that out. They‘ll dump a bunch of relevant knowledge on the designer, and the good little designer will develop a course around what the SME tells them. So, we see objectives like:
Be able to cite common objections to our product.
What‘s needed is to focus on more meaningful outcomes. Dave Ferguson has written a nice post defending Bloom‘s skill taxonomies, and he‘s largely right when saying that focusing on what people actually do with the knowledge is critical. However, I find it simpler to distinguish, ala Van Merrienboer, between the knowledge the learner needs, and the complex decisions they apply that knowledge to, with the emphasis on the latter. So, I’d like to see objectives more like:
Be able to counter customer objections to our product.
The nuances may seem subtle, but the difference is important.
How does a designer do that? SMEs are not the easiest folks to work with in this regard. I‘ve found it useful to turn the conversation to focus on the things that the learner needs to be able to do after the learning experience. That is, ask them what decisions learners need to be able to make that they can’t make know. Not what they need to know, but what do they need to be able to do.
And, I argue, what will likely be making the difference going forward will be skills: things that learners can do differently, not just what they know. I recall a case where an organization was not just looking for the learners to understand the organizational values, but to act in accordance with them (and that that meant). That‘s what I‘m talking about!
When it comes to capturing objectives, I‘m perfectly happy with Mager‘s format of specifying who the audience is, what they need to be able to do, and a way to determine that they‘re successfully performing. From there, you can work backwards to the assessment, to the concept, examples, and practice that will develop the skills to pass the assessment.
There‘s another step, really, before this, and that‘s determining what decision learners need to make differently or better to impact the bottom line, e.g. choosing objectives that will affect the organization in important ways, but that‘s another topic for another day.
Doing good objectives is both a skill that can be learned, and a process that can be supported. You should be doing both. Starting from the right objective makes everything else flow well; if you start on the wrong foot, everything else you do is wasted. Get your objectives right, and get your learning going!
Virginia Yonkers says
What a great post! This is the biggest problem I have in teaching distance learning to teachers at all levels of education (including, but not excluding, professional). When I ask them to identify objectives at the beginning of an activity, they identify the “standards” which are easily “measurable”.
Many of my students balk at the question, “what do you want them to be able to do at the end of the lesson? How will they use what they learn?” Until we begin to get some better assessment tools, though, (something better than objective before and after tests) that are cost effective and long term, few people will focus on creating objectives the way you have outlined them.
This is why when I teach distance learning ID, I begin with asking my students what they would like to do better in their classroom. In other words, what are the students not doing that they would like to see them do better (note that it is not what knowledge do they have and should they acquire). From there we spend a lot of time crafting the learning objectives. The next step is to develop an assessment tool (rather than the activity itself). Often, in the process of developing the assessment tool, the objectives are modified and fine-tuned. This makes the ID much easier to develop.
Rob Bartlett says
Great post
I find that the SME’s I work with become frustrated with me asking “What are they
going to do back on the job?, and how much of it are they going to do?”
It takes a lot of trust from the SME’s to follow this process to develop objectives,
they have been in learning events before, there was an outline and an info dump.
That worked for them so it shoulod work for everyone.
Changing the point of SME’s involvelment to performance on the job
not completion of thier info dump into a course is something I am trying to improve.
Does anyone have any success stories?
Rob Bartlett
Dave Ferguson says
Clark, with a little unexpected money, last month I ordered van Merrienboer’s book, which shows the exciting mental life I lead.
I have no disagreement at all with your “situated objective.” (In my post, I was giving examples of how Bloom’s stuff makes sense for someone who thought it had no merit whatsoever.)
Your point about affecting the bottom line connects will with Gilbert’s concept of worthy performance (not that this is news to you) — the whole idea of learning at work is ultimately to increase the value of the results that people achieve.
Dave Ferguson says
I had meant to follow Virginia’s comments, which emphasize something often lost in a too-linear approach to instructional design: ITERATION.
People like the idea that you can run once through a process — any process — and emerge with great results. Here on earth, though, early efforts can help us rethink assumptions, means, and even goals.
I once spent a lot of time working on an orientation program for new salespeople at my employer. The biggest breakthrough came about a third of the way through development, when we realized that the overarching metaphor was not “a day in the life” of a sales rep, but a deal in the life… because the salespeople always thought in terms of a deal, which by its nature might stretch over weeks.
Clark says
Virginia, absolutely like that you have them craft objectives and assessments; make them mimic what they need to do in the world. And aligning assessment to objective in a feedback loop is great. Help them see that one of their objectives needs to be “how to create assessment that learners ‘get’ is valuable to them”.
Rob, it is hard work to get SMEs to move away from knowledge dump. But they’re motivated to become experts, so it works for them; not necessarily for their audiences. It does take more responsibility on the part of the ID to fight for meaningful objectives, not just accept what’s given, but it’s so worth it when the learner has an experience that they realize is valuable. But it can take patience!
Clark says
Dave, your comments came in while I was responding to Virginia and Rob. Iteration is very critical; great point. Those breakthrough’s are so critical (even if painful to go back :). Great feedback, thanks.