I’ve argued before that the differences between well-designed and well-produced learning, and just well-produced learning, are subtle. And, in general, nuances matter. So, in my recent book, the section on misconceptions spent a lot of time unpacking some terms. The goal there was ensuring that the nuances were understood. And a recent event triggered even more reflection on this.
Learnnovators, a company I’ve done a couple of things with (the Deeper eLearning series, and the Workplace of the Future project), interviewed me once quite a while ago. I was impressed then with the depth of their background research and thoughtful questions. And they recently asked to interview me on the book. Of course, I agreed. And again they impressed me with the depths of their questions, and I realized in this case there was something specific going on.
In their questions, they were unpacking what common concerns would be about some of the topics. The questions dug in to ways in which people might think that the recommendations are contrary to personal experience, and more. There were very specifically looking for ways in which folks might think to reject the findings. And that’s important. I believe I had addressed most of them in the book, but it was worth revisiting them.
And that’s the thing that I think is important about this for our practice. We can’t just do the surface treatment. If we just say: “ok we need some content, and then let’s write a knowledge test on it”, we’ve let down our stakeholders. If we don’t know the cognitive properties of the media we use, don’t sweat the details about feedback on assessment, don’t align the practice to the needed performance, etc., we’re not doing our job!
And I don’t mean you have to get a Ph.D. in learning science, but you really do need to know what you’re doing. Or, at least, have good checklists and quick reference guides to ensure you’re on track. Ideally, you review your processes and tools for alignment to what’s known. And the tools themselves could have support. (Ok, to a limit, I’ve seen this done to the extent of handcuffs on design.)
Nuances matter, if you care about the outcomes (and if you don’t, why bother? ;). I’ve been working on both a checklist and on very specific changes that apply to various places in design processes that represent the major ways folks go wrong. These problems are relatively small, and easy to fix, and are designed to yield big improvements. But unless you know what they are, you’re unlikely to have the impact you intend.