Clark Quinn's Learnings about Learning
(The Official Quinnovation blog)

3 January 2018

2018 Trajectories

Clark @ 8:08 AM

Given my reflections on the past year, it’s worth thinking about the implications.  What trajectories can we expect if the trends are extended?  These are not predictions (as has been said, “never predict anything, particularly the future”).  Instead, these are musings, and perhaps wishes for what could (even should) occur.

I mentioned an interest in AR and VR.  I think these are definitely on the upswing. VR may be on a rebound from some early hype (certainly ‘virtual worlds’), but AR is still in the offing.  And the tools are becoming more usable and affordable, which typically presages uptake.

I think the excitement about AI will continue, but I reckon we’re already seeing a bit of a backlash. I think that’s fair enough. And I’m seeing more talk about Intelligence Augmentation, and I think that’s a perspective we continue to need. Informed, of course, by a true understanding of how we think, work, and learn.  We need to design to work with us.  Effectively.

Fortunately, I think there are signs we might see more rationality in L&D overall. Certainly we’re seeing lots of people talking about the need for improvement. I see more interest in evaluation, which is also a good step. In fact, I believe it’s a good first step!

I hope it goes further, of course. The cognitive perspective suggests everything from training & performance support, through facilitating communication and collaboration, to culture.  There are many facets that can be fine-tuned to optimize outcomes.Similarly, I hope to see a continuing improvement in learning engineering. That’s part of the reason for the Manifesto and the Quinnov 8.  How it emerges, however, is less important than that it does.  Our learners, and our organizations, deserve nothing less.

Thus, the integration of cognitive science into the design of performance and innovation solutions will continue to be my theme.  When you’re ready to take steps in this direction, I’m happy to help. Let me know; that’s what I do!

2 January 2018

Reflections on 2017

Clark @ 8:07 AM

The end of the calendar year, although arbitrary, becomes a time for reflection.  I looked back at my calendar to see what I’d done this past year, and it was an interesting review.  Places I’ve been and things I’ve done point to some common themes.  Such are the  nature of reflections.

One of the things I did was speak at a number of events. My messages have been pretty consistent along two core themes: doing learning better, and going beyond the course.  These were both presented at TK17 that started the year, and were reiterated, one or the other, through other ATD and Guild events.

With one exception. For my final ATD event of the year, I spoke on Artificial Intelligence (AI). It was in China, and they’re going big into AI. It’s been a recurrent interest of mine since I was an undergraduate. I’ve been fortunate to experience some seminal moments in the field, and even dabble.  The interest in AI does not seem to be abating.

Another persistent area of interest has been Augmented Reality (AR) and Virtual Reality (VR). I attended an event focused on Realities, and I continue to believe in the learning potential of these approaches. Contextual learning, whether building fake or leveraging real, is a necessary adjunct to our learning.  One AR post of mine even won an award!

My work continues to be both organizational learning, but also higher education. Interestingly, I spoke to an academic audience about the realities of workplace learning!  I also had a strategic engagement with a higher education institution on improving elearning.

I also worked on a couple of projects. One I mentioned last week, a course on better ID.  I’m still proud of the eLearning Manifesto (as you can see in the sidebar ;).  And I continue to want to help people do better using technology to facilitate learning.  I think the Quinnov 8 are a  good way.

All in all, I still believe that pursuing better and broader learning and performance is a worthwhile endeavor. Technology is a lovely complement to our thinking, but we have to do it with an understanding of how our brains work. My last project from the year is along these lines, but it’s not yet ready to be announced. Stay tuned!

27 December 2017

Pernicious problems

Clark @ 8:05 AM

I’m using a standard for organizational learning quality in the process of another task.  Why or for whom doesn’t matter. What does matter is that there are two problems in their standard that indicate we still haven’t overcome some pernicious problems.  And we need to!

So, for the first one, this is in their standard for developing learning solutions:

Uses blended models that appeal to a variety of learning styles.

Do you see the problem here?  Learning styles are debunked! There’s no meaningful and valid instrument to measure them, and no evidence that adapting to them is of use.  Appealing to them is a waste of time and effort. Design for the learning instead!  Yet here we’re seeing someone conveying legitimacy by communicating this message.

The second one is also problematic, in their standard for evaluation:

Reports typical L&D metrics such as Kirkpatrick levels, experimental models, pre- and post-tests and utility analyses.

This one’s a little harder to see. If you think about it, however, you should see that pre- and post-test measures aren’t good measures.  What you’re measuring here is a delta, and the problem is, you would expect a delta. It doesn’t really tell you anything. You shouldn’t have even bothered if the performance isn’t up to scratch! What you want to do is confirm that you’re achieving a higher level of performance set objectively. Are they now able to perform? Or how many are?  Doing the pre-post is like doing normative reference (e.g. grading on a curve) when you should be doing criteria-referenced performance.

And this is from an organization that’s purports to communicate L&D quality! These are both from their base level of operation, which means it’s acceptable. This is evidence that our problems aren’t just in practice, they’re pernicious; they’re present in the mindset of even the supposed experts. Is it any wonder the industry is having trouble?  And I haven’t rigorously reviewed the standard, I was merely using it (I wonder what I’d find if I did?).

Maybe I’m being too harsh. Maybe the wording doesn’t imply what I think it does.  But I’ll suggest that we need a bit more rigor, a bit more attention to science in what we do. What have I missed?



21 December 2017


Clark @ 8:01 AM

Expertise is an elusive thing. It comes from years of experience in a field.  However, it turns out that it doesn’t just accumulate. You need very specific practice and/or useful feedback to develop it.  And the more expertise one has, the better you are able to apply it to situations. Which has implications for what you do and when and how you do it.

Expertise is valuable. The properties of expertise include that it’s compiled away to be essentially automatic. Which implies it’s not accessible for conscious introspection. (Which is why experts quite literally cannot tell you what they do!)  On the other hand, their responses to situations in their area of expertise are likely to be as good as you can get.  They apply mental models they’ve developed to solve problems.

If you want to develop expertise as an individual, you need to understand how to practice.  Deliberate practice, as Ericsson details, is the key.  You need to practice at the limits of your ability, and consciously learn from the outcomes.  It’s not just doing the job, it’s pushing the boundaries, and actively reflecting.

If you want to develop expertise as an organization internally, the situation is very much the same.  You need resources to develop people, and stretch assignments with feedback and coaching to optimally develop the expertise.

Of course, you can bring in expertise from outside, as well.  The question then becomes one of when and who.  You can contract out work, which makes sense when the activity isn’t part of your core ability.  Outsourcing to technology or external expertise is fine for things that are in areas that are well developed.

Otherwise, you can bring in consultants. The latter is particularly useful when you are moving in a new direction or want to deepen your understandings. A good consultant will work with you to not only help address the situation, but internally develop your own understanding. The key is working collaboratively and transparently. Yes, I’m a vested interest, but I believe these things are true on principle and should be in practice.

Expertise is core to situations you know you need expertise in, but also in those that are new. When you need innovation, you need expertise in the complementary areas that you are applying to address the situation.  You don’t want to develop learning except in the problem.  At least, that’s my expert opinion. Which, of course, is on tap if needed ;).

19 December 2017


Clark @ 8:03 AM

Sparked by a colleague, I’m reading The Digital Transformation Playbook, by David Rogers. In the chapter on innovation, he talks about two types of experimentation: convergent and divergent. And I was reminded that I think of two types of innovations as well.  So what are they?


He talks about how experimentation is the key to innovation (in fact, the chapter title is Innovate by Rapid Experimentation). His point is that you need to be continually experimenting, rapidly.  And throughout the organization, not just in separate labs. Also, it’s ok to fail, as long as the lesson’s learned.  And then he distinguishes between two types of experimentation.

The first is convergent. Not surprisingly, this is when you’re trying to eliminate options and make a decision.  This is your classic A/B testing, for example. Here you might try out two or three different solutions, to see which one works best. You create the options, and have measures you’ll use to determine the answer.  You might ask: should we use a realistic video or a cartoon animation? A situation where there isn’t a principled answer, and you need to make a decision.

Divergent experimentation is, instead, exploratory. Here you give folks some ideas, or a prototype, and see what happens. You don’t know what you’ll get, but you’re eager to learn.  What would a scenario look like here?


These roughly correspond to the two types of innovation I think of. One is the ‘we need to solve this’ type. I think of this as short-term innovation. Here we are problem-solving or trouble-shooting.  You bring together a team of relevant capabilities and otherwise as diverse as possible. You facilitate the process. And you’re likely to try convergent experimentation.

At the other end is the serendipitous, long-term innovation that happens because you create an environment where ideas can gestate.  You’ve got access to the adjacent possible, and the opportunities to explore and share. It’s safe to experiment and fail.  People are supposed to take time to reflect! This is more closely aligned to divergent experimentation.

Note that this is all learning, as you don’t know the answers when you start!  The success of organizational learning, however, is a product of both. You need to solve the problems you know you have, and allow for ideas to generate solutions to problems you didn’t know you had.  Or, more optimistically, to search through idea spaces for opportunities you didn’t know to look for.

Rogers is right that continual experimentation is key.  It has to become baked into how you do what you do.  Individually, and organizationally.  And you can’t really get it unless you start practicing it yourself.  You need to continually challenge yourself, and try things both to fix the problems, and to explore things that are somewhat tangential. Your own innovations will be key to your ability to foster them elsewhere.

Too many orgs are only focused on the short-term.  And while that may solve shareholder return expectations, it’s not a receipt for longer-term organizational survival.  You need both types of innovations. So, the question is whether you can assist your org in making a shift to the serendipitous environment.  Are you optimizing your innovation?

13 December 2017

Higher Ed & Job Skills?

Clark @ 8:08 AM

I sat in on a twitter chat yesterday, #DLNChat, that is a higher ed tech focused group (run by EdSurge). The topic was the link between higher ed and job skills, and I was a wee bit cynical. While I think there are great possibilities, the current state of the art leaves a lot to be desired.

So, I currently don’t think higher ed does a good job of preparation for success in business. Higher ed focuses too much on knowledge, and uses assignments that don’t resemble the job activities.  Frankly, there aren’t too many essays in most jobs!

Worse, I don’t think higher ed does a good job of developing meta-cognitive and meta-learning skills. There is little attempt to bridge assignments across courses, so your presentations in psychology 101 and sociology 202 and business 303 aren’t steadily tracked and developed. Similarly with research projects, or strategy, or… And there’re precious little (read: none) typically found where you actually make decisions like you would need to.

And, sadly, the use of technology isn’t well stipulated either. You might use a presentation tool, a writing tool, or a spreadsheet, maybe even collaboratively, but it’s not typically tied to external resources and data.

Yes, I know there are exceptions, and it may be changing somewhat, but it still appears to be the case. Research, write a paper, take a test.

Yet the role of developing higher skills is possible and valuable.  We could be providing more meaningful assignments, integrating meta-learning layers, and developing both meaningful skills and meta-skills.

This doesn’t have to be done at the expense of the types of things professors believe are important, but just with a useful twist in the way the knowledge is applied. It might lead to a revision of the curriculum, at least somewhat, but I reckon it’d likely be for the better ;).

Our education system, both K12 and higher-ed, isn’t doing near what it could, and should. As Roger Schank says, only two things wrong: what we teach, and how we teach it.  We can do better. Will we?

6 December 2017

Conceptual Clarity

Clark @ 8:07 AM

Ok, so I can be a bit of a pedant.  Blame it on my academic background, but I believe conceptual clarity is important! If we play fast and loose with terminology, we can be be convinced of something without truly understanding it.  Ultimately, we can waste money chasing unwarranted directions, and worse, perhaps even do wrong by our learners.

Where do the problems arise?  Sometimes, it’s easy to ride a bizbuzz bandwagon.  Hey, the topic is hot, and it sounds good.  Other times, it’s just too hard to spend the effort. Yet getting it wrong ends up meaning you’re wasting resources.

Let’s be clear, I’m not talking myths. Those abound, but here I’m talking about ideas that are being used relatively indiscriminately, but in at least one interpretation there’s real value.  The important thing is to separate the wheat from the chaff.

Some concepts that are running around recently and could use some clarity are the following:

Microlearning.  I tried to be clear about this here. In short, microlearning is about small chunks where the learning aggregates over time.  Aka spaced learning.  But other times, people really mean performance support (just-in-time help to succeed in the moment). What you don’t want is someone pretending it’s so unique that they can trademark it.

70:20:10.  This is another that some people deride, and others find value in. I’ve also talked about this.   The question is why they differ, and my answer is that the folks who use it as a way to think more clearly about a whole learning experience find value. Those who fret about the label are missing the point.  And I acknowledge that the label is a barrier, but that horse has bolted.

Neuro- (aka brain- ). Yes, our brains are neurologically based. And yes, there are real implications. Some.  Like ‘the neurons that fire together, wire together’.  And yet there’re a whole lot of discussions about neuro that are really at the next higher level: cognitive.  This is just misleading folks to make it sound more scientific.

Unlearning. There’s a lot of talk about unlearning, but in the neurological sense it doesn’t make sense. You don’t unlearn something.  As far as we can tell, it’s still there, just increasingly hard to activate. The only real way to ‘unlearn’ is to learn some other response to the same situation.  You learn ‘over’ the old learning. Or overlearn.  But not unlearn. It’s an unconcept.

Gamification. This is actually the one that triggered this post. In theory, gamification is the application of game mechanics to learning.  Interestingly, Raph Koster wrote that what makes games fun are that they are intrinsically about learning!  However, there are important nuances.  It’s not just about adding PBL (points, badges, and leaderboards). These aren’t bad things, but they’re secondary.  Designing the intrinsic action around the decisions learners need to acquire is a deeper and more meaningful implication.  Yet people tend to ignore the latter because it’s ‘harder’.  Yet it’s really just about good learning design.

There are more, of course, but hopefully these illustrate the problem. (What are yours?)  Please, please, be professional and take the time to get clear about our cognitive architecture enough to ensure that you can make these distinctions on your own. We need the conceptual clarity!  Hopefully then we can reserve excitement for ideas that truly add value.

29 November 2017

Before the Course

Clark @ 8:04 AM

It appears that, too often, people are building courses when they don’t need to (or, more importantly, shouldn’t).  I realize that there are pressures to make a course when one is requested, including expectations and familiarity, but really, you should be doing some initial thinking about what makes sense.  So here’s a rough guide about the thinking you should do before you course.

FlowchartYou begin with a performance problem.  Something’s not right: calls take too long, sales success rate is too low, there’re too many errors in manufacturing.  So it must need training, right?  Er, no.  There’s this thing that’s called ‘performance consulting‘ that talks about identifying the gaps that could be preventing the desirable outcomes, and they’re not all about gaps that training meets.  So we need to triage, and see what’s broken and what’s the priority.

To start, people can simply not know what they’re supposed to do.  That may seem obvious, but it can in fact be the case.  Thus, there’s a need to communicate. Note that this and all of these are more complex than just ‘communicate’. There are the issues about who needs to communicate, and when, and to whom, etc.  But it’s not (at least initially) a training problem.

If they do know, and could do it but aren’t, the problem isn’t going to be solved by training.  As someone once put it “if they could do it if their life depended on it”, then there’s something else going on. If they’re not following safety procedures because they’re too onerous, a course on it isn’t going to fix it. You need to address their motivation.

Now, if they can’t do it, then could they do it if they had the right tools, or more people, or more time? In other words, is it a resource problem?  And, in one way I like to think about it: can we put the solution in the world, instead of in the head?  Will lookup tables, checklists, step-by-step guides or videos solve the problem? Or even connections to other folks! (There are times when it doesn’t make sense to course or even job-aid; e.g. if it’s changing too fast, or too unique, or…)

And, of course, if you don’t have the right people, training still may not work. If they need to meet certain criteria, but don’t, training won’t solve it.  Training can’t fix color-blindness or lack of height, for instance.

Finally, if the prior solutions won’t solve it, and there’s a serious skill gap, then it’s time for training.  And not just knowledge dump, of course, but models and examples and meaningful (and spaced) practice.

Again, these are all abbreviated, and this is oversimplified.  There’s more depth to be unpacked, so this is just a surface level way to represent that a course isn’t always the solution.  But before you course, consider the other solutions. Please.

28 November 2017

eLearning Land

Clark @ 8:03 AM

This post is just a bit of elearning silliness, parodying our worst instincts…

Welcome back my friends, to the show that never ends. We’re so glad you could attend. Come inside, come inside! – Emerson, Lake & Palmer: Karn Evil 9, 1st Impression, Part 2.

It’s so good to see you, and I hope you’re ready for fun. Let’s introduce you to the many attractions to be found here.  We’ve got entertainment suitable for all ages, and wallets!  You can find something you like here, and for an attractive cost.

snake oil salesmanTo start, we have the BizBuzz arcade. It’s a mirror maze, where all things look alike. Microlearning, contextual performance support, mobile elearning, chunking, just-in-time, it’s all there.  Shiny objects appear and disappear before your eyes!  Conceptual clarity is boring, it’s all about the sizzle.

And over here is the Snake Oil Pool.  It’s full of cures for what ails you!  We’ve got potions and lotions and aisles of styles.  It’s slippery, and unctuous; you can’t really get a handle on it, so how can you go wrong?  Apply our special solution, and your pains go away like magic.  Trust us.

Step right up and ride the Hype Balloon!  It’s a quick trip to the heights, held aloft by empty promises based upon the latest trends: neuro/brain-based, millennial/generations, and more.  It doesn’t matter if it holds water, because it’s lighter than air!

Don’t forget the wild Tech Lifecycle ride. You’ll go up, you’ll go down, you’ll take unpredictable twists, followed by a blazing finale. Get in line early!  You’ll leave with a lighter pocketbook, and perhaps a slight touch of nausea, but no worries, it was fun while it lasted.

Come one, come all! We’ll help you feel better, even if when you leave things aren’t any different. You’ll at least have been taken for a ride.  We’ll hope to see you again soon.

This was a jest, this was only a jest. If this were a real emergency, I’d write a book or something. Seriously, we do have to pay attention to the science in what we’re doing, and view things with a healthy skepticism.  We now return you to your regularly scheduled blog, already in progress.  

15 November 2017

#AECT17 Reflections

Clark @ 8:10 AM

Ok, so I was an academic for a brief and remarkably good period of time (a long time ago). Mind you, I’ve kept my hand in: reviewing journal and conference submissions, writing the occasional book chapter, contributing to some research, even playing a small role in some grant-funded projects. I like academics, it’s just that circumstances took me away (and I like consulting too; different, not one better). However, there’re a lot of benefits from being engaged, particularly keeping up with the state of the art. At least one perspective… Hence, I attended the most recent meeting of the Association of Educational Communications & Technology, pretty much the society for academics in instructional technology.

The event features many of your typical components: keynotes, sessions, receptions, and the interstitial social connections. One of the differences is that there’s no vendor exhibition. And there are a lot of concurrent sessions: roughly 27 per time slot!   Now, you have to understand, there are multiple agendas, including giving students and new faculty members opportunities for presentations and feedback. There are also sessions designed for tapping into the wisdom of the elders, and working sessions to progress understandings. This was only my second, so I may have the overall tenor wrong.  Regardless, here are some reflections from the event:

For one, it’s clear that there’s an overall awareness of what could, and should, be happening in education. In the keynotes, the speakers repeatedly conveyed messages about effective learning. What wasn’t effectively addressed was the comprehensive resistance of the education system to meaningful change.  Still, all three keynotes, Driscoll, Cabrera, and Reeves, commented in one way or another on problems and opportunities in education. Given that many of the faculty members come from Departments of Education, this is understandable.

Another repeated emergent theme (at least for me) was the need for meaningful research. What was expressed by Tom Reeves in a separate session was the need for a new approach to research grounded in focusing on real problems. I’ve been a fan of his call for Design-Based Research, and liked what he said: all thesis students should introduce their topics with the statement “the problem I’m looking at is”. The sessions, however, seemed to include too many small studies. (In my most cynical moments, I wonder how many studies have looked at teaching students or teacher professional development and their reflections/use of technology…).

One session I attended was quite exciting. The topic was the use of neuroscience in learning, and the panel were all people using scans and other neuroscience data to inform learning design. While I generally deride the hype that usually accompanies the topic, here were real researchers talking actual data and the implications, e.g. for dyslexia.  While most of the results from research that have implications for design are still are at the cognitive level, it’s important to continue to push the boundaries.

I focused my attendance mostly on the Organizational Training & Performance group, and heard a couple of good talks.  One was a nice survey of mentoring, looking across the research, and identifying what results there were, and where there were still opportunities for research. Another study did a nice job of synthesizing models for human performance technology, though the subsequent validation approach concerned me.

I did a couple of presentations myself that I’ll summarize in tomorrow’s post, but it was a valuable experience. The challenges are different than in corporate learning technology, but there are interesting outcomes that are worth tracking.  A valuable experience.

Next Page »

Powered by WordPress