Learnlets
Clark Quinn's Learnings about Learning
(The Official Quinnovation blog)

20 July 2016

The wrong basis

Clark @ 8:08 am

Of late, I’ve been talking about the approach organizations take to learning.  It’s come up in presentations on learning design, measurement, and learning technology strategy.  And the point is simple: we’re not using the right basis.

What we’re supposed to be doing is empirically justifiable:

  • doing investigations into the problem
  • identifying the root cause
  • mapping back to an intervention design
  • determining how we’ll know the intervention is working
  • implementing our intervention
  • testing to see if we’ve achieved the necessary outcome
  • and revising until we do

Instead, what we see is what I’ve begun to refer to as ‘faith-based learning’: if we build a course, it is good!  We:

  • take orders for courses
  • document what the SME tells us
  • design a screen-friendly version of the associated content
  • and add a knowledge test

Which would be well and good except that this approach has a very low likelihood of affecting anything except perhaps our learners’ patience (and of course our available resources). Orders for courses have little relation to the real problems, SMEs can’t tell you what they actually do, content 0n a screen doesn’t mean learners know how to or will apply it, and a quiz isn’t likely to lead to any meaningful change in behavior (even if it is tarted up with racing cars).

The closer you are to the former, the better; the closer to the latter, the more likely it is that you’re quite literally wasting time and money.

Faith may not be a bad thing for spirituality, but it’s not a particularly good basis for attempting to develop new skills.  I’ve argued that learning design really is rocket science, and we should be taking an engineering approach.  To the extent we’re not – to the extent that we are implicitly accepting that a course is needed and that our linear processes are sufficient – we’re taking an approach that very much is based upon wishful thinking. And that’s not a good basis to run a business on.

It’s time to get serious about your learning.  It’s doable, with less effort than you may think.   And the alternative is really unjustifiable. So let’s get ourselves, and our industry, on a sound basis.  There’s a lot more we can do as well, but we can start by getting this part right.  Please?

19 July 2016

‘Form’ing learning

Clark @ 8:04 am

Last week I ran a workshop for an online university that is working to improve it’s learning design.  Substantially.  They’re ramping up their staff abilities, and we’d talked about how I could help.  They have ‘content’, but wanted to improve the learning design around this.  While there are a number of steps to take (including how you work with SMEs, the details you attend to in your content, etc), their internal vocabulary talks about ‘knowledge checks’ and the goal was to do those better as they migrate existing courses to a new platform with a suite of assessment types.

So, first of all, my focus was on formative evaluation.  If we take activity-based learning seriously, we need to ensure that there are meaningful tasks set that can provide feedback.  They are fans of Make It Stick (mentioned in my Deeper eLearning reading list), so it was easy to help them recognize that good activities require learners retrieve the information in context, so each formative evaluation should be a situation requiring a decision.

Ok, so not every formative evaluation should be such a situation. But for things that need to be known by rote, I recommend tarted-up ‘drill and kill’. And it became clear, they’re fine at developing standard knowledge checks, it’s the more important ones that needed work.

I started out reviewing the principles, not least because they had a larger audience they wanted to appreciate the background being applied.  Then we moved on to more hands-on work.  First we worked through the different types of assessment types (moving from true/false to more complex assessments like ‘submit and compare’).  We then proceeded to review a first pass to understand the overall course requirements and likely important milestone assessments. We concluded by working through some examples of tough challenges (they’d submitted) and workshopping how to revise them.

There was more behind this, including my understanding more of their context and task, but overall it appeared to develop their understanding of how to take formative evaluation and turn it into an opportunity to truly develop learners in ways that will benefit them after the learning experience.

Of course, focusing on decisions was a key component, and we visited and revisited the issues of working with SMEs. This included getting contexts, and how exaggeration is your friend.  The result is that they’re much better equipped to develop ‘knowledge checks’ that go far beyond knowledge, and actually develop skills that are critical to success after graduation.

This is the type of thinking that organizations from K12 through higher ed and workplace learning (whether corporate, not-for-profit, or government) need to adopt if they’re going to move to learning experiences that actually develop meaningful new abilities.  It’s also about good objectives and more, but what the learner actually does, how they are required to use the knowledge, is critical to the outcome. So, are you ready to make learning that works?

13 July 2016

‘Checking’ In

Clark @ 8:03 am

As a personal reflection, the value of checklists and forcing functions can definitely be understated.  As I mentioned, last week I went into the woods for a few days.  And while the trip didn’t live up to our plans, it was a great experience.  However, there was a particular gap that points out our cognitive limitations.

So, I have a backpacking checklist. And I look at it from time to time. What I didn’t do this time was check it before the trip.  And I found out once I got away from home was that I’d forgotten both my bandana and my towel!  Both are useful, and while I was able to purchase a bandana ($15! but it is microfiber and large, so I’ll keep using it), I had to do without the towel (which the bandana was a poor but necessary substitute for).

We often swim or wade in the river (and did this trip too), and a towel’s handy to get dry before the breeze chills you or the horseflies descend. The bandana, well it served as a sun cover, mosquito deterrent, towel (see above), and glasses wipe. Amongst others.

Let me add that I almost left on today’s overnite biz trip without my sleep clothes!  Fortunately, I had one of those middle-of-the-nite epiphanies, and remedied this morning.

And this just isn’t a consequence of advancing age (hey, I’m still [barely] < 60!).  It’s a natural consequence of our cognitive architecture, and we have well-established processes/tools to support these gaps.  These include checklists to help us remember things, and forcing functions whereby we place things in ways that it’s hard to forget things.

As a consequence, I’m going to do two things going forward. One is to make sure I do check my checklist. I’ll review it for comprehensiveness in the meantime, and have developed it in conjunction with another list from an experience colleague. I have another wilderness trip, and I’ll definitely check it beforehand.  Second, I’ve now put the bandana and a towel in my backpack. So I’d actually have to take it out to forget it!

Here’s to knowing, and applying, tools to help us overcome our cognitive deficits.  What are you doing to help not make mistakes?  And what could you do similarly for your learning design processes?

12 July 2016

Web trust

Clark @ 8:05 am

I get asked to view a lot of things. And sometimes, particularly when there’s a potential tangible relationship, I will actually go visit sites. (BTW, I tend to avoid what comes unsolicited, and instead trust to what comes through my social network.) And one of my strategies often fails, and that, to me, is a warning sign.

When I go to sites (not from familiar companies, but new ones), one of the places I’m very likely to visit is the ‘About Us’ page or equivalent. There’s a reason I do that: I want to know something about who is behind this, and why. They’re linked, but separable.

There’re a couple of reasons to be clear about who’s behind this. One is for authenticity: is there someone willing to put their name to what this is and what it’s about?  And why them?  What background do they have that makes them credible to be the ones behind this endeavor?

And the why is about what motivates them? Are they doing this because of a passion, or because they think it’s a good business opportunity?  Either’s acceptable, but what you want is coherence between the people and what they’re doing.  Ideally, it’s a good story that links them.

There are sites that are clearly out to make money, and some that are out to meet a real need. There are some that have been created by folks who have an idea but not necessarily a clue, and then there are those created by those who should be doing it. And when you get both together, need and clue, you have a site you are willing to investigate further.

It may seem overly harsh or naive, and I’m sure someone could spin a good story and fool me (and has ;), but I think this is a good heuristic, a good reality check, on any site that’s looking to interact with others.  If my search fails to find the requisite information, my antennas start quivering, and my defenses go up.  A personal opinion, of course. Do you agree? Do you have other checks that you like better?  Eager to hear your thoughts.

6 July 2016

Wild thinking

Clark @ 8:13 am

Our everyday lives are decreasingly connected to nature. We’re increasingly separated from the context we evolved in. Is that a good thing?

Yosemite National Park

Now, our relationship with nature hasn’t always been one of benevolent protection, as Roderick Nash has let us know. We lived dangerous lives until we developed the means to defend ourselves, and then the wilderness became an opportunity to expand and profit.  Now, however, with wilderness diminishing, and a growing awareness of the value of wildness for serendipitous diversity, we are beginning to view wilderness as a precious resource.

But are there reasons to consider wilderness benefits for our thinking and learning? The evidence appears to say yes. When we’re in wilderness with minimal risks, at least, the proximity to natural sounds and scenes seems to stimulate areas of the brain. It may take just a walk, or three days, but there are apparent benefits to heart and mind.

I’ve tried to get out in the wilderness at least once a year. I like to hike, and in particular to get backpacking, of late with trips to Yosemite National Park.  A friend/colleague/mentor has regularly organized these trips, and several of us will hike off with our tents, stoves, sleeping bags, water filters, bear cans, and everything else for 3-7 days and get above timber line, sweaty, dirty, and happy. It was on just such a trip where ‘Quinnovation‘ emerged as a branding!

I’ve taken the family, too, to share my love of the outdoors.  So, I’m off again, and we’ll see whether I come back charged with creativity (or just exhausted ;).  Happy trails!

5 July 2016

The Inaugural Jay Cross Memorial Award winner is…

Clark @ 8:00 am

Reposted from the Internet Time Alliance website:

The Internet Time Alliance Jay Cross Memorial Award is presented to a workplace learning professional who has contributed in positive ways to the field of Real Learning and is reflective of Jay’s lifetime of work. Recipients champion workplace and social learning practices inside their organisation and/or on the wider stage. They share their work in public and often challenge conventional wisdom. The Jay Cross Memorial Award is given to professionals who continuously welcome challenges at the cutting edge of their expertise and are convincing and effective advocates of a humanistic approach to workplace learning and performance.

We are announcing this inaugural award on 5 July, Jay’s birthday. Following his death in November 2015, the partners of the Internet Time Alliance (Jane Hart, Harold Jarche, Charles Jennings, Clark Quinn) resolved to continue Jay’s work. Jay Cross was a deep thinker and a man of many talents, never resting on his past accomplishments, and this award is one way to keep pushing our professional fields and industries to find new and better ways to learn and work.

The Internet Time Alliance Jay Cross Memorial Award for 2016 is presented to Helen Blunden. Helen has been an independent practitioner at Activate Learning since 2014. Her vision is to help people stay current in a constantly changing world of work and do this by working and sharing their work and learning in a generous, open, and authentic manner. Helen started her career within the Royal Australian Navy across two branches (Training Development and Public Relations) as well as working within Service and external to Service (with Air Force and Army and Defence civilians), then with the Reserves. Helen later worked as a Learning and Development Consultant for Omni Asia Pacific, and subsequently with National Australia Bank as a Social Learning Consultant. Helen is an active blogger and is engaged professionally on various social media platforms.

Here is Helen in her own words: “In my observations, it’s not only learning teams in organisations or institutions that need to change and recreate the traditional ways of training into learning experiences. It’s wider than that. I have smaller businesses, some of whom are vendors who offer training products and services to the public or to organisations who are scratching their heads trying to figure out how to get ‘into the 21st century’ as their clients ask for more blended programs – shorter programs – but still achieve the same outcomes. Dare I say it, the tools that Jane Hart offers as tools for professional development are not for learning people alone – they’re for everyone. This is where I’m grappling to understand the enormity of the change and how, for the first time, you’re not only helping a client design and develop the learning experience – but you need to teach them how to use the tools so it becomes part of their social behaviour to build their own business, brand and reputation.”

Helen will be formally presented with the award in her home city of Melbourne by Simon Hann, CEO of DeakinPrime, the corporate education arm of Deakin University.

It is with great pleasure that the partners of the Internet Time Alliance present the first Jay Cross Memorial Award to Helen Blunden.

helenblunden

30 June 2016

Moving forward

Clark @ 8:11 am

So, I was chided that my last post was not helpful in moving people forward, as I was essentially being derogatory to those who weren’t applying the new understandings. And I’ve previously provided lots of ways to think anew about L&D, such as posts on the topics (both carrot and stick), pointed to readings that are relevant and can help, created a group to discuss the issues, and even written a book trying to point out the ways to move forward, so I’m not apologetic about also trying to point out the gaps (hey, let’s try all levers).  However, I’m happy to also weigh in positively as well.

The question may be where to start. And of course that will differ. Different organizations will have different starting situations, and contexts, that will mean a different approach will make sense for them.  But there are some overall guiding principles that will help.

One of the first steps is to move to a performance consulting approach. If you start talking to those who are requesting courses and start digging in deeper into the real problem, you’re likely to start investing in better solutions.  This is a relatively straightforward step that is a small change to what you’re doing and yet has the promise of both investing your resources in more relevant ways, and starting to demonstrate real contributions to organizational success.

Of course, your elearning should also start being serious.  We know what leads to effective learning, and we should be employing that deeper design. The nuances that make better learning aren’t obvious, but the details matter and distinguish between learning that has an impact and learning that doesn’t.

Another one is to start thinking about measurement. It’s been said before that “what’s measured, matters”, and this can and should be coupled with the aforementioned approach by looking for measurable improvements that come out of the performance conversation.

This naturally  means that the scope of operations also moves beyond just courses to performance support, but again that should be a small stretch from what is already being done: extending developing course content to also developing job aid content.

One other suggestion is to start looking at the culture picture.  While in the long term this should migrate to an organizational level concern, I suggest that it could and should start within the L&D organization.  L&D needs to start practicing those elements of valuing diversity and openness, making it safe to share, and experimenting as a precursor to taking it out.  The notion of starting small and scaling is a proven approach, and provides a chance to understand and leverage it as a basis for both internal improvement and to take it further.

It’s not easy.  But it’s doable, and desirable. There’re lots of ways to get help (hint hint), but it’s past time to get started.  Let’s get this going, and do it together. So, what barriers do you have and what questions can we assist with?

28 June 2016

Organizational Learning Engineering

Clark @ 8:10 am

Organizational learning processes – across L&D, Executive Development, Leadership Development, and more of the roles in HR and talent management – are largely still rooted in both industrial era models and myths. We see practices that don’t make sense, and we’re not aligned with what we now know about how we think, work, and learn. And this is a problem for organizational success. So what are some of the old practices compared with what we now know?  No surprise, I created a diagram (a table in this case) representing just some of the tensions:

OldNew2

I won’t elaborate on all of these, but I want to make two points.  The first is that I could’ve gone on; both in breadth and depth.  That is, each of these unpacks with many implications, and there are more ways organizations are not aligned with what’s know about how people work.  The second point is that there are known ways to address these problems.  Systemic ways to get the combined benefits of more effective output and more engaged people. Not surprisingly, treating people in ways that reflect their inner nature is more rewarding for them as well as more successful for the organization.

I’ve argued in the past that we should treat learning design seriously, with the depth of rocket science applied as a learning engineering. Similarly, we should be basing our organizational learning designs – our strategies, processes, and policies – on what’s known about people. That’s not being seen often enough.  It’s time for organizational learning to move into the information age, and start performing like professionals.  The action is at the coal face, not in the comfort zone. There’s good work to be done, and it’s time to do it.  Let’s go!

 

23 June 2016

Ambiguity Denial Syndrome?

Clark @ 11:05 am

I was talking with a colleague at an event one of the past weeks, and I noted down the concept of ambiguity denial syndrome. And I’m retrospectively making up what we were talking about, but it’s an interesting idea to me.

FractalSo one of the ways I start out a talk (including later today for a government agency) is to talk about chaos. I use a fractal, and talk about the properties a fractal has.  You know, that it’s a mathematical formulation that paints an image from which patterns emerge, yet at any point you really don’t know where it’s going to go next.

I use this to explain how our old beliefs in an ability to plan, prepare, and execute were somewhat misguided.  What we did was explain away the few times it didn’t work. But as things move faster, the fact that things are not quite as certain as we’d believe means we have to become more agile, because we can less tolerate the mistakes.

The point I’m making, that the world increasingly requires an ability to deal with ambiguity and unique situations. And our learning designs, and organization designs, and our cultures, need to recognize this. And yet, in so many ways, they don’t.

At the individual level, we’re not equipping folks with the right tools. We should be providing them with models to use to interpret and adapt to situations (explain and predict). Our learning designs should have them dealing with a wide variety and degrees of certainty in situations.  And we should be testing and refining them, recognizing that learners aren’t as predictable as concrete or steel.  Instead we see one-shot development of information dumps and knowledge tests, which aren’t going to help organizations.

At the interpersonal level, we should be facilitating people to engage productively, facilitating the development of viable processes for working and learning together. We know that the room is smarter than the smartest person in the room (if we manage the process right), and that we’ll get the best results when we empower people and support their success. We need them working out loud, communicating and collaborating, to get the best. Instead, we still see top-down hierarchies and solo work.

In short, we see people denying the increasing complexity that the world is showing us.  Implicitly or explicitly, it’s clear that many folks believe that they can, and must, control things, instead of looking to adapt on the fly.  We have new organizational models for this, and yet we’re not even seeing the exploration yet.  I acknowledge that change is hard, and navigating it successfully is a challenge. But we have lots of guidance here too.

Too many processes I see reflect industrial age thinking, and we’re in an information age. We have greater capacity amongst our people, and greater challenges to address, with less tolerance for mistakes.  We need to address, even embrace ambiguity, if we are to thrive. Because we can, and we should.  It’s the only sensible way to move forward in this increasingly complex world. So, are you ready?

21 June 2016

eLearning Process Survey results!

Clark @ 8:05 am

So, a few weeks ago I ran a survey asking about elearning processes*, and it’s time to look at the results (I’ve closed it).  eLearning process is something I’m suggesting is ripe for change, and I thought it appropriate to see what people thoughts.  Some caveats: it’s self-selected, it’s limited (23 respondents), and it’s arguably readers of this blog or the other folks who pointed to it, so it’s a select group.  With those caveats, what did we see?

SQ1The first question was looking at how we align our efforts with business needs. The alternatives were ‘providing what’s asked for’ (e.g. taking orders), ‘getting from SMEs’, and ‘using a process’.  These are clearly in ascending order of appropriateness. Order taking doesn’t allow for seeing if a course is needed and SMEs can’t tell you what they actually do. Creating a process to ensure a course is the best solution (as opposed to a job aid or going to the network), and then getting the real performance needs (by triangulating), is optimal.  What we see, however, is that only a bit more than 20% are actually getting this right from the get-go, and almost 80% are failing at one of the two points along the way.

SQ2The second question was asking about how the assessments were aligned with the need. The options ranged from ‘developing from good sources’, thru ‘we test knowledge’ and ‘they have to get it right’ to ‘sufficient spaced contextualized practice’, e.g. ’til they can’t get it wrong.  The clear need, if we’re bothering to develop learning, is to ensure that they can do it at the end.  Doing it ‘until they get it right’ isn’t sufficient to develop a new ability to do. And, we see more than 40% are focusing on using the existing content! Now, the alternatives were not totally orthogonal (e.g. you could have the first response and any of the others), so interpreting this is somewhat problematic.  I assumed  people would know to choose the lowest option in the list if they could, and I don’t know that (flaw in the survey design).  Still it’s pleasing to see that almost 30% are doing sufficient practice, but that’s only a wee bit ahead of those who say they’re just testing knowledge!  So it’s still a concern.

SQ3The third question was looking at the feedback provided. The options included ‘right or wrong’, ‘provides the right answer’, and ‘indication for each wrong answer’.  I’ve been railing against one piece of feedback for all the wrong answers for years now, and it’s important. The alternatives to the wrong answer shouldn’t be random, but instead should represent the ways learners typically get it wrong (based upon misconceptions).  It’s nice (and I admit somewhat surprising) that almost 40% are actually providing feedback that addresses each wrong answer. That’s a very positive outcome.  However, that it’s not even half is still kind of concerning.

SQ4The fourth question digs into the issue of examples.  There are nuances of details about examples, and here I was picking up on a few of these. The options ranged from ‘having’, thru ‘coming from SMEs’ and ‘illustrate the concept and context’, to ‘showing the underlying thinking’.  Again, obviously the latter is the best.  It turns out that experts don’t typically show the underlying cognition, and yet it’s really valuable for the learning. We see that we are getting the link of concept to context clear, and together with showing thinking we’re nabbing roughly 70% of the examples, so that’s a positive sign.

SQ5The fifth question asks about concepts.  Concepts are (or should be) the models that guide performance in the contexts seen across examples and practice (and the basis for the aforementioned feedback). The alternatives ranged from ‘using good content’ and ‘working with SMEs’ to ‘determining the underlying model’.  It’s the latter that is indicated as the basis for making better decisions, going forward.  (I suggest that what will helps orgs is not the ability to receive knowledge, but to make better decisions.)  And we see over 30% going to those models, but still a high percentage still taking the presentations from the SMEs. Which isn’t totally inappropriate, as they do have access to what they learned. I’m somewhat concerned overall that much of ID seems to talk about practice and ‘content’, lumping intros and concepts and examples and closing all together into the latter (without suitable differentiation), so this was better than expected.

SQ6The sixth question tapped into the emotional side of learning, engagement. The options were ‘giving learners what they need’, ‘a good look’, ‘gamification’, and ‘tapping into intrinsic motivation’.  I’ve been a big proponent of intrinsic motivation (heck, I effectively wrote a book on it ;), and not gamification. I think an appealing visual design, but just ‘giving them what they need’ isn’t sufficient for novices: they need the emotional component too. For practitioners, of course, not so much.  I’m pleased that no one talked about gamification (yet the success of companies that sell ‘tart up’ templates suggests that this isn’t the norm). Still, more than a third are going to the intrinsic motivation, which is heartening. There’s a ways to go, but some folks are hearing the message.

SQ7The last question gets into measurement.  We should be evaluating what we do. Ideally, we start from a business metric we need to address and work backward. That’s typically not seen. The questions basically covered the Kirkpatrick model, working from ‘smile sheets’, through’ testing after the learning experience’ and ‘checking changes in workplace behavior’ to ‘tuning until impacting org  metrics’.  I was pleasantly surprised to see over a third doing the latter, and my results don’t parallel what I’ve seen elsewhere. I’m dismayed, of course, that over 20% are still just asking learners, which we know in general isn’t of particular use.

This was a set of questions deliberately digging into areas where I think elearning falls down, and (at least with this group of respondents), it’s not good as I’d hope, but not as bad as I feared.  Still, I’d suggest there’s room for improvement, given the constraints above about who the likely respondents are.  It’s not a representative sample, I’d suspect.

Clearly, there are ways to do well, but it’s not trivial. I’m arguing that we can do good elearning without breaking the bank, but it requires an understanding of the inflection points of the design process where small changes can yield important results. And it requires an understanding of the deeper elements to develop the necessary tools and support. I have been working with several organizations to make these improvements, but it’s well past time to get serious about learning, and start having a real impact.

So over to you: do you see this as a realistic assessment of where we are? And do you take the overall results as indicating a healthy industry, or an industry that needs to go beyond haphazard approaches and start practicing Learning Engineering?

*And, let me say, thanks very much to those respondents who bothered to take the time to respond.  It was quick, but still, the effort was completely appreciated.

 

« Previous PageNext Page »

Powered by WordPress