Learnlets
Clark Quinn's Learnings about Learning
(The Official Quinnovation blog)

20 July 2016

The wrong basis

Clark @ 8:08 am

Of late, I’ve been talking about the approach organizations take to learning.  It’s come up in presentations on learning design, measurement, and learning technology strategy.  And the point is simple: we’re not using the right basis.

What we’re supposed to be doing is empirically justifiable:

  • doing investigations into the problem
  • identifying the root cause
  • mapping back to an intervention design
  • determining how we’ll know the intervention is working
  • implementing our intervention
  • testing to see if we’ve achieved the necessary outcome
  • and revising until we do

Instead, what we see is what I’ve begun to refer to as ‘faith-based learning’: if we build a course, it is good!  We:

  • take orders for courses
  • document what the SME tells us
  • design a screen-friendly version of the associated content
  • and add a knowledge test

Which would be well and good except that this approach has a very low likelihood of affecting anything except perhaps our learners’ patience (and of course our available resources). Orders for courses have little relation to the real problems, SMEs can’t tell you what they actually do, content 0n a screen doesn’t mean learners know how to or will apply it, and a quiz isn’t likely to lead to any meaningful change in behavior (even if it is tarted up with racing cars).

The closer you are to the former, the better; the closer to the latter, the more likely it is that you’re quite literally wasting time and money.

Faith may not be a bad thing for spirituality, but it’s not a particularly good basis for attempting to develop new skills.  I’ve argued that learning design really is rocket science, and we should be taking an engineering approach.  To the extent we’re not – to the extent that we are implicitly accepting that a course is needed and that our linear processes are sufficient – we’re taking an approach that very much is based upon wishful thinking. And that’s not a good basis to run a business on.

It’s time to get serious about your learning.  It’s doable, with less effort than you may think.   And the alternative is really unjustifiable. So let’s get ourselves, and our industry, on a sound basis.  There’s a lot more we can do as well, but we can start by getting this part right.  Please?

12 July 2016

Web trust

Clark @ 8:05 am

I get asked to view a lot of things. And sometimes, particularly when there’s a potential tangible relationship, I will actually go visit sites. (BTW, I tend to avoid what comes unsolicited, and instead trust to what comes through my social network.) And one of my strategies often fails, and that, to me, is a warning sign.

When I go to sites (not from familiar companies, but new ones), one of the places I’m very likely to visit is the ‘About Us’ page or equivalent. There’s a reason I do that: I want to know something about who is behind this, and why. They’re linked, but separable.

There’re a couple of reasons to be clear about who’s behind this. One is for authenticity: is there someone willing to put their name to what this is and what it’s about?  And why them?  What background do they have that makes them credible to be the ones behind this endeavor?

And the why is about what motivates them? Are they doing this because of a passion, or because they think it’s a good business opportunity?  Either’s acceptable, but what you want is coherence between the people and what they’re doing.  Ideally, it’s a good story that links them.

There are sites that are clearly out to make money, and some that are out to meet a real need. There are some that have been created by folks who have an idea but not necessarily a clue, and then there are those created by those who should be doing it. And when you get both together, need and clue, you have a site you are willing to investigate further.

It may seem overly harsh or naive, and I’m sure someone could spin a good story and fool me (and has ;), but I think this is a good heuristic, a good reality check, on any site that’s looking to interact with others.  If my search fails to find the requisite information, my antennas start quivering, and my defenses go up.  A personal opinion, of course. Do you agree? Do you have other checks that you like better?  Eager to hear your thoughts.

5 July 2016

The Inaugural Jay Cross Memorial Award winner is…

Clark @ 8:00 am

Reposted from the Internet Time Alliance website:

The Internet Time Alliance Jay Cross Memorial Award is presented to a workplace learning professional who has contributed in positive ways to the field of Real Learning and is reflective of Jay’s lifetime of work. Recipients champion workplace and social learning practices inside their organisation and/or on the wider stage. They share their work in public and often challenge conventional wisdom. The Jay Cross Memorial Award is given to professionals who continuously welcome challenges at the cutting edge of their expertise and are convincing and effective advocates of a humanistic approach to workplace learning and performance.

We are announcing this inaugural award on 5 July, Jay’s birthday. Following his death in November 2015, the partners of the Internet Time Alliance (Jane Hart, Harold Jarche, Charles Jennings, Clark Quinn) resolved to continue Jay’s work. Jay Cross was a deep thinker and a man of many talents, never resting on his past accomplishments, and this award is one way to keep pushing our professional fields and industries to find new and better ways to learn and work.

The Internet Time Alliance Jay Cross Memorial Award for 2016 is presented to Helen Blunden. Helen has been an independent practitioner at Activate Learning since 2014. Her vision is to help people stay current in a constantly changing world of work and do this by working and sharing their work and learning in a generous, open, and authentic manner. Helen started her career within the Royal Australian Navy across two branches (Training Development and Public Relations) as well as working within Service and external to Service (with Air Force and Army and Defence civilians), then with the Reserves. Helen later worked as a Learning and Development Consultant for Omni Asia Pacific, and subsequently with National Australia Bank as a Social Learning Consultant. Helen is an active blogger and is engaged professionally on various social media platforms.

Here is Helen in her own words: “In my observations, it’s not only learning teams in organisations or institutions that need to change and recreate the traditional ways of training into learning experiences. It’s wider than that. I have smaller businesses, some of whom are vendors who offer training products and services to the public or to organisations who are scratching their heads trying to figure out how to get ‘into the 21st century’ as their clients ask for more blended programs – shorter programs – but still achieve the same outcomes. Dare I say it, the tools that Jane Hart offers as tools for professional development are not for learning people alone – they’re for everyone. This is where I’m grappling to understand the enormity of the change and how, for the first time, you’re not only helping a client design and develop the learning experience – but you need to teach them how to use the tools so it becomes part of their social behaviour to build their own business, brand and reputation.”

Helen will be formally presented with the award in her home city of Melbourne by Simon Hann, CEO of DeakinPrime, the corporate education arm of Deakin University.

It is with great pleasure that the partners of the Internet Time Alliance present the first Jay Cross Memorial Award to Helen Blunden.

helenblunden

30 June 2016

Moving forward

Clark @ 8:11 am

So, I was chided that my last post was not helpful in moving people forward, as I was essentially being derogatory to those who weren’t applying the new understandings. And I’ve previously provided lots of ways to think anew about L&D, such as posts on the topics (both carrot and stick), pointed to readings that are relevant and can help, created a group to discuss the issues, and even written a book trying to point out the ways to move forward, so I’m not apologetic about also trying to point out the gaps (hey, let’s try all levers).  However, I’m happy to also weigh in positively as well.

The question may be where to start. And of course that will differ. Different organizations will have different starting situations, and contexts, that will mean a different approach will make sense for them.  But there are some overall guiding principles that will help.

One of the first steps is to move to a performance consulting approach. If you start talking to those who are requesting courses and start digging in deeper into the real problem, you’re likely to start investing in better solutions.  This is a relatively straightforward step that is a small change to what you’re doing and yet has the promise of both investing your resources in more relevant ways, and starting to demonstrate real contributions to organizational success.

Of course, your elearning should also start being serious.  We know what leads to effective learning, and we should be employing that deeper design. The nuances that make better learning aren’t obvious, but the details matter and distinguish between learning that has an impact and learning that doesn’t.

Another one is to start thinking about measurement. It’s been said before that “what’s measured, matters”, and this can and should be coupled with the aforementioned approach by looking for measurable improvements that come out of the performance conversation.

This naturally  means that the scope of operations also moves beyond just courses to performance support, but again that should be a small stretch from what is already being done: extending developing course content to also developing job aid content.

One other suggestion is to start looking at the culture picture.  While in the long term this should migrate to an organizational level concern, I suggest that it could and should start within the L&D organization.  L&D needs to start practicing those elements of valuing diversity and openness, making it safe to share, and experimenting as a precursor to taking it out.  The notion of starting small and scaling is a proven approach, and provides a chance to understand and leverage it as a basis for both internal improvement and to take it further.

It’s not easy.  But it’s doable, and desirable. There’re lots of ways to get help (hint hint), but it’s past time to get started.  Let’s get this going, and do it together. So, what barriers do you have and what questions can we assist with?

28 June 2016

Organizational Learning Engineering

Clark @ 8:10 am

Organizational learning processes – across L&D, Executive Development, Leadership Development, and more of the roles in HR and talent management – are largely still rooted in both industrial era models and myths. We see practices that don’t make sense, and we’re not aligned with what we now know about how we think, work, and learn. And this is a problem for organizational success. So what are some of the old practices compared with what we now know?  No surprise, I created a diagram (a table in this case) representing just some of the tensions:

OldNew2

I won’t elaborate on all of these, but I want to make two points.  The first is that I could’ve gone on; both in breadth and depth.  That is, each of these unpacks with many implications, and there are more ways organizations are not aligned with what’s know about how people work.  The second point is that there are known ways to address these problems.  Systemic ways to get the combined benefits of more effective output and more engaged people. Not surprisingly, treating people in ways that reflect their inner nature is more rewarding for them as well as more successful for the organization.

I’ve argued in the past that we should treat learning design seriously, with the depth of rocket science applied as a learning engineering. Similarly, we should be basing our organizational learning designs – our strategies, processes, and policies – on what’s known about people. That’s not being seen often enough.  It’s time for organizational learning to move into the information age, and start performing like professionals.  The action is at the coal face, not in the comfort zone. There’s good work to be done, and it’s time to do it.  Let’s go!

 

23 June 2016

Ambiguity Denial Syndrome?

Clark @ 11:05 am

I was talking with a colleague at an event one of the past weeks, and I noted down the concept of ambiguity denial syndrome. And I’m retrospectively making up what we were talking about, but it’s an interesting idea to me.

FractalSo one of the ways I start out a talk (including later today for a government agency) is to talk about chaos. I use a fractal, and talk about the properties a fractal has.  You know, that it’s a mathematical formulation that paints an image from which patterns emerge, yet at any point you really don’t know where it’s going to go next.

I use this to explain how our old beliefs in an ability to plan, prepare, and execute were somewhat misguided.  What we did was explain away the few times it didn’t work. But as things move faster, the fact that things are not quite as certain as we’d believe means we have to become more agile, because we can less tolerate the mistakes.

The point I’m making, that the world increasingly requires an ability to deal with ambiguity and unique situations. And our learning designs, and organization designs, and our cultures, need to recognize this. And yet, in so many ways, they don’t.

At the individual level, we’re not equipping folks with the right tools. We should be providing them with models to use to interpret and adapt to situations (explain and predict). Our learning designs should have them dealing with a wide variety and degrees of certainty in situations.  And we should be testing and refining them, recognizing that learners aren’t as predictable as concrete or steel.  Instead we see one-shot development of information dumps and knowledge tests, which aren’t going to help organizations.

At the interpersonal level, we should be facilitating people to engage productively, facilitating the development of viable processes for working and learning together. We know that the room is smarter than the smartest person in the room (if we manage the process right), and that we’ll get the best results when we empower people and support their success. We need them working out loud, communicating and collaborating, to get the best. Instead, we still see top-down hierarchies and solo work.

In short, we see people denying the increasing complexity that the world is showing us.  Implicitly or explicitly, it’s clear that many folks believe that they can, and must, control things, instead of looking to adapt on the fly.  We have new organizational models for this, and yet we’re not even seeing the exploration yet.  I acknowledge that change is hard, and navigating it successfully is a challenge. But we have lots of guidance here too.

Too many processes I see reflect industrial age thinking, and we’re in an information age. We have greater capacity amongst our people, and greater challenges to address, with less tolerance for mistakes.  We need to address, even embrace ambiguity, if we are to thrive. Because we can, and we should.  It’s the only sensible way to move forward in this increasingly complex world. So, are you ready?

21 June 2016

eLearning Process Survey results!

Clark @ 8:05 am

So, a few weeks ago I ran a survey asking about elearning processes*, and it’s time to look at the results (I’ve closed it).  eLearning process is something I’m suggesting is ripe for change, and I thought it appropriate to see what people thoughts.  Some caveats: it’s self-selected, it’s limited (23 respondents), and it’s arguably readers of this blog or the other folks who pointed to it, so it’s a select group.  With those caveats, what did we see?

SQ1The first question was looking at how we align our efforts with business needs. The alternatives were ‘providing what’s asked for’ (e.g. taking orders), ‘getting from SMEs’, and ‘using a process’.  These are clearly in ascending order of appropriateness. Order taking doesn’t allow for seeing if a course is needed and SMEs can’t tell you what they actually do. Creating a process to ensure a course is the best solution (as opposed to a job aid or going to the network), and then getting the real performance needs (by triangulating), is optimal.  What we see, however, is that only a bit more than 20% are actually getting this right from the get-go, and almost 80% are failing at one of the two points along the way.

SQ2The second question was asking about how the assessments were aligned with the need. The options ranged from ‘developing from good sources’, thru ‘we test knowledge’ and ‘they have to get it right’ to ‘sufficient spaced contextualized practice’, e.g. ’til they can’t get it wrong.  The clear need, if we’re bothering to develop learning, is to ensure that they can do it at the end.  Doing it ‘until they get it right’ isn’t sufficient to develop a new ability to do. And, we see more than 40% are focusing on using the existing content! Now, the alternatives were not totally orthogonal (e.g. you could have the first response and any of the others), so interpreting this is somewhat problematic.  I assumed  people would know to choose the lowest option in the list if they could, and I don’t know that (flaw in the survey design).  Still it’s pleasing to see that almost 30% are doing sufficient practice, but that’s only a wee bit ahead of those who say they’re just testing knowledge!  So it’s still a concern.

SQ3The third question was looking at the feedback provided. The options included ‘right or wrong’, ‘provides the right answer’, and ‘indication for each wrong answer’.  I’ve been railing against one piece of feedback for all the wrong answers for years now, and it’s important. The alternatives to the wrong answer shouldn’t be random, but instead should represent the ways learners typically get it wrong (based upon misconceptions).  It’s nice (and I admit somewhat surprising) that almost 40% are actually providing feedback that addresses each wrong answer. That’s a very positive outcome.  However, that it’s not even half is still kind of concerning.

SQ4The fourth question digs into the issue of examples.  There are nuances of details about examples, and here I was picking up on a few of these. The options ranged from ‘having’, thru ‘coming from SMEs’ and ‘illustrate the concept and context’, to ‘showing the underlying thinking’.  Again, obviously the latter is the best.  It turns out that experts don’t typically show the underlying cognition, and yet it’s really valuable for the learning. We see that we are getting the link of concept to context clear, and together with showing thinking we’re nabbing roughly 70% of the examples, so that’s a positive sign.

SQ5The fifth question asks about concepts.  Concepts are (or should be) the models that guide performance in the contexts seen across examples and practice (and the basis for the aforementioned feedback). The alternatives ranged from ‘using good content’ and ‘working with SMEs’ to ‘determining the underlying model’.  It’s the latter that is indicated as the basis for making better decisions, going forward.  (I suggest that what will helps orgs is not the ability to receive knowledge, but to make better decisions.)  And we see over 30% going to those models, but still a high percentage still taking the presentations from the SMEs. Which isn’t totally inappropriate, as they do have access to what they learned. I’m somewhat concerned overall that much of ID seems to talk about practice and ‘content’, lumping intros and concepts and examples and closing all together into the latter (without suitable differentiation), so this was better than expected.

SQ6The sixth question tapped into the emotional side of learning, engagement. The options were ‘giving learners what they need’, ‘a good look’, ‘gamification’, and ‘tapping into intrinsic motivation’.  I’ve been a big proponent of intrinsic motivation (heck, I effectively wrote a book on it ;), and not gamification. I think an appealing visual design, but just ‘giving them what they need’ isn’t sufficient for novices: they need the emotional component too. For practitioners, of course, not so much.  I’m pleased that no one talked about gamification (yet the success of companies that sell ‘tart up’ templates suggests that this isn’t the norm). Still, more than a third are going to the intrinsic motivation, which is heartening. There’s a ways to go, but some folks are hearing the message.

SQ7The last question gets into measurement.  We should be evaluating what we do. Ideally, we start from a business metric we need to address and work backward. That’s typically not seen. The questions basically covered the Kirkpatrick model, working from ‘smile sheets’, through’ testing after the learning experience’ and ‘checking changes in workplace behavior’ to ‘tuning until impacting org  metrics’.  I was pleasantly surprised to see over a third doing the latter, and my results don’t parallel what I’ve seen elsewhere. I’m dismayed, of course, that over 20% are still just asking learners, which we know in general isn’t of particular use.

This was a set of questions deliberately digging into areas where I think elearning falls down, and (at least with this group of respondents), it’s not good as I’d hope, but not as bad as I feared.  Still, I’d suggest there’s room for improvement, given the constraints above about who the likely respondents are.  It’s not a representative sample, I’d suspect.

Clearly, there are ways to do well, but it’s not trivial. I’m arguing that we can do good elearning without breaking the bank, but it requires an understanding of the inflection points of the design process where small changes can yield important results. And it requires an understanding of the deeper elements to develop the necessary tools and support. I have been working with several organizations to make these improvements, but it’s well past time to get serious about learning, and start having a real impact.

So over to you: do you see this as a realistic assessment of where we are? And do you take the overall results as indicating a healthy industry, or an industry that needs to go beyond haphazard approaches and start practicing Learning Engineering?

*And, let me say, thanks very much to those respondents who bothered to take the time to respond.  It was quick, but still, the effort was completely appreciated.

 

7 June 2016

The 3 Social Media Things You Ought to Avoid

Clark @ 8:04 am

At least, that is, with me. Frankly, I wonder if you even bothered to read this after a title like that! Or at least are highly suspicious at this point. It (should) be just the type of thing you would not expect from me. And there’s a reason for that. There are 3 egregious social media things you shouldn’t do, and the title is related to one of them.

As context, because of this blog, I get occasional emails offering to write guest posts for me. Now, these aren’t really learning folks, these are marketing folks who would want to put  in links to their site.  This used to happen a lot, so much so I even wrote a post about it.  And I point people to it when they get it (for a number of certain types of requests I’ve made up canned responses I just cut and paste).

So I just got one, and it was nice, because it actually listed the company, pointed to examples of their work, and listed some sample titles.  However, the titles just didn’t sound like me:

The Four Social Media Perversions You Should Capitalize On

7 Tips to Clickbait That Will Guarantee Results

Posts That Generate Revenue: Using the Words You Can’t Say On Television

(Ok, I’m exaggerating a wee bit :).  However, this leads to the first thing to avoid:

1. Don’t offer guest posts that don’t match the tenor of the blog

Now the second case is implied by a bit of the above. Recently I’ve gotten requests about placing links that are much more, er, mysterious. To paraphrase: I work for a client that works in a related area and I’ve written lots of posts and I’d like to do some for you, and there might even be a small bit of money available.  Read: I’m too ashamed to admit who I work for, I won’t show you an example of my work, and I’ll try to entice you with a mention of money.  Somehow these folks haven’t heard about what builds trust on the net (hint: it’s spelled ‘transparency’).  So:

2. Don’t give  vague offers with unsubstantiated particulars

I’m more susceptible to people who actually do inquire what it would take to place an ad, but so far I haven’t gone there (I once asked and folks seemed to prefer it without).

Along with this, there are always people who want to show me their product (because it’s in my space) and give them my feedback.  That is, they want me to give them my years of expertise for free. On top of that, they need enough of my time to present their product first.  My response is always “I talk ideas for free, I help someone personally for drinks/dinner*, but if someone’s making a quid, I get a cut”. The point being, I’m not giving my free time and expert opinion (hey, that’s how I feed the family). I’ll offer them my services, and a time or two that’s actually happened. But mostly they plead poverty and move on.

This is a well-known problem. There are other examples as well: “can I pick your brain”, offers of ‘exposure’ in return for speaking, and it’s not on. In fact, it’s ripe for parody.  Thus:

3. Don’t try to get free work

There’re more, I’m sure, but these seem to be the most frequent. It’s really bad social media behavior. If you want something, tell me what it is, and make the value proposition clear.

And let’s be clear: there are offers I do take up, but these are clear about what is required as well as the benefit is to me as well as to them, and I can make a conscious evaluation.

So please, feel free to hire me, but don’t expect me to work for free. Fair enough?

(*Sometimes I just request they pay it forward, if they’re a young person, since I benefited so much from intellectual generosity when I was a neophyte.)

1 June 2016

The Quinnovation eLearning Process Survey

Clark @ 8:08 am

In the interests of understanding where the market is, I’m looking to benchmark where organizations are. Sure, there are other data points, but I have my own questions I would like to get answered. So I’ve created a quick survey of seven questions (thanks, SurveyMonkey) I’d love for you to fill out.

My interest is in finding out about the processes used in designing and delivering elearning. While I’ve my own impressions, I thought it would be nice to bolster it with data. So here we are.
 
And I’m not asking what org you’re working for, because I’d appreciate honest answers. Please feel free to respond and circulate to those you know in other organizations (but try to only have one person from your org fill it out).

This is an experiment (hey, that’s what innovation is all about ;), so we’ll see how it goes. I’ll report out what happens when responses start petering out (or when I hit my 100 response cap ;). I welcome your comments or questions as well. Thanks!

Create your own user feedback survey

25 May 2016

A richer suite of support

Clark @ 8:08 am

While it’s easy to talk about how we need to support the transition from novice to expert, it might help to be a little more detailed.  While it’s easy to say that the role of formal learning wanes, and the role of informal learning ramps up, what are the types of support we might look to?

I expanded a core diagram I’ve been using for quite a while, based upon earlier diagrams from others.  It’s also been used by others, and the core of the diagram is clear, but I wanted to elaborate it. The underlying point is that as individuals gather expertise the value of formal learning drops, and the value of informal learning increases.  Ok, but what does that mean?

InFormalSpaces It means that courses make sense for novices, who don’t know what they need nor why it’s important. As they start performing however, their needs change. They start knowing what they need, and why it’s important, and they start just needing those resources.  They can be designed or curated, but they are either performance support in the moment or learning resources that develop understanding or abilities.  For the former, we’re talking about how-to videos, checklists, lookup tables, etc.  For the latter, we might be talking documents, documentaries, diagrams, or more interactive elements such as simulations.

At this stage we also need coaching and/or mentoring, and chances to communicate with our colleagues.  It’s the social work that will play a role in the development of the learner through interactions. Obviously, you can be doing communication in courses as well, and reflecting and collaborating at the practitioner stage as well, these are continua, not boxes as portrayed here.  The point, however, is that the nature of the necessary support and the activities change.

And, of course, once an individual advances far enough, there’s little anyone can be providing for them, instead they need the ‘creative friction’ of interactions with other experts and ideas to generate the new understandings that will advance the individual and the organization.  Reflecting together, solving problems to gather, and more, are all part of the activities that individuals undertake.

These activities don’t always happen well, and can be facilitated in many ways.  There are cultural factors as well.  There is a clear need for someone to be undertaking ensuring that these activities are happening in optimal ways in a conducive environment. It doesn’t have to be L&D, and it won’t be if all they do is focus on training and courses, but it should be someone who understands a bit about how we think, work, and learn.  And I don’t know another group that is better placed.  Can you?

Next Page »

Powered by WordPress