Learnlets
Clark Quinn's Learnings about Learning
(The Official Quinnovation blog)

17 September 2014

Learning in 2024 #LRN2024

Clark @ 8:14 am

The eLearning Guild is celebrating it’s 10th year, and is using the opportunity to reflect on what learning will look like 10 years from now.  While I couldn’t participate in the twitter chat they held, I optimistically weighed in: “learning in 2024 will look like individualized personal mentoring via augmented reality, AI, and the network”.  However, I thought I would elaborate in line with a series of followup posts leveraging the #lrn2024 hashtag.  The twitter chat had a series of questions, so I’ll address them here (with a caveat that our learning really hasn’t changed, our wetware hasn’t evolved in the past decade and won’t again in the next; our support of learning is what I’m referring to here):

1. How has learning changed in the last 10 years (from the perspective of the learner)?

I reckon the learner has seen a significant move to more elearning instead of an almost complete dependence on face-to-face events.  And I reckon most learners have begun to use technology in their own ways to get answers, whether via the Google, or social networks like FaceBook and LinkedIn.  And I expect they’re seeing more media such as videos and animations, and may even be creating their own. I also expect that the elearning they’re seeing is not particularly good, nor improving, if not actually decreasing in quality.  I expect they’re seeing more info dump/knowledge test, more and more ‘click to learn more‘, more tarted-up drill-and-kill.  For which we should apologize!

2. What is the most significant change technology has made to organizational learning in the past decade?

I reckon there are two significant changes that have happened. One is rather subtle as yet, but will be profound, and that is the ability to track more activity, mine more data, and gain more insights. The ExperienceAPI coupled with analytics is a huge opportunity.  The other is the rise of social networks.  The ability to stay more tightly coupled with colleagues, sharing information and collaborating, has really become mainstream in our lives, and is going to have a big impact on our organizations.  Working ‘out loud’, showing our work, and working together is a critical inflection point in bringing learning back into the workflow in a natural way and away from the ‘event’ model.

3. What are the most significant challenges facing organizational learning today?

The most significant change is the status quo: the belief that an information oriented event model has any relationship to meaningful outcomes.  This plays out in so many ways: order-taking for courses, equating information with skills, being concerned with speed and quantity instead of quality of outcomes, not measuring the impact, the list goes on.   We’ve become self-deluded that an LMS and a rapid elearning tool means you’re doing something worthwhile, when it’s profoundly wrong.  L&D needs a revolution.

4. What technologies will have the greatest impact on learning in the next decade? Why?

The short answer is mobile.  Mobile is the catalyst for change. So many other technologies go through the hype cycle: initial over-excitement, crash, and then a gradual resurgence (c.f. virtual worlds), but mobile has been resistant for the simple reason that there’s so much value proposition.  The cognitive augmentation that digital technology provides, available whenever and wherever you are clearly has benefits, and it’s not courses!  It will naturally incorporate augmented reality with the variety of new devices we’re seeing, and be contextualized as well.  We’re seeing a richer picture of how technology can support us in being effective, and L&D can facilitate these other activities as a way to move to a more strategic and valuable role in the organization.  As above, also new tracking and analysis tools, and social networks.  I’ll add that simulations/serious games are an opportunity that is yet to really be capitalized on.  (There are reasons I wrote those books :)

5. What new skills will professionals need to develop to support learning in the future?

As I wrote (PDF), the new skills that are necessary fall into two major categories: performance consulting and interaction facilitation.  We need to not design courses until we’ve ascertained that no other approach will work, so we need to get down to the real problems. We should hope that the answer comes from the network when it can, and we should want to design performance support solutions if it can’t, and reserve courses for only when it absolutely has to be in the head. To get good outcomes from the network, it takes facilitation, and I think facilitation is a good model for promoting innovation, supporting coaching and mentoring, and helping individuals develop self-learning skills.  So the ability to get those root causes of problems, choose between solutions, and measure the impact are key for the first part, and understanding what skills are needed by the individuals (whether performers or mentors/coaches/leaders) and how to develop them are the key new additions.

6. What will learning look like in the year 2024?

Ideally, it would look like an ‘always on’ mentoring solution, so the experience is that of someone always with you to watch your performance and provide just the right guidance to help you perform in the moment and develop you over time. Learning will be layered on to your activities, and only occasionally will require some special events but mostly will be wrapped around your life in a supportive way.  Some of this will be system-delivered, and some will come from the network, but it should feel like you’re being cared for in the most efficacious way.

In closing, I note that, unfortunately,my Revolution book and the Manifesto were both driven by a sense of frustration around the lack of meaningful change in L&D. Hopefully, they’re riding or catalyzing the needed change, but in a cynical mood I might believe that things won’t change near as much as I’d hope. I also remember a talk (cleverly titled: Predict Anything but the Future :) that said that the future does tend to come as an informed basis would predict with an unexpected twist, so it’ll be interesting to discover what that twist will be.

16 September 2014

On the Road Fall 2014

Clark @ 8:05 am

Fall always seems to be a busy time, and I reckon it’s worthwhile to let you know where I’ll be in case you might be there too! Coming up are a couple of different events that you might be interested in:

September 28-30 I’ll be at the Future of Talent retreat  at the Marconi Center up the coast from San Francisco. It’s a lovely spot with a limited number of participants who will go deep on what’s coming in the Talent world. I’ll be talking up the Revolution, of course.

October 28-31 I’ll be at the eLearning Guild’s DevLearn in Las Vegas (always a great event; if you’re into elearning you should be there).  I’ll be running a Revolution workshop (I believe there are still a few spots), part of  a mobile panel, and talking about how we are going about addressing the challenges of learning design at the Wadhwani Foundation.

November 12-13 I’ll be part of the mLearnNow event in New Orleans (well, that’s what I call it, they call it LearnNow mobile blah blah blah ;).  Again, there are some slots still available.  I’m honored to be co-presenting with Sarah Gilbert and Nick Floro (with Justin Brusino pulling strings in the background), and we’re working hard to make sure it should be a really great deep dive into mlearning.  (And, New Orleans!)

There may be one more opportunity, so if anyone in Sydney wants to talk, consider Nov 21.

Hope to cross paths with you at one or more of these places!

10 September 2014

Learning Engineering

Clark @ 8:37 am

Last week I had the opportunity to attend the inaugural meeting of the Global Learning Council.  While not really global in either sense (little representation from overseas nor from segments other than higher ed), it was a chance to refresh myself in some rigor around learning sciences. And one thing that struck me was folks talking about learning engineering.

If we take the analogy from regular science and engineering, we are talking about taking the research from the learning sciences, and applying it to the design of solutions.  And this sounds like a good thing, with some caveats.  When talking about the Serious eLearning Manifesto, for example, we’re talking about principles that should be embedded in your learning design approach.

While the intention was not to provide coverage of learning science, several points emerged at one point or another as research-based outcomes to be desired. For one, the value of models in learning.  Another was, of course, the value of spacing practice. The list goes on.  The focus of the engineering, however, is different.

While it wasn’t an explicit topic of the talk, it emerged in several side conversations, but the focus is on design processes and tools that increase the likelihood of creating effective learning practices.  This includes doing a suitable job of creating aligned outcomes through processes of working with SMEs, identifying misconceptions to be addressed, ensuring activities are designed that have learners appropriately processing and applying information, appropriate spread of examples, and more.

Of course, developing an accurate course for any topic is a thorough exercise.  Which is desirable, but not always pragmatic.  While the full rigor of science would go as far as adaptive intelligent tutoring systems, the amount of work to do so can be prohibitive under pragmatic constraints.  It takes a high importance and large potential audience to do this for other than research purposes.

In other cases, we use heuristics.  Sometimes we go too far; so just dumping information and adding a quiz is often seen, though that’s got little likelihood of having any impact.  Even if we do create an appropriate practice, we might only have learners practice until they get it right, not until they can’t get it wrong.

Finding the balance point is an ongoing effort. I reckon that the elements of good design is a starting point, but you need processes that are manageable, repeatable, and scalable.  You need structures to help, including representations that have support for identifying key elements and make it difficult to ignore the important elements.  You ideally have aligned tools that make it easy to do the right things.

And if this is what Learning Engineering can be, systematically applying learning science to design, I reckon there’s also a study of learning science engineering, aligning not just the learning, but the design process, with how we think, work, and learn.  And maybe then there’s a learning architecture as well – where just as an architect designs the basic look and feel of the halls & rooms and the engineers build them – that designs the curriculum approach and the pedagogy, but the learning engineers follow through on those principles for developing courses.

Is learning engineering an alternative to instructional design?  I’m wondering if the focus on engineering rather than design (applied science, rather than art) and learning rather than instruction (outcomes, not process), is a better characterization.  What do you think?

26 August 2014

Aspiration trumps trepidation

Clark @ 7:43 am

Last week’s #lrnchat (a twitter chat on learning that runs Thurs evenings for an hour 5:30 PT/8:30 ET) was on the topic of fear-mongering in Organizational Learning.  The point is that often fear-mongering happens (by definition always wrongly), but what are the reasons, impacts, and ways to avoid.  And among my responses are one that I like as a quip.

I was, in particular, flashing back on the book Story Wars, that talked about how advertising has changed. This was in the context of fear-mongering as an approach to motivating behavior. In that book, they cited how advertisements in older days were designed to target your concerns. In essence, they made you worry about shortcomings as a motivation to buy remedies, whether to address your personal hygiene or appearance of success.

What’s changed is that they’ve now moved on to finding out that what is more motivating is tapping into your goals.  What are you trying to achieve? Who are you and what reflects your passions?  Then they provide products that can align with your self-image.  Of course, their ability to target your market segment is much more advanced, so they know more about who you are and have more specific means of reaching you.

In learning, however, this is also true. It’s far better to tap into your aspirations to motivate your learning than drumming on your fears. The latter will work some, if you’ve got legitimate concerns (e.g. losing your job), but far better is to help you understand how this will help you.

So, when it comes to motivation, I’ll argue that targeting aspiration trumps targeting  trepidations. Help people understand why this is valuable or important, not fear of the consequences of failure to comply. It’s part of a better culture, and a better workplace.  And that’s something you aspire to, right? ;)

21 August 2014

Rethinking Design: Curriculum

Clark @ 8:13 am

In addition to yesterday’s post about pedagogy, I also mentioned the need to get deeper on curriculum as well.  The notion is that we need to redefine curriculum as a way to get away from a content base, and start moving to an activity base.

a task-centered curriculumThe focus is on creating a curriculum that has tasks at it’s core, and these tasks are tasks like the learner will be expected to be performing in the world.  These tasks can be viewed as competencies that, if the learner possesses them, there is evidence that they are prepared to perform.

The goal is to choose tasks, with the final task likely being chosen first and working backwards (as in Understanding By Design) to determine what needs to be done.  From each chosen ‘task’ will be a suite of activities that comprise the pre-, in-, and post-class activities, but here we are focusing on the overall curriculum before we get into the individual pedagogy.

Note that the content is subsequently chosen to support successful execution of the tasks, and is not presented to the learner but is, instead, made available to them at the time of the task.  The goal is to have them process the content in service of accomplishing the task, an approach more consonant with our cognitive architecture. We’re more likely to remember information we’ve had to process rather than information we’ve just been presented with.  Information that a learner can recite is unlikely to be activated at a relevant time (“inert knowledge) unless it’s been applied, and the focus should be on the application, not the recitation.

Again, this turns out to be  very much the approach of Roger Schank of Socratic Arts as well, but emerged from my own thoughts and frustrations before I found out about the Story-Centered Curricula.  Of course, I referenced the Goal-Based Scenarios in my book on learning (game) design, but I hadn’t been aware of his curricular processes at the time of this or my Reimagining rant pointed to above.

There’re more details required to fully flesh out the process here. I reckon it’d be easier to do with an employer than faculty, but even with faculty, I reckon if you can get more than one in a room, focus on decisions, and help them understand the power of activity-focused learning, it’ll work. Fingers crossed ;).

20 August 2014

Rethinking Design: Pedagogy

Clark @ 8:01 am

In thinking through how to design courses that lead to both engaging experiences and meaningful outcomes, I’ve been working on the component activities.  As part of that, I’ve been looking at elements such as pedagogy in pre-, in-, and post-class sessions so that there are principled reasons behind the design.

Pre-, In-, & Post-Class activities So, here I’m looking at trying for guidance to align what happens in all three sections.  In this case, two major types of activities have emerged: more procedural activities, such as using equipment appropriately; and more conceptual activities such as making the right decisions of what to say and do.  These aren’t clearly discriminated, but it’s a coarse description.

Of course, there’s an introduction that both emotionally and cognitively prepares the learner for the coming learning experience.

So for conceptual tasks, what we’re looking to do is drive learning to content.  In typical approaches, you’d be presenting conceptual information (e.g. ‘click to see more‘) and maybe asking quiz questions.  Here, I’m looking to make the task of processing the information to generate something, whether a document, presentation, or whatever, and that the processing is close to the way the information will be used.  So they might create a guide for decisions (e.g. a decision tree), or a checklist, or something that requires them to use the information. (And if the information doesn’t support doing, it’s probably not necessary.)  As support, in a recent conversation I heard that interviewed organizations said that making better decisions were the keys to better job performance.

Whereas in the procedural approach, we really want to give them practice in the task. It may be scaffolded, e.g. simplified, but it’s the same practice that they’ll need to be able to perform after the learning experience. Ideally, they’ll have to explore and use content resources to figure out how to do it appropriate, in a guided exploration sense, rather than just be given the steps.

In both cases, models are key to helping them determine what needs to happen.  Also in both cases, an instructor should be reviewing their output. In the conceptual case, learners might get feedback on their output, and have a chance to revise their creation.  In the case of the practice, the experience is likely a simulation, and the learner should be getting feedback about their success.  In either case, the instructor has information about how the cohort is doing.  So…

…for in-class learning, the learners should be reflecting on their performances, and the instructor should be facilitating that at the beginning, using the information about what’s working (and not).  Then there should be additional activities that the learners engage in that require them interacting with the material, processing (conceptual) or applying (procedural) it with each other and then with facilitated reflection.

Finally, the learners after class should be getting given elaborative activities.  In the case of the conceptual task, coming up with an elaborated version or some additional element that helps cement the learning would be valuable.  The practice or activity should get fleshed out to the point where the learner will be capable of appropriately acting after the learning experience, owing to sufficient practice and appropriate decontextualization. The goal is for retention over time and transfer to all appropriate situations.

Am I making sense here?

12 August 2014

Deeper activities

Clark @ 8:04 am

A while ago, I argued for an activity-based curriculum.  The point was to rebel against the usual content-based curriculum, and push us to more meaningful learning. And, of late, I’ve had a chance to reexamine both the curriculum ideas, and the pedagogical implications.

So I’ve been in a situation where I’ve been handed a curriculum already developed, and the content is already being fleshed out. In trying to move beyond good, albeit traditional, elearning, I’ve been working hard on the notion of what a meaningful activity (read: practice, task, etc) would be.  As context, we’re working here within a pre-class, in-class, and post-class model.

As a consequence, I’m pushing an alternative to what would be content presentation pre-class, practice and group discussion in-class, and simulation and summary assessment  as post-class.  While this is not too far from traditional blended learning, I’m also trying to get better alignment with what learners will be doing after the learning experience with sufficient practice.

So for each module, I’m looking for a meaningful practice.  For even a knowledge based task, I’m asking learners to develop something that requires them to integrate the knowledge, not just present and test.  And revisit the knowledge several times.  So, for example, if learners are looking at types of hacking attacks, I ask them to create a defense plan as a pre-class task.  They’ll get a chance to self-evaluate, and instructor feedback, before generating a second attempt.  In many ways, it doesn’t matter what they create as long as they’re making a suitably sincere effort, it’s the processing that matters.

A colleague asked whether this meant that they’d always generate a product of learning, and my preliminary answer was yes, and then I realized that there was another way to view it. It could also be the trace of the learner in a simulation, but in some sense that’s a product as well.  The important point is to have learners perform and create an output of that performance as a manifestation of their thinking.  It’s not taking knowledge tests (if it absolutely has to be known cold ‘in the head’, you’ve got the excuse for a tarted-up drill-and-kill, but make sure it absolutely does), but processing information in meaningful ways.

In-class, they’ll still be doing practice and reflection, but they should be processing and/or practicing. For example, discussing and comparing the guides, and maybe then an activity that refreshes their knowledge of the attacks.  The team came up with a game that has one side giving hints about an attack, and the other side trying to guess it.

After class, it’s more elaborated practice. For instance,  they might be implementing their defenses in a sim, or refining their attacks, or…  The activities need to reinforce and build, reactivating and reapplying the knowledge in ways that mimic how it will be used in the performance context so that practice is both meaningful and spaced.

I’m still working within the existing paradigm, but the work I’m inspired by is the work of Roger Schank and his team at Socratic Arts, where they’re rethinking the curriculum more comprehensively, where they get the subject matter experts (SMEs) to sit down and come up with a series of activities that are the curriculum. This is what I intended, but at this point I’m still working within an already underway curriculum. Even this small change will be better learning, and we will get to the curriculum as well ;).

5 August 2014

Learning Experience (LX) Drinking Game

Clark @ 8:07 am

Having found a fun site of bad UX design, someone else followed me who had the UX Drinking Game site.  And it made me think maybe we need an LX Drinking Game.  So I started tweeting out some drinking game rules. I encourage you to join in with the hashtag #LXDrinkingGame.

So, my first list of ‘drink’ cues is:

if they ask for a pre-test/post-test design

if the alternatives to the right answer are so silly or obvious that you don’t need to know anything

if the course objectives are ‘know’ or ‘understand’

if someone says ‘use a click to see more’

if someone says, use an avatar because people like it

if someone comes in and says “we need a course for this”

if someone dumps PDFs and/or PPTs on you and expects a course in a few days

if all they’re measuring is cost/time/seat

So, what other rules do we need?  Tweets (again, #LXDrinkingGame) or comments welcome ;).

31 July 2014

Layering on success

Clark @ 8:03 am

In a previous post, I talked about the layers around learning design.  One of the layers that’s increasingly interesting to me is the notion of the success skills, or meta-skills that are involved.  For example, the SCANS competencies are a decent suite of skills that recognize the general skills for success that cross different disciplines.

However, you really can’t focus on such skills in isolation. Like most meta-skills, they need to be applied in a domain.  As a consequence, they really need to be worked on while developing some other skills. That is, when  you’re developing a curriculum, you have opportunities to require using those skills, but they need to be explicitly included and better yet, assessed.

In the field of educational software, there have been many ‘games’ that claimed “develops problem-solving skills”. This wasn’t accurate, as most of them required problem-solving skills, but there was no development. Development would require assessing performance and providing feedback. And that’s what we want to do to develop these skills.

Competencies across curriculaSo my suggestion is to layer on these requirements across the curriculum, and assess them separately.  The skills, like organizing, problem-solving, communicating, researching, etc, are naturally part of an activity-based curriculum, but need to be deliberately inserted at reasonable rates and tracked.  It’s not hard, you choose this assignment (task/activity/practice) to include a presentation, that one to require research, another to require a design task, etc.  And you assess them across assignments.

So, you look at their repeated performance on each skill at each time they’re inserted.  You can provide support and gradually remove it (as you do for other skill-development practice).

The point is to not only develop the learner’s ability to acquire the curricular skills, but also to acquire the meta-skills.  For instance, if you are helping people acquire job skills, you are also developing their ability to hold the job, and self-improve over time.

Think of it this way.  People acquire a job by their ability to do X, but they will need to know how to work in a job context regardless of whether it’s X, Y, or Z.  Also, X will change to X+ and X++ over time, and the skills to keep up to date and move up require the meta-skills.

I think of this as one of the pillars of a successful education practice; develop the learners not only in the domain, but as learners.  Developing them as people, not only as practitioners of a competency.  I think this is a practical approach, what do you think?

16 July 2014

Models for learning

Clark @ 8:10 am

In a previous post, I suggested that we should not do the ‘click to learn more’, as it was just about presenting content.  But we do need to present content, so what content makes sense?  Obviously, examples are one thing, but let me make the case that the ‘how to’, the concept, should be in the form of a model.

There’s a problem in that Subject Matter Experts (SMEs) don’t have access to the what they do,but they have access to what they know, so it’s real easy to get a knowledge dump. And it’s hard work to make sense of it, sometimes, and it’s easier to just recite it. For example, expertise in many areas requires careful distinctions (e.g. such as in instructional design between the elements of learning).  However, it’s hard for learners to acquire all those careful distinctions without the underlying rationale of how they differ.

Similarly, most procedures to do something are guided not by arbitrary reasons, but instead are sequenced because of inherent constraints.  These constraints guide the proper procedures.  There’s a reason you do X before Y, and then a causal relationship that explains what you look for before deciding to do W instead of Z.

Too often, I see someone presenting learners with an arbitrary list of different things, when there are conceptual reasons why they differ. Similarly, I’ll see steps presented without a rationale for why. And in both cases, learners will remember better, and perform more robustly (particularly in environments with changes), if they have the model that explains what to do as well as the information.  While this might seem like more information, it’s really not, as the model minimizes the amount of arbitrary information you present. And it leads to better outcomes, so it would be worth it anyways.

Models give us a couple of useful things; they help us explain what has happened, and predict what will happen (e.g. if we do A, we’ll see B).  Which makes us more flexible in our actions, a useful trait.  As an aside, models also can draw upon metaphors to facilitate developing a useful understanding. Whether it’s flows, transformations, whatever, finding a concrete equivalent in the world can help recollection and application.

The problem, of course, is getting the model. It’s not always there, nor even easily inferable.  Which doesn’t mean you can ignore it.  The designer must be willing to work until they can understand it.  But it’s doable, and valuable.

So, please, model your learning design on the model of good learning with models. (Ok, I went too far there :)

Next Page »

Powered by WordPress