Learnlets

Secondary

Clark Quinn’s Learnings about Learning

Search Results for: engagement

Plans for 2010

6 January 2010 by Clark 1 Comment

The Learning Circuit’s Blog Big Question of the Month is “predictions and plans for 2010“, specifically:

  • What are your biggest challenges for this upcoming year?
  • What are your major plans for the year?
  • What predictions do you have for the year?

I’ve already blogged the predictions question, so I’ll just address the first two points.

As a consultant, my big challenge is always finding more people who I can help.   With my colleagues in the Internet Time Alliance, we’re looking for organizations that know they want to leverage the power of social media to develop a collective intelligence infrastructure, but need assistance.   Through Quinnovation, I’m looking to improve organizational learning design, whether through developing immersive learning simulation capability, mobile delivery, performance solutions, adaptive systems, content models, or all of the above as a strategic lever.   I’ve helped lots of folks, and it’s clear there’s more need, so I’m just looking for more opportunities to really improve things, and ways to find those opportunities.

My plans are severalfold.   First, I’ve got to finish the manuscript for my mobile book.   I’m also committed to execute against the contracts I already have to continue to deliver great solutions.   And I intend to continue experimenting, speaking (hope to see you at the Guild’s Learning Solutions conference in March), writing, and of course, consulting.

I’m also intending to elaborate on some recent thoughts on learning experience design.   I think there’s a real opportunity to wrap some definition around the different components that helps systematize the integration of engagement and effective learning.   This is a generalization of Engaging Learning, going broader in areas of application, and across technologies.   I think there’s a need (just look at all the bad elearning still out there), and as we start delivering learning in more distributed ways and in wider contexts, we need a conceptual framework that helps us design in meaningful ways.

Naturally, I welcome your participation and assistance in any of the above!

Blurring boundaries

7 December 2009 by Clark Leave a Comment

I just downloaded a couple of new apps onto my iPhone. Okay, so one was a free trial of a game, but the other was a really interesting offering, and it led to some thoughts about organizational silos and new functionality.

The app was a new release by ATT called Mark the Spot, that lets you report the occurrence and location of a problem with your coverage.   This is a new way to interact with customers, allowing them to serve as a agent of “can you hear me now”-style coverage evaluation.   Given that they’ve just turned up as the lowest rated carrier of the major four here in the US, according to leading consumer champion Consumer Reports, it’s a step in the right direction.

Now this is an instance of considering a broader reach of engagement in our conversations tapping into collective intelligence. As I’ve been learning with my colleagues in the Internet Time Alliance, tapping into collective intelligence goes beyond conversations internally to include partners and customers.     It’s also a broader interpretation of learning, in the senses that I argue we need to consider, including problem-solving, innovation, etc.   And it’s mobile.

So here’s the question I pondered: is this tech support?   Marketing?   And what occurred to me is that it just isn’t really easy to categorize.   It’s a dialog with the customer, gathering data about coverage, which could be seen as market research.   They can also extend it via a call into a issue resolution exercise (ok, so the app doesn’t really make the call for you but could and should: “click to send the data and be connected to a representative”) .   You could even bake in some trouble-shooting support as a performance support exercise.

The approach, and the potential, crosses boundaries in terms of the benefits and how it must be supported organizationally.   We’re beginning to see a new notion of mashup that combines functionalities that might normally be seen in separate organizational areas, but from a customer perspective, they’re linked. And   we’re seeing a hybrid of communication capabilities, linking the data capabilities of an app with voice, and even media files (e.g. some trouble-shooting information).

Around 1999, the CEO of Cisco, John Chambers, opined that elearning was going to be so big that email would seem like a rounding error.   I think that it’s not just about education over the internet, but it’s really about the broader picture of learning including performance support, social learning, and it’s not just the desktop internet, but it’s mobile apps, and more.   The full performance ecosystem isn’t just within the organization, but it’s external as well. It’s what your company builds for you, what your ‘providers’ build for you (device, service, etc), and, ultimately, how you integrate that into your personal learning network.

The implications are huge.   How to organizations realign to make meaningful information environments for their employees, partners, and customers?   How do we skill up society to take advantage and shape this environment for the benefit of all?   And how do we develop ourselves to manage and optimize the environment to help us achieve our goals?

I think we are seeing an inflection point that will trump email, but it’s not about education, it’s about the broad intersection between people’s goals and our technology infrastructure.   And our role in that, as designers of learning experiences and performance ecosystems.   We have a fair bit of understanding of cognition and social interaction, and increasing experience with different technology capabilities.   Now it’s time to put that all to work to start creating meaningful new opportunities. Who’s game?

Ignoring Informal

14 October 2009 by Clark 4 Comments

I received in the mail an offer for a 3 book set titled Improving Performance in the Workplace.   It’s associated with ISPI, and greatly reflects their Human Performance Technology approach, which I generally laud as going beyond instructional design.   It’s also by Pfeiffer, who is my own publisher, and they’re pretty good as publishers go.   However, I noticed something that really struck me, based upon the work I’ve been doing with my colleagues in the Internet Time Alliance (formerly TogetherLearn).

The first volume is really about assessing needs, and design, and it includes behavioral task analysis and cognitive task analysis, and even talkes about engagement strategies in simulation and gaming, video gaming.   The second volume includes performance interventions, and includes elaerning, coaching, knowledge management, and more (as well as things like incentives, culture, EPSS, feedback, etc.   The third volume’s on measurement and evaluation.

All this is good: these are important topics, and having a definitive handbook about them is a valuable contribution (and priced equivalently, the whole set is bargain-priced at $400).   However, while I don’t have the book to hand to truly evaluate it, it appears that there are some gaps.

In my experience, some issues are not behavioral or cognitive but attitudinal.   Consequently, I’d have thought there might be some coverage.   There was a chapter in Jonassen’s old Handbook on Research in Ed Tech on the topic, and I’ve derived my own approach from that and some other readings. When they get into tools, they seem to miss virtual worlds, and they seem to have a repeat of the straw-man case against discovery environments (many years ago it was recognized that pure discovery wasn’t the go, and guided discovery was developed).   It bugs me that I can’t find the individual authors, but I do recognize the names of one of the editors.     But these aren’t the biggest misses, to me.

Overall, there seems to be no awareness of the whole thrust of social and informal learning.   Ok, so Jay’s book on Informal Learning is relatively new, and the concrete steps may still be being sorted out, but there’s a lot there.   Or perhaps it’s covered in Knowledge Management (after all, Marc Rosenberg’s been deeply involved in ISPI and wrote the Beyond e-Learning book).   Yet it seems a bit buried and muddled, and here’s why:

I’m working with a client now, and one of my tasks is surveying how they’re using social media.   A group responsible for technical training (and they’re an engineering organization) recognized that they weren’t able to keep up with the increasing quantity and quality of changes that were coming.   Rather than do a performance improvement intervention, they realized that another opportunity would be to start putting up information and inviting others to contribute.   They put up a wiki, and first maintained it internally, and then gradually devolved some of the responsibility out to their ‘customers’.

The point is, how does that fit into the traditional paradigm?   And yet, increasingly, we’re seeing and recommending approaches that go beyond the categories that fit here.   I wonder if their metrics include the outputs of enabling innovation.   I wonder if their interventions include expertise finders and collaboration tools. I wonder if their analyses include the benefits of ‘presence’.

Times are changing, faster and faster.   I think these books would’ve been the ideal thing, maybe 5 years ago.   Now, I think they’re emblematic of a training mindset when a larger perspective is needed.   These come into play after you’ve identified that a formal approach is needed.   They use a phrase of a ‘performance landscape’, but their picture doesn’t seem to include the concepts that Jay includes in his ‘learnscape’ and I as the ‘performance ecosystem’.

Complicit Clients

6 August 2009 by Clark 5 Comments

I regularly rail against cookie-cutter learning design, boring elearning, etc.   I like to blame it on designers who don’t know the depths of learning behind the elements of design, and perhaps also on managers who don’t work to ensure that the learning objectives are tied closely to meaningful business outcome.   And I think that’s true, but of course there’s another culprit as well: clients who just ask for the same old thing!

I regularly work with a couple of partners who use me when there’s a need to go to the ‘next level’, whether it’s to mobile, pushing the engagement envelope, or working more strategically (that’s one of the way I help clients, too).   However, too often they’re just asked to turn content into courses, and the clients don’t care that the learning objectives in that content are too low-level, too knowledge-focused, completely abstract or de-contextualized, and generally not meaningful.   Now, my partners generally push back a bit, trying to help the client realize the value of a deeper design, but many times the client doesn’t want to put any more money in, doesn’t want to think about it, they just want that course up with a quiz (even with a pre-test!, *shudder*).   And my partners will go along, because creating elearning is their business and they can’t just turn away work.

And I’ve heard that from in-h0use departments as well.   As one of the attendees at my strategic elearning workshop a couple of months ago said, the managers from other business units say “just do that stuff you do” and don’t want any deeper thought into it.   They want it fast, based upon the content, and apparently don’t care that it isn’t going to lead to any meaningful change.   Or don’t know the difference. Hey, they learned that way, so it must be OK, right?

However, I think we owe it to the learners, to those clients, and to ourselves to start educating those clients, internal or external, about good learning.   You’ve got to know it yourself first, of course, but once you’re doing it anyway, there’s really no extra overhead at the first level.   But you want to start pushing back: “what’s the behavior that needs to change/”, or “what decisions do they need to be able to make that they can’t make correctly now?”   And, we need to ask “how will you know that it’s changed? What are the metrics that you’re trying to impact?”   Once you’ve got them thinking about measurable change, you have the opportunity to start talking about meaningful impact and good design to achieve outcomes.

Frankly, you can’t complain about relevance to the organization if you’re not fighting to achieve better outcomes, ones that matter.   So, educate yourselves, improve your processes, and then fight to be doing more meaningful stuff.   Hey, we’re supposed to be about learning, and marketing our services is really about good customer education! Get them educated, and get to be doing more meaningful and consequently rewarding design.

Making designing good learning easier

30 July 2009 by Clark Leave a Comment

On my last post, I got a comment that really made me think.   The problem was content coming as PPTs from SMEs, and the question was poignant: “Given limited time and resources on a project how can you plan in advance to ensure that your learning is engaging and creates effective outcomes?”   I commented a reply, but I’d like to elaborate on that.

I like the focus on the ‘planning’ part: what can you do up front to increase the quality of your learning outcomes?   It’s a recursive design problem: people need to be able to design better, what training, job aids, tools, and/or social learning can we develop to make this work?   Having just done this on a project where a team I was a member of   were responsible for generating a whole curriculum around the domain, I can speak with some confidence about how to make this work.

First, are the tools.   Too often, the templates enforce rigor around having the elements, rather than about what makes those elements really work.   So, on the project, I not only guided the design of the templates, but the definitions associated with the elements that helped ensure they accomplished the necessary learning activities.   For example, it’s no good to have an introduction that doesn’t activate the relevant prior experience and knowledge, doesn’t help the learner comprehend why this learning is important, or even accomplishes this in an aversive way (can you say: “pre-test“?   :).   This is the performance support component, that helps make it easy to do things well and more difficult to do the wrong thing.   Similarly with ensuring meaningful activity in the first place, etc.

Next is the understanding.   This comes both by creating a shared understanding in the team, and then refining the process, making the outcome a ‘habit’.   First, I’d worked with some of the team before, so they shared my design principles, then I presented and co-developed with the client that understanding.   Then, as first draft content came out, I’d critique it and used that to tune the template, and the understanding amongst the content developers.

The involvement in refining the design process took some time, but really paid off as the quality of the resulting output took a steep increase and then stabilized as good quality learning experience yet reproducible in a cost-effective way and sustainable and manageable way.

As I’ve mentioned before, the nuances between bad elearning and really effective and engaging content are subtle to the untrained eye, but the outcomes are not, both subjectively from the learner’s experience, and objectively from the outcomes.   You should be collecting both those metrics, and reviewing the outcomes, as they both provide useful information about how your design is working (or not) and how to improve it.

If it matters, and it should, you really should be reviewing and tuning your processes to achieve engagement and learning outcomes.   It’s not more expensive, in the long term, though it does take more work.   But otherwise, it’s just a waste of money and that is expensive!   You’ll end up in the situation Charles Jenning’s cites, when”you might as well throw the money spent on these activities out the window.”   Don’t waste money, spend the time assuring that your learning design processes achieve what they need to.   Your organization, and your learners, will thank you.

Getting Revolutionary: LC Big Q

3 April 2009 by Clark 1 Comment

The Learning Circuits’ Blog Big Question of the Month is whether and how get ‘unstuck’, when you’ve got a lot to offer and it’s well beyond what they expect you to do in your job.

This actually resonates with two separate things, some thoughts around ‘being revolutionary’, and a previous post based upon a similar complaint that triggered this month’s question (must be a lot of understandable angst out there).   The previous post was about trying to meet unreasonable expectations, and the individual wasn’t getting the support they needed to do the job the way it should be done.   Similarly the big question was triggered by someone knowing what should be done but feeling trapped.

The thread that emerges, for me, is that training departments can’t keep operating in the same old way, despite the fact that formal instruction doesn’t have to die (just improve).   Incrementalism isn’t going to be enough, as optimal execution is going to be just to stay in the game, and the competitive advantage will be the ability to innovate new value to offer.   It’s just too easy to copy a successful product or service, and the barriers to entry aren’t high enough to prevent competition.   You never know when a viral or chaotic event will give someone a marketing advantage, so you’ve got to keep moving.

Trying to keep to the status quo, or slowly expand your responsibility is going to fail, as things are moving too quickly. You have to seize the responsibility now to take on the full suite of performance elements: job aids, portals, social learning, content and knowledge management, and more, and start moving.   It still has to be staged, but it’s a perspective shift that will move you more strategically and systemically towards empowering your organization.

And back to the tactics, what do you do when your clients (internal or external) aren’t pushing you for more and better?   Show them the way.   While I’ve learned that conceptual prototypes don’t always work (some folks can’t get beyond the lack of polish, even when you’re just showing the proof of concept), try and mock up what is on offer, and talk them through it. Help them see why it’s better.   Do a back of the envelope calculation about how it’s better.   Bring in all the factors: outcomes, performance, engagement, learner experience, whatever it takes.

Then, if they don’t want it, do your best within the constraints to do it anyway (write better objectives, practice, etc. even if they won’t appreciate it), and live with what you can do.   And, truly, if you’re capable of more (not more work, better/smarter work), and it’s on offer but continually not accepted, it probably is time to move on.   Don’t give in, keep up the fight for better learning, your learners need it!

Monday Broken ID Series: Process

22 March 2009 by Clark 5 Comments

Previous Series Post

This is the last formal post in a series of thoughts on some broken areas of ID that I’ve been posting for Mondays.   The intention is to provide insight into many ways much of instructional design fails, and some pointers to avoid the problems. The point is not to say ‘bad designer’, but instead to point out how to do better design.

We’ve been talking about lots of ways instructional design can be wrong, but if that’s the case, the process we’re using must be broken too.   If we’re seeing cookie-cutter instructional design, we must not be starting from the right point, and we must be going about it wrong.

Realize that the difference between really good instructional design, and ordinary or worse, is subtle.   Way too often I’ve had the opportunity to view seemingly well-produced elearning that I’ve been able to dismantle systematically and thoroughly.   The folks were trying to do a good job, and companies had paid good money and thought they got their money’s worth.   But they really hadn’t.

It’d be easy to blame the problems on tight budgets and schedules, but that’s a cop-out.   Good instructional design doesn’t come from big budgets or unlimited timeframes, it comes from knowing what you’re doing.   And it’s not following the processes that are widely promoted and taught.

You know what I’m talking about – the A-word, that five letter epithet – ADDIE.   Analysis, Design, Development, Implementation, and Evaluation.   A good idea, with good steps, but with bad implementation.   Let me take the radical extreme: we’re better off tossing out the whole thing rather than continue to allow the abominations committed under that banner.

OK, now what am I really talking about?   I was given a chance to look at an organization’s documentation of their design process.   It was full of taxonomies, and process, and all the ID elements.   And it led to boring, bloated content.   If you follow all the procedures, without a deep understanding of the underpinnings that make the elements work, and know what can be finessed based upon the audience, and add the emotional elements that instructional design largely leaves out (with the grateful exception of Keller’s ARCS model).

The problem is that more people are doing design than have sufficient background, as Cammy Bean’s survey noted.   Not that you have to have a degree, but you do have to have the learning background to understand the elements behind the processes.   Folks are asked to become elearning designers and yet haven’t really had the appropriate training.

Blind adherence to ADDIE will, I think, lead to more boring elearning than someone creative taking their best instincts about how to get people to learn.   Again, Cathy Moore’s Action Mapping is a pretty good shortcut that I’ll suggest will lead to better outcomes than ADDIE.

Which isn’t to say that following ADDIE when you know what you’re doing, and have a concern for the emotional and aesthetic side (or a team with same), won’t yield a good result, it will.   And, following ADDIE likely will yield something that’s pretty close to effective, but it’s so likely to be undermined by the lack of engagement, that there’s a severe worry.

And, worse, there’s little in their to ensure that the real need is met, asking the designer to go beyond what the SME and client tells you and ensure that the behavior change is really what’s needed.   The Human Performance Improvement model actually does a better job at that, as far as I can tell.

It’s not hard to fix up the problem.   Start by finding out what significant decision-making change will impact the organization or individual, and work backward from there, as the previous posts have indicated. I don’t mean to bash ADDIE, as it’s conceptually sound from a cognitive perspective, it just doesn’t extend far enough pragmatically in terms of focusing on the right thing, and it errs too much on the side of caution instead of focusing on the learner experience.It’s not clear to me that ADDIE will even advocate a job aid, when that’s all that’s needed (and I’m willing to be wrong).

Our goal is to make meaningful change, and that’s what we need to do.   I hope this series will enable you to do more meaningful design.   There may be more posts, but I’ve exhausted my initial thoughts, so we’ll see how it goes.

Meeting unreasonable needs

20 March 2009 by Clark 3 Comments

I was contacted yesterday by a relatively new ID person, who was in a tough spot.   This person understood the principles of Tony Karrer’s “Before You Ask” post, as the situation was well laid out.   Some help was asked for (clearly no expectation other than, perhaps, a thoughtful reply; the circumstances were quite clear).

The situation is that this person is the support for an LMS across multiple geographic locations.   The ID was hired to do ‘training’ on the system, but access to SMEs is limited at beast, the uses in the different contexts were different enough that a course model isn’t a viable solution, yet this person wasn’t clear on what alternatives to take: “I am beginning to think that the position is flawed in its design.”

For what it’s worth, here’s what I replied (slightly modified for clarity and anonymity):

First, I’d offer a pointer to John Carroll’s minimalist instruction (via “The Nurnberg Funnel”).   He taught a word processing system via a set of cards that trumped the instructionally designed manual by focusing on the learners’ existing knowledge and goals.   It’d be one way to ‘teach photography’ instead of ‘the camera’.

Of course, I also recommend teaching ‘the model’, not the software *nor* the task. That is, what is the LMS’s underlying model, and how does it lead you to predict how to do x, y, and z.   If you can teach the model, and through a couple of examples and practice get them to be able to infer how to do other tasks, you’ve minimized ‘training’ and maximized their long-term success.   Your lack of access to SMEs means you have to become one, however, I reckon.   Doing good ID does mean more responsibility on the designer in any case.   Sorry.

On top of either approach (common tasks, or model-based learning) consider that your role is to put out some basic materials (don’t think training, think job aids), and then serve as a ‘consultant’.   Have them come to you to ask how to do things, and either create FAQ’s or more job aids, depending on their need and your assessment of the value proposition in either.   So don’t think your only solution is ‘training’.

Also consider gestating a ‘community’ to surround your wiki, and grow it into a self-help resource that people can get into to the level they can handle.   Have discussion board where people can post questions. You’ll be busy at first, but if they find value, it can grow to be self-sustaining.   People will often self-help, if it’s easy enough.

BTW, another organization had some success many years ago starting with a central office, bringing in and training local ‘champions’ who gradually moved the locus of responsibility back to their unit.   Of course, they got buy-in to do so, but you might try to work with your early adopters and help them become the local resources.

Overall, don’t try to accomplish everything with ‘the course’, but look to the broader range of performance ecosystem components (if you’ve followed my blog, you know I’m talking job aids, ecommunity, etc) and balance your efforts appropriately.

The response was that this was, indeed, helpful.   I feel for the person in the situation of having to do a particular role when the ‘received wisdom’ about how to do it is at odds with what really is useful, and is underresourced to boot. A too-frequent situation, and probably not decreasing, sigh.   But taking the broader performance perspective is a useful framework I also found useful in another recent engagement, professional development for teachers.   Don’t just worry about getting them the basics, and develop them as practitioners, even into experts, as well.   Moreover, help them help themselves!

This is just the type of situation where taking a step back and looking at what is being done can yield ways to rethink, or even just fine-tune the approach.   I typically find that it’s the case that there *are* such opportunities, and it’s an easy path to better outcomes.   Of course, I also find that years of experience and a wealth of relevant frameworks makes that easier ;).   What is your experience in adapting to circumstances and improving situations?

Monday Broken ID Series: Examples

22 February 2009 by Clark 2 Comments

Previous Series Post |Next Series Post

This is one in a series of thoughts on some broken areas of ID that I‘m posting for Mondays.   I intend to provide insight into many ways much of instructional design fails, and some pointers to avoid the problems. The point is not to say ‘bad designer‘, but instead to point out how to do good design.

I see several reliable problems with examples, and they aren‘t even the deepest problems. They tend to be mixed in with the concept, instead of separate, if they exist at all.   Then, when they do exist, too often they‘re cookie-cutter examples, that don‘t delve into the necessary elements that make examples successful, let alone are intrinsically interesting, yet we know what these elements are!

Conceptually, examples are applications of the concept in a context.   That is, we have a problem in a particular setting, and we want to use the model as a guide to solving the problem. Note that the choice of examples is important. The broader the transfer space, that is, the more general the skills, the more you want examples that differ in many respects.   Learners generalize the concept from the examples, and the extent to which they‘ll generalize to all appropriate situations depends on the breadth of contexts they‘ve seen (across both examples and practice).   You need to ensure that the contexts the learner sees are as broadly disparate as possible.

Note that we should also be choosing problems and contexts that are of interest to the audience.   Going beyond just the cognitive role, we should be trying to tap into the motivational and engagement factors.   Factor that into the example design as well!

Now, we know that examples have to show the steps that were taken.   They have to have specific steps from beginning to end.   And, I add, those steps have to refer back to the concept that guides the presentation.   You can‘t just say “first you do this, then you do this”, etc, you have to say “first, using the model, you do this, and then the model says to do that”.   You need to show the steps, and the intermediate work products.   Annotating them is really important.

And that annotation is not just the steps, but also the underlying thought processes.   The problem is, experts don‘t even have access to their thought processes anymore!   Yet, their thinking really works along lines like “well, I could‘ve done A, but because of X, and thought B was a better approach, and then I could do C, but because of Y I tried D”, etc.   The point being, there‘s a lot of contextual clues that they evaluate that aren‘t even conscious, yet these clues are really important for learners. (BTW, this is one of the many reasons I recommend comics in elearning, thought bubbles are great for cognitive annotation.)

Another valuable component is showing mistakes and backtracking. This is a hard one to get your mind around, and yet it‘s powerful both cognitively and emotionally.   First, experts model the behavior perfectly, and when learners try, they make mistakes, and may turn off emotionally (“I‘m having trouble, and it looks so easy, I must not be good at this”).   In reality, experts make mistakes all the time, and learners need to know that. It keeps you from losing them altogether!

Cognitively it‘s valuable, too.   When experts show backtracking and repair, they‘re modeling the meta-skills that are part of the expertise.   Unpacking that self-monitoring helps learners internalize the ‘check your answer‘ component that‘s part of expert performance.   This takes more work on the part of the designer, like we had with the concept, but if the content is important (otherwise, why are you building a course), it‘s worth doing right.

Finally, I believe it‘s important to convey the example as a story.   Our brains are wired to comprehend stories, and a good narrative has better uptake.   Having a protagonist documenting the context and problem, and then solving it with the model to achieve meaningful outcomes, is more interesting, and consequently more memorable.   We can use a variety of media to tell stories, from prose, through audio (think mobile and podcasts) and narrated slideshow, animation, or video.   Comics are another channel.   Stories also are useful for conveying the underlying thought processes, via thought bubbles or reflective narration (“What was I thinking?…”).

So, please do good examples.   Be exemplary!

Strategy, strategically

21 February 2009 by Clark Leave a Comment

In addition to working on the technology plan for my school district, I’ve also been assisting a not-for-profit trying to get strategic about technology.   The struggles are instructive, but looking across these two separate instances as well as the previous organizations I’ve assisted, I’m realizing that there are some common barriers.

The obvious one is time. The old saying about alligators and draining the swamp is too true, and it’s only getting worse.   Despite an economic stimulus package for the US and other countries, and (finally) a budget in my own state, things are not likely to get better soon.   Even if companies could hire back everyone they’ve laid off, the transition time would be significant.   It’s hard to sit back and reflect when you’re tackling more work with less resources.   Yet, we must.

The second part is more problematic.   Strategic thinking isn’t easy or obvious, at least to all.   For some it’s probably in their nature, but I reckon for most it takes a breadth of experience and an ability to abstract from that experience to take a broader perspective.   Abstraction, I know from my PhD research on analogy, isn’t well done without support.   Aligning that perspective with organizational goals simultaneously adds to the task.   Doing it keeping both short- and long-term values, for several different layers of stakeholders, and you’re talking some serious cognitive overhead.

We do need to take the time to be strategic.   As I was just explaining on a call, you don’t want to be taking small steps that aren’t working together towards a longer-term goal.   If you’re investing in X, and Y, and Z, and each one doesn’t build on each other, you’re missing an opportunity. If you’ve alternatives A & B, and A seems more expedient, if you haven’t looked to the future you might miss that B is a better long term investment.   If you don’t evaluate what else is going on, and leverage those initiatives because you’re just meeting your immediate needs, you’re not making the best investment for the organization, and putting yourself at risk.   You need to find a way to address the strategic position, at least for a percentage of your time (and that percentage goes up with your level in the organization).

To cope, we use frameworks and tools to help reduce the load, and follow processes to support systematicity and thoroughness. The performance ecosystem framework is one specific to use of technology to improve organizational learning, innovation, and problem-solving, but there are others.   Sometimes we bring in outside expertise to help, as we may be too tightly bound to the context and an external perspective can be more objective.

You can totally outsource it, to a big consulting company, but I reckon that the principle of ‘least assistance‘ holds here too.     You want to bring in top thinking in a lightweight way, rather than ending up with a bunch of interns trying to tie themselves to you at the wrist and ankles.   What can you do that will provide just the amount of help you need to make progress?   I have found that a lightweight approach can work in engagements with clients, so I know it can be done.   Regardless, however of wWhether you do it yourself, with partners, or bring in outside help, don’t abandon the forest for the trees, do take the time.   You need to be strategic, so be strategic about it!

« Previous Page
Next Page »

Clark Quinn

The Company

Search

Feedblitz (email) signup

Never miss a post
Your email address:*
Please wait...
Please enter all required fields Click to hide
Correct invalid entries Click to hide

Pages

  • About Learnlets and Quinnovation

The Serious eLearning Manifesto

Manifesto badge

Categories

  • design
  • games
  • meta-learning
  • mindmap
  • mobile
  • social
  • strategy
  • technology
  • Uncategorized
  • virtual worlds

License

Previous Posts

  • March 2026
  • February 2026
  • January 2026
  • December 2025
  • November 2025
  • October 2025
  • September 2025
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • April 2011
  • March 2011
  • February 2011
  • January 2011
  • December 2010
  • November 2010
  • October 2010
  • September 2010
  • August 2010
  • July 2010
  • June 2010
  • May 2010
  • April 2010
  • March 2010
  • February 2010
  • January 2010
  • December 2009
  • November 2009
  • October 2009
  • September 2009
  • August 2009
  • July 2009
  • June 2009
  • May 2009
  • April 2009
  • March 2009
  • February 2009
  • January 2009
  • December 2008
  • November 2008
  • October 2008
  • September 2008
  • August 2008
  • July 2008
  • June 2008
  • May 2008
  • April 2008
  • March 2008
  • February 2008
  • January 2008
  • December 2007
  • November 2007
  • October 2007
  • September 2007
  • August 2007
  • July 2007
  • June 2007
  • May 2007
  • April 2007
  • March 2007
  • February 2007
  • January 2007
  • December 2006
  • November 2006
  • October 2006
  • September 2006
  • August 2006
  • July 2006
  • June 2006
  • May 2006
  • April 2006
  • March 2006
  • February 2006
  • January 2006

Amazon Affiliate

Required to announce that, as an Amazon Associate, I earn from qualifying purchases. Mostly book links. Full disclosure.

We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.