Learnlets

Secondary

Clark Quinn’s Learnings about Learning

Search Results for: quip

SME Brains

30 June 2015 by Clark 1 Comment

As I push for better learning design, I’m regularly reminded that working with subject matter experts (SMEs) is critical, and problematic.   What makes SMEs has implications that are challenging but also offers a uniquely valuable perspective.    I want to review some of those challenges and opportunities in one go.

One of the artifacts about how our brain works is that we compile knowledge away.  We start off with conscious awareness of what we’re supposed to be doing, and apply it in context.  As we practice, however, our expertise becomes chunked up, and increasingly automatic. As it does so, some of the elements that are compiled away are awarenesses that are not available to conscious inspection. As Richard Clark of the Cognitive Technology Lab at USC lets us know, about 70% of what SMEs do isn’t available to their conscious mind.  Or, to put it another way, they literally can’t tell us what they do!

On the other hand, they have pretty good access to what they know. They can cite all the knowledge they have to hand. They can talk about the facts and the concepts, but not the decisions.  And, to be fair, many of them aren’t really good at the concepts, at least  not from the perspective of being able to articulate a model that is of use in the learning process.

The problem then becomes a combination of both finding a good SME, and working with them in a useful way to get meaningful objectives, to start. And while there are quite rigorous ways (e.g. Cognitive Task Analysis), in general we need more heuristic approaches.

My recommendation, grounded in Sid Meier’s statement that “good games are a series of interesting decisions” and the recognition that making better decisions are likely to be the most valuable outcome of learning, is to focus rabidly on decisions.  When SMEs start talking about “they need to know X” and “they need to know Y” is to ask leading questions like “what decisions do they need to be able to make that they don’t make know” and “how does X or Y actually lead them to make better decisions”.

Your end goal here is to winnow the knowledge away and get to the models that will make a difference to the learner’s ability to act.  And when you’re pressed by a certification body that you need to represent what the SME tells you, you may need to push back.  I even advocate anticipating what the models and decisions are likely to be, and getting the SME to criticize and improve, rather than let them start with a blank slate. This does require some smarts on the part of the designer, but when it works, it leverages the fact that it’s easier to critique than generate.

They also are potentially valuable in the ways that they recognize where learners go wrong, particularly if they train.  Most of the time, mistakes aren’t random, but are based upon some inappropriate models.  Ideally, you have access to these reliable mistakes,  and the reason why they’re made. Your SMEs should be able to help here. They should know ways in which non-experts fail.  It may be the case that some SMEs aren’t as good as others here, so again, as in ones that have access to the models, you need to be selective.

This is related to one of the two ways SMEs are your ally.  Ideally, you’re equipped with stories, great failures and great successes. These form the basis of your examples, and ideally come in the form of a story. A SME should have some examples of both that they can spin and you can use to build up an example. This may well be part of your process to get the concepts and practice down, but you need to get these case studies.

There’s one other way that SMEs can help. The fact that they are experts is based upon the fact that they somehow find the topic fascinating or rewarding enough to spend the requisite time to acquire expertise. You can, and should, tap into that. Find out what makes this particular field interesting, and use that as a way  to communicate the intrinsic interest to learners. Are they playing detective, problem-solver, or protector? What’s the appeal, and then build that into the practice stories you ask learners to engage in.

Working with SMEs isn’t easy, but it is critical. Understanding what they can do, and where they intrinsic barriers, gives you a better handle on being able to get what you need to assist learners in being able to perform.  Here are some of my tips, what have you found that works?

Making Sense of Research

17 March 2015 by Clark Leave a Comment

A couple of weeks ago, I was riffing on sensors:  how mobile devices are getting equipped with all sorts of new sensors and the potential for more and what they might bring.  As part of that discussion was a brief mention of sensor nets, how aggregating all this data could be of interest too. And low and behold, a massive example was revealed last week.

The context was the ‘spring forward’ event Apple held where they announced their new products.  The most anticipated one was the Apple Watch (which was part of the driving behind my post on wearables), the new iConnected device for your wrist.  The second major announcement was their new Macbook, a phenomenally thin new laptop with some amazing specs on weight and screen display, as well as some challenging tradeoffs.

One announcement that was less noticed was the announcement of a new research endeavor, but I wonder if it isn’t the most game-changing element  of them all.  The announcement was ResearchKit, and it’s about sensor nets.

So, smartphones have lots of sensors.  And the watch will have more.  They can already track a number of parameters about you automatically, such as your walking.  There can be more, with apps that can ask about your eating, weight, or other health measurements.  As I pointed out, aggregating data from sensors could do things like identify traffic jams (Google Maps already does this), or collect data like restaurant ratings.

What Apple has done is to focus specifically on health data from their HealthKit, and partner with research hospitals. What they’re saying to scientists is  “we’ll give you anonymized health data, you put it to good use”. A number of research centers are on board, and already collecting data about asthma and more.  The possibility is to use analytics that combine the power of large numbers with a bunch of other descriptive data to be able to investigate things at scale.  In general, research like this is hard since it’s hard to get large numbers of subjects, but large numbers of subjects is a much better basis for study (for example, the China-Cornell-Oxford Project that was able to look at a vast breadth of diet to make innovative insights into nutrition and health).

And this could be just the beginning: collecting data en masse (while successfully addressing privacy concerns) can be a source of great insight if it’d done right.  Having devices that are with you and capable of capturing a variety of information gives the opportunity to mine that data for expected, and unexpected, outcomes.

A new iDevice is always cool, and while it’s not the first smart watch (nor was the iPhone the first smartphone, the iPad not the first tablet, nor the iPod the first music play), Apple has a way of making the experience compelling.  Like with the iPad, I haven’t yet seen the personal value proposition, so I’m on the fence.  But the ability to collect data in a massive way that could support ground-breaking insights and innovations in medicine? That has the potential for affecting millions of people around the world.  Now  that is impact.

Shiny objects and real impact

9 January 2015 by Clark 2 Comments

Yesterday I went off about how learning design should be done right and it’s not easy.  In a conversation two days ago, I was talking to a group that was  supporting several initiatives in adaptive learning, and I wondered if this was a good idea.

Adaptive learning is  desirable.  If learners come from different initial abilities, learn at different rates, and have different availability, the learning should adapt.  It should skip things you already know, work at your pace, and provide extra practice if the learning experience is extended.  (And, BTW, I’m  not talking learning styles).  And this is worthwhile,  if the content you are starting  with is good.  And even then, is it really necessary. To explain, here’s an analogy:

I have heard it said that the innovations for the latest drugs should be, in many cases, unnecessary. The extra costs (and profits for the drug companies) wouldn’t be necessary. The claim is that the new drugs aren’t any more effective than the existing treatments  if they were used properly.  The point being that people don’t take the drugs as prescribed (being irregular,  missing, not continuing past the point they feel better, etc), and if they did the new drugs wouldn’t be as good.  (As a side note, it would appear that focusing on improving patient drug taking protocols would be a sound strategy, such as using a mobile app.)  This isn’t true in all cases, but even in some it makes a point.

The analogy here is that using all the fancy capabilities: tarted up templates for simple questions, 3D virtual worlds, even adaptive learning, might not be needed if we did better learning design!  Now, that’s not to say we couldn’t add value with using the right technology at the right points, but as I’ve quipped in the past: if you get the design right, there are  lots of ways to implement it.  And, as a corollary, if you don’t get the design right, it doesn’t matter how you implement it.

We do need to work on improving our learning design, first, rather than worrying about the latest shiny objects. Don’t get me wrong, I  love  the shiny objects, but that’s with the assumption that we’re getting the basics right.  That was my assumption ’til I hit the real world and found out what’s happening. So let’s please get the basics right, and then worry about leveraging the technology on  top of a strong foundation.

Maybe it is rocket science!

8 January 2015 by Clark 11 Comments

As I’ve been working with the Foundation over the past 6 months I’ve had the occasion to review a wide variety of elearning, more specifically in the vocational and education space, but my experience mirrors that from the corporate space: most of it isn’t very good.  I realize that’s a harsh pronouncement, but I fear that it’s all too true; most of the elearning I see will have very little impact.  And I’m becoming ever more convinced that what I’ve quipped  in the past is true:

Quality design is hard to distinguish from well-produced but under-designed content.

And here’s the thing: I’m beginning to think that this is not just a problem with the vendors, tools, etc., but that it’s more fundamental.  Let me elaborate.

There’s a continual problem of bad elearning, and yet I hear people lauding certain examples, awards are granted, tools are touted, and processes promoted.  Yet what I see really isn’t that good. Sure, there are exceptions, but that’s the problem, they’re exceptions!  And while I (and others, including the instigators of the Serious eLearning Manifesto) try to raise the bar, it seems to be an uphill fight.

Good learning design is rigorous. There’re some significant effort just getting the right objectives, e.g. finding the  right  SME, working with them and not taking what they say verbatim, etc.  Then working to establish the right model and communicating it, making meaningful practice, using media correctly.  At the same time, successfully fending off the forces of fable (learning styles, generations, etc).

So, when it comes to the standard  tradeoff    –  fast, cheap, or good, pick two – we’re ignoring ‘good’.  And  I think a fundamental problem is  that everyone ‘knows’  what learning is, and they’re not being astute consumers.  If it looks good, presents content, has some interaction, and some assessment, it’s learning, right?  NOT!  But stakeholders don’t know, we don’t worry enough about quality in our metrics (quantity per time is not a quality metric), and we don’t invest enough in learning.

I’m reminded of a thesis that says medicos reengineered their status in society consciously.  They went from being thought of ‘quacks’ and ‘sawbones’ to an almost reverential status today by a process of making the process of becoming a doctor quite rigorous.  I’m tempted to suggest that we need to do the same thing.

Good learning design is complex.  People don’t have predictable properties as does concrete.  Understanding the necessary distinctions to do the right things is complex.  Executing the processes to successfully design, refine, and deliver a learning experience that leads to an outcome is a complicated engineering endeavor.  Maybe we do have to treat it like rocket science.

Creating learning should be considered a highly valuable outcome: you are helping people achieve their goals.  But if you really aren’t, you’re perpetrating malpractice!  I’m getting stroppy, I realize, but it’s only because I care and I’m concerned.  We have  got to raise our game, and I’m seriously concerned with the perception of our work, our own knowledge, and our associated processes.

If you agree, (and if you don’t, please do let me know in the comments),  here’s my very serious question because I’m running out of ideas: how do we get awareness of the nuances of good learning design out there?

 

Aspiration trumps trepidation

26 August 2014 by Clark 2 Comments

Last week’s #lrnchat (a twitter chat on learning that runs Thurs evenings for an hour 5:30 PT/8:30 ET) was on the topic of fear-mongering  in Organizational Learning.  The point is that often fear-mongering happens (by definition always wrongly), but what are the reasons, impacts, and ways to avoid.  And among my responses are one that I like as a quip.

I was, in particular, flashing back on the book Story Wars, that talked about how advertising has changed. This was in the context of fear-mongering as an approach to motivating behavior. In that book, they cited how advertisements in older days were  designed to target your concerns.  In essence, they made you worry about shortcomings as a motivation to buy remedies, whether to address your personal hygiene or appearance of success.

What’s changed is that they’ve now moved on to finding out that what is more motivating is tapping into your goals.  What are you trying to achieve? Who are you and what reflects your passions?  Then they provide products that can align with your self-image.  Of course, their ability to target your market segment is much more advanced, so they know more about who you are and have more specific means of reaching you.

In learning, however, this is also true. It’s far better to tap into your aspirations to motivate your learning than drumming on your fears. The latter  will work some, if you’ve got legitimate concerns (e.g. losing your job), but far better is to help you understand how this will help you.

So, when it comes to motivation, I’ll argue that targeting aspiration trumps targeting  trepidations. Help people understand why this is valuable or important, not fear of the consequences of failure to comply. It’s part of a better culture, and a better workplace.  And that’s something you aspire to, right? ;)

Rethinking Design: Pedagogy

20 August 2014 by Clark 4 Comments

In thinking through how to design courses that lead to both engaging experiences and meaningful outcomes, I’ve been working on the component activities.  As part of that, I’ve been looking at elements such as pedagogy in pre-, in-, and post-class sessions so that there are  principled reasons behind the design.

Pre-, In-, & Post-Class activities  So, here I’m  looking at trying for guidance  to align what happens in all three sections.  In this  case, two major types of activities have emerged: more procedural activities, such as using equipment appropriately; and more conceptual activities such as making the right decisions of what to say and do.  These aren’t clearly discriminated, but it’s a coarse description.

Of course, there’s an introduction that both emotionally and cognitively prepares the learner for the coming learning experience.

So for conceptual tasks, what we’re looking to do is drive learning to content.  In typical  approaches, you’d be presenting conceptual information (e.g. ‘click to see more‘) and maybe asking quiz questions.  Here, I’m looking to make the task of processing the information to generate  something,  whether a document, presentation, or whatever,  and that the processing is close to the way the information will be used.  So they might create a guide for decisions (e.g. a decision tree), or a checklist, or something that requires them to use the information. (And if the information doesn’t support doing, it’s probably not necessary.)  As support, in a recent conversation I heard that interviewed organizations said that making better decisions were the keys to better job performance.

Whereas in the procedural approach, we really want to give them practice in the task. It may be scaffolded, e.g. simplified, but it’s the same  practice that they’ll need to be able to perform after the learning experience. Ideally, they’ll have to explore and use content resources to figure out how to do it appropriate, in a guided exploration sense, rather than just be given the steps.

In both  cases, models are key to helping them determine what needs to happen.  Also in both cases, an instructor should be reviewing their output. In the conceptual case, learners might get feedback on their output, and have a chance to revise their creation.  In the case of the practice, the experience is likely a simulation, and the learner should be getting feedback about their success.  In either case, the instructor has information about how the cohort is doing.  So…

…for in-class learning, the learners should be reflecting on their performances, and the instructor should be facilitating that at the beginning, using the information about what’s working (and not).  Then there should be additional activities that the learners engage in that require them interacting with the material, processing (conceptual) or applying (procedural) it with each other and then with facilitated reflection.

Finally, the learners after class should be getting given elaborative activities.  In the case of the conceptual task, coming up with an elaborated version or some additional element that helps cement the learning would be valuable.  The practice or activity should get fleshed out to the point where the learner will be capable of appropriately acting after the learning experience, owing to sufficient practice and appropriate decontextualization. The goal is for retention over time and transfer to all appropriate situations.

Am I making sense here?

Resources before courses

3 July 2014 by Clark Leave a Comment

In the course of answering a question in an interview, I realized a third quip to complement two recent ones. The earliest one (not including my earlier ‘Quips‘) was “curation trumps creation”, about how you shouldn’t spend the effort to create new resources if you’ve already got them.  The second one was “from the network, not your work”, about how if your network can have the answer, you should let it.  So what’s this new one?

While I’ve previously argued that good learning design shouldn’t take longer, that was assuming good design in the first place: that you did an analysis, and concept and example design and presentation, and practice, not just dumping a quiz on top of content.  However, doing real design, good or bad,  should take time.  And if it’s about knowledge, not skills, a course doesn’t make sense. In short, doing courses should be reserved for when they are  really needed.

Too often, we’re making courses  trying to get knowledge into people’s heads, which usually isn’t a good idea, since our brains aren’t good at remembering rote information.  There are times when it’s necessary, rarely  (e.g. medical vocabulary), but we resort to that solution too often as course tools are our only hammer.  And it’s wrong.

We  should be trying to put information in the world, and reserve the hard work of course building when it’s proprietary skills sets we’re developing. If someone else has done it, don’t feel like you have to use your resources to do it  again, use your resources to go meet other needs: more performance support, or facilitating cooperation and communication.

So, for both principled and pragmatic reasons, you should be looking to resources as a solution before you turn to courses. On principle, they meet different needs, and you shouldn’t use the course when (most) needs can be met with resources. Pragmatically, it’s a more effective use of  your  resources: staff, time, and money.

#itashare

Changing Culture: Scaling Up Excellence

11 June 2014 by Clark 2 Comments

I’ve found myself    picking up books about how to change culture, as it seems to be the big barrier to a successful revolution.  I’ve finished a quick read of  Scaling Up Excellence, am in the midst of Change the Culture, Change the Game, and have Reinventing Organizations and Organize for Complexity (the latter two recommended by my colleague Harold Jarche) on deck.  Here are my notes on the first.

Scaling Up Excellence is the work of two Stanford professors who have looked for years at what makes organizations succeed, particularly when they need to grow, or seed a transformation.  They’ve had the opportunity to study a wide variety of companies, most as success stories, but they do include some cautionary tales as well.  Fortunately, this doesn’t read like an academic book, and while it’s not equipped  with formulas, there are overarching principles that have been extracted.

The overarching principle is that scaling is “a ground war, not an air war”.  What they mean is that you can’t make a high level decision and expect change to happen.  It requires hard work in the trenches.  Leaders have to go in, figure out what needs to change, and then lead that change.  Using a religious metaphor, they distinguish between Buddhist and Catholic approaches, where you’re either wanting everyone to follow the same template, or modify it to their unique situation.  Some organizations need to replicate a particular customer experience (think fast food), whereas others will need to be more accommodating to unique situations (think high-end retailers).

There are some principles around scaling, such as getting mental buy-in, helping people see the bigger picture and how the near term necessities are tied into that, and that going slow initially may help things go better. An interesting one, to me, is that accountability is a key factor; you can’t have folks sit on the side lines, and no slackers (let alone those who undermine).

Another suite of principles include cutting the cognitive load to getting things done the right way, mixing together emotional issues with clever approach, connecting people. One important element is of course allegiance, where people believe in the organization  and it’s clear the organization  is also believing in the people.  No one’s claiming this is easy, but they have lots of examples and guidance.

One really neat idea that I haven’t heard before was the concept of a pre-mortem, that is, imagining a period some time in the future and asking “why did it go right”, and also “why did it go wrong”.  A nice way to distance  oneself from the moment and reflect effectively on a proposed plan. If separate groups do this, the inputs can help address potential risks, and emphasize useful actions.

I worry a bit that it’s still ‘old school’ business, (more on that after I finish the book I’m currently reading and look to the two ‘new thinking’ books), but they do seem to be pushing the values of doing meaningful work and sharing  it.  A bit discursive, but overall I thought it insightful.

#itashare

Aligning with us

12 March 2014 by Clark Leave a Comment

The main complaint I think I have about the things L&D does isn’t so much that it’s still mired in the industrial age of plan, prepare, and execute, but that it’s just not aligned with how we think, learn, and perform, certainly not for information age organizations.  There are very interesting rethinks in all these areas, and our practices are not aligned.

So, for example, the evidence is that our thinking is not the formal logical thinking that underpins our assumptions of support.  Recent work paints a very different picture of how we think.  We abstract meaning but don’t handle concrete details well, have trouble doing complex thinking and focusing attention, and our thinking is very much influenced by context and the tools we use.

This suggests that we should be looking much more at contextual performance support and providing models, saving formal learning for cases when we really need a significant shift in our understanding and how that plays out in practice.

Similarly, we learn better when we’re emotionally engaged, when we’re equipped with explanatory and predictive models, and when we practice in rich contexts.    We learn better when our misunderstandings are understood, when our practice adjusts for how we are performing, and feedback is individual and richly tied to conceptual models.  We also learn better  together, and when our learning to learn skills are also well honed.

Consequently, our learning similarly needs support in attention, rich models, emotional engagement, and deeply contextualized practice with specific feedback.  Our learning isn’t a result of a knowledge dump and a test, and yet that’s most of what see.

And not only do we learn better together, we work better together.  The creative side of our work is enhanced significantly when we are paired with diverse others in a culture of support, and we can make experiments.  And it helps if we understand how our work contributes, and we’re empowered to pursue our goals.

This isn’t a hierarchical management model, it’s about leadership, and culture, and infrastructure.  We need bottom-up contributions and support, not top-down imposition of policies and rigid definitions.

Overall, the way organizations need to work requires aligning all the elements to work with us the way our minds operate.  If we want to optimize outcomes, we need to align both performance  and  innovation.  Shall we?

The ‘Role’ of Compliance

11 September 2013 by Clark 3 Comments

I’m not an expert on compliance training. I haven’t suffered through it, and I haven’t been asked to design it. But I know it’s a monkey on the back of the industry, and I know we have to address it. So how? I think there are two main barriers.

The first is the regulatory aspect. Much like I really think the problem holding back better for-profit schools is that the accreditation process isn’t informed enough about pedagogy, I think the agencies that oversee required learning don’t really focus on the right thing. When you are mandating the requirement by seat time, you’re missing the point. Really, you should have competencies associated with objectives. Compliance decoupled from outcomes is just a legal bulwark, not a meaningful prevention of behavior.

Of course, we could be spending that time doing more than a knowledge dump. I think there are two parts: helping people define the situation, and then providing them with skills to address it. Whether it’s ethics, harassment, or some other topic, if you’re just raising awareness you’re not equipping people, and if you’re just providing responses, you’re not helping them understand when it makes sense.

I’ve previously addressed the awareness issue, when I talked about shades of grey. The point being that seldom are things black and white, and the best way to help learners understand the situation is to give them scenarios and discuss in groups whether and how a situation qualifies. Having this done in groups, and then having a reflection session facilitated by an expert on the topic would really help learners get value. Even online, having them share their initial thoughts, and then see some other discussion would be valuable to get some of the benefits of social interaction.

So then the question becomes one of how to equip the learners to deal with the situations. There are always mandated policies, but they’re not always as easy to apply as suggested. First of all, I think role-plays make great sense here. You can use scenario tools for asynchronous situations, or just traditional role-play in the classroom. What’s important is that you consider these processes with problematic examples. So, for example, trying to do behavior coaching with a passive-aggressive individual. You might have someone who’s facing such a problem role play the tough individual to deal with, and another member of the class can try to apply the principles. Again, you’re venturing out into the grey that acknowledges it’s never as clear cut and easy as it seems.

Of course, the latter pedagogies don’t guarantee anything (learning is probabilistic, after all), and you’ve still the barrier that there’s little real reason to care given the current way the requirements are structured, but at least you have the opportunity to make the process less onerous for the learner and have a greater likelihood of actually accomplishing something meaningful in the workplace. Someone familiar with compliance want to weigh in on how I’m off-base?

« Previous Page
Next Page »

Clark Quinn

The Company

Search

Feedblitz (email) signup

Never miss a post
Your email address:*
Please wait...
Please enter all required fields Click to hide
Correct invalid entries Click to hide

Pages

  • About Learnlets and Quinnovation

The Serious eLearning Manifesto

Manifesto badge

Categories

  • design
  • games
  • meta-learning
  • mindmap
  • mobile
  • social
  • strategy
  • technology
  • Uncategorized
  • virtual worlds

License

Previous Posts

  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • April 2011
  • March 2011
  • February 2011
  • January 2011
  • December 2010
  • November 2010
  • October 2010
  • September 2010
  • August 2010
  • July 2010
  • June 2010
  • May 2010
  • April 2010
  • March 2010
  • February 2010
  • January 2010
  • December 2009
  • November 2009
  • October 2009
  • September 2009
  • August 2009
  • July 2009
  • June 2009
  • May 2009
  • April 2009
  • March 2009
  • February 2009
  • January 2009
  • December 2008
  • November 2008
  • October 2008
  • September 2008
  • August 2008
  • July 2008
  • June 2008
  • May 2008
  • April 2008
  • March 2008
  • February 2008
  • January 2008
  • December 2007
  • November 2007
  • October 2007
  • September 2007
  • August 2007
  • July 2007
  • June 2007
  • May 2007
  • April 2007
  • March 2007
  • February 2007
  • January 2007
  • December 2006
  • November 2006
  • October 2006
  • September 2006
  • August 2006
  • July 2006
  • June 2006
  • May 2006
  • April 2006
  • March 2006
  • February 2006
  • January 2006

Amazon Affiliate

Required to announce that, as an Amazon Associate, I earn from qualifying purchases. Mostly book links. Full disclosure.

We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.Ok