Learnlets

Secondary

Clark Quinn’s Learnings about Learning

Learning Science Conference 2024

15 October 2024 by Clark Leave a Comment

I believe, quite strongly, that the most important foundation anyone in L&D can have is understanding how learning really works. If you’re going to intervene to improve people’s ability to perform, you ought to know how learning actually happens! Which is why we’ve created the Learning Science Conference 2024.

We have some of the most respected translators of learning science research to practice. Presenters are Ruth Clark, Paul Kirschner, Will Thalheimer, Patti Shank, Nidhi Sachdeva, as well as Matt Richter and myself. They’ll be providing a curated curriculum of sessions. These are admittedly some of our advisors to the Learning Development Accelerator, but that’s because they’ve reliably demonstrated the ability to do the research, and then to communicate the results of theirs and others’ work in terms of the implications for practice. They know what’s right and real, and make that clear.

The conference is a hybrid model; we present the necessary concepts asynchronously, starting later this month. Then from 11- 15 November, we’ll have live online sessions led by the presenters. These are at two different times to accommodate as much of the globe as we can! In these live sessions we’ll discuss the implications and workshop issues raised by attendees. We will record the sessions in case you can’t make it. I’ll note, however, that participating is a chance to get your particular questions answered! Of course, we’ll have discussion forums too.

We’ve worked hard to make this the most valuable grounding you can get, as we’ve deliberately chosen the topics that we think everyone needs to comprehend. I suggest there’s something there for everyone, regardless of level. We’re covering the research and implications around the foundations of learning, practices for design and evaluation, issues of emotion and motivation, barriers and myths, even informal and social learning. It’s the content you need to do right by your stakeholders.

Our intent is that you’ll leave equipped to be the evidence-based L&D practitioner our industry needs. I hope you’ll take advantage of this opportunity, and hope to see you at the Learning Science Conference 2024.

Simple Models and Complex Problems

8 October 2024 by Clark Leave a Comment

I’m a fan of models. Good models that are causal or explanatory can provide guidance for making the right decisions. However, there are some approaches that are, I suggest, less than helpful. What makes a good or bad model? My problem is about distinguishing when to talk about each: simple models and complex problems.

A colleague of ours sent me an issue of a newsletter (it included the phrase ‘make it meaningful‘ ;). In it, the author was touting a four letter acronym-based model. And, to be fair, there was nothing wrong with what the model stipulated. Chunking, maintaining attention, elaboration, and emotion are all good things. What bothered me was that these elements weren’t sufficient! They covered important elements, but only some. If you just took this model’s advice, you’d have somewhat more memorable learning, but you’d fall short on the real potential impact. For instance, there wasn’t anything there about the importance of contextualized practice nor feedback. Nor models, for that matter!

I’m not allergic to n letter acronym models. For instance, I keep the coaster I was given for Michael Allen’s CCAF on my desk. (It’s a nice memento.) His Context-Challenge-Activity-Feedback model is pretty comprehensive for the elements that a practice has to have (not surprisingly). However, learning experiences need more than just practice, they need introductions, and models, and examples and closings as well as practice. And while the aforementioned elements are necessary, they’re not sufficient. Heck, Gagné talked about nine elements.

What I realize as I reflect is that I like models that have the appropriate amount of complexity for the level of description they’re talking about. Yet I’ve seen far too many models that are cute (some actually spell words) and include some important ideas but they’re not comprehensive for what they cover. The problem, of course, is that you need to understand enough to be able to separate the wheat from the chaff. I’ll suggest to look to vetted models, that are supported by folks who know, and there are criticisms and accolades to accompany them. Read the criticisms, and see if they’re valid. Otherwise, the model may be useful.

Ok, one other thing bothered me. This model supposedly has support from neuroscience. However, as I’ve expressed before, there have yet to be results that aren’t already made from cognitive science research. This, to me, is just marketing, with no real reason to include it except to try to make it more trendy and appealing. A warning sign, to me at least.

Look, designing for learners is complex. Good models help us handle this complexity well. Bad ones, however, can mislead us into only paying attention to particular bits and create insufficient solutions. When you’re looking at simple models and complex problems, you need to keep an eye out for help, but maybe it needs to be a jaundiced eye.

Short term thinking versus long term benefits

1 October 2024 by Clark Leave a Comment

I was thinking about a particular issue, and I realized it’s symptomatic of a bigger problem. The issue is that too often I see folks indulging in short term thinking versus long term benefits. I understand, but I think it’s problematic, regardless. Of course, making a change is also liable to be a struggle. Still, it’s worth talking about.

The problem is that organizations have a structure that is largely to meet short-term needs. For instance, there are pressures to return short-term shareholder benefits, at least in publicly traded companies. Even private organizations are liable to want to reward the founder. There are few enough examples of folks that are taking a bigger perspective.

And, to be clear, I’m not denying the need for efficiencies. That’s a given. The issue instead, to me, is one of whether those efficiencies generate short-term returns, or instead will yield long-term benefits.

For example, when the pandemic hit, lots of orgs were struggling to find ways to continue operations when suddenly everyone had to stay remote. I argued that if you’re going through a digital transformation, you should start with an organizational transformation. My reasoning was that digitizing an old way of doing things was only going to be a short-term fix. What I saw was that this big upheaval was an opportunity for redesign. Not surprisingly, this wasn’t an effective pitch. People needed to fix things now! Yet, the orgs that survived the pandemic best were the ones that had a good culture to survive the enforced digital operation.

Similarly, I see many orgs focusing on ‘leadership development. That’s not a bad thing, mind you. Well, if you get past the Leadership BS (thought I’d written about this, but I can’t find it ;). Yet, most of what we see is expensive and highly interactive. Which sounds great, but it doesn’t scale. Our colleague JD Dillon is starting a book for frontline workers, which I laud. Yet there’s an intermediate level we’re guilty of neglecting. Again, a short term perspective.

Managers, data says, are the biggest reason people leave. Als0, most managers are promoted from the front line, and yet pretty much all of them are novices when it comes to management. Yet, our management training is idiosyncratic. More, our colleague Will Thalheimer recently suggested in an LDA event, that little in leadership development covers how to facilitate learning for your folks. Yet, hat’s one of the best things to help employees think their managers actually care for them (c.f. Self-Determination Theory). Moreover, there are so many managers that can benefit from training (and increasingly, leadership is viewed as something that needs to be present throughout the organization).

There are problems trying to deliver manager training at scale. We see demand, but it’s hard to deliver, particularly cost-effectively. Technology is part of the solution, but to make it work takes (wait for it) a long term perspective. These are only two examples, from the area of learning and development that I largely work and play in.  I’d argue that, for instance, the shift to a learning organization would be one of the best investments you could make. Well, for the long term ;). That’s the type of transformation that would be greatly augmented by a subsequent digital enablement. But without that initial refocus, the digitization will continue to support hierarchy, lack of transparency, and other factors that interfere with ongoing innovation and success.

I’d welcome hearing that most organizations are working on both the short- and long-term, but I’m skeptical. And more than willing to be wrong!  I’ll merely reiterate the point the late Jay Cross would make; investing in your people’s ability to learn is probably the best one you can make. In the tradeoff of short term thinking versus long term benefits, it seems obvious to me that playing the long game is the right way. That, at least, makes sense to me. What am I missing?

Is “Workflow Learning” a myth?

24 September 2024 by Clark 5 Comments

There’s been a lot of talk, of late, about workflow learning. To be fair, Jay Cross was talking about learning in the flow of work way back in the late 1990s, but the idea has been recently suborned and become current. Yet, the question remains whether it’s real or a mislabeling (something I’m kind of  anal about, see microlearning). So, I think it’s worth unpacking the concept to see what’s there (and what may not be). Is workflow learning a myth?

To start, the notion is that it’s learning at the moment of need. Which sounds good. Yet, do we really need learning? The idea Jay pointed to in his book Informal Learning, was talking about Gloria Gery’s work on helping people in the moment. Which is good! But is it learning? Gloria was really talking about performance support, where we’re looking to overcome our cognitive limitations. In particular, memory, and putting the information into the world instead of in the head. Which isn’t learning! It’s valuable, and we don’t do it enough, but it’s not learning.

Why? Well, because learning requires action and reflection. The latter can just be thinking about the implications, or in Harold Jarche’s Personal Knowledge Mastery model, it’s about experimenting and representing. In formal learning, of course, it’s feedback. I’ve argued we could do that, by providing just a thin layer on top of our performance support. However, I’ve never seen same!  So,  you’re going to do, and then not learn. Okay, if it’s biologically primary (something we’re wired to learn, like speaking), you’re liable to pick it up over time, but if it’s biologically secondary (something we’ve created and aren’t tuned for, e.g. reading) I’d suggest it’s less likely. Again, performance is the goal. Though learning can be useful to support comprehending context and  making complex decisions, what we’re good at.

What is problematic is the notion of workflow and reflection in conjunction. Simply, if you’re reflecting, you’re by definition out of the workflow! You’re not performing, you’re stopping and thinking. Which is valuable, but not ‘flow’. Sure, I may be overly focused on workflow being in the ‘zone’, acting instead of thinking, but that, to me, is really the notion. Learning happens when you stop and contemplate and/or collaborate.

So, if you want to define workflow to include the reflection and thoughtful work, then there is such a thing. But I wonder if it’s more useful to separate out the reflection as things to value, facilitate, and develop. It’s not like we’re born with good reflection practices, or we wouldn’t need to do research on the value of concept mapping and sketch noting and how it’s better than highlighting. So being clear about the phases of work and how to do them best seems to me to be worthwhile.

Look, we should use performance support where we can. It’s typically cheaper and more effective than trying to put information into the head. We should also consider adding some learning content on top of performance support in times where people knowing why we’re doing it as much as what we should do is helpful. Learning should be used when it’s the best solution, of course. But we should be clear about what we’re doing.

I can see arguments why talking about workflow learning is good. It may be a way to get those not in our field to think about performance support. I can also see why it’s bad, leading us into the mistaken belief that we can learn while we do without breaking up our actions. I don’t have a definitive answer to “is workflow learning a myth” (so this would be an addition to the ‘misconceptions’ section of my myths book ;). What I think is important, however, is to unpack the concepts, so at least we’re clear about what learning is, about what workflow is, and when we should do either. Thoughts?

Marathons and Sprints

3 September 2024 by Clark Leave a Comment

(Empty) Lanes on track on a gym field.

Besides Kahnemann’s Fast & Slow book, I’ve also talked about fast and slow innovation. Fast is where you have a specific problem to solve, or product to design, or thing to research, and you do so. Slow is the innovation that happens because you create opportunities for new ideas to flourish: making it safe, keeping the ‘adjacent possible’ open, facilitating creative friction, etc. Similarly, in my writing, I use both marathons and sprints. What do I mean?

So, I tend to have reasonably long time-frames for writing. I now blog once a week, and I tend to queue these up a week or two in advance. My books, of course, when I’m working on them, have deadlines months ahead. Presentations, too, are a form of communication. Overall, I tend to have months between proposals and when I have to deliver them. Occasionally, I’m asked for something on a short time frame, but even that’s several days.

And, in my life, I tend to have time (typically, in the morning) to respond to short term requirements, and also time to nick away at the longer term requirements. I’ve become relatively good at leaving projects open to contribute to them as I can. So, largely, this is the ‘marathon’ life. That is, I take care of details, and then take time to polish off the bigger projects. Which, I acknowledge, is a luxury. The tradeoff is that I haven’t had a secure income for most of the past 2.5 decades ;).

What also happens is that, at some point in my nicking away at a project, it comes together. The picture that’s been gestating finally emerges. Then, I tend to suddenly find myself grinding it out. It could be a chapter, a book, a presentation, or just an article, but ultimately it takes shape. That said, for my most recent tome, an iterative process emerged. I kept sending out the latest version to someone else, and rearranging it based upon their feedback. That is, until I realized that the latest rearrangement felt truly right, and I was done!

This varies, of course. Sometimes I’m asked for something short term, and then I tend to fall back on things I’ve already thought through. This blog, as I’ve mentioned in many ways, forces me to think through things (looking to keep it fed and not repeat myself too much). I don’t mind this, as it still forces me to rearticulate, which often forces me to rethink, which is a good thing! In my reprocessing, I’m not only cementing my understanding, but frequently deepening it!

Overall, however, this cycle of marathons and sprints works. The longer term processing provides the basis for the short-term sprints. As it is, I’m usually as productive as anyone else (possibly more), yet it seems like there’s a lot of time of me just musing. Percolation (fermentation, incubation, pick your metaphor) is a good thing! As a reflection, this strikes me as right. It also strikes me as a prescription: break things up, ensure you have enough time for the big things, and take time to reflect. It works for me! And, I realize, it’s contrary to much of organizational life, which to me says more about organizational life than how you (should) think.

(BTW, in real life, I was always better the longer I had to run; I was usually the slowest person in my phys ed classes in sprints! At least on land…) 

 

Changes at Quinnovation

21 August 2024 by Clark Leave a Comment

Pretty much nothing stays the same, and that includes my situation. One of the activities that has been taking up my time, which I wrote about as recently as 4 June this year, is ending. As a result, there are some interesting outcomes, and some which are still unresolved. So here’s the rundown on some changes at Quinnovation (the vehicle which I consult through, for which Learnlets is the blog).

Amongst the things I’ve been doing is serving as Chief Learning Strategist for Upside Learning. That was a role where they had me evangelizing learning science in L&D, and working with them internally to deliver on it. It was a good situation; their CEO, Amit Garg, really cares about learning science, and the folks I worked with were really stellar. We did videos, blog posts, ebooks, conference presentations, and demos. I did internal and external webinars as well. Even some client work!

However, in an announcement this morning, they have been acquired (so I can now say this out loud), and my relationship with them ends. The boost in funding is a good situation for an organization that can benefit from a boost, and for Amit of course earning returns on his hard work.

What this means is, of course, that Quinnovation has a bit more bandwidth than I did before. I’m still continuing in my role as co-director of the Learning Development Accelerator, and board advisor to Elevator9. And, I have existing and some pending business with clients through Quinnovation. If there’s an org that wants to actively promote (and practice) learning science, I’m happy to hear. Otherwise, if your organization has a need for some guidance around the cognitive and learning sciences for L&D and innovation, let’s talk!  Those are the current changes at Quinnovation (but probably not the last ones ;). Stay curious, my friends!

 

The Damage Done

20 August 2024 by Clark Leave a Comment

There’ve been a recent discussions about misinformation. One question is, what does it hurt? When you consider myths, superstitions, and misconceptions (the breakdown in my book on L&D problems), what can arise? Let’s talk about the damage done.

So, let’s start with myths. These, I claim, are things that have been shown not to have value by empirical research. There are studies that have examined these claims, and found them to not have data to support them. For instance, accommodating learning styles is a waste. Yes, we know people differ in learning, but we don’t have a reliable base. Moreover, people’s choices to work for (or against) their style don’t make a difference in their learning. Some of the instruments are theoretically flawed as well as psychometrically invalid.

What’s the harm? I’ll suggest several ways in which myths harm us. For one, they can cause people to spend resources (money & time)  addressing them that won’t have an impact. It’s a waste! We can also characterize people in ways that limit them; for instance if they think they learn in a particular way, they may avoid a topic or invest effort in an inappropriate way to learn it. Investing in unproven approaches also perpetuates them, propagating the beliefs to others.

Superstitions, as I define them, are beliefs nobody would claim to believe, yet somehow persist in our practices. For instance, few will claim to believe that telling is sufficient to achieve behavior change. Yet, we continue to see information presentation and knowledge test, such as “awareness” training. Why? This is a waste of effort. There aren’t outcomes from these approaches. Typically, they are legacies of expectations from previous decades, yet business practices haven’t been updated. Still, to the extent that we continue these practices, even while decrying them, we’re again wasting time and money. Maybe we tick boxes and make people happy, but we can (and should) do better.

The final category is misconceptions. These are beliefs that some hold, and others decry. They aren’t invalid, but they only make sense in certain circumstances. I suggest that those who defy them don’t have the need, and those who tout them are in the appropriate circumstance. What matters is understanding when they make sense, and then using them, or not, appropriately. If you avoid them when they make sense, you may make your life harder. If you adopt them when they’re not appropriate,  you could make mistakes or waste money.

At the end of the day, the damage done is the cost of wasting money and time. Understanding the choices is critical. To do so best, you can and should understand the underlying cognitive and learning sciences. You should also track the recognized translators of research into practice who can guide you without you having to read the original academese. To be professional in our practice, we need to know and use what’s known, and avoid what’s dubious. Please!

Failing right

13 August 2024 by Clark Leave a Comment

I’ve been reading Amy Edmondson’s Right Kind of Wrong, and I have to say it’s very worthwhile. I’ve been a fan of hers since her book Teaming introduced me to the notion of psychological safety. It’s an element I’ve incorporated into my thinking about innovation and learning. This new book talks about how we have beliefs about making mistakes, and how we can, and should, be failing right.

In this book, she uses examples to vibrantly talk about failure, and how it’s an important part of life. She goes on to talk about different types of failure, and the situations they can occur in, creating a matrix. This allows us to look at when and how to fail. Along the way, she talks about self, situational, and systemic failure.

One of the important takeaways, which echoes a point Donald Norman made in Design of Everyday Things, is that failure may not be our fault! Too often, bad design allows failure, instead of preventing it. Moreover, she makes the point that we have a bad attitude towards failure, not recognizing that it’s not only part of life, but can be valuable!  When we make a mistake, and reflect, we can learn.

Of course, there are simple mistakes. I note that there’s some randomness in our architecture, e.g. To Err is Human. But also, there can be factors we haven’t accounted for, like bad design, or things out of our control. At the most significant level, she talks about complex systems, and how they can react in unpredictable ways. Along the way, what counts as ‘intelligent’ failure is made clear. Some fails are smart, others are not justified.

She also talks about how experiments are necessary to understand new domains. This is, in my mind, about innovation. She also gives prescriptions, at both the personal and org level. Dr. Edmondson talks about the value of persistence, taking ‘good enough’, but also not taking it too personally. She also talks about sharing, as Jane Bozarth would say: Show Your Work. This is for both calling out problems and sharing failure.

Along with a minor quibble about the order in which she presents a couple things, a more prominent miss, to me, is a small shift in focus. She talks about celebrating the ‘pivot’, where you change direction. However,  I’d more specifically celebrate the learning. That is, whether we pivot or not, we say that learning something is good. Of course, I’m biased towards learning, but I’d rather celebrate the learning. Yes, we possibly would do something different, and celebrating action is good, but sharing the learning means others can learn from it too. Maybe I’m being too pedantic.

Still, this is another in her series of books exploring organizational improvement and putting useful tools into our hands. We can, and should expect to not get everything right all the time, and instead should be focusing on failing right. Recommended.

 

Emotions

30 July 2024 by Clark 2 Comments

Emotion matters. Yes, largely it’s a cultural construct, as Lisa Feldman Barrett tells us. Still, they can help or hinder learning. When designing games or creating meaningful learning, they matter. But they also affect us in our daily activities.

So, my previous post, on misinformation, is personal. I’ve frustration that family members are buying into some of it. I try to maintain a calm demeanor, but it’s challenging. Still, it’s a battle I’ve not yet given up on. Yet, I’m also not immune to the larger effects of emotion.

A curve showing low performance for low and high arousal, but a peak of performance in between.What we know, from the Yerkes-Dodson Curve, is that a little bit of arousal (read: emotion) can help, but too much can hurt. What isn’t clear from my conceptual rendering is what amount is the ‘right’ amount of arousal for optimal performance. I’ll suggest that for learning, it’s pretty low, as learning is stressful (another synonym for arousal). And I do suggest we manipulate emotions (which I admit is shorthand for motivation, anxiety, and confidence, which aren’t the regular definition) to successfully achieve learning outcomes.

However, even general functioning gets difficult when things are stressful. When I look at the design of casinos, for instance, (a way to cope with the too many times I have to go to Vegas for conferences), I note that they deliberately have low information, lights, no clocks, as an information-sparse environment. It is deliberate, so that you’re more focused on the enticements. They want you confused because you’re then more vulnerable to predations.

I fear that there’s a bit of this in our culture. For instance, fear sells: more alarmist headlines lead to more engagement. Which is good for the news business, but perhaps bad for us in several ways. For one, there’s a vested interest in focus on the alarming, not the bigger picture. Similarly, twisting stories to get emotional engagement isn’t unknown. That can be entertaining, but when it’s the information we depend on is manipulated, it’s problematic. Reducing support for education similarly reduces the intelligence people can apply to analysis.

I struggled to focus to find a topic this week, and I realize it’s because of the informational turmoil that’s currently in play. So, I thought I’d write about it (for better or worse ;).  Exaggeration of issues for the sake of clicks and sales, I’ll suggest isn’t a good thing. I’m willing to be wrong, but I worry that we’re over-excited. Our emotions are being played on, for purposes that are not completely benign. That’s a worry. That’s what’s worrying me, what about you?

Misinformation (and the fighting thereof)

23 July 2024 by Clark Leave a Comment

One of the banes of our corporate existence is the existence of myths. (We seem to be immune to conspiracy theories, at least.) I’ve been fighting them in myriad ways, over the years. Approaches include a book, talks, and more. We also need ways to vet new information for veracity. Here are a few steps taken recently for misinformation and the fighting thereof.

First, at the Learning Development Accelerator (LDA), we created a research checklist (warning: members only, but at the free level). This was supposed to be a way to vet claims, starting with the practical, but eventually getting into actually evaluating the research. We don’t necessarily recommend this, by the way. It’s probably better to trust research translators unless you’re really willing to dive into the details. (Translators: folks who’ve demonstrated a reliable ability to both take research and extract the meaningful principles and cut through hype).

Then, Matt Richter, my colleague in the LDA, recommended Alex Edman’s book May Contain Lies. I’ve read it and found it an accessible and thoughtful treatment of analyzing claims and data (recommended). Matt even prompted the LDA to host a ‘meet the author’ with Alex. That’s available to view (may also require free membership).

In it, he reiterated something in the book that I found valuable. He talked about a ‘ladder’ of investigation. Telegraphically, it’s this:

  1. Statement is not fact (the statement must be accurate)
  2. Fact is not data (the fact must be representative)
  3. Data is not evidence (the data must be conclusive)
  4. Evidence is not proof (the evidence must be universal)

What is being said here is that there are several steps to evaluate what folks want to tell (sell) you. If someone just quotes a statement, it’s not necessarily valid unless it’s accurate. Someone could make a claim that’s not actually true (as happens). Then, that statement alone is not data, unless the statement is representative of the general tenor of thought. For instance, a few positive anecdotes aren’t necessarily indicative of everyone’s experience. Then, representative quotes actually have to be sufficient against any other explanations for the same outcome. For instance, finding out that people like something may not be indicative of its actual efficacy. Finally, the evidence has to apply in your situation, not just theirs.

He used some examples, for instance books where they draw inferences from a few successful companies, without determining that other companies with the inferred characteristics also succeed. What’s nice is he has boiled down what can be an overwhelming set of rules into a simple framework. Misinformation isn’t diminishing, it even seems to be increasing. There’s increasing needs to separate out bogus claims for legitimate. We need to be rallying around misinformation and the fighting thereof. Here’re some tools. Good luck!

« Previous Page
Next Page »

Clark Quinn

The Company

Search

Feedblitz (email) signup

Never miss a post
Your email address:*
Please wait...
Please enter all required fields Click to hide
Correct invalid entries Click to hide

Pages

  • About Learnlets and Quinnovation

The Serious eLearning Manifesto

Manifesto badge

Categories

  • design
  • games
  • meta-learning
  • mindmap
  • mobile
  • social
  • strategy
  • technology
  • Uncategorized
  • virtual worlds

License

Previous Posts

  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • April 2011
  • March 2011
  • February 2011
  • January 2011
  • December 2010
  • November 2010
  • October 2010
  • September 2010
  • August 2010
  • July 2010
  • June 2010
  • May 2010
  • April 2010
  • March 2010
  • February 2010
  • January 2010
  • December 2009
  • November 2009
  • October 2009
  • September 2009
  • August 2009
  • July 2009
  • June 2009
  • May 2009
  • April 2009
  • March 2009
  • February 2009
  • January 2009
  • December 2008
  • November 2008
  • October 2008
  • September 2008
  • August 2008
  • July 2008
  • June 2008
  • May 2008
  • April 2008
  • March 2008
  • February 2008
  • January 2008
  • December 2007
  • November 2007
  • October 2007
  • September 2007
  • August 2007
  • July 2007
  • June 2007
  • May 2007
  • April 2007
  • March 2007
  • February 2007
  • January 2007
  • December 2006
  • November 2006
  • October 2006
  • September 2006
  • August 2006
  • July 2006
  • June 2006
  • May 2006
  • April 2006
  • March 2006
  • February 2006
  • January 2006

Amazon Affiliate

Required to announce that, as an Amazon Associate, I earn from qualifying purchases. Mostly book links. Full disclosure.

We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.Ok