Learnlets

Secondary

Clark Quinn’s Learnings about Learning

Search Results for: top tools

Engineering solutions

19 March 2024 by Clark 1 Comment

Every once in a while, I wonder what I’m doing (ok, not so infrequently ;). And it’s easy to think it’s about applying what’s known about learning to the design of solutions. However, it’s more. It is about applying science results to designing improvements, but, it’s broader than learning, and not just individual. Here are some reflections on engineering solutions.

As I’ve probably regaled you with before, I was designing and programming educational computer games, and asking questions like “should we use spacebar and return, or number keys to navigate through menus?” (This was a long time ago.) I came across an article that argued for ‘cognitive engineering’, applying what we knew about how we think to the design of systems. Innately I understood that this also applied to the design of learning. I ended up studying with the author of the article, getting a grounding in what was, effectively, ‘applied cognitive science’.

Now, my focus on games has been on them as learning solutions, and that includes scenarios and simulation-driven experiences. But, when looking for solutions, I realize that learning isn’t always the answer. Many times, for instance, we are better off with ‘distributed‘ cognition. That is, putting the answer in the world instead of in our heads. This is broader than learning, and invokes cognitive science. Also, quite frankly, many problems are just based in bad interface designs!  Thus, we can’t stop at learning. We truly are more about performance than learning.

In a sense, we’re engineers; applying learning and cognitive science to the design of solutions, (just as chemical engineering is about applying chemistry). Interestingly, the term learning engineering has another definition. This one talks about using the benefits of engineering approaches, such as data, and technology-at-scale, to design solutions. For instance, making adaptive systems requires integrating content management, artificial intelligence, learning design, and more.

Historically, our initial efforts in technology-facilitated learning did take teams. The technology wasn’t advanced enough, and it took learning designers, software engineers, interface designers and more to generate solutions like Plato, intelligent tutoring systems, and the like.  I’ve argued that Web 1.0 took the integration of the tech, content design, and more, which usually was more than one person could handle. Now, we’ve created powerful tools that allow anyone to create content. Which may be a problem! The teams used to ensure quality. Hopefully, the shift back comes with a focus on process.

We can apply cognitive science to our own design processes. We’ve evolved many tools to support not making reliable mistakes: design processes, tools like checklists, etc. I’ll suggest that moving to tools that make it easy to produce content haven’t been scaffolded with support to do the right thing. (In fact, good design makes it hard to do bad things, but our authoring tools have been almost the opposite!)  There’s some hope that the additional complexity will focus us back on quality instead of being a tool for quantity. I’m not completely optimistic in the short term, but eventually we may find that tools that let us focus on knowledge aren’t the answer.

I’m thinking we will start looking at how we can use tools to help us do good design. You know the old engineering mantra: good, fast, and cheap, pick 2. Well, I am always on about ‘good’. How do we make that an ongoing factor? Can we put in constraints so it’s hard to do bad design? Hmm… An interesting premise that I’ve just now resurrected for myself. (One more reason to blog!) What’re your thoughts?

 

Get the basics right first!

10 October 2023 by Clark Leave a Comment

I’m currently advising several organizations on their approaches to the use of technology to support learning. Moreover, I’ve been doing so for more than two decades, and see a lot more such situations as well. One of the things that I struggle with is seeing folks getting all agog over new technology, yet without getting the design right beforehand.  Thus, let me make a simple suggestion: get the basics right first!

So, we know what leads to good learning. Heck, I’ve written a book summarizing what’s known about it, and I’m not the only one. Despite that fact that humans are complex, and increasingly so are our learning goals, there exist robust principles. We know that we should provide a sufficient quantity of appropriately challenging contextualized practice with aligned feedback, for instance. That is, if we actually want to achieve an outcome.

Yet, too often, we don’t see this. We see, instead, information presentation. Sometimes even with a knowledge test! Yet, such an effort is unlikely to lead to any meaningful change. That is, the investment’s wasted!

Worse, too often we see this being done with fancy new tools. Sure, I get as attracted to shiny new objects as anyone. However, I want to understand their core affordances for learning. Anyone had the dubious pleasure of attending a slide presentation in a virtual world? Or maybe being presented with animated presentations of lots of facts? The new tools may have a short-term effect of novelty, but that’s it. The fundamental aspects of how our brains learn are what’s going to make, or break, a learning investment.

On the other hand, if we start with getting the learning right, first, then there may be additional value coming from the tech. Adaptivity, on top of quality learning design, can accelerate the outcomes.  Immersion, at the right time and place, is better than not. Language models, properly used, can have big impacts. However, it comes from knowing the specific capabilities, and matching them to the need.

While I haven’t done the ‘back of the envelope’ calculation (I’m not a financial whiz), I can state with a fair degree of comfort that you’re better off doing simple learning with good design. Bad design with shiny tech is still bad design! You’ll more likely have an impact putting your investment into learning quality than using fancy tech to deliver dreck. Of course, once you’ve done that, the investment in tech can do a lot more!

I’m not against new tech, heck I’ve written on games, mobile, and more! What I’m against is new tech in lieu of good design. And I’m even more enamored of good tech on top of good design.  So, get the basics right first, then add in the shiny objects. That way you’re going to have a good return on your $$, and that’s a good thing. Right?

PS, speaking of basics, we’ll be running a debate tomorrow (11 Oct) discussing the Learning Experience Design (LXD) label. I’m sure we’ll unpack critical issues. Check it out. 

Grounded in practice

16 May 2023 by Clark Leave a Comment

Many years ago, I was accused of not knowing the realities of learning design. It’s true that I’ve been in many ways a theorist, following what research tells us, and having been an academic. I also have designed solutions, designed design processes, and advised orgs. Still, it’s nice to be grounded in practice, and I’ve had the opportunity of late.

So, as you read this, I’m in India (hopefully ;), working with Upside Learning. I joined them around 6 months ago to serve as their Chief Learning Strategist (on top of my work as Quinnovation, as co-director of the Learning Development Accelerator, and as advisor to Elevator9). They have a willingness to pay serious attention to learning science, which as you might imagine, I found attractive!

It’s been a lot of marketing: writing position papers and such. The good news is it’s also been about practice. For one, I’ve been running workshops for their team (such as the Missing LXD workshop with the LDA coming up in Asia-friendly times this summer). We’ve also created some demos (coming soon to a sales preso near you ;). I’ve also learned a bit about their clients and usual expectations.

It’s the latter that’s inspiring. How do we bake learning science into a practical process that clients can comprehend? We’re working on it. So far, it seems like it’s a mix of awareness, policy, and tools. That is, the design team must understand the principles in practice, there need to be policy adjustments to support the necessary steps, and the tools should support the practice. I’m hoping we have a chance to put some serious work into these in my visit.

Still, it’s already been eye-opening to see the realities organizations face in their L&D roles. It only inspires me more to fight for the changes in L&D that can address this. We have lots to offer orgs, but only if we move out of our comfort zone and start making changes. Here’s to the revolution L&D needs to have!

 

Misleading Malarkey

25 April 2023 by Clark 2 Comments

Recently, I saw a claim that was, well, a tad extreme. Worse, I think it was wrong, and possibly harmful. Thus, I feel it’s right to address it, to avoid misleading malarkey.

So, here’s the claim that riled me up:

Short-form edutainment is the most effective teaching method for both children and adults. TikTok and YouTube shorts will ultimately replace high schools and universities. Employment sector will phase out LMS systems and replaced with AI-powered compliance tools. If you are considering instructional design as a career, you may want to become a YouTuber or TikToker instead.

If you’ve tuned in at all, you’ll know that I’m a fan of engagement, properly construed.  Heck, it’s the topic of my most recent book! So, talking about the value of engagement in learning is all to the good. However…

…this claim goes over the top. Most notably, there’s the claim that edutainment is the most effective teaching method. If only! That puts me off, because teaching should yield a learning outcome, and just watching video shorts won’t do that (under most circumstances). Not surprisingly, I asked for research.

The author pointed to a study where mice genetically low on dopamine learned better when given dopamine. Yes, but the study had the mice do more than just watch videos, they performed tasks! I tried to go deeper, saying that engagement may be desirable, but it’s not sufficient. Without practice, watching entertaining and informative material (e.g. edutainment) isn’t a path to learning outcomes.

The conversation was derailed by my comment that edutainment had gotten a bad name from games. In the 80s, in an industry I was in, this was the case! I was accused of having a ‘gamification’ mindset! (Ahem.)  I tried steering the conversation back to the point it’s not about gamification, it’s about engagement combined with practice.

Interestingly, there was an almost parallel conversation about how engagement wasn’t the same as learning (which I pointed to in the exchange). The general take is that engagement is desirable but insufficient. Yes! Yet here we see the claim that engagement is all we need!

I believe in engagement for learning. I just don’t believe that by itself it will lead to learning. Learning science supports both the value of engagement, and the necessity of practice and feedback. That’s all. But claims like the above are misleading malarkey. It may be we’re talking an outrageous marketing claim (infamy is better than not being known at all?), but when it misleads, it’s a problem. Am I missing something?

Missing LXD Workshop

20 April 2023 by Clark Leave a Comment

We interrupt your regularly scheduled reading for this commercial announcement:

What is Learning Experience Design (LXD)? Further, why should you care? Finally, (and arguably most important) what does it mean you should do differently? Those, to me, are important questions. My short answer is that LXD is the elegant integration of learning science and engagement. Which, to me, implies some important nuances on top of what’s traditionally done in instructional design (ID). How to address it? There’s actually quite a lot in LXD, but it’s also a lot of overlap with traditional ID practices and processes. I reckon the easiest (and best) way to address it is to talk about the delta. That is, what’s different between the two. So, in my role for Upside Learning, I developed a missing LXD workshop. We ran it internally to good outcomes, and now, you can take it!

I believe that the difference starts with objectives; you can’t make a meaningful experience if you don’t have learners acquiring relevant new skills (not just an information dump). From there, there are nuances on designing individual practice activities, and then aggregated into practices (that is, putting practices together). Moving on, we look at the content elements of models and examples, and then the emotional aspects of learning. The workshop closes by looking at a design process that accommodates these. Recognizing that folks don’t want to throw out their whole process to start anew, it works from a generic model.

In the workshop, I cover each of those topics in a week; so it’s a six week experience. In between, I ask attendees to do some interim processing to both cement their understanding and to change their practices. Each week we’ll cover underlying concepts, see examples of what we’re talking about, actively process the information, and do a major application task.

To make this available more broadly, Upside’s partnered with the Learning Development Accelerator (LDA) to deliver it. Full disclosure: I’m co-director of the LDA, and Chief Learning Strategist for Upside Learning (in addition to my ongoing role for Quinnovation). (So, it’s all about me! :) Seriously, I think this puts together the tools I believe are necessary to lift our industry.

To be clear, since the advance notice timeframe puts this in summer, we’re offering it in Asia time-frames first (tho’ anyone is welcome!):

Australian Eastern Standard Time: July 7, 14, 21, 28, August 4 and 11 from 12h00 to 14h00 each day
Singapore Time: July 7, 14, 21, 28, August 4 and 11 from 10h00 to 12h00 each day
India Standard Time: July 7, 14, 21, 28, August 4 and 11 from 07h30 to 09h30 each day
New York Time: July 6, 13, 20, 27, August 3 and 10 from 22h00 to 24h00 each day

We’re offering it for US$100 to LDA members, and US$350 to non-members (for only $40 more, you get the full LDA offerings as well).

We’re planning to offer the missing LXD workshop again at a later date at East Coast/Europe friendly times (probably at a steeper price, we’ll have worked the bugs out ;). You can find out more at the LDA site. It’s got learning science and engagement bundled up into a coherent whole, for those who’ve already been doing ID and want to lift their game. I hope you’ll find it worth your while.

We now return you to your regularly scheduled reading until next week at the usual time.

Time is the biggest problem?

21 March 2023 by Clark 1 Comment

In conversations, I’ve begun to suspect that one of the biggest, if not the biggest, problem facing designers wishing to do truly good, deep, design, is client expectations. That is, a belief that if we’re provided with the appropriate information, we can crank out a solution. Why, don’t you just distribute the information across the screen and add a quiz? While there are myriad problems, such as lack of knowledge of how learning works, etc, folks seem to think you can turn around a course in two weeks. Thus, I’m led to ponder if time is the biggest problem.

In the early days of educational technology, it was considered technically difficult. Thus, teams worked on instantiations: instructional designers, media experts, technologists. Moreover, they tested, refined, and retested. Over time, the tools got better. You still had teams, but things could go faster. You could create a draft solution pretty quickly, with rapid tools. However, when people saw the solutions, they were satisfied. It looks like content and quizzes, which is what school is, and that’s learning, right? Without understanding the nuances, it’s hard to tell well-produced learning from well-designed and well-produced learning. Iteration and testing fell away.

Now, folks believe that with a rapid tool and content, you can churn out learning by turning the handle. Put content into the hopper, and out comes courses. This was desirable from a cost-efficiency standpoint. This gets worse when we fail to measure impact. If we’re just asking people whether they like it, we don’t really know if it’s working. There’s no basis to iterate! (BTW, the correlation for learner assessment of learning quality, and the actual quality, is essentially zero.)

For the record, an information dump and knowledge test is highly unlikely to lead to any significant change in behavior (which is what learning we are trying to accomplish). We need is meaningful practice, and to get that right requires a first draft, and fine tuning. We know this, and yet we struggle to find time and resources to do it, because of expectations.

These expectations of speed, and unrealistic beliefs in quality, create a barrier to actually achieving meaningful outcomes. If folks aren’t willing to pay for the time and effort to do it right, and they’re not looking at outcomes, they will continue to believe that what they’re spending isn’t a waste.

I’ve argued before that what might make the biggest impact is measurement. That is, we should be looking to address some measurable problem in the org and not stop until we have addressed it. With that, it becomes easier to show that the quick solutions aren’t having the needed impact. We need evidence to support making the change, but I reckon we also need to raise awareness. If we want to change perception, and the situation, we need to ensure that others know time is the biggest problem. Do you agree?

Debating debates

17 January 2023 by Clark Leave a Comment

This is the year, at the LDA, of unpacking thinking (the broader view of my previous ‘exposure‘). The idea is to find ways to dig a bit into the underlying rationale for decisions, to show the issues and choices that underly design decisions. How to do that? Last year we had the You Oughta Know series of interviews with folks who represent some important ideas. This year we’re trying something new, using debates to show tradeoffs. Is this a good idea? Here’s the case, debating debates.

First, showing underlying thinking is helpful. For one, you can look at Alan Schoenfeld’s work on showing his thinking as portrayed in Collins & Brown’s Cognitive Apprenticeship. Similarly, the benefits are clear in the worked examples research of John Sweller. While it’s fine to see the results, if you’re trying to internalize the thinking, having it made explicit is helpful.

Debates are a tried and tested approach to issues. They require folks to explore both sides. Even if there’s already a reconciliation, I feel, it’s worth it to have the debate to unpack the thinking behind the positions. Then, the resolution comes from an informed position.

Moreover, they can be fun! As I recalled here, in an earlier debate, we agreed to that end. Similarly, in some of the debates I had with Will Thalheimer (e.g. here), we deliberately were a bit over-the-top in our discussions. The intent is to continue to pursue the fun as well as exposing thinking. It is part of the brand, after all ;).

As always, we can end up being wrong. However, we believe it’s better to err on the side of principled steps. We’ll find out. So that’s the result of debating debates. What positions would you put up?

Critical ID/LXD Differences?

14 June 2022 by Clark 4 Comments

I’ve argued both that Learning Experience Design (LXD) is an improvement on Instructional Design (ID), and that LXD is the  elegant integration of learning science with engagement. However, that doesn’t really unpack what are the critical ID/LXD differences. I think it’s worth looking at those important distinctions both in principle and practice. Here, I’m talking about the extensions to what’s already probably in place.

Principle

In principle, I think it’s the engagement part that separates the two. True, proper ID shouldn’t ignore it. However, there’s been too little attention. For instance, only one ID theorist, John Keller, has really looked at those elements. Overall, it’s too easy to focus purely on the cognitive. (Worse, of course, is a focus purely on knowledge, which really  isn’t good ID).

I suggest that this manifests in two ways. First, you need an initial emotional ‘hook’ to gain the learner’s commitment to the learning experience. Even before we open them up cognitively (though, of course, they’re linked)! Then, we need to manage emotions through out the experience. We want to do thinks like keep challenge balanced, anxiety low enough not to interfere, build confidence, etc.

We have tools we can use, like story, exaggeration, humor, and more to assist us in these endeavors. At core, however, what we’re focusing on is making it a true ‘experience’, not just an instructional event. Ideally, we’d like to be transformational, leaving learners equipped with new skills and the awareness thereof.

Practice

What does this mean in practice? A number of things. For one, it takes creativity to consider ways in which to address emotions. There are research results and guidance, but you’ll still want to exercise some exploration. Which also means you have to be iterative, with testing. I understand that this is immediately scary, thinking about costs. However, when you stop trying to use courses for everything, you’ll have more resources to do courses right. For that matter, you’ll actually be achieving outcomes, which is a justification for the effort.

Our design process needs to start gathering different information. We need to get performance objectives; what people actually need to do, not just what they need to know. You really can’t develop people if you’re not having them perform and getting feedback. You also need to understand  why this is needed, why it’s important, and why it’s interesting. It is, at least to the subject matter experts who’ve invested the time to  be experts in this…

Your process also needs to have those creative breaks. These are far better if they’re collaborative, at least at the times when you’re ideating. While ideally you have a team working together on an ongoing basis, in many cases that may be problematic. I suggest getting together at least at the ideating stage, and then after testing to review findings.

You’ll also want to be testing against criteria. At the analysis stage, you should design criteria that will determine when you’re ‘done’. When you run out of time and money is  not the right answer! Test usability first, then effectiveness, and then engagement. Yes, you want to quantify engagement. It doesn’t have to be ‘adrenaline in the blood’ or even galvanic skin response, subjective evaluations by your learners is just fine. If you are running out of time and money before you’re achieving your metrics, you can adjust them, but now you’re doing it on consciously, not implicitly.

I’m sure there more that I’m missing, but these strike me as some critical ID/LXD differences. There are differences in principle, which yield differences in practice. What are your thoughts?

Superstitions for New Practitioners

26 April 2022 by Clark 3 Comments

Black catIt’s become obvious (even to me) that there are a host of teachers moving to L&D. There are also a number of initiatives to support them. Naturally, I wondered what I could do to assist. With my reputation as a cynic apparently well-secured, I’m choosing to call out some bad behaviors. So here are some superstitions for new practitioners to watch out for!

As background, these aren’t the myths that I discuss in my book on the topic. That would be too obvious. Instead, I’m drawing on the superstitions from the same tome, that is things that people practice without necessarily being aware, let alone espousing them. No, these manifest through behaviors and expectations rather than explicit exhortation.

  • Giving people information will lead them to change. While we know this isn’t true, it still seems to be prevalent. I’ve argued before about why I think this exists, but what matters is what it leads to. That is, information dump and knowledge-test courses. What instead we need is not just a rationale, but also practice and then ongoing support for the change.
  • If it looks like school, it’s learning. We’ve all been to school, and thus we all know what learning looks like, right? Except many school practices are only useful for passing tests, not for actually solving real problems and meeting real goals. (Only two things wrong: the curriculum and the pedagogy, otherwise school’s fine.) It, however, creates barriers when you’re trying to create learning that actually works. Have people look at the things they learned outside of school (sports, hobbies, crafts, etc) for clues.
  • People’s opinion is a useful metric for success. Too often, we just ask ‘did you like it’. Or, perhaps, a ‘do you think it was valuable’. While the latter is better than the former, it’s still not good enough. The correlation between people’s evaluation of the learning and the actual impact is essentially zero. At least for novices. You need more rigorous criteria, and then test to achieve.
  • A request for a course is a sufficient rationale to make one. A frequent occurrence is a business unit asking for a course. There’s a performance problem (or just the perception of one), and therefore a course is the answer. The only problem is that there can be many reasons for a performance problem that have nothing to do with knowledge or skill gaps. You should determine what the performance gap is (to the level you’ll know when it’s fixed), and the cause.  Only when the cause is a skill gap does a course really make sense.
  • A course is always the best answer. See above; there are  lots  of reasons why performance may not be up to scratch: lack of resources, wrong incentives, bad messaging, the list goes on. As Harless famously said, “Inside every fat course there‘s a thin job aid crying to get out.” Many times we can put knowledge in the world, which makes sense because it’s actually  hard to get information and skills reliably in the head.
  • You can develop meaningful learning in a couple of weeks. The rise of rapid elearning tools and a lack of understanding of learning has led to the situation where someone will be handed a stack of PPTs and PDFs and a rapid authoring tool and expected to turn out a course. Which goes back to the first two problems. While it might take that long to get just a first version, you’re not done. Because…
  • You don’t need to test and tune. There’s this naive expectation in the industry that if you build it, it is good. Yet the variability of people, the uncertainty of the approach, and more, suggest that courses should be trialed, evaluated, and revised until actually achieving the necessary change. Beware the ‘build and release’ approach to learning design, and err on the side of iterative and agile.

This isn’t a definitive list, but hopefully it’ll help address some of the worst practices in the industry. If you’re wary of these superstitions for new practitioners, you’ll likely have a more successful career. Fingers crossed and good luck!

There’s some overlap here with messages to CXOs  1 and 2, but a different target here.  

Tech Thoughts

28 October 2021 by Clark Leave a Comment

I’m late with a post this week, owing to several factors, all relating to technology. I hadn’t quite pulled together a complete lesson, but by writing down these tech thoughts, I got there. (A further argument for the benefits of reflection.)

It started with upgrading our phones. We needed to (I tend to hand mine down, but really we both needed an upgrade this time). Of course there are hiccups, particularly since m’lady’s was so old that it couldn’t do the amazing thing mine had done. What happened with mine was that you just put the old phone and the new phone together and the new one just sucks the data off the old one and then the old one asks if you want to wipe it clean!  That’s some serious user experience. Something we should look more to in our LXD, so that we’re doing proper backwards design and we have the right combination of tools and learning to make performance relatively effortless.

Then another thing was quite interesting. An individual linked to me on the basis of a citation in a book. I didn’t know the book, so he sent me a picture of the page. He also asked if I could read Dutch. Sadly, no. However, I had recently upgraded my OS, and when I opened the picture, I noticed I could click on the text. Of. The. Picture!  I could select all the text (my OS was doing OCR on the picture live!), and then I could paste into Google Translate (another amazing feat) and it recognized it as Dutch and translated it into English. Whoa!

On the flip side, owing to the unusually heavy rain (for California), first our internet went out, and then the power. Fortunately both were working by the next morning. However, after that my backup drives kept dismounting and I couldn’t execute a backup reliably. I thought it might be the laptop, and I did a couple of increasingly difficult remedial measures. Nope. Had the drives been damaged by the power outage? Power outages aren’t quite new around here (we’re a bit up a hillside, and squirrels regularly blow the transformer), yet it hadn’t happened before.

Then I was on a Zoom call, and I started having hiccups in the microphone and camera. Even typing. What? When I switched to the laptop camera, it was all good.  All the things, drives, microphone, external monitor, are connected by a external hub. The hub had gone wonky! Instead of having to replace drives, I picked up a new hub last nite, and all’s good now. Phew!

I guess my take home tech thoughts is that we’re making true a promise I’ve mentioned when talking mobile: we really do have magic. (Asimov famously said any truly advanced technology is indistinguishable from magic.) We can do magical things like talk at distance, have demons do tasks on our behalf (picture text transcribing and translation), etc. On the other hand, when it doesn’t work, it can be hard to identify the problem!  Overall, it’s a win. Well, when it’s designed right! Which involves testing and tuning. As Dion Hinchcliffe put it: “Seamless #cx is now table stakes.” So let’s get designing, testing, and tuning, and make magical experiences.

« Previous Page
Next Page »

Clark Quinn

The Company

Search

Feedblitz (email) signup

Never miss a post
Your email address:*
Please wait...
Please enter all required fields Click to hide
Correct invalid entries Click to hide

Pages

  • About Learnlets and Quinnovation

The Serious eLearning Manifesto

Manifesto badge

Categories

  • design
  • games
  • meta-learning
  • mindmap
  • mobile
  • social
  • strategy
  • technology
  • Uncategorized
  • virtual worlds

License

Previous Posts

  • March 2026
  • February 2026
  • January 2026
  • December 2025
  • November 2025
  • October 2025
  • September 2025
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • April 2011
  • March 2011
  • February 2011
  • January 2011
  • December 2010
  • November 2010
  • October 2010
  • September 2010
  • August 2010
  • July 2010
  • June 2010
  • May 2010
  • April 2010
  • March 2010
  • February 2010
  • January 2010
  • December 2009
  • November 2009
  • October 2009
  • September 2009
  • August 2009
  • July 2009
  • June 2009
  • May 2009
  • April 2009
  • March 2009
  • February 2009
  • January 2009
  • December 2008
  • November 2008
  • October 2008
  • September 2008
  • August 2008
  • July 2008
  • June 2008
  • May 2008
  • April 2008
  • March 2008
  • February 2008
  • January 2008
  • December 2007
  • November 2007
  • October 2007
  • September 2007
  • August 2007
  • July 2007
  • June 2007
  • May 2007
  • April 2007
  • March 2007
  • February 2007
  • January 2007
  • December 2006
  • November 2006
  • October 2006
  • September 2006
  • August 2006
  • July 2006
  • June 2006
  • May 2006
  • April 2006
  • March 2006
  • February 2006
  • January 2006

Amazon Affiliate

Required to announce that, as an Amazon Associate, I earn from qualifying purchases. Mostly book links. Full disclosure.

We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.