Learnlets

Secondary

Clark Quinn’s Learnings about Learning

Search Results for: top tools

Game Development Tools

15 August 2008 by Clark 7 Comments

The last topics in our 2 day game design workshop for the Guild (great group of attendees, great experience) were evaluation, production, and organizational issues.   On the production issue, the perennial topic of tools came up.   In thinking about it, I realized that we needed a map, so I started coming up with one (a diagram, of course :) ).   I ran it past Jeff (Johannigman, my co-conspirator on the workshop) in our taxi to the airport, to his general approval.

gametoolspace

The two dimensions are complexity of the scenario (only covering branching and model-driven), and the power (e.g. complexity) of the tool.   It’s a pretty linear map, and realize that small distances aren’t significant (so the clusters are roughly equivalent).

The impossible dream is that tool that everyone wants that makes it easy to develop model-driven interactions.   Sorry, I’m convinced it can’t exist, because to be flexible enough to cover all the different models that we’d want to represent, it’s got to be so general as to be essentially just a programming language.   QED (Quinn Ephemeral Decision).

This is a first stab, so feedback welcome.   If desired, I can create it in Gliffy and we can collaboratively develop it (though my first effort with that was underwhelming in participation…).   Thoughts?

eLearning Tools?

17 April 2007 by Clark 6 Comments

In my elearning strategy session at the elearning guild, I included the following graphic as a model to think about how tools can help populate a performance ecosystem (aka learnscape):

PerformanceEcosystem

The point being that different tools fit different spaces in terms of who they serve in terms of experience, and whether they’re more individual or more group. The desktop/mobile may become less clear, but still makes sense for now.

I’ve seen folks trying to understand where blogs, wikis, etc fit into the space of learning tools (and realize that some of the tools have a broad reach and I’ve tried to place them in their center of impact; maybe I need some circles or auras or something indicating reach).

So, do you think I’ve got it right? And, do you think it’s useful?

They’re ripping you off

7 January 2025 by Clark Leave a Comment

Ok, so I am grateful. But there may also be times to rant. (Maybe I’m grateful for getting it off my chest?) But I’m seeing a continual rise in how folks are looking to take advantage of me, and you. And I don’t like it. So, here are some of the ways they’re ripping you off!

So, first, it’s the rise in attempts to defraud you. That can be scams, phishing, or more. As I was creating this post, this was a repost on Bluesky:

Robocalls are seeing a massive increase lately. Keep in mind that efforts to stop caller-ID spoofing have largely had no real effect, because callers now use “throw away” numbers that verify correctly and then are abandoned after days or even hours. In fact, if you get an “unknown caller” on your phone, it’s likely NOT a spam call, because spammers can now so easily not bother spoofing or blocking their numbers, they just keep switching to different “legit” numbers that spam blocks usually don’t detect.

Email phishing is on the rise, and much of it now is bypassing SPF and DKIM checks (that Google and other large mailers started requiring for bulk mailings) due to techniques such as DKIM replay and a range of other methods. Fake PayPal invoices are flooding the Net, and they often are passing those checks meant to block them. It’s reported that many of these are coming from Microsoft’s Outlook, with forged PayPal email addresses. Easiest way to detect these is to look at the phone number they want you to call if you have a question — and if it’s not the legit PayPal customer service number you know it’s not really from PayPal. Getting you to call the scammers on the phone is the basis of the entire scheme.

It’s all getting worse, not better. – From Lauren Weinstein Lauren.vortex.com

Another one are Google Calendar announcements, and recently DocuSign frauds. Plus, of course, the continual fake invoices for Macafee, etc. I don’t know about you, but the earlier scam of pretending to be someone on LinkedIn has returned. I’m seeing a renewal of folks saying that I have an interesting profile, or that I’d be a good match for their company’s new initiative. Without knowing anything about me, of course.

Worse, I’m now seeing at least the former showing up in Bluesky (so I’m keeping Mastodon around; quinnovator on both), and even on Academia.edu! I hear about some attempts to crack down on the factories where they house (and exploit) folks to do this. Which, of course, just drives them to smaller and harder to find such activities. The tools are getting more powerful, making it easier.

The one that really gets me is the increasing use of our data to train language models. I was first alerted when a tool (no longer freely available) allowed me to check one of the AI engines. Sure enough, this blog was a (miniscule) percentage of it. In the column on the right, you can see I’m ok with my posts being fodder. Er, only if you aren’t making money, share alike, and provide attribution! Which isn’t the case; I haven’t had contact nor seen remuneration.

This is happening to you, too. As they say, if you’re not paying, you’re the product. If you use Generative AI (e.g. ChatGPT), you’re likely having your prompts tracked, and any materials you upload are fair game. Many of the big tools (e.g. Microsoft) that connect to the internet are also taking your data. Some may make not taking the default, but others aren’t. In short, your data is being used. Sure, it may be a fair exchange, but how do you know?

In short, they’re ripping you off. They’re ripping us off!  And, we can passively accept it, or fight. I do. I report phishing, I block folks on social media, and I tick every box I can find saying you can’t have my data. Do we need more? I like that the EU has put out a statement on privacy rights. Hopefully, we’ll see more such initiatives. The efforts won’t stop; shareholder returns are at stake after all, but I think we can and should stand up for our rights. What say you?

What L&D resources do we use?

29 October 2024 by Clark 1 Comment

This isn’t a rhetorical question. I truly do want to hear your thoughts on the necessary resources needed to successfully execute our L&D responsibilities. Note that by resources in this particular case, I’m not talking: courses, e.g. skill development, nor community. I’m specifically asking about the information resources, such as overviews, and in particular tools, we use to do our job. So I’m asking: what L&D resources do we need?

A diagram with spaces for strategy, analysis, design, development, evaluation, implementation, evaluation, as well as topics of interest. Elements that can be considered to be included include tools, information resources, overviews, and diagrams. There are some examples populating the spaces.I’m not going to ask this cold, of course. I’ve thought about it a bit myself, creating an initial framework (click on the image to see it larger). Ironically, considering my stance, it’s based around ADDIE. That’s because I believe the elements are right, just that it’s not a good basis for a design process. However, I do think we may need different tools for the stages of analysis, design, development, implementation, and evaluation, even if don’t invoke them in a waterfall process. I also have categories for overarching strategy, and for specific learning topics. These are spaces in which resources can reside.

There are also several different types of resources I’ve created categories for. One is an overview of the particular spaces I indicate above. Another are for information resources, that drill into a particular approach or more. These can be in any format: text or video typically. Because I’m weird for diagrams, I have them separately, but they’d likely be a type of info resource. Importantly, one is tools. Here I’m thinking performance support tools we use: templates, checklists, decision trees, lookup tables. These are the things I’m a bit focused on.

Of course, this is for evidence-based practices. There are plenty of extant frameworks that are convenient, and cited, but not well-grounded. I am looking for those tools you use to accomplish meaningful solutions to real problems that you trust. I’m looking for the ones you use. The ones that provide support for excellent execution. In addition to the things listed above, how about processes? Frameworks? Models? What enables you to be successful?

Obviously, but importantly, this isn”t done! That is, I put my first best thoughts out there, but I know that there’s much more. More will come to me (already has, I’ve already revised the diagram a couple of times), but I’m hoping more will come from you too. That includes the types of resources, spaces, as well as particular instances.

The goal is to think about the resources we have and use. I welcome you putting in, via comments on the blog or wherever you see this post, and let me know which ones you find to be essential to successful execution. I’d really like to know what L&D resources do we use. Please take a minute or two and weigh in with your top and essential tools. Thanks!

Engineering solutions

19 March 2024 by Clark 1 Comment

Every once in a while, I wonder what I’m doing (ok, not so infrequently ;). And it’s easy to think it’s about applying what’s known about learning to the design of solutions. However, it’s more. It is about applying science results to designing improvements, but, it’s broader than learning, and not just individual. Here are some reflections on engineering solutions.

As I’ve probably regaled you with before, I was designing and programming educational computer games, and asking questions like “should we use spacebar and return, or number keys to navigate through menus?” (This was a long time ago.) I came across an article that argued for ‘cognitive engineering’, applying what we knew about how we think to the design of systems. Innately I understood that this also applied to the design of learning. I ended up studying with the author of the article, getting a grounding in what was, effectively, ‘applied cognitive science’.

Now, my focus on games has been on them as learning solutions, and that includes scenarios and simulation-driven experiences. But, when looking for solutions, I realize that learning isn’t always the answer. Many times, for instance, we are better off with ‘distributed‘ cognition. That is, putting the answer in the world instead of in our heads. This is broader than learning, and invokes cognitive science. Also, quite frankly, many problems are just based in bad interface designs!  Thus, we can’t stop at learning. We truly are more about performance than learning.

In a sense, we’re engineers; applying learning and cognitive science to the design of solutions, (just as chemical engineering is about applying chemistry). Interestingly, the term learning engineering has another definition. This one talks about using the benefits of engineering approaches, such as data, and technology-at-scale, to design solutions. For instance, making adaptive systems requires integrating content management, artificial intelligence, learning design, and more.

Historically, our initial efforts in technology-facilitated learning did take teams. The technology wasn’t advanced enough, and it took learning designers, software engineers, interface designers and more to generate solutions like Plato, intelligent tutoring systems, and the like.  I’ve argued that Web 1.0 took the integration of the tech, content design, and more, which usually was more than one person could handle. Now, we’ve created powerful tools that allow anyone to create content. Which may be a problem! The teams used to ensure quality. Hopefully, the shift back comes with a focus on process.

We can apply cognitive science to our own design processes. We’ve evolved many tools to support not making reliable mistakes: design processes, tools like checklists, etc. I’ll suggest that moving to tools that make it easy to produce content haven’t been scaffolded with support to do the right thing. (In fact, good design makes it hard to do bad things, but our authoring tools have been almost the opposite!)  There’s some hope that the additional complexity will focus us back on quality instead of being a tool for quantity. I’m not completely optimistic in the short term, but eventually we may find that tools that let us focus on knowledge aren’t the answer.

I’m thinking we will start looking at how we can use tools to help us do good design. You know the old engineering mantra: good, fast, and cheap, pick 2. Well, I am always on about ‘good’. How do we make that an ongoing factor? Can we put in constraints so it’s hard to do bad design? Hmm… An interesting premise that I’ve just now resurrected for myself. (One more reason to blog!) What’re your thoughts?

 

Get the basics right first!

10 October 2023 by Clark Leave a Comment

I’m currently advising several organizations on their approaches to the use of technology to support learning. Moreover, I’ve been doing so for more than two decades, and see a lot more such situations as well. One of the things that I struggle with is seeing folks getting all agog over new technology, yet without getting the design right beforehand.  Thus, let me make a simple suggestion: get the basics right first!

So, we know what leads to good learning. Heck, I’ve written a book summarizing what’s known about it, and I’m not the only one. Despite that fact that humans are complex, and increasingly so are our learning goals, there exist robust principles. We know that we should provide a sufficient quantity of appropriately challenging contextualized practice with aligned feedback, for instance. That is, if we actually want to achieve an outcome.

Yet, too often, we don’t see this. We see, instead, information presentation. Sometimes even with a knowledge test! Yet, such an effort is unlikely to lead to any meaningful change. That is, the investment’s wasted!

Worse, too often we see this being done with fancy new tools. Sure, I get as attracted to shiny new objects as anyone. However, I want to understand their core affordances for learning. Anyone had the dubious pleasure of attending a slide presentation in a virtual world? Or maybe being presented with animated presentations of lots of facts? The new tools may have a short-term effect of novelty, but that’s it. The fundamental aspects of how our brains learn are what’s going to make, or break, a learning investment.

On the other hand, if we start with getting the learning right, first, then there may be additional value coming from the tech. Adaptivity, on top of quality learning design, can accelerate the outcomes.  Immersion, at the right time and place, is better than not. Language models, properly used, can have big impacts. However, it comes from knowing the specific capabilities, and matching them to the need.

While I haven’t done the ‘back of the envelope’ calculation (I’m not a financial whiz), I can state with a fair degree of comfort that you’re better off doing simple learning with good design. Bad design with shiny tech is still bad design! You’ll more likely have an impact putting your investment into learning quality than using fancy tech to deliver dreck. Of course, once you’ve done that, the investment in tech can do a lot more!

I’m not against new tech, heck I’ve written on games, mobile, and more! What I’m against is new tech in lieu of good design. And I’m even more enamored of good tech on top of good design.  So, get the basics right first, then add in the shiny objects. That way you’re going to have a good return on your $$, and that’s a good thing. Right?

PS, speaking of basics, we’ll be running a debate tomorrow (11 Oct) discussing the Learning Experience Design (LXD) label. I’m sure we’ll unpack critical issues. Check it out. 

Grounded in practice

16 May 2023 by Clark Leave a Comment

Many years ago, I was accused of not knowing the realities of learning design. It’s true that I’ve been in many ways a theorist, following what research tells us, and having been an academic. I also have designed solutions, designed design processes, and advised orgs. Still, it’s nice to be grounded in practice, and I’ve had the opportunity of late.

So, as you read this, I’m in India (hopefully ;), working with Upside Learning. I joined them around 6 months ago to serve as their Chief Learning Strategist (on top of my work as Quinnovation, as co-director of the Learning Development Accelerator, and as advisor to Elevator9). They have a willingness to pay serious attention to learning science, which as you might imagine, I found attractive!

It’s been a lot of marketing: writing position papers and such. The good news is it’s also been about practice. For one, I’ve been running workshops for their team (such as the Missing LXD workshop with the LDA coming up in Asia-friendly times this summer). We’ve also created some demos (coming soon to a sales preso near you ;). I’ve also learned a bit about their clients and usual expectations.

It’s the latter that’s inspiring. How do we bake learning science into a practical process that clients can comprehend? We’re working on it. So far, it seems like it’s a mix of awareness, policy, and tools. That is, the design team must understand the principles in practice, there need to be policy adjustments to support the necessary steps, and the tools should support the practice. I’m hoping we have a chance to put some serious work into these in my visit.

Still, it’s already been eye-opening to see the realities organizations face in their L&D roles. It only inspires me more to fight for the changes in L&D that can address this. We have lots to offer orgs, but only if we move out of our comfort zone and start making changes. Here’s to the revolution L&D needs to have!

 

Misleading Malarkey

25 April 2023 by Clark 2 Comments

Recently, I saw a claim that was, well, a tad extreme. Worse, I think it was wrong, and possibly harmful. Thus, I feel it’s right to address it, to avoid misleading malarkey.

So, here’s the claim that riled me up:

Short-form edutainment is the most effective teaching method for both children and adults. TikTok and YouTube shorts will ultimately replace high schools and universities. Employment sector will phase out LMS systems and replaced with AI-powered compliance tools. If you are considering instructional design as a career, you may want to become a YouTuber or TikToker instead.

If you’ve tuned in at all, you’ll know that I’m a fan of engagement, properly construed.  Heck, it’s the topic of my most recent book! So, talking about the value of engagement in learning is all to the good. However…

…this claim goes over the top. Most notably, there’s the claim that edutainment is the most effective teaching method. If only! That puts me off, because teaching should yield a learning outcome, and just watching video shorts won’t do that (under most circumstances). Not surprisingly, I asked for research.

The author pointed to a study where mice genetically low on dopamine learned better when given dopamine. Yes, but the study had the mice do more than just watch videos, they performed tasks! I tried to go deeper, saying that engagement may be desirable, but it’s not sufficient. Without practice, watching entertaining and informative material (e.g. edutainment) isn’t a path to learning outcomes.

The conversation was derailed by my comment that edutainment had gotten a bad name from games. In the 80s, in an industry I was in, this was the case! I was accused of having a ‘gamification’ mindset! (Ahem.)  I tried steering the conversation back to the point it’s not about gamification, it’s about engagement combined with practice.

Interestingly, there was an almost parallel conversation about how engagement wasn’t the same as learning (which I pointed to in the exchange). The general take is that engagement is desirable but insufficient. Yes! Yet here we see the claim that engagement is all we need!

I believe in engagement for learning. I just don’t believe that by itself it will lead to learning. Learning science supports both the value of engagement, and the necessity of practice and feedback. That’s all. But claims like the above are misleading malarkey. It may be we’re talking an outrageous marketing claim (infamy is better than not being known at all?), but when it misleads, it’s a problem. Am I missing something?

Missing LXD Workshop

20 April 2023 by Clark Leave a Comment

We interrupt your regularly scheduled reading for this commercial announcement:

What is Learning Experience Design (LXD)? Further, why should you care? Finally, (and arguably most important) what does it mean you should do differently? Those, to me, are important questions. My short answer is that LXD is the elegant integration of learning science and engagement. Which, to me, implies some important nuances on top of what’s traditionally done in instructional design (ID). How to address it? There’s actually quite a lot in LXD, but it’s also a lot of overlap with traditional ID practices and processes. I reckon the easiest (and best) way to address it is to talk about the delta. That is, what’s different between the two. So, in my role for Upside Learning, I developed a missing LXD workshop. We ran it internally to good outcomes, and now, you can take it!

I believe that the difference starts with objectives; you can’t make a meaningful experience if you don’t have learners acquiring relevant new skills (not just an information dump). From there, there are nuances on designing individual practice activities, and then aggregated into practices (that is, putting practices together). Moving on, we look at the content elements of models and examples, and then the emotional aspects of learning. The workshop closes by looking at a design process that accommodates these. Recognizing that folks don’t want to throw out their whole process to start anew, it works from a generic model.

In the workshop, I cover each of those topics in a week; so it’s a six week experience. In between, I ask attendees to do some interim processing to both cement their understanding and to change their practices. Each week we’ll cover underlying concepts, see examples of what we’re talking about, actively process the information, and do a major application task.

To make this available more broadly, Upside’s partnered with the Learning Development Accelerator (LDA) to deliver it. Full disclosure: I’m co-director of the LDA, and Chief Learning Strategist for Upside Learning (in addition to my ongoing role for Quinnovation). (So, it’s all about me! :) Seriously, I think this puts together the tools I believe are necessary to lift our industry.

To be clear, since the advance notice timeframe puts this in summer, we’re offering it in Asia time-frames first (tho’ anyone is welcome!):

Australian Eastern Standard Time: July 7, 14, 21, 28, August 4 and 11 from 12h00 to 14h00 each day
Singapore Time: July 7, 14, 21, 28, August 4 and 11 from 10h00 to 12h00 each day
India Standard Time: July 7, 14, 21, 28, August 4 and 11 from 07h30 to 09h30 each day
New York Time: July 6, 13, 20, 27, August 3 and 10 from 22h00 to 24h00 each day

We’re offering it for US$100 to LDA members, and US$350 to non-members (for only $40 more, you get the full LDA offerings as well).

We’re planning to offer the missing LXD workshop again at a later date at East Coast/Europe friendly times (probably at a steeper price, we’ll have worked the bugs out ;). You can find out more at the LDA site. It’s got learning science and engagement bundled up into a coherent whole, for those who’ve already been doing ID and want to lift their game. I hope you’ll find it worth your while.

We now return you to your regularly scheduled reading until next week at the usual time.

Time is the biggest problem?

21 March 2023 by Clark 1 Comment

In conversations, I’ve begun to suspect that one of the biggest, if not the biggest, problem facing designers wishing to do truly good, deep, design, is client expectations. That is, a belief that if we’re provided with the appropriate information, we can crank out a solution. Why, don’t you just distribute the information across the screen and add a quiz? While there are myriad problems, such as lack of knowledge of how learning works, etc, folks seem to think you can turn around a course in two weeks. Thus, I’m led to ponder if time is the biggest problem.

In the early days of educational technology, it was considered technically difficult. Thus, teams worked on instantiations: instructional designers, media experts, technologists. Moreover, they tested, refined, and retested. Over time, the tools got better. You still had teams, but things could go faster. You could create a draft solution pretty quickly, with rapid tools. However, when people saw the solutions, they were satisfied. It looks like content and quizzes, which is what school is, and that’s learning, right? Without understanding the nuances, it’s hard to tell well-produced learning from well-designed and well-produced learning. Iteration and testing fell away.

Now, folks believe that with a rapid tool and content, you can churn out learning by turning the handle. Put content into the hopper, and out comes courses. This was desirable from a cost-efficiency standpoint. This gets worse when we fail to measure impact. If we’re just asking people whether they like it, we don’t really know if it’s working. There’s no basis to iterate! (BTW, the correlation for learner assessment of learning quality, and the actual quality, is essentially zero.)

For the record, an information dump and knowledge test is highly unlikely to lead to any significant change in behavior (which is what learning we are trying to accomplish). We need is meaningful practice, and to get that right requires a first draft, and fine tuning. We know this, and yet we struggle to find time and resources to do it, because of expectations.

These expectations of speed, and unrealistic beliefs in quality, create a barrier to actually achieving meaningful outcomes. If folks aren’t willing to pay for the time and effort to do it right, and they’re not looking at outcomes, they will continue to believe that what they’re spending isn’t a waste.

I’ve argued before that what might make the biggest impact is measurement. That is, we should be looking to address some measurable problem in the org and not stop until we have addressed it. With that, it becomes easier to show that the quick solutions aren’t having the needed impact. We need evidence to support making the change, but I reckon we also need to raise awareness. If we want to change perception, and the situation, we need to ensure that others know time is the biggest problem. Do you agree?

« Previous Page
Next Page »

Clark Quinn

The Company

Search

Feedblitz (email) signup

Never miss a post
Your email address:*
Please wait...
Please enter all required fields Click to hide
Correct invalid entries Click to hide

Pages

  • About Learnlets and Quinnovation

The Serious eLearning Manifesto

Manifesto badge

Categories

  • design
  • games
  • meta-learning
  • mindmap
  • mobile
  • social
  • strategy
  • technology
  • Uncategorized
  • virtual worlds

License

Previous Posts

  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • April 2011
  • March 2011
  • February 2011
  • January 2011
  • December 2010
  • November 2010
  • October 2010
  • September 2010
  • August 2010
  • July 2010
  • June 2010
  • May 2010
  • April 2010
  • March 2010
  • February 2010
  • January 2010
  • December 2009
  • November 2009
  • October 2009
  • September 2009
  • August 2009
  • July 2009
  • June 2009
  • May 2009
  • April 2009
  • March 2009
  • February 2009
  • January 2009
  • December 2008
  • November 2008
  • October 2008
  • September 2008
  • August 2008
  • July 2008
  • June 2008
  • May 2008
  • April 2008
  • March 2008
  • February 2008
  • January 2008
  • December 2007
  • November 2007
  • October 2007
  • September 2007
  • August 2007
  • July 2007
  • June 2007
  • May 2007
  • April 2007
  • March 2007
  • February 2007
  • January 2007
  • December 2006
  • November 2006
  • October 2006
  • September 2006
  • August 2006
  • July 2006
  • June 2006
  • May 2006
  • April 2006
  • March 2006
  • February 2006
  • January 2006

Amazon Affiliate

Required to announce that, as an Amazon Associate, I earn from qualifying purchases. Mostly book links. Full disclosure.

We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.Ok