Learnlets

Secondary

Clark Quinn’s Learnings about Learning

Search Results for: design

To design is human

5 September 2023 by Clark Leave a Comment

I maintaining a fascination in design, for several reasons. As Herb Simon famously said: “The proper study of mankind is the science of design.” My take is to twist the title of Henry Petroski’s book, To Engineer is Human into ‘to design is human’. To me, design is both a fascinating study in cognition, and an area of application. The latter of which seems to be flourishing!

I’ve talked in the past about various design processes (and design overall, a lot). As we’ve moved from waterfall models like the original ADDIE, we’ve shifted to more iterative approaches. So, I’ve mentioned Michael Allen’s SAM, Megan Torrance’s LLAMA, etc.

And I’ve been hit with a few more! Just in the past few days I’ve seen LeaPS and EnABLE. They’re increasingly aware of important issues in learning science. All of this is, to me, good. Whether they’re just learning design approaches, or more performance consulting (that is, starting with a premise that a course may not be the answer), it’s good to think consciously about design.

My interest in design came in a roundabout way. As an undergrad, I designed my own major on Computer-Based Education, and then got a job designing and programming educational computer games. What that didn’t do, was teach me much about design as a practice. However, going back to grad school (for several reasons, including knowing that we didn’t have a good enough foundation for those game designs) got me steeped in cognition and design. Of course, what emerges is that they link at the wrists and ankles.

So, my lab was studying designing interfaces. This included understanding how we think, so as to design to match. My twist was to also design for how we learn. However, more implicitly than explicitly perhaps, was also the topic of how to design. Just as we have cognitive limitations as users, we have limitations as designers. Thus, we need to design our design processes, so as to minimize the errors our cognitive architecture will introduce.

Ultimately, what separates us from other creatures is our ability to create solutions to problems, to design. I know there’s now generative AI, but…it’s built on the average. I still think the superlative will come from people. Knowing when and how is important. Design is really what we want people to do, so it’s increasingly the focus of our learning designs. And it’s the process we use to create those solutions. Underpinning both is how we think, work, and learn.

To design is human, and so we need to understand humans to design optimally. Both for the process, and the product. This, I think, makes the case that we do need to understand our cognitive architecture in most everything we do. What do you think?

FWIW, I’ll be talking about the science of learning at DevLearn. Hope to see you there. 

Designing a conference

22 September 2022 by Clark Leave a Comment

When I agreed to join as co-director of the Learning & Development Accelerator, I’d already attended their first two conferences. Those had been designed to reflect the circumstances at the time, e.g. the pandemic. In addition, there was a desire on the part of Matt Richter & Will Thalheimer (the original directors) to reflect certain values. Matt and I are running the event again, but times have changed. That means we have to rethink what’s being done. So here’s my thinking about designing a conference.

First, the values Matt and Will started with included being as global as possible, and being virtual. The former was reflected in having presentations given twice, once early in the US day, and then again later. That supported everything from Europe, Africa, and the Mideast to Asia and Australia. The virtual was, at least partly, a reaction to the lack of desire to travel and meet face to face, but also to provide options for those who might struggle.

We’re definitely still focusing on being virtual. Folks who would find it challenging to arrange travel for whatever reason can attend this event. There’s also the environmental considerations. Yes, technology requires resources, but not as much as collective travel. While there’s also a desire to meet different time needs, we’ve found less demand for multiple times. However, we will be recording sessions that are synchronous, so they can be viewed at convenient times. We also are spreading it over six weeks, so that there’s time to consume as much as you want. Further, faculty can choose when they’re offering ;).

The original design was focused on evidence-based L&D (which remains a key guiding principle for the LDA). Matt & Will solicited their presenters based upon their representation, but the agenda was largely what those folks wanted to present. Which, in many ways, reflects what other conferences do. In this new era, we wondered what would make a compelling proposition when you can travel to F2F events. We decided that we wanted to step away from ‘what we get’, and focus on ‘what the audience needs’.

This event, then, has a curriculum, across two tracks, designed to address specific needs. There’s also a different pedagogy than most conferences.We also have specific faculty, rather than presenters based upon submissions. Of course, there are tradeoffs. At least we can share our thinking.

The faculty are folks we know and trust to present evidence-based content. You won’t hear promotion for snake oil, like learning styles. We have a pretty impressive lineup, frankly, of people we think are world-class. This includes folks like Ruth Clark, Mirjam Neelen & Paul Kirschner, Karl Kapp, Julie Dirksen, Kat Koppett, Stella Lee, Nigel Paine, Will Thalheimer, and Thiagi. On top of, of course, Matt and myself. Reality means that a few folks we would’ve liked to have couldn’t commit, but this is a a broad and reputable group.

The tracks are basics and advanced. We want to be able to serve multiple audiences. The intent is that the basic track has the core knowledge an L&D person should know. As best we can, as we negotiate with the faculty, of course. Then, the advanced topics are things that are emergent and need addressing. Of course, there’s no commitment that you have to stay in one or another. As with other conferences, you can pick and choose what to view.

We’re also not just having presentations; we’ve asked the faculty to provide development. That is, we’re intending several rounds of content, activity, and feedback, spread out over several days or weeks. We don’t want people to hear good ideas, and maybe take them back. We want folks to take action! We’re also designing in the opportunity for mentoring.

Of course, there’ll be some social events, and other ways to not only hear content and apply it, but to mingle with faculty and other attendees. We want to foster some community. Also, we’re intending to somewhat front load stuff so that we can adapt. If we hear that we need to do something we haven’t planned, we’re looking to have leeway to address it. The nice thing about being small is the ability to be flexible!

None of this is saying you don’t get much of the same from conferences (except, perhaps, the design). I’ve been on conference program committees, and know conference organizers as well. They typically get more proposals than they can accept, so they can choose a suite that reflect things for various ranges of experience and cover important topics. They may not, however, know all the submitters, and take chances on a few. I laud that, actually, because we can’t know if a new approach or person is worthwhile without experimentation. Still, there is the chance for gaps, and for bad presentations/presenters. They’re also, except for the pre-conference workshops (e.g. my Make It Meaningful one at the upcoming DevLearn), one-off events.

We’re taking a chance on our format, too. We haven’t done it before. It may not work, though we have good reasons to believe it will. So, we hope to see you at the Learning & Development Conference, Oct 10 – Nov 18, if the above thinking about designing a conference sense. We think it does, we hope you do, too.

LXD by Design

28 June 2022 by Clark Leave a Comment

Learning Experience Design (LXD), I argue, is the elegant integration of learning science with engagement. All well and good, of course, but how do you introduce it? Specifically, how do we engage people already actively designing? There are a number of ways to cut it. You could talk about the cognitive underpinnings, the implications for the elements of learning, or via the changes in the design process. I do the latter two, with a focus on the engagement side (which I feel is underdeveloped), in my latest book, “Make It Meaningful“. However, what if you’re trying to do both? Here’s a case to visit LXD by design.

It seems pretty safe to say that most people will resist totally throwing out their entire design process. There’s lots of investment. Further, most design processes have a useful basic structure. This suggests looking for the smallest tweaks that will yield the biggest impacts. We’re looking to incorporate the effectiveness of learning science with the emotional appropriateness of engagement. What does this require?

The first change is in the analysis. LXD simply  can’t work without performance objectives. If you’re just trying to make people aware, you’re not really on a transformative journey. You want to be focusing on equipping people so that they’re (meaningfully) changed through the process. You also need some new information: why this is necessary  for the learners, and why experts find it interesting enough to study. There’s more, but this is key.

Then, your design process differs. You are being creative,  given that you’re not just directly practicing. You’re also tuning to get the experience optimized. So, you need to build in some brainstorming, and iteration. In pragmatic ways, of course.

Implementation is also more iterative. You’ll be investing slowly, to allow pivots and to keep the overall costs contained. Postponing programming and preferring paper are components of this.

Even your evaluation is different. You are testing, now, not only effectiveness, but also the experience. Which you may have been doing (*cough* smile sheets *cough*), but you need to test both sequentially, not just one  or the other.

All along, there are small changes that will help integrate learning science elegantly with engagement. Making those critical changes will likely take a bit longer, at least at first. On the other hand, you should be getting real outcomes  and more engaged learners. Which, ultimately, is what we should be doing.

I’ll be covering this in a workshop for the Learning Guild in two half-day sessions prior to their LXD conference at the beginning of August. (Also doing a session during the conf on emotion.) I hope you’ll find LXD by Design to be a practical and useful, even  transformative, experience.

Meta-ethics of learning design

28 September 2021 by Clark 1 Comment

I‘ve addressed ethics elsewhere, but I‘m looking at it a different way now. I’m thinking from the perspective of situated cognition, and recognizing that there are certain things we can do. For better or worse. Further, these choices have ramifications beyond the initial impact. I think we need to be aware of the possibilities, and then consider the meta-ethics of learning design.

My particular concern sparks from the notion of how we are contextually sensitive. To make this concrete, think of the research by Beth Loftus. I”ll characterize a whole suite of research with a simply paraphrased experiment. So, she had folks watch a video of an accident. Then, she prompted recall of the amount of damage they witnessed. For one, she just asked them. For another group, the recall was prompted by “Besides the broken glass,…”. The latter group recalled worse damage. And. There. Was. No. Broken. Glass!  

The point here is that context can influence our thinking and memory. Which is what I see worry about with videos. They can take a statement as if it‘s fact, and then continue on with that as received wisdom. It‘s a classic cognitive approach, making a statement as if it‘s assumed.The bad part is that there‘s a narrative flow, and it‘s hard to stop and reflect. Versus, say, reading.  

On the other hand, of course, we can build in reflection time. Sure, learners can use the pause button, but there are times they may not. For instance, if the learner is following confirmation bias, that is looking at things that align with what you believe.  

We have the option to use coercive techniques for good, of course. However, we can also choose to use legitimate presentation techniques. I believe we should. Part of the development of metacognition is to see the pedagogy being used and internalizing it. If our pedagogy is visible (as it should be), it needs to be scrutable if we want our learners to adopt appropriate skills.  

Our designs, and our meta-designs, need to be ethically designed, both to effectively achieve our ends, and to develop our learners. We need to support the meta-ethics of learning design, as well as the ethics themselves.  

Complexity in Learning Design

21 September 2021 by Clark Leave a Comment

a fractalI recently mentioned that one of the problems with research is that things are more interconnected than we think. This is particularly true with cognitive research. While we can make distinctions that simplify things in useful ways (e.g. the human information processing system model*), the underlying picture is of a more interactive system.  Which underpins why it makes sense to talk about Learning Experience Design (LXD) and not just instructional design. We need to accommodate complexity in learning design.  (* Which I talk about in Chapter 2 of my learning science book, and in my workshops on the same topic through the Allen Academy.)

We’re recognizing that the our cognition is more than just in our head. Marcia Conner, in her book  Learn More Now  mentioned how neuropeptides passed information around the body. Similarly, Annie Murphy Paul’s  The Extended Mind talks about moving cognition (and learning) into the world. In my Make It Meaningful workshops (online or F2F at DevLearn 19 Oct), I focus on how to address the emotional component of learning. In short, learning is about more than just information dump and knowledge test.

Scientifically, we’re finding there are lots of complex interactions between the current context, our prior experience, and our cognitive architecture. We’re much more ‘situated’ in the moment than the rational beings we want to believe. Behavioral economics and Daniel Kahneman’s research have made this abundantly clear. We try to avoid the hard mental work using shortcuts that work sometimes, but not others. (Understanding when is an important component of this).

We get good traction from learning science and instructional design approaches, for sure. There are good prescriptions (that we often ignore, for reasons above) about what to do and how. So, we should follow them. However, we need more. Which is why I tout LXD  Strategy! We need to account for complexity in learning design approaches.

For one, our design processes need to be iterative. We’ll make our best first guess, but it won’t be right, and we’ll need to tune. The incorporation of agile approaches, whether SAM or LLAMA or even just iterative ADDIE, reflects this. We need to evaluate and refine our designs to match the fact that our audience is more complex than we thought.

Our design also needs to think about the emotional experience as well as the cognitive experience. We want our design processes to systematically incorporate humor, safety, motivation, and more. Have we tuned the challenge enough, and how will we know?  Have we appropriately incorporated story? Are our graphics aligned or adding to cognitive load? There are lots of elements that factor in.

Our design process has to accommodate SMEs who literally can’t access what they do. Also learner interests, not just knowledge. We need to know what interim deliverables, processes for evaluation, times when we shouldn’t be working solo, and tools we need. Most importantly, we have to do this in a practical way, under real-world resource constraints.

Which is why we need to address this strategically. Too many design processes are carry-over from industrial approaches: one person, one tool, and a waterfall process. We need to do better. There’s complexity in learning design, both on the part of our learners, and ourselves as designers. Leveraging what we know about cognitive science can provide us with structures and approaches that accommodate these factors. That’s only true, however, if we are aware and actively address it. I’m happy to help, but can only do so if you reach out. (You know how to find me. ;) Here’s to effective and engaging  learning!

More lessons from bad design

24 August 2021 by Clark 2 Comments

I probably seem like a crank, given the way I take things apart. Yet, I maintain there’s a reason beyond “get off my lawn!” I point out flaws not to complain, but instead to point to how to do it better. (At least, that’s my story and I’m sticking to it. ;) Here’s another example, providing more lessons from bad design.

In this case, I’ll be attending a conference and the providers have developed an application to support attendees. In general, I look forward to these applications. They provide ways to see who’s attending, and peruse sessions to set your calendar. There are also ways to connect to people. However, two major flaws undermine this particular instance.

The first issue is speed. This application is  slow! I timed it; 4 seconds to open the list of speakers or attendees. Similarly, I clicked on a letter to jump through the list of attendees. The amount of time it takes varied from 4 to 8 seconds. Jumping to the program took 6 seconds.

While that may seem short, compare that to most response times in apps. You essentially can’t time them, they’re so fast. More than a second is an era in mobile responsiveness. I suspect that this app is written as a ‘wrapped’ website, not a dedicated app. Which works sometimes, but not when the database is too big to be responsive. Or it could just be bad coding. Regardless, this is  basically unusable. So test the responsiveness before it’s distributed to make sure it’s acceptable. (And then reengineer it when it isn’t.)

That alone would be sufficient to discount this app, but there’s a second problem. Presumably for revenue reasons, there are ads that scroll across the top. Which might make sense to keep the costs of the app down, but there’s a fundamental problem with our visual architecture.

Motion in the periphery of our vision is distracting. That was evolutionarily adaptive, allowing us to detect threats from places that we weren’t focusing on. Yet, when it’s not a threat, and we  are trying to focus on something, it interferes. We learned about this in the days of web pages with animated gifs: you couldn’t process what you were there to consume!

In this app, the scrolling of the ads makes it more difficult to read the schedule, attendee lists, and other information. Thus, the whole purpose of the application is undermined. You could have static ads that are randomly attached to the pages you click on. The audience is likely to go to several pages, so all the ads will get seen. Having them move, however, to ensure that you see them all undermines the whole purpose of the app.

Oddly enough, there are other usability problems here. On the schedule, there’s a quick jump to times on a particular day. Though it stops at 2PM!?!? (The conference extends beyond that; my session’s at 4PM.) You’d think you could swipe to see later times on that ‘jump’ menu, but that doesn’t work. I can’t go farther, because the usability makes it too painful; we may miss more lessons from bad design.

Our cognitive architecture is powerful, but has limitations. Designing to work in alignment with our brains is a clear win; and this holds true for designing for learning as well as performance support. Heck, I’ve written a whole book  about how our minds work, just to support our ability to design better learning! Conflicting with our mental mechanisms is just bad design. My goal is that with more lessons in bad design, we can learn to do better. Here’s to good design!

Levels of LXD Design

6 April 2021 by Clark Leave a Comment

I stumbled across the Elements of UX diagram again, and happened to wonder if it would map to LXD. Here’s my stab:

And the text, as usual.


In a justifiably well-known image (PDF), Jesse James Garrett (JJG) detailed the elements of (web) user experience. I‘ve been involved in the parallel development of UX and ID (and cross-fertilized them), so I wondered what the LXD version would be. So, of course, I took a stab at levels of LXD design.

To start with, JJG‘s diagram works from the bottom up. The five levels, in order, are:

  1. The original objectives and user needs.
  2. That leads to content requirements and/or functional specifications.  
  3. The next level is an information architecture or interface design that is structured to meet those needs.  
  4. Those semantic structures are then rendered as an information design with navigation or interface design.
  5. The top level is the visual design, what the user actually sees or experiences.

This systematic breakdown has been well recognized as a useful development framework. The development from need to semantics to implementation syntax suggests a logical development flow. As an aside, no one‘s claiming we should develop in a linear manner, and there tends to be more up and down action in actual practice. Drilling down and then working from the bottom up as well is a well-known cycle of design!  

The learning equivalent, then, should similarly have a structured flow. We want to go from our needs, through various levels of representation, until we reach the learner experience.  

Given that we should be driven not by the goals for the interface but learner needs, I‘ll suggest we start with the performance objectives.   Then, in parallel with user needs, I‘ll stipulate that the other top-level definition comes from the user characteristics. These match the initial level stipulated.  

At the next level, I‘ll suggest that the performance objectives drive assessment specifications, and the other decision at this level is for the pedagogical approach. We need to know what learners need to able to do, and how we‘ll get them there.

As an intermediate representation equivalent to UX‘s information architecture or interface design, I suggest from the assessment we determine the necessary practice activities required, and these are coupled with the necessary content requirements: models and examples, as well as the introduction and closing. Here we‘re still at what‘s required, not how it manifests.  

The next level is where we start getting concrete. We need to pick an overall theme or look and feel, and the flow of the experience. We‘ll also, of course, need to make a consistent interface to support navigation and taking action. We know what we need to have, but we haven‘t actually rendered it yet.  

Finally, we must render the necessary media. This will be the videos, audios, text, diagrams, images, and more that comprise the experience. This includes the actions to be taken and the associated consequences of each choice.  

That‘s the equivalent structure I‘m suggesting are the different levels of LXD design. Of course, this is a thought exercise, and so I may well have made some interpretations you could disagree with. For instance, I may have slavishly followed JJG’s levels too closely. Let me know! Also, it‘s not clear whether this is a useful representation, so far it‘s sort of a ‘because it‘s there‘ effort ;). You can let me know your thoughts on that, too!  

Performance Support and Bad Design

30 March 2021 by Clark Leave a Comment

Here’s a story about where performance support would’ve made a task much easier.

And, as always, the text.


The other day, I had a classic need for performance support. Of course, it didn‘t exist. So here‘s a cognitive story about when and where a job aid would help.

Our Bosch dishwasher stopped near the beginning of the cycle, and displayed an icon of a water tap. The goal was to get the dishwasher running again. What with the layer of undrained water, we figured there was some sort of problem with the drain, clogged or the pump broken. M‘lady had cleaned the drain, but the icon persisted. What now? Of course we could call a service person, but trying to be handy and frugal (and safe), we wanted to find out if it was something I could deal with. So, off to the manual.

Well, in this case, since I didn‘t know where the manual was, I went online. I accessed the site and downloaded the manual. Only to find no guide to what the icons mean. What?!? This violates what we know about our brains, in this case that our memory is limited. The support section of the site did list the error codes, but numerically, not by icon.  So, I had an indication I couldn’t map to a problem, let alone a  solution.  

This is a real flaw! If you‘re gonna use icons, provide a guide!  Don’t assume they’re interpretable. (This had happened once before with this same appliance, with an impenetrable icon and no clue.) As a result, I had to call the service line. That wait took awhile (with more people staying home, they‘re using their dishwashers more, and the appliances are therefore breaking down more). Once, the call dropped. The second time I had to stop because I had an upcoming call. The third time, however, I got through.

And a perfectly nice person listened, asked some questions, and then instructed me through a process. After hitting cancel (which automatically tries to drain everything and reset to zero) by simultaneously pressing two buttons linked by a line on the control panel, I heard noises in the sink like it was draining. After a minute, I was told to go ahead and open it up (yep, drained), turn it off and on, and then try running the cleaning cycle again. And, voila, it worked! (Yay!)

So, what‘s wrong with this picture? First of all, there should be a clear explanation of what the icon means, as indicated above. Second, it should be clearly tied to a process to address the problem, including intermediate steps.This is so common, I am quite boggled that the great engineers that made our (very good) dishwasher aren‘t complemented with a great technical communications team who write up a useful manual to support. It. Is. Just. Silly!

Note: this isn‘t a learning experience. It‘s just fine that I don‘t recall what the last time‘s icon was or what it meant, and maybe what this icon meant and what I should do. It should be infrequent enough that it‘d be unreasonable for me to have to recall. Instead, I should be able to look it up. Put information in the world!  In the long term, this should save them buckets of money because most people could self help. Clearly, they‘ve gone to numeric codes, but they could‘ve just added in the associated icons, or given a mapping from icon to numeric code. Something to help folks who have the pics.  

This is just bad design, and it‘s so obvious how to ameliorate it. People will self-help many times, but only if they can!   Just as you shouldn‘t be creating a training course when a job aid will do, you can save a help call when a job aid can address most of the problems. Use performance support when it makes sense, and doing so comes from understanding how we actually think, work, and learn. When you do, you can design solutions that meet real needs. And that‘s what we want to do, no?

Mythless Learning Design

28 July 2020 by Clark 1 Comment

If I’m going to rail against myths in learning, it makes sense to be clear about what learning design without  myths looks like. Let me lay out a little of what mythless learning design is, or should be.

Myths book coverLearning with myths manifests in many ways. Redundant development to accommodate learning styles, or generations. Shortened to be appropriate for millennials or the attention span of a goldfish. Using video and images for everything because we process images 60K faster. Quiz show templates for knowledge test questions because they’re more engaging. And all of these would be wrong.

Instead, mythless design starts with focusing on  performance. That is, there’re clear learning outcomes that will change what people do that will affect the success of the organization. It’s not about knowledge itself, but only in service of achieving better ability to make decisions.

Then, it’s about designing meaningful practice in making those decisions. It’s not about testing knowledge, but ability to apply that knowledge to choose between alternative courses of action. It can be mini-scenarios (better multiple choice), branching, or sims, but it’s about ‘do’, not  know.

We reinforce practice with content that guides performance and provides feedback. It does use multiple media, because we use the right media for the message. Yes, we look to engage multiple senses, but for comprehending and encoding information. And variety. We use visuals to tap into our powerful visual processing system, not because they have any particular metric improvement. We also use audio when appropriate. And while text is visual, we use it as appropriate too. To address learning outcomes, not learner preferences.

Mythless learning design may use small amounts of content, but because minimalism keeps cognitive load in check, not because our attention span has changed. We need appropriate chunking, as our working memory is limited, so we want to make things as small as possible, but no smaller!

We design meaningful active practice not because any generation needs it, but because it’s better aligned with how our brains learn at pretty much any age. There are developmental differences in working memory capacity and experience base, but  everyone benefits from doing things, not passively consuming content.

There are good bases for design. Ones that lead to real outcomes. Starting from a performance focus, and reflecting what’s been demonstrated in learning science research, and tested and refined. Evidence guiding design, not myths.

There are also bad bases for design. Dale’s Cone, shiny object syndrome, the list goes on. Gilded bad design is still bad design. Get the core right. Let’s practice good, mythless learning design. Please.

 

Experimenting with conference design

13 May 2020 by Clark 2 Comments

As part of coping in this time of upheaval, I’m trying different things. Which isn’t new, but there seem to be more innovations to tap into. In addition to teaching a course on mobile learning, I’m one of the speakers at a new online event. And, what’s nice, is that they’re experimenting with conference design, not just moving straight online.

To be fair, the Learning Guild has had a continual practice of trying different things at their conferences, and it’s been good. And, so too, was the most recent TK by ATD.   But this is different. Two of my colleagues organized it as a response to our ‘new normal’, Will Thalheimer and Matt Richter. And their stated goal is changing the way we conference.

The key, of course, is to leverage what’s different, and possible, online. It’s running from June 22 – July 31. That’s not a typo, it’s all of July and the tail end of June. That’s a long time!   They’ve recruited a suite of experts from around the world (they’re really trying to do this across boundaries include time and geography). And, to let you know, I’m one (so take my comments with the appropriate caveats ;).

They’re also tossing out traditional ideas and open to new ones. Speakers are expected to build an experience that’s spread out over the time. Yet also designed so that you can come in late, or early, and drill into what you want when you want. They’re also planing on having synchronous events – debates, panels, socializing – again using technology.

Note that it’s not free. There are some free conferences being put on, mostly webinars. And those are good. This is different. It’s deeper. It’s a stab at looking afresh. And I’m not sure it could even have come from any existing framework.

And, we won’t know if it all will work. We’re designing this in the time between now and launch. There’re bound to be hiccups. Which, of course, means there’re bound to be learnings. I know I want to talk about Learning Science 101. And something else. Lots I could (I welcome suggestions). I’m inclined to think it might be Emotion and Learning. But it could also be LXD. (There are all linked, of course.)

But it’s a high quality group (er, mostly…they did let me in). AND, importantly, it’s focused on evidence-based content. There may be sponsors, or even an exhibit hall, but every presenter is honor-bound not to push anything that’s not legit. Most importantly, there’s enough quality that overall it’s bound to be worth it.

I’m excited, frankly. I have to come up with some different ideas. And I like that. I’m glad that they’re experimenting with conference design. We all win, regardless! It’s part of learning, challenging yourself. So, do yourself a favor. Check it out. It may not be for you, but keep an open mind!

 

 

Next Page »

Clark Quinn

The Company

Search

Feedblitz (email) signup

Never miss a post
Your email address:*
Please wait...
Please enter all required fields Click to hide
Correct invalid entries Click to hide

Pages

  • About Learnlets and Quinnovation

The Serious eLearning Manifesto

Manifesto badge

Categories

  • design
  • games
  • meta-learning
  • mindmap
  • mobile
  • social
  • strategy
  • technology
  • Uncategorized
  • virtual worlds

License

Previous Posts

  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • April 2011
  • March 2011
  • February 2011
  • January 2011
  • December 2010
  • November 2010
  • October 2010
  • September 2010
  • August 2010
  • July 2010
  • June 2010
  • May 2010
  • April 2010
  • March 2010
  • February 2010
  • January 2010
  • December 2009
  • November 2009
  • October 2009
  • September 2009
  • August 2009
  • July 2009
  • June 2009
  • May 2009
  • April 2009
  • March 2009
  • February 2009
  • January 2009
  • December 2008
  • November 2008
  • October 2008
  • September 2008
  • August 2008
  • July 2008
  • June 2008
  • May 2008
  • April 2008
  • March 2008
  • February 2008
  • January 2008
  • December 2007
  • November 2007
  • October 2007
  • September 2007
  • August 2007
  • July 2007
  • June 2007
  • May 2007
  • April 2007
  • March 2007
  • February 2007
  • January 2007
  • December 2006
  • November 2006
  • October 2006
  • September 2006
  • August 2006
  • July 2006
  • June 2006
  • May 2006
  • April 2006
  • March 2006
  • February 2006
  • January 2006

Amazon Affiliate

Required to announce that, as an Amazon Associate, I earn from qualifying purchases. Mostly book links. Full disclosure.

We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.Ok