Two months ago, I wrote about the L&D Conference we were designing. In all fairness, I reckon I should report on how it went, now that it’s finished. There are some definite learnings, which we hope to bring forward, both for the conference (should we run it again, which we intend), and for the Learning & Development Accelerator (LDA; the sponsoring org, of which I’m co-director with Matt Richter) activities as well. So here are some thoughts on the conference outcomes.
Our design was to have two tracks (basic and advanced) and a limited but world-class faculty to cover the topics. We also were looking not just to replicate what you get at typical face-to-face conferences (which we like as well), but to do something unique to the medium and our audience. Thus, we weren’t just doing one-off sessions on a topic. Instead, each was an extended experience, with several sessions spread out over days or weeks.
The results of that seemed to work well. While not everybody who attended one of the sessions on a topic attended all, there was good continuity. And the feedback has been quite good; folks appreciate the deep dive with a knowledgeable and articulate expert. This, we figure, is an important result that we’re proud of. If someone misses a session, they can always review the video (we’re keeping the contents available for the rest of the year).
Our social events, networking and trivia, didn’t do quite so well. The networking night did have a small attendance but the trivia night didn’t reach critical mass. We attribute this at least partly to it being a later thought, and not promoting from the get-go.
We struggled a bit with scheduling. First, we spread it across changes in countries that switch to/from daylight savings time. The platform we used didn’t manage that elegantly, and we owe a lot to a staffer who wrestled that into submission. Still, it led to some problems in folks connecting at the right time. On the other hand, having the courses spread out meant we didn’t collide, you could attend any sessions you want (the tracks were indicative, not prescriptive).
The platform also had one place to schedule events, but it was as web page. As a faculty member opined, they wished they could’ve loaded all the sesssions into their calendar with one click. I resonate with that, because in moments when I might’ve had spare bandwidth to attend a session, I’m more likely to look at my calendar rather than the event page. Not sure there’s an easy solution, of course. Still, folks were able to find and attend sessions.
We also didn’t get the social interaction between the sessions we’d hoped, though there was great interaction during the sessions. Faculty and participants were consistent in that perspective. There was a lot of valuable sharing of experiences, questions, and advice.
One thing that, post-hoc, I realize is that it really helps to unpack the thinking. The faculty we chose are those who’ve demonstrated an ability to help folk see the underlying thinking. That paid off well! However, we realize that there may be more opportunities. An interesting discussion arose in a closing event about the value of debates; where two folks who generally agree on the science find something to diverge on. Everyone (including the debaters), benefit from that.
We’re going to be looking to figure out how to do more unpacking, and share the ability to do the necessary critical thinking around claims in our industry. The LDA focuses on evidence-based approaches to L&D. That requires a bit more effort than just accepting status quo (and associated myths, snakeoil, etc), but it’s worth it for our professional reputation.
So those are my reflections on the L&D Conference outcomes. Any thoughts on this, from attendees or others?