Apparently, an acquaintance challenged my colleague Harold Jarche’s Personal Knowledge Mastery (PKM) model. He seemed to consider the possibility that it’s a fad. Well, I do argue people should be cautious about claims. So, I’ve talked about PKM before, but I want to elaborate. Here’s my take on the case for PKM.
As context, I think meta-learning, learning to learn, is an important suite of skills to master. As things change faster, with more uncertainty and ambiguity, the ability to continually learn will be a critical differentiator. And you can’t take these skills for granted; they’re not necessarily optimal, and our education systems generally aren’t doing a good job of developing them. (Most of school practices are antithetical to self learning!)
Information is key. To learn, you need access to it, and the chance to apply. Learning on one’s own is about recognizing a knowledge gap, looking for relevant information, applying what you find to see if it works, and once it does, to consolidate the learning.
Looking at how you deal with information – how you acquire it, how you process it, and how you share your learnings – is an opportunity to reflect. Think of it as double-loop learning, applying your learning to your own learning. We’re often no so meta-reflective, yet that ends up being a critical component to improving.
Having a framework to scaffold this reflection is a great support for improving. Then the question becomes what is the right or best support? There are lots of people who talk about bits and pieces, but what Harold’s done is synthesize them into a coherent whole (not a ‘mashup’). PKM integrates different frameworks, and creates a practical approach. It is simple, yet unpacks elegantly.
So what’s the evidence that it’s good? That’s hard to test. The acquaintance was right that just university uptake wasn’t a solid basis (I found a renowned MBA program recently that was still touting MBTI!). The hard part would be to create a systematic test. Ideally, you’d find an organization that implements it, and documents the increase in learning. However, learning in that sense is hard to measure, because it’s personal. You might look for an increase in aggregate measures (more ideas, faster trouble-shooting), but this is personal and is dependent on outside factors like the culture for learning.
When you don’t have such data, you have to look for some triangulating evidence. The fact that multiple university scholars are promoting it isn’t a bad thing. To the contrary, uptake at individual institutions without a corporate marketing program is actually quite the accolade! The fact that the workshop attendees tout it personally valuable it also a benefit. While we know that individual attendee’s reports on the outcomes of a workshop don’t highly correlate with actual impact, that’s not true for people with more expertise. And the continued reflection of value is positive.
Finally, a point I made at the end of my aforementioned previous reflection is relevant. I said: “I realize mine is done on sort of a first-principles basis from a cognitive perspective, while his is richer, being grounded in others’ frameworks.” Plus, he’s been improving it over the years, practicing what he preaches. My point, however, is that it’s nicely aligned with what you’d come at from a cognitive perspective. Without empirical data, theoretical justification combined with scholarly recognition and personal affirmations are a pretty good foundation.
There’re meta-lessons here as well: how to evaluate programs, and the value of meta-learning. These are worth considering. Note that Harold doesn’t need my support, and he didn’t ask me to do this. As usual, my posts are triggered by what crosses my (admittedly febrile) imagination. This just seemed worth reflecting on. So, reflections on your part?