Way back when we were building the adaptive learning system tabbed Intellectricityâ„¢, we were counting on a detailed content model that carved up the overall content into discrete elements that could be served up separately to create a unique learning experience. As I detailed in an article, issues included granularity and tagging vocabulary. While my principle for the right level of granularity is playing a distinct role in the learning experience, e.g. separating a concept presentation from an example from a practice element, my more simple heuristic is to consider “what would a knowledgeable mentor give to one learner versus another”. The goal, of course, is to support future ability to personalize and customize the learning experience.
Back then, we were thinking then as a content delivery engine, but our constraints required content produced in a particular format, and we were thinking about how we’d get content produced the way we needed. Today, I’m still thinking that the advantages of content produced in discrete chunks, under a tight model, is a valuable investment in time and energy. Increasingly, I’m seeing publishers taking a similar view, and as new content formats get developed and delivered (e.g. ebooks, mobile web), the importance of more careful attention to content makes sense.
The benefits of more careful articulation of content can go further. In the performance ecosystem model (PDF), the greater integration step is specifically around more tightly integrating systems and processes. While this includes coupling the disparate systems into a coherent workbench for individuals, it also includes developing content into a model that accounts for different input sources, output needs, and governance. While this is largely for formal content, it could be community-generated content as well. The important thing is to stop redundant content development. Typically, marketing generates requirements, and engineering develops specifications, which then are fed separately to documentation, sales training, customer training, and support, which all generate content anew from the original materials. Developing into and out of a content model reduces errors and redundancy, and increases flexibility and control. (And this is not incommensurate with devolving responsibility to individuals.)
We’re already seeing the ability to create custom recommendations (e.g. Amazon, Netflix), and companies are already creating custom portals (e.g. IBM). The ability to begin to customize content delivery will be important for customer service, performance support, and slow learning. Whether driven by rules or analytics (or hybrids), semantic tagging is going to be necessary, and that’s an concomitant requirement of content models. But the upside potential is huge, and will eventually be a differentiator.
Learning functions in organizations need to be moving up the strategic ladder in terms of their overall responsibility for more than just formal learning, but also performance support and ecommunity. Thinking like advanced publishers can and should be about moving beyond the text, and even beyond content, to the experience. While that could be custom designs (and in some cases it must be, e.g. simulation games), for content curators and providers it also has to be about flexible business models and quality development. I believe it’s a must for other organizations as well. I encourage you to start thinking strategically about content development in rich ways that stop with one-off development, and start thinking about putting some up-front effort into not only templates, but also models with tight definitions and labels.
Leave a Reply