Unhappy in many ways with the current status of education, particularly here in the US, I’ve been thinking a lot about what would make sense. What’s the role of K12, and then what’s the role of a university? Some thoughts recently coalesced that I thought I’d put out and see what reaction I get.
The issue, to me, covers several things. Now, I talked some time ago about my ongoing search for wisdom, and the notion of a wise curriculum coupled with a wise pedagogy very much permeate my thinking. However, I’m probably going to be a bit more mundane here. I just want to think what we might want to cover, and how.
Let me start with the premise that what needs to be learned to be a productive member of society needs to be learned before university, as not everyone goes further. If we truly believe (and we should) that 21st Century skills of learning, research, communication, leadership, etc, are skills everyone needs, then those are K12 goals. Naturally, of course, we also include literacy of many sorts (not just reading and writing), and ideally, thinking like a mathematician and scientist (not science and math).
However, if those are accomplished in K12 (when I’ve previously argued learning how to think might be the role of the university, and now think it’s got to be before then), then what is the role of university? Given that the half-life of knowledge is less than four years, focusing on preparing for a lifetime of performance is out of the question. Similarly, pursuing one fixed course of study won’t make sense anymore, as the fields are beginning to change, and the arbitrary categorizations won’t make sense. So what then?
I’m thinking of going back to the original Oxbridge model. In the old days, you were assigned a tutor (and advisor), and you met with that person regularly. They’d have a discussion with you, recommend some activities (read X, solve Y), and send you on your way. It was a customized solution. Since then, for a variety of reasons (scale, mostly), the model’s turned into a mass-production model. However, we now have the power of technology.
What if we moved to a system where individuals could spend some time exploring particular areas (like the first two years or so of college), and then put together a proposal of what they wanted to do, and how they’d pursue it, and the proposal would be vetted. Once approved, there’d be regular updates. Sure, there’d likely be some templates around for learning, but it’d be more self-directed, customizable, and put the appropriate responsibility on the learner.
I may be biased, as I designed my own major (UCSD’s Muir campus had a mechanism to design your own degree, and as they didn’t have a learning technology program…) as an undergraduate, and again you propose your research as a PhD candidate, but I think there’s a lot to recommend a learner taking responsibility for what they’re going to study and why. Granted, universities don’t do a good enough job of articulating why a program sequence has particular courses in it, but I think it’s even better if a learner at least has to review and defend it, if not choose it themselves.
Naturally, some domain-specific learning skills would emerge, but this would provide a more flexible system to match how specializations are changing so dynamically, serve as a model for life, and put the responsibility of faculty members more to mentorship and less to lecture. It would necessitate a change in pedagogy as well.
I think, in the long term, this sort of model has to be adopted. In the short term, it will wreak havoc with things like accreditation, but that’s not necessarily a bad thing, given the flaws we’re beginning to see in the existing system, both non- and for-profit. I reckon the for-profits might be able to move quicker, but there will be battles. And, of course, changing faculty minds reminds me of the old joke: “How many academics does it take to change a lightbulb?” “Change?” (And I *was* one!)
Naturally, this has implications for K12 too, as many have articulately argued that the pedagogy needs to change there as well, following the learners’ interests. Likewise the notion of educational publishing (where is that iPad replacement for my kid’s texts?). Those are topics for another day.
So, does this make sense? What am I missing?
virginia Yonkers says
Coming from an interdisciplinary background both in studying and teaching in the university, there are some other institutional barriers today (besides the strengthening of accreditation requirements and the push at the university level to create standardized testing as there is at the k-12 levels) that prohibit the type of flexibility you propose.
First, the current financial model of higher ed is based on the ability for graduates to find a job after college. More and more, businesses are hiring graduates based on finite skills rather than a person’s ability to learn a job. In other words, you will need to know how to learn a skill for the interview. As a result, many colleges and universities will go to the businesses, ask the skills they need in order to get their graduates hired, and integrate that into their curriculum, programs, etc… The student learns what he or she needs to get a job, therefore, not what they are interested in.
Related to this is that the programs the generate the most employed, get the biggest budget. Those that can generate outside income (through grants, research funds, etc…) will continue to be able to offer their courses. However, interdisciplinary courses do not fair well with this model as grants and research funds often require specialization. Faculty are valued based on the specialized research they conduct which often must fit into a narrow category within a field of study. So, I currently am doing research on knowledge building in distributed groups. This crosses disciplines between applied psychology, communication, management, writing, education, and sociology. While the study is very narrow, and I know of many who are interested in my work, there is no money out there for research because it does not fit into a nice neat category.
So I think to change our current system, it is necessary to start at both ends of the educational system. It is nice to learn something new and pick and choose your major. However, unless you develop a skill that will make you marketable in our society, you should keep the learning limited to your free time.
Clark says
Virginia, you raise important issues. I believe, however, that business will start recognizing that professionals whose expertise comes from passion about a topic, with a study plan vetted by experts in the field (there will likely be committees for cross-disciplinary studies), and of course still ‘off-the-shelf’ degrees, are more valuable than the current “pass ’em thru” calcified degrees. Okay, so maybe it’s a few years out ;).
I expect narrow research to continue, but your sort of cross-disciplinary work is increasingly recognized as the source of more useful innovations (as a complement to the deep discipline work, not as a replacement). Yes, funding needs to follow societal recognition of this.
And you’re spot on that it needs to come from both ends. I note that when I designed my own major, I only preceded the market, it’s not that I contradicted it!
Angela says
Hi Clark,
As always, love your brilliance, and I’m going to require some clarification on a blog statement you made…”thinking like a mathematician and scientist (not science and math)”. If I was following your stream of thought, it seemed that one of the important elements of higher education is to foster one’s ability to learn – learning to learn (which could encompass SDL, LLL, critical thinking, complex problem solving, etc.). Then, the statement about thinkng like a scientist came up….and I lost the flow of the thought. In your view, what does a scientist or mathematician think like (’cause I’m thinking…procedural, ‘right’ answer’, formulaic)? I’m all ears!
Julie Michener says
It’s something that St. Catherine University leaders have been defining as the University has implemented an academic reorganization. It begins with caring and strategic advising for the student, core curriculum that helps the student discover where they want to go and then guiding them with options for faculty/student research collaborations, internships and other experiences that develop their core skills along with communications and critical thinking abilities. It’s liberal arts based – no matter what the major as students need to learn to learn for a lifetime.
Clark says
Great feedback! Julie, thanks for the example; I think that’s great credit to St. Catherine’s. I agree with the liberal arts focus, though sometimes I wonder if that’s *supposed* to be done in high school (which is what Australia and the UK claim when they have 3 year college degrees without a general ed requirement but are instead very domain-focused).
Angela, I’m referring to what Seymour Papert said: “I don’t want to teach kids mathematics, I want to teach them to think like mathematicians”. Similarly for teaching them to think like scientists, not to just learn science. It’s about systematicity, hypotheses, experimentation, analysis. It’s another take on learning to learn, but it’s meta-cognition in a broader sense. Hope that helps!
John Andrew Williams says
This is a very interesting blog. I would like to add that the effectiveness of the educational system would be incredibly enhanced if the mentioned ideas are paired with informed students. Having learned essential life skills, I have seen all of my students experience a complete turnaround in their lives regarding education and life in general.
It is incredible to think about the potential of learners given meaningful skills and a refined educational context.