What is quality when we speak of learning? In this third post in series on education and evaluation metrics the issue of quality is within graduate and professional education is explored with more questions than answers about the very nature of learning itself.
But what does learning really mean and do we set the system up to adequately assess whether people do it or not and whether that has any positive impact on what they do in their practice.
What do you mean when you say learning?
The late psychologist Seymour Sarason asked the above question with the aim of provoking discussion and reflection on the nature and possible outcomes of educational reform. Far from being glib, Sarason felt this question exposed the slippery nature of the concept of learning as used in the context of educational programming and policy. It’s a worthwhile question when considering the value of university and professional education programming. What do we mean when we say learners are learning?
The answer to this question exposes the assumptions behind the efforts to provide quality educational experiences to those we call learners. To be a learner one must learn…something.
The Oxford English Dictionary defines learning this way:
the acquisition of knowledge or skills through experience, practice, or study, or by being taught: these children experienced difficulties in learning | [ as modifier ] : an important learning process.
• knowledge acquired in this way: I liked to parade my learning in front of my sisters.
This might sufficiently answer Dr Sarason except there is no sense of what the content is or whether that content is appropriate, sufficient, timely or well-supported with evidence (research or practice-based); the quality of learning.
Knowledge translation professionals know that learning through evidence is not achieved through a one-size-fits-all approach and that the match between what professionals need and what is available is rarely clean and simple (if it was, there would be little need for KT). The very premise of knowledge translation is that content itself is not enough and that sometimes it requires another process to help people learn from it. This content is also about what Larry Green argues: practice-based evidence is needed to get better evidence-based practice.
How do we know when learning is the answer (and what are the questions)?
If our metric of success in education is that those who engage in educational programming learn, how do we know whether what they have learned is of good quality? How do we know what is learned is sufficient or appropriately timed? Who determines what is appropriate and how is that tested? These are all questions pertaining to learning and the answers to them depend greatly on context. Yet, if context matters then the next question might be: what is the scope of this context and how are its parameters set?
Some might choose academic discipline as the boundary condition. To take learning itself as an example, how might we know if learning is a psychology problem or a sociology problem (or something else)? If it is a problem for the field of psychology, when does it become educational psychology, cognitive psychology, community psychology or one of the other subdisciplines looking at the brain, behaviour, or social organization? Successful learning through all of these lenses means something very different across conditions.
Yet, consider the last time you completed some form of assessment on your learning. Did you get asked about the context in which that learning took place? When you were asked questions about what you learned on your post-learning assessment:
- Did it take into account the learning context of delivery, reception, use, and possible ways to scaffold knowledge to other things?
- Did your learner evaluation form ask how you intended to use the material taught? Did you have an answer for that and might that answer change over time?
- Did it ask if your experience of the learning event matched what the teachers and organized expected you to gain and did you know what that really was?
- Did you know at the time of completing the evaluation whether what you were exposed to was relevant to the problems you needed to solve or would need to solve in the future?
- Did you get asked if you were interested in the material presented and did that even matter?
- Was there an assumption that the material you were exposed to could only be thought of in one way and did you know what that way was prior to the experience? If you didn’t think of the material in the way that the instructors intended did you just prove that the first of these two questions is problematic?
Years of work in post-secondary teaching and continuing professional education suggests to me that your answer to these questions was most likely “no”, except the very last one.
These many questions are not posed to antagonize educators (or “learners”, too) for there are no single or right answers to any of them. Rather, these are intended to extend Seymour Sarason’s question to the present day and put in the context of graduate and professional education at a time when both areas are being rethought and rationalized.
Learning to innovate (and being wrong)
A problem with the way much of our graduate and professional education is set up is that it presumes to have the answers to what learning is and seeks to deliver the content that fills a gap in knowledge within a very narrow interpretation. This is based on an assumption that what was relevant in the past is both still appropriate now and will be in the future unless we are speaking of a history lesson. However, innovation and discovery — and indeed learning itself — is based on failure, discomfort and not knowing the answers as much as building on what has come before us. There is no doubt that a certain base level of knowledge is required to do most professional and scientific work and that building a core is important, but it is far from sufficient.
The learning systems we’ve created for ourselves are based on a factory model of education, not for addressing complexity or dynamic systems like we find in most social worlds. We do not have a complex adaptive learning system in place, one that supports innovation (and the failures that produce new learning) because:
If you’re not prepared to be wrong, you’ll never come up with anything original. – Sir Ken Robinson, TED Talk 2006
The above quote comes from education advocate Sir Ken Robinson in a humorous and poignant TED talk delivered in 2006 and then built on further in a second talk in 2010. Robinson lays bare the assumptions behind much of our educational system and how it is structured. He also exposes the problem we face in advancing innovation (my choice of term) because we have designed a system that actively seeks to discourage wide swaths of learning that could support it, particularly with the arts.
Robinson points to the conditions of interdisciplinary learning and creativity that emerge when we free ourselves of the factory model of learning that much of our education is set up, “producing” educated people. If we are assessing learning and we go outside of our traditional disciplines how can we assess whether what we teach is “learned” if we have no standard to compare it to? Therein lies the rub with the current models and metrics.
If we are to innovate and create the evidence to support it we need to be wrong. That means creating educational experiences that allow students to be wrong and have that be right. If that is the case, then it means building an education system that draws on the past, but also creates possibilities for new knowledge and learning anchored in experimentation and transcends disciplines when necessary. It also means asking questions about what it means to learn and what quality means in the context of this experimental learning process.
If education is to transform itself and base that transformation on any form of evidence then getting the right metrics to evaluate these changes is imperative and quality of education might just need to be one of them.
Jonah Lehrer is/was as big as it gets in science writing and two weeks ago proved the adage that the higher one climbs the farther the fall after admitting to some false content in his stories. This is bad news for him, but may be much worse for all of us interested in making science and innovation knowledge accessible for reasons that have as much to do with the audience as it does the message and messenger.
Jonah Lehrer was one of our most prolific and widely read science writers until he admitted fudging some quotes about Bob Dylan in his new book, Imagine, which looks at the process of discovery, creativity and innovation. The discovery by fellow journalist (and fervent Bob Dylan fan) Michael Moynihan set off a wave of reflections and investigations of Lehrer’s work revealing passages in the book (and other pieces) that had been reused from his other writings without proper self-attribution and sparking questions about the integrity of the author’s entire body of work. The “fall of Jonah Lehrer” was big news at a time when the London Olympics were dominating most of the media’s attention.
This case is a testament to the wide appeal that Lehrer’s work had beyond the usual ‘science geeks’ while illustrating the power of the internet to enable the kind of curation and investigation to support on and offline fact checking. But what it spoke to most for me is the role
The Writer and his Craft
Much digital type has been spent on the Lehrer incident. Search Google and you’ll find dozens of commentaries looking at how things transpired and how Lehrer ironically succumbed to the cognitive biases he wrote about.
Roxane Gay, writing in Salon, took a gendered approach to the issue and questioned whether our fascination is less with the science and more about the ‘young male genius’. Lehrer’s youth was something she saw as critical to amplifying the fascination with his work. She writes:
When young people display remarkable intelligence or creativity, we are instantly enamored. We want or need geniuses to show us the power and potential of the human mind and we’re so eager to find new people to bestow this title upon that the term and the concept have become quite diluted.
I agree with her on the point about our desire to over-inflate the accomplishments of youth (as if we are *amazed* that any of them could possibly do anything brilliant, which is as offensive to them and it is to older people), although a careful look at Lehrer’s articles and much of the press around his work suggests that he was much less a focus of the attention than his ideas.
Call it “Gladwellization.” It’s not just lucrative, but powerful: your ideas (or rather, the ideas you’ve turned into compelling anecdotes for a popular audience) can influence everything from editorial choices across the publishing world to corporate management and branding strategies.
But with this comes mounting demands to produce, and to recycle. You have to be prolific, churning out longer pieces that give your insights some ballast, and brilliant, bite-sized items. And yet you can’t be too new either: people want to hear what you’re already famous for. In this cauldron of congratulation and pressure for more and more, it’s not hard to see how standards might erode, how the “ideas” might become more important than doing the necessary due diligence to make sure they sync with reality.
‘Snappy Science’ and Synthesis
Innovation is about ‘new’ and there are good reasons why its a challenge to get the message out that this ‘new’ can be adapted, small, and unsexy and still make a large difference in the long run instead of big, bold and transformative right away. We are in an age of selling “snappy science” and it says more about the media and audiences than the authors and scientists producing the original work.
This snappy, bite-sized science might sell books and make for great TED talks, but it is a misrepresentation of what we actually know and do as scientists. Rarely does a single finding lead to a solution, rather it is an amalgam of discoveries small and large brought together that gets us to closer to answers. Synthesis is the driver of change and synthesis is what journalists do particularly well. Malcolm Gladwell, Steven Johnson and Jonah Lehrer are among the best synthesizers out there and I would imagine (no pun intended) that they contribute to more to public and professional understanding of social innovation than all of the original-sourced scientific knowledge on the subject combined.
When I hear Malcolm Gladwell cited as an original source in serious discussions with colleagues on scientific matters, I realize we have a problem…and an opportunity. Gladwell’s writings popularized the concept of tipping points, but his work is based on a wealth of scientific data on complex systems. They are not his original ideas, but they are his syntheses and (sometimes) his interpretations. This is important work and I am not taking anything from anyone who makes science data digestible and accessible, but it is not the original science.
That Jonah Lehrer is as well known as he is tells me that there is an appetite for science and I’ll freely admit to using his work (and that of the other authors I’ve mentioned) to inform what I do in a general sense. It is good work, however I also acknowledge that I have the scientific training to know how to go beyond the initial articles to critically appraise the information, place it in context, and I have the resources to go to the original sources in academic journals. Most people (professionals and lay people) do not. This access is going to decrease as resources shrink.
It is for this reason that synthetic work is so important. My Twitter feed often is filled with references to such synthetic work, rather than original works of research because I aim to fill role that is somewhere between journalism and the science of design, systems and psychology. I am not a pure science blogger, nor am I speaking to the lay public, but rather other professionals seeking to enrich their knowledge base. That is a role I’ve created for myself, largely because there is a high demand and low supply.
We have a need for synthesis and a demand for it, but little acknowledgement of the value of this role in professional scientific circles. Yet, when we leave journalists to do the work for us, we allow a different system to take charge. John McQuaid ended his article with this caution:
Book publishers don’t do fact-checks, so there’s no fail-safe, just the conscience of the writer. Reach that point, and all is lost.
Filling the gap, meeting a need and shooting the messenger
Journalists like Johnson, Gladwell and Lehrer fill a gap, which is why I am saddened by the loss of one of them and angry at what has transpired. While there is no doubt that Lehrer made mistakes, they were of a rather minor nature in the grand scheme of things. Synthetic work is designed to provide a big picture overview, not guide microscopic decisions. I would like people to read Lehrer and learn about the creative process and the role of neuroscience in making our lives better, to appreciate systems thinking and decision making because of Malcolm Gladwell, and see innovation, emergence and discovery in new ways because of writers like Steven Johnson.
Yet, when we seek more and more from these authors, we might get less and less. This is what happened to Jonah Lehrer. As more people found themselves drawn to his work, the pressure grew for doing more, faster and getting that ‘snappy science’ out the door. GOOD magazine in the ‘tyranny of the big idea‘ goes further:
The problem is that it’s unreasonable to expect that every new piece of media should upend conventional wisdom or deliver a profound new insight. To think that Jonah Lehrer could expose an amazing new facet of human psychology every week, in 1,000-odd words no less, is ludicrous. There are only so many compelling, counterintuitive, true ideas out there.
But the demand for them doesn’t abate. That’s why you see so many science writers talking about the same handful of studies (the Stanford prison experiment, the rubber hand illusion, Dunbar’s number, the marshmallow test) over and over. That’s why you see pop economists who should know better creating flimsy and irresponsible contrarian arguments about climate change for shock value. That’s why you get influential bloggers confessing they’re only 30 percent convinced of their own arguments but “you gotta write something.” That’s why the#slatepitches meme hits home.
Search Censemaking and you’ll find many of these topics not just because they are punchy, but because they are useful.
I hope we haven’t lost Jonah Lehrer as a voice just as I hope more people stop putting writers like him on a pedestal, where they don’t belong (nor do the scientists who produce the research). Synthesis is about bringing ideas together to produce innovative insights that often lead to bigger conversations about how to socially innovate. Synthesis is bigger than science, but dependent on it. It means paying attention to parts and wholes together and is the epitome of systems thinking in knowledge work.
It also means taking responsibility as knowledge producers and consumers and be wary of shooting the messengers while asking more from the messages they deliver.
Unless we are prepared to give people time to search, appraise and synthesize research on their own — and train them to make informed choices — the role of synthesizers – professional, journalistic, or otherwise – will become more important than ever.
Photo from Wikimedia Commons and is used under licence.
Complexity, by its very nature, is not a simple concept to communicate, yet it is increasingly becoming one that will define our times and may be the key to ensuring human survival and wellbeing in the years to come. If society is to respond to complex challenges the meaning of complexity needs to be communicated to the world in a manner that is understandable to a wide audience. This is the first in a series of posts that are looking at the concept of complexity and the challenges and opportunities with marketing it to the world.
Across North America this week the temperatures are vastly exceeding normal levels into ranges more akin to places like India or East Africa. The climate is changing and regardless of what the causes are the complexities that this introduces require changes in our thinking and actions or human health and wellbeing will be at risk. To follow Einstein’s famous quote:
“We can’t solve problems by using the same kind of thinking we used when we created them”
Many U.S. States are suffering hurricane-like after-effects from a Derecho that hit last week, knocking out power at a time when temperatures are into the high 90′s and low 100′s. Derechos are rapid moving hot air systems that are difficult to predict and can only be anticipated under certain conditions. The heat wave combined with the lack of air conditioning and supplies left 13 dead, maybe more. The heat wave is continuing and is expected to last throughout the weekend.
But this post is not really about the weather, but the challenges with complexity that it represents and how we need to be better understanding what complexity is and how to work with it if we are to survive and thrive in the years to come.
It’s ironic that this post was delayed by blackout. I live in Toronto, Canada and we have a remarkably stable power supply, yet last night and through this morning I was without power due to suspected overheated circuits attributed to high air conditioning use, shutting down my Internet and everything else with it. In many parts of the world, this kind of blackout is commonplace and a fact of daily living, but not here…yet. This fortuitous bit of timing illustrates the fragility of many of our systems given the reliance on power to fuel much of what we do (e.g., cooking, food storage, Internet, traffic signals, lighting, etc..).
Virtually all of the infrastructure of modern life (here and increasingly globally) is tied to electricity. If you’re interested in imagining what would happen if it all shuts off, I’d highly recommend reading The World Without Us by Alan Weisman. Weisman uses a complexity scientist and futurists’ tool called a thought experiment to craft a book about what New York City would look like if humans suddenly disappeared. The book illustrates how nature might take over, how the underground subways would flood and collapse because of the millions of litres of water needed to be pumped out of it each day, and how certain human-built structures would decay over time (some far faster than we might hope).
Thought experiments take data from things that have happened already, theories, and conjecture and project scenarios into the future based on the amalgam of these. It provides some grounded means of anticipating possible futures to guide present action.
From present delays to future/tense
The Guardian asked a number of scientists working on climate about whether this current spate of extreme weather events is attributable to global warming. The scientists offered a range of answers that (not surprisingly) lacked a definitive statement around cause-and-effect, yet the comments hint at a deep concern. These anomalous conditions are starting to move further towards the end of the normal curve, meaning that they are becoming less statistically plausible to be caused by chance. What this means for the weather, for climate, for our economies is not known; all we have is thought experiments and scenarios. But the future is coming and we may want to be prepared by helping create one we want, not just one we get.
Unfortunately, we cannot wait for the data to confirm that global warming is happening or determine that we are contributing to it and to what degree. This is not just a weather issue; the same situation is playing itself out with issues worldwide ranging from healthcare funding to immigration policies and migration patterns. Interconnected, interdependent and diverse agents and information forms are interacting to create, emergent patterns of activity.
It is for this reason that weather patterns — despite being one of the most monitored and studied phenomenon — can’t be accurately predicted outside of a few hours in advance, if at all. There is too much information coming together between air flows, humidity, land forms, physical structure and human intervention (e.g., airplane contrails) interacting simultaneously in a dynamic manner to create a reliable model of the data. David Orrell’s book Apollo’s Arrow is a terrific read if you want to understand complexity in relation to weather (and more) or see his talk at TEDX on YouTube.
Two’s company, three’s complexity (and other analogies)
The above heading is taken from a title of another book on complexity and tries to simply point to how adding just a little bit of information (another person to a conversation perhaps) can radically alter the experience from being simple or complicated to complex. Just thinking about planning a night out with two people vs. three and you’ll know a little of what this means.
Analogies and metaphors are ways in which complexity scholars commonly seek to convey how the differences in conditions represent varying states of order. Brenda Zimmerman and others write about putting a rocket to the moon as being complicated and raising a child as being complex. One of my favourites is Dave Snowden‘s video on How to Organize a Children’s Party. One of the reasons we resort to analogies is that we need a narrative that fits with their experience. All of us were children and some of us have had them as parents so we can relate to Zimmerman and Snowden’s ideas because we’ve experienced it firsthand.
We haven’t experienced anything like what is anticipated from global warming. In the Americas, parts of Europe and Asia we are enormously fortunate to have entire generations that don’t know what it’s like to be hungry, have no healthcare, be without electricity, or have no access to safe water and proper sanitations. Stories about children’s parties might not bring these scenarios home. It is why Weisman’s book is so clever: it makes a plausible scenario fiction.
Science fact as science fiction
The role of fiction might be the key to opening the marketing vault to complexity. Scott Smith and others have been exploring how the use of science fiction helped pave the way for some of today’s modern technologies and innovations. By weaving together fantasy narratives and imaginations on the future, technologists have managed to re-create these tools for current life. Witness the Tricorder Project that seeks to develop the same multifunction health and information tool used by Dr. McCoy on Star Trek.
We are making headway with complex information as witnessed by the popularity of infographics and data visualizations. But there is much more to be done.
Complex problems require complex solutions. Artists, designers, scientists, marketers, journalists and anyone who can communicate well can play a role. Making complexity something that people not only know about, but want to know about is the task at hand. In doing so, we may find people reaching for and advocating for complex solutions rather than stop-gap, band-aid ones like buying a car with better fuel economy as the main strategy to combat carbon emissions.
It’s been done before. Marshall McLuhan wrote about esoteric, yet remarkably insightful and complex topics and became a household name in part to his appearance in Woody Allen‘s Annie Hall. Our media landscape is far more complex now (no pun intended) to think that a single appearance of any complexity superstar (if one existed) would change public perception of the topic in the same way that McLuhan’s did for his theories on media. Yet, Al Gore’s An Inconvenient Truth might have done more to get people talking about the environment than anything. And while Gore is not known for his witty storytelling, his slide show did a good job.
To begin our journey of marketing complexity we need to come up with our stories so that we can tell ones that are pleasant, rather than the ones that are less so. And if you want one that fits this latter category, I strongly recommend reading Gwynn Dyer’s chilling Climate Wars. Instead, let’s get closer to living what Peter Diamandis and Steven Kotler write about in Abundance.
The future is ours to write.
For more books and resources on complexity, check out the library page on Censemaking.
SEED magazine recently posted on the concept of early warning signs in complex systems that I found quite provocative and important.
Science is a creative human enterprise. Discoveries are made in the context of our creations: our models and hypotheses about how the world works. Big failures, however, can be a wake-up call about entrenched views, and nothing
produces humility or gains attention faster than an event that blindsides so many so immediately.
There are so many key points in this one phrase that are worth discussing at length.
Science is a creative enterprise . For reasons I’ve discussed elsewhere, I think that science needs to embrace its creative side more than ever and embrace design. This isn’t a universal, but if we (scientists) approached problems from the multidimensional manner in which designers typically approach them, we might create new innovations and discoveries that are different than the ones we’ve made before. Why is this important (beyond the obvious to those whose business it is to discover)? Complexity. The problems we are dealing with now more than ever are likely to be complex ones, which require different ways of approaching them and (some) different science and practice.
And as Albert Einstein famously said (or at least many people have attributed this to him — I can’t verify it, but it works nonetheless):
“We can’t solve problems by using the same kind of thinking we used when we created them.”
Discoveries are made in the context of our creations: our models and hypotheses about how the world works. In public health where I work, the dominant models remain those rooted in reductionist science. We are asked to ‘prove’ the links between certain activity and the outcomes they produce. This works relatively well in areas like sanitation, toxicology, (some) pharmaceutical or vaccination interventions, and injury prevention. It is for these reasons that the top achievements in public health, including massive increases in life expectancy and reductions in premature death took place in the 20th century. But that was then. The challenges we face now are into the realm of complexity, unless we fail to support fundamentals in public health and then we’ll have both simple and complex challenges on our hands. The point here is that our models will only take us so far without some acknowledgement of the complexity of the problems they seek to explain. The context of our creations is complexity.
Big failures, however, can be a wake-up call about entrenched views. The key term here is “entrenched views” . My colleagues Alex Jadad, Murray Enkin, Shalom Glouberman and others once had a group called the Clinamen collaborative that wrote a great piece on the problem of complexity when dealing with entrenched health care practices. Their recommendations are essentially:
1) there are no recipes for universal success,
2) pay attention to local conditions,
3) intervene small and often and then scale,
4) aim for stability first, then change.
In science, we’re failing a lot and rather than see this as a potential positive, I see more conservative approaches to science based on risk aversion. Providing support for smaller, rapid response scientific studies that are encouraged to fail will do more than these big, non-adventurous team projects that provide high-level window dressing for grant funders and avoid making anyone look bad.
…and nothing produces humility or gains attention faster than an event that blindsides so many so immediately. Humility is a word that is too often absent from my profession. I’m not talking about the kind of humility that comes from acknowledging the limitations of a scientific study or the recommendations of a report. I am speaking of true humility, where one “seeks to first understand, then to be understood”. Indeed, I would argue that we are lousy at both more often than we’re successful. Our understanding comes from a scientific perspective that holds us up as the experts. Once you’ve labelled someone or yourself an “expert” conversations immediately shift. Watch a classroom where the instructor insists on pure lecturing, being called “Dr.” and where “right” and “wrong” are regular parts of the conversation. Then watch a classroom where students learn from each other, are encouraged to share their experience and challenge the material, where the professor doesn’t push her or his titles and credentials, and where there is interaction between everyone. You’ll see a very different sense of humility from students and teachers alike.
When I encounter others on genuine, authentic and intimate level of learning I never cease to be left in awe. That comes from humility and is something I was fortunate to have modeled to me. I was once told by a retiring professor who was leaving on the day I was convocating from my undergraduate degree:
“When I was in my undergraduate, I new everything. Now that I am a retired professor, I realize I know nothing. Every year of learning serves to teach me that I know less and less.”
The SEED article goes on to point to the current problems in science in dealing with complexity and the imperative towards collaboration and cross-disciplinary engagement:
Examples of catastrophic and systemic changes have been gathering in a variety of fields, typically in specialized contexts with little cross-connection. Only recently have we begun to look for generic patterns in the web of linked causes and effects that puts disparate events into a common framework—a framework that operates on a sufficiently high level to include geologic climate shifts, epileptic seizures, market and fishery crashes, and rapid shifts from healthy ecosystems to biological deserts.
The main themes of this framework are twofold: First, they are all complex systems of interconnected and interdependent parts. Second, they are nonlinear, non-equilibrium systems that can undergo rapid and drastic state changes.
Complex systems require the kind of deep attention that science brings, the spirit of engagement and problem solving that designers offer, and a space to bring them together. With their focus on reductionist science and the lack of embrace of design, universities haven’t been the home to this kind of thinking. But things can change because, after all, this is a complex dynamic system we’re talking about.
Among the most frustrating aspects of being a systems thinker/actor/researcher is the “one thing” question: What is the one thing that we can do to solve this problem?
The answer is almost always: there isn’t one thing you can do, the problem requires a complex response*
*That isn’t always the case, but in my line of work, it pretty much is true that the problem and its solution fall within the realm of complexity.
When you work in complex systems, the problems are nearly always multifaceted, convoluted and multi-dimensional in their scope and impact. Yet the “one thing” request comes up all the time.
Dave Snowden, from Cognitive Edge, in his work with the Cynefin Framework nicely points to the difference between best, good, and emergent practice. In public health and medicine, the term “best practice” has been so dominant that it is hard for people to lose the terminology, even if they acknowledge that it sometimes doesn’t fit (the concept of “better” practice is often snuck in as a way to placate those who subscribe a model of thinking that challenges “best” practice thinking). While this is very useful, the problem that even useful frameworks like Cynefin produce is a tendency for people to put whatever they are doing into boxes.
Looking at Cynefin, you can see the world compartmentalized into four nifty quadrants, making the world simple — or at least complicated, but certainly not complex. Dave Snowden himself has been critical of this tendency in his writings on Cognitive Edge’s blog, yet time and again I see this type of thinking come alive. I currently am teaching a course for graduate students on systems science perspectives in public health and this is one of the pitfalls that I hope the learners in my class can avoid.
It’s not easy. We humans love to compartmentalize things. Charles Darwin, one of the founders of modern science, famously began his career putting things in literal and metaphorical boxes. Classification is something we do from our earliest years and do all the way through school. Indeed, a brilliant and somewhat depressing look at how engrained this thinking is can be seen in Sir Ken Robinson’s animated TED Talk on the history and possible future of education.
Systems thinking requires spectrum thinking. People must be able to see things on a gradient, rather than in absolute compartments. Students can’t be faulted too much for having a hard time with this when they are graded based on letters where a B+ is a 79 and an A- is one percentage point higher, yet the mere presence of a B (anything) on a transcript can mean the difference between an award, admission, or a job and not.
This is in no way a criticism of Cynefin or other frameworks used to explain or assist in the understanding of complexity, but rather a statement of the problems inherent in our quest to teach others about complexity that is intellectually honest, reasonably accurate, yet also effective in helping people understand the gravity and scope of complexity in practice. I don’t have an answer for this, but do find using a spectrum useful.
And on a side note, the multi-coloured palette of the spectrum also allows for an introduction of the concept of diversity and human relations at the same time as it illustrates a way of thinking about complex systems.