Learning is at the heart of innovation and works best when evaluation is part of the process to spot what’s actually learned.
Learning is a critical part of innovation and evaluation is about understanding its role.
There are many definitions of innovation out there. After years of experience, observation, and study I’ve found them lacking in utility so I came up with my own:
Learning transformed into value through design
While much focus in the discourse on innovation focuses on the last part of that statement — value and design — the part about learning is just as important. Learning is far more than knowledge and evidence and data. It’s about integrating all of that with our experience, our practice, and our reflection on how it all comes together.
We’ve all been taught things that we later realized we knew, but didn’t actually learn until much later in life. (Consider all the pearls of wisdom we might have been exposed to through our trusted elders as children only to ignore until later in life after we’ve realized the folly in our arrogance).
Traditional approaches to learning tend to focus on two things: recall and performance. Thus, we look at knowledge-based tests (consider most of what we took in grade school and college or university) or performance assessments (reviewed accomplishment on a specific task or problem set).
What is often absent from both of these is a consideration of the context and the learner themselves. It’s one thing to be exposed to information or demonstrations of how to do something, yet it’s entirely another thing to actually learn it.
If understanding real learning matters, then so does it’s assessment to help us understand if it actually took place and to what effect. This is the role of evaluation.
Assessment of Learning: Questions to Ask
Just as learning is about asking and seeking answer to questions, so is the evaluation of learning in practice. At Cense, we begin with asking the following questions:
- Exposure. What level of exposure did we have to the material or experience? How did this exposure treat our sense of understanding of the problem and the context? What salient information did we attend to during this exposure? How (or were) we attentive to what was presented to us and what was the context in which this was presented?
- Connection. How is what we are exposed to related to what we’ve already experienced? In what ways does this new material fit with what we have previously learned? Where does this material or experience fit in shaping my goals, intentions, or general direction for myself or my organization?
- Reinforcement. What activities allow me to apply what I’ve been exposed to in new ways and in practical ways repeatedly (including fitting with what I already know, whether a theory or practice-focused material)? What systems are in place to allow me to integrate what I’ve been exposed to into my current understanding of the situation? What activities have been set out to enable recall and re-visiting materials in the near and longer-term?
- Application. What opportunities do I have to apply what I’ve learned into shaping a new understanding of a problem or steps toward addressing it? What opportunities are present to reflect and integrate this application of knowledge or skill into my existing repertoire of practice or understanding?
These questions can help us to determine what degree of activity is tied to integrated versus performative learning — doing activities that look like learning but aren’t really effective. Taking workshops on topics that are not relevant to your work or set up too distant from the problems you’re seeking to tackle is one way. Filling your days with reading articles, watching webinars, taking courses, listening to podcasts, and viewing videos on some material without a means to organize and integrate that into your experience is unlikely to generate any real learning.
Engaging in these activities might be entertaining, provide a useful distraction, keep you connected to your peers, help familiarize you with some activities in your professional work — so it’s not without value — but it is likely not about any substantive learning. You will retain some of what you’re exposed to, but probably not enough to warrant your time and energy if your aim is to actually learn substantively.
What’s misleading is that this kind of ‘filling’ of your attention actually serves to decrease the likelihood that you’re able to use what you’ve been exposed to because of the sapping of energy from your brain. The more information we expose ourselves to, the more energy is required and eventually we’ll be depleted enough that things won’t stick.
Are We Serious About Learning?
I ask this question over and again of myself, my students, and clients. We want to say ‘yes’, but without offering up time, creating systems — real structures to allow us to critically engage with material and ourselves — and the means to apply and try our our ideas, the answer is most likely ‘no’.
Better learning comes from better systems, by design. Evaluation can tell us whether we’ve put these systems in place and the mere act of asking about whether or not we did learn, how, and commit ourselves to deliberating on what we find can be an enormous step forward. We learn about what we pay attention to. Evaluation can help us to pay attention to what we’re doing.
That old saying from Peter Drucker about what gets measured, gets managed has a ring of truth here. When what we’re doing matters in a manner we can measure — by whatever metrics we choose for ourselves — we manage the conditions to learn better.
And by my definition above, that means we’ll innovate better, too.
This article was based on an earlier one posted at Cense.ca.
If you’re looking to learn more, do more with what you have, please contact me and we can help you learn better, by design. This is what we do.