This morning at the American Evaluation Association meeting in San Antonio I attended a session very near and dear to my heart: design thinking and evaluation.
I have been a staunch believer that design thinking ought to be one of the most prominent tools for evaluators and that evaluation ought to be one of the principal components of any design thinking strategy. This morning, I was with my “peeps”.
Specifically, I was with Ching Ching Yap, Christine Miller and Robert Fee and about 35 other early risers hoping to learn about ways in which the Savannah College of Art and Design (SCAD) uses design thinking in support of their programs and evaluations.
The presenters went through a series of outlines for design thinking and what it is (more on that in a follow-up post), but what I wanted to focus on here was the way in which evaluation and design thinking fits together more broadly.
Design thinking is an approach that encourages participatory engagement in planning and setting out objectives, as well as in ideation, development, prototyping, testing, and refinement. In evaluation terms, it is akin to action research and utilization-focused evaluation (PDF). But perhaps its most close correlate is with Developmental evaluation (DE). DE is an approach that uses complexity-science concepts to inform an iterative approach to evaluation that is centred on innovation, the discovery of something new (or adaptation of something into something else) and the application of that knowledge to problem solving.
Indeed, the speakers today positioned design thinking as a means of problem solving.
Evaluation , at least DE, is about problem solving by collecting the data used as a form of feedback to inform the next iteration of decision making. It also is a form of evaluation that is intimately connected to program planning.
What design thinking offers is a way to extend that planning in new ways that optimizes opportunities for feedback, new information, participation, and creative interaction. Design thinking approaches, like the workshop today, also focuses on people’s felt needs and experiences, not just their ideas. In our session today, six audience members were recruited to play the role of either three facets of a store clerk or three facets of a customer — the rational, emotional and executive mind of each. A customer comes looking for a solution to a home improvement/repair problem, not sure of what she needs, while the store clerk tries to help.
What this design-oriented approach does is greatly enhance the participant’s sense of the whole, what the needs and desires and fears both parties are dealing with, not just the executive or rational elements. More importantly, this strategy looks at how these different components might interact by simulating a condition in which they might play out. Time didn’t allow us to explore what might have happened had we NOT done this and just designed an evaluation to capture the experience, but I can confidently say that this exercise got me thinking about all the different elements that could and indeed SHOULD be considered if trying to understand and evaluate an interaction is desired.
If design thinking isn’t a core competency of evaluation, perhaps we might want to consider it.