Evaluating Social Innovation For Social Impact

How do the innovation letters line up?

Earlier this week I has the pleasure of attending talks from Bryan Boyer from the Helsinki Design Lab and learning about the remarkable work they are doing in applying design to government and community life in Finland. While the focus of the audience for the talks was on their application of design thinking, I found myself drawn to the issue of evaluation and the discussion around that when it came up.

One of the points raised was that design teams are often working with constraints that emphasize the designed product, rather than its extended outcome, making evaluation a challenge to adequately resource. Evaluation is not a term that frequents discussion on design, but as the moderator of one talk suggested, maybe it should.

I can’t agree more.

Design and Evaluation: A Natural Partnership

It has puzzled me to no end that we have these emergent fields of practice aimed at social good — social finance and social impact investing, social innovation, social benefit (PDF)– that have little built into their culture to assess what kind of influence they are having beyond the basics. Yet, social innovation is rarely about simple basics, it’s influence is likely far larger, for better or worse.

What is the impact being invested in? What is the new thing being created of value? and what is the benefit and for whom? What else happened because we intervened?

Evaluation is often the last thing to go into a program budget (along with knowledge translation and exchange activities) and the first thing to get cut (along with the aforementioned KTE work) when things go wrong or budgets get tightened. Regrettably, our desire to act supersedes our desire to understand the implication of those actions. It is based on a fundamental idea that we know what we are doing and can predict its outcomes.

Yet, with social innovation, we are often doing things for the first time, or combining known elements into an unknown corpus, or repurposing existing knowledge/skills/tools into new settings and situations. This is the innovation part. Novelty is pervasive and with that comes opportunities for learning as well as the potential for us to good as well as harm.

An Ethical Imperative?

There are reasons beyond product quality and accountability that one should take evaluation and strategic design for social innovation seriously.

Design thinking involves embracing failure (e.g,  fail often to succeed sooner is the mantra espoused by product design firm IDEO) as a means of testing ideas and prototyping possible outcomes to generate an ideal fit. This is ideal for ideas and products that can be isolated from their environment safely to measure the variables associated with outcomes, if considered. This works well with benign issues, but can get more problematic when such interventions are aimed at the social sphere.

Unlike technological failures in the lab, innovations involving people do have costs. Clinical intervention trials go through a series of phases — preclinical through five stages to post-testing — to test their impact, gradually and cautiously scaling up with detailed data collection and analysis accompanying each step and its still not perfect. Medical reporter Julia Belluz and I recently discussed this issue with students at the University of Toronto as part of a workshop on evidence and noted that as complexity increases with the subject matter, the ability to rely on controlled studies decreases.

Complexity is typically the space where much of social innovation inhabits.

As the social realm — our communities, organizations and even global enterprises — is our lab, our interventions impact people ‘out of the gate’ and because this occurs in an inherently a complex environment, I argue that the imperative to evaluate and share what is known about what we produce is critical if we are to innovate safely as well as effectively. Alas, we are far from that in social innovation.

Barriers and Opportunities for Evaluation-powered Social Innovation

There are a series of issues that permeate through the social innovation sector in its current form that require addressing if we are to better understand our impact.

  1. Becoming more than “the ideas people”: I heard this phrased used at Bryan Boyer’s talk hosted by the Social Innovation Generation group at MaRS. The moderator for the talk commented on how she had wished she’d taken more interest in statistics in university because they would have helped in assessing some of the impact fo the work done in social innovation. There is a strong push for ideas in social innovation, but perhaps we should also include those that know how to make sense and evaluate those ideas in our stable of talent and required skillsets for design teams.
  2. Guiding Theories & Methods: Having good ideas is one thing, implementing them is another. But tying them both together is the role of theory and models. Theories are hypotheses about the way things happen based on evidence, experience, and imagination. Strategic designers and social innovators rarely refer to theory in their presentations or work. I have little doubt that there are some theories being used by these designers, but they are implicit, not explicit, thus remaining unevaluable and untestable or challenged by others. Some, like Frances Westley, have made theories guiding her work explicit, but this is a rarity. Social theory, behaviour change models and theories of discovery beyond just use of Rogers’ Diffusion of Innovation theory must be introduced to our work if we are to make better judgements about social innovation programs and assess their impact. Indeed, we need the kind of scholarship that applies theory and builds it as part of the culture of social innovation.
  3. Problem scope and methodological challenges with it. Scoping social innovation is immensely wide and complicated task requiring methods and tools that go beyond simple regression models or observational techniques. Evaluators working social innovation require a high-level understanding of diverse methods and I would argue cannot be comfortable in only one tradition of methods unless they are part of a diverse team of evaluation professionals, something that is costly and resource intensive. Those working in social innovation need to live the very credo of constant innovation in methods, tools and mindsets if they are to be effective at managing the changing conditions in social innovation and strategic design. This is not a field for the methodologically disinterested.
  4. Low attendance to rigor and documentation. When social innovators and strategic designers do assess impact, too often there is a low attention to methodological rigor. Ethnographies are presented with little attention to sampling and selection or data combination, statistics are used sparingly, and connections to theory or historical precedent are absent. Of course, there are exceptions, but this is hardly the rule. Building a culture of innovation within the field relies on the ability to take quality information from one context and apply it to another critically and if that information is absent, incomplete or of poor quality the possibility for effective communication between projects and settings diminishes.
  5. Knowledge translation in social innovation. There are few fora to share what we know in the kind of depth that is necessary to advance deep understanding of social innovation, regularly. There are a lot of one-off events, but few regular conferences or societies where social innovation is discussed and shared systematically. Design conferences tend towards the ‘sage on the stage’ model that favours high profile speakers and agencies, while academic conferences favour research that is less applied or action-oriented. Couple that with the problem of client-consultant work that is common in social innovation areas and we get knowledge that is protected, privileged or often there is little incentive to add a KT component to the budget.
  6. Poor cataloguing of research. To the last point, we have no formalized methods of determining the state-of-the-art in social innovation as research and practice is not catalogued. Groups like the Helsinki Design Lab and Social Innovation Generation with their vigorous attention to dissemination are the exception, not the rule. Complicating matters is the interdisciplinary nature of social innovation. Where does one search for social innovation knowledge? What are the keywords? Innovation is not a good one (too general), yet neither is the more specialized disciplinary terms like economics, psychology, geography, engineering, finance, enterprise, or health. Without a shared nomenclature and networks to develop such a project the knowledge that is made public is often left to the realm of unknown unknowns.

Moving forward, the challenge for social innovation is to find ways to make what it does more accessible to those beyond its current field of practice. Evaluation is one way to do this, but in pursuing such a course, the field needs to create space for evaluation to take place. Interestingly, FSG and the Center for Evaluation Innovation in the U.S. recently delivered a webinar on evaluating social innovation with the principle focus being on developmental evaluation, something I’ve written about at length.

Developmental evaluation is one approach, but as noted in the webinar : an organization needs to be a learning organization for this approach to work.

The question that I am left with is: is social innovation serious about social impact? If it is, how will it know it achieved it without evaluation?

And to echo my previous post: if we believe learning is essential to strategic design we must ask: How serious are we about learning? 

Tough questions, but the answers might illuminate the way forward to understanding social impact in social innovation.

* Photo credit from Deviant Art innovation_by_genlau.jpg used under Creative Commons Licence.

Comments are closed.

Discover more from Censemaking

Subscribe now to keep reading and get access to the full archive.

Continue reading

Scroll to Top