Data-driven strategy and decision-making are only as good as the processes we have to gather, synthesize and use to inform action.
I am a strong advocate for using data to inform decisions. Good quality data, appropriately gathered and focused, can reveal patterns that save time, money, and even lives. But data is also another method of providing cover for poor strategy and inept leadership.
I’ve witnessed intelligent people quibble over data at the expense of the decisions that the data is meant to serve. Human biases become evident when you see poor data used for decisions that have favour with leaders and excellent data ignored (or fought) when such decisions are difficult. Data is only as good as the use we put it toward.
To illustrate, I’ve worked within healthcare and public health for over 25 years and witnessed the shameless use of data to make decisions under the guise of evidence-based decision-making. The shamelessness is tied to the illusions and denial made about the quality of the data used in generating evidence for (or against) a decision. This is not just in the health sector; organizations do it all over. The health sector is particularly notable because they explicitly use the term evidence-based to describe their work.
The truth is that very few decisions are genuinely evidence-based in nearly any organization. When was the last time you used an evidence-based process to guide strategy for using evidence? We might have robust data to support a decision, but how was that decision rendered? Were the options presented for making the decision based on evidence?
Were the actions taken based on the evidence guided by an implementation strategy informed by evidence?
Too often the answer is no.
Struggling with Data and Evidence
High-quality data provides insight into the impact of our work – processes, outcomes, and general direction. If I have good data, I have a more accurate world picture of how my product, service or systems operate. I am also more likely to generate questions about what I do, not just answers. This is why evaluation is such a valuable part of the innovation process. Yet, evaluation is among the most ignored or underinvested areas of innovation.
Why is this?
One reason is – to offer a cheeky quote —
Evaluation is the longest four-letter word in our language.
Evaluation exposes our assumptions. My interests might be reflected in the question I ask, but they might not be in the answers. That’s really the difference. That is why evaluation — and the data it generates — is often not well-received.
I suggest that you ask yourself if you quibble only when you dislike what the data suggests. I suggest you get to love evaluation.
Are you using data to support decisions in your network or organization?
I can’t use data if I don’t know what I am looking at and what it means. One reason many of us struggle with data is that we don’t know what it’s supposed to tell us. This is where sensemaking as a process comes in and helps us with using data.
Without a sensemaking process in place, the decisions I make are subject to the first thought I have. This is what I meant when I critiqued the lack of evidence-based decision-making earlier. Too often, our data is not accompanied by adequate sensemaking.
Sensemaking converts our data into meaningful insights, which are not always the most obvious. Sensemaking also connects us to strategy. I can’t make sense of data if I don’t know where I’m going, what has meaning for me, and whether what I see connects the two. I want data to tell me how I connect what I want to what I actually do. It has to make sense.
If it doesn’t make sense, that’s not a reason to quibble. (It might be). A real reason to quibble is when I stop applying evidence to how I gather my data and what it’s for.
Many organizations struggle with this — don’t feel bad if this is what you do, too. This is something that can be designed to be better. If you need help with this, let’s grab a coffee and talk; I can help you.
Image Credits used under license from Marketoonist.