This series of articles looks at Time to Change’s approach to Behaviour Change campaigning – this week head of marketing Katherine talks about effective evaluation design.
Designed to help readers feel more informed when feeding their own insights and evaluation learnings into their campaign strategies, the other articles in this series look at:
- Understanding your target audience
- Delving deeper into developing a campaign based on audience
- Showing how using behaviour change insights can inform short and long-term strategy
A guide to effective evaluation design
Designing our evaluation process starts as soon as we start work on a new campaign. The questions are always right there – how will we know what’s happened? Can we find out if this has made a difference? Will this make people actually behave differently? And we use what we find out through every stage of the campaign – from changing the mix of marketing channels to getting more funding. We need our evaluation to be consistent, robust but also able to evolve as the campaign does. Here’s the ‘how to’ of what we do…
Finding the right partner
Our campaign needed to change how people acted when it came to mental health. So we wanted to understand what our audience did before our campaign – and what happened after. We needed research expertise to get a clear pre and post campaign understanding, making sure we were getting to the right people, and asking the right questions.
We ran a tender process looking for a partner who could keep the evaluation straightforward – but at the same time get under the skin of complicated attitudes and behaviours. We needed to understand the responses to our advertising, but also longer term behaviour changes across campaigns. So we were looking for a long-term partner. We’ve now worked with our research agency Consumer Insight for eleven years, and this has added so much to our evaluation, meaning they have real depth of knowledge about every part of our campaign and we get a solid understanding of trends over time – important as behaviour change is slow.
Keeping it simple
To evaluate our campaigns, Consumer Insight use an online questionnaire. People are recruited from our target audience, and research dips happen pre and post campaigns. We then add in research dips to evaluate activities outside of our main campaigns. We try to strike a balance between keeping our questionnaires consistent so we can track trends over time and adding fresh questions as the campaign evolves. In the same way, the people we recruit varies as the campaign targets different audiences over time. But we’ve used the same model for all our campaign bursts. The analysis we get back is deep and insightful, but the model is straight forward – meaning we can keep focussed on understanding what’s happening.
So what is our model? Working out what we need to know and who we need to ask
When budgets are tight and opportunities for research infrequent, there can be pressure to get everything into a questionnaire. We work hard to make sure that every question helps us understand the impact of our campaign. We need to know about campaign awareness, but most importantly if and how that awareness converts to people changing their behaviour. We want to be sure that we’ll be able to use the results from every question to understand impact, return on investment, and make the campaign work better. That means if we’re not sure a question will give us those answers we don’t include it.
Just ONE behaviour change
Beyond working out what it is we really need to know, we always have ONE question that is the behaviour change the campaign is designed to deliver. We ask people to consider four statements that are actions they could have taken as a result of the campaign. For our current campaign ‘Ask twice’, we are looking for agreement with the following statement ‘ I have stepped in to help a friend who might be experiencing a mental health problem (by asking twice) as a result of this campaign.’ Having this one measure keeps us focussed on making sure we are driving behaviour change and able to see the real impact on our audience.
Finding the ‘so what?’
As well as the evaluation Consumer Insight do for us, we get output data from our advertising agencies and our own social listening, website and PR tracking. There is a LOT of data. Bringing it all together, we’re always looking for the so what? Has having a certain number of click throughs helped us change behaviour? How can we use that information to make what we do more effective?
That’s our approach – find the right partner, keep it simple, ask the right questions and then make sure to get the most from the data. Above all, we make sure we know how many people’s behaviour we’ve changed and why. And while it might not always be possible to get outside agencies to track campaigns, it always possible to get totally clear on the behaviour change that’s at the heart of the campaign – and keep resources focussed on finding out if the campaign has made that change happen.