Search
  • James Odene

Avoid these research pitfalls

Research can offer great insight to be used to create new products, campaigns, optimise newsletters… but here’s the glitch, if the data is no good, you could be acting on poor information.

Source: dailymail.co.uk

Data doesn’t make science

When collecting and analysing data, it can be quite easy to end up with a warped sense of the insight it is offering. Imagine for a second that you have conducted a survey that said that 86% of respondents would not buy caged eggs. That sounds like a pretty significant result.


If we place that alongside another survey that also had a task element, whereby customers had to identify which products were cage-free. Here only 2% of those asked were able to correctly identify all cage-free products… for as much as these figures are made up, they illustrate that data can lead you down the wrong path.


In this fabricated scenario, it wouldn't make sense to celebrate the high desire for cage-free eggs without acknowledging that no one can tell which products are actually cage free.


Great data can be discovered to be misleading or at worst - entirely useless. Avoid the traps that are so easy to fall into...


"Faith in data grows in relation to your distance from the collection of it."
- Scott Berkun


1. Customer interviews:


Customer interviews are great. They go straight to the people you want to hear from and ask them what they really think. The issue is that we are perfectly brilliant at giving answers to questions that truly we had no thoughts on previously.


In Nick Charter’s ‘The Mind is Flat’, he expresses the power of what he calls ‘the improvised mind’. Rather than having deeply held unconsciousness that we tap in to with customer interviews, what might really be happening is customer interviews recording lots of improvised and good-enough sounding answers that have no true value as they are made up on the spot. These kinds of answers might give you awesome looking data, but they won’t provide you with useful insight in which to act upon.


Here’s a question for you: Do you prefer the top of your sock or the toe area? Which is more important out of the two?


Chances are you can answer this question quite well. But unless you are like me and are utterly and incomprehensibly picky about what kind of sock you deem suitable, you didn’t have an opinion on this until I asked. I could ask 1000 people this question and have 60% choose the toe area of the sock as the most important and launch product development into that area. When perhaps in truth, people buy whichever socks are cheap, or most people just buy white, or what if people tend to buy socks as part of a bigger shopping list and so just go for whatever is in the shop they find themselves.


All of these options have wider implications, and would make product development in to finding the exact tightness of toe that people prefer would have little impact on sales. (I would however be all for this, as I like a tight sock and rarely find this...)


This question is posed from the point of view of the concerned sock manufacturer, it is organisation-focused. Considering the duck picture at the top of this post, this product related question is the metaphorical equivalent of only seeing the shark-y looking shadow and entirely missing the duck. Without the duck, the data looks alarming and could lead to some extreme behaviour.


And this is easy to do. If you want to ask questions about your vegan challenge for example, you ask questions about your vegan challenge. But really, this can be too organisation-focused. What is better is to try and ensure that the questions sit within the wider audience-based context.


Rather than asking:

How helpful were the recipes in the vegan challenge newsletters?


You might find more useful data in asking:

If you clicked on a vegan challenge in our newsletters, please describe after clicking what then happened and how you continued your interaction with the recipes.


It could well be that the first question would tell you that say 75% of people found the recipes helpful, whereas the second could tell you that 50% of people moved on to find a different recipe to the one you promoted that they found of more use. These two data points express something very different and warrant a different action for optimisation.


“People can make up an opinion about anything, and they’ll do so if asked. […] You can thus get users to comment at great length about something that doesn’t matter, and which they wouldn’t have given a second thought to if left to their own devices.”
- Jakob Nielsen, The Query Effect

It’s important to remember that people’s loyal brains will make up reasons. Typically questionnaires ask direct questions that may lead to obscure data. Don’t directly ask what you want to know from a product-focused point of view, instead see it as a process of seeking out the truer behaviours of the customer. Finding the best data can be more like finding a crumb by candlelight, rather than a lighthouse seeking approaching boats.


Some key thoughts:

- Start with the wider context, rather than jumping straight into what you want to know

- Be user-centred, not product or organisation-centred



2. Surveys:


Surveys are great ways to get a lot of data without taking too much time or resource. A downside to it being so quick and easy is that a survey can go out without the due care over how the question is posed. As in the previous example, the questions are very similar, however, the data may be very different.


The speed and ease of a survey can presuppose that questions should be formed with simple metric measurements (never, sometimes, often, always…etc.) so that the data can be easily captured and formulated. The downside to this is that you lose a lot of the context and this could be key.

Here’s a great example as shared by Leisa Reichelt, Head of Research and Insight at Atlassian. During her talk at Mind the Product, she talked about a large team event she ran whereby people were encouraged to place a dot on a sliding spectrum in response to questions. One of which was ‘I have a great work-life balance’. The majority of people placed their dots firmly to the right – the data suggested that everyone had a great work-life balance – rejoice!


Not so fast!


Knowing that there was context lacking from this kind of question, Leisa stood at the board and asked everyone as they were placing their dot, why they had chosen that area… the vast majority of people said something along the lines of ‘I get to work from home, so even though I have too much work, I can be flexible with when I complete it’.


So whilst the context-free question suggested everyone was living their best life, it turns out that the context revealed something very different. Turns out that peoples work-life balance was actually pretty rubbish, but the fact that they could see their kids at all before working again until 10 pm meant that they were seeing it as a trade-off that they’ll accept.


To do better surveys:

- Try and capture the context, not just the answer

- Use qualitative research where possible to get the most useful data

- Yes, it takes more time, but using people to analyse survey results will make the insight more useable


Final thoughts

Research is a specialist area and I would welcome more peer-reviewed research in communications strategies for farmed animal and vegan outreach. However, I also recognise that perhaps there are lower level research opportunities - such as campaign surveys and acquiring customer feedback - that could be conducted within NGOs that may currently be avoided or not considered, even though good data could reasonably be attainable with the right support and preparation.


Another consideration is to avoid being allured by making something seem more scientific and quantifiable by reducing everything to metrics and graphs. By doing this, you could be losing hugely vital context which could entirely shift the results.


Yes, it’s important to be efficient with time, but really a quick survey with some data that makes a nice graph could have little (to no) insight versus a more human-powered context-driven research process.


Don't miss the duck! Quack, quack.


80 views
  • Black Instagram Icon

©2019 by USER-FRIENDLY  |  Privacy Policy  |  Terms & Conditions