Different research objectives require different methodological approaches – but not all techniques are equal. Some questions are best suited to qualitative exploration, whereas others – especially where a large sample and a statistical base are required – benefit from a quantitative approach.
Not all quantitative surveys follow the same methodology as large samples of a population can be interviewed face to face, on the phone or online. The online approach has seen explosive growth over recent years, helping to make quantitative research – in more places and with more consumer groups – more affordable and more accessible.
This proliferation in online surveys – and the resulting plethora of statistics – has led to questions about its accuracy and no little cynicism. Surely research done online can’t be as accurate or reliable as the same questions asked person to person (either on the phone or face to face)?
But two examples of research released in recent months have underlined the real value of online methodologies. Specifically, this value comes from people appearing to be more honest if they are answering questions put to them by a computer rather than a person.
The first example concerns a question asked several times a week in the UK by various polling companies – which party are you going to vote for in the next election?
Looking at UKIP’s results in political polling, we can see clear variation between telephone and online polls – with telephone polls showing lower levels of support. This ICM Poll from October shows UKIP on 14 points, while three online polls from the same weekend found consistently higher levels of support for Nigel Farage’s party (between 16-18 points) – which can be found here, here, and here.
It’s hard to know at this stage which poll is more accurate, but the difference in levels of expressed support and voting intention recall the ‘shy Tories’ phenomenon that has existed in political polling since the 1980s. The effect is the same here – when speaking to an interviewer, a respondent may be reluctant to agree with something that they feel the interviewer may judge them for, with support for an unpopular or potentially divisive party one such example.
Online polling strips the interviewer effect from the process. Another such example – and proof that this extends well beyond political polling – compares the most recent wave of World Values Survey Data (collected through face to face interviews) with our own Trajectory Global Foresight data on the importance people attach to their family. In the online poll, 79% of people in the US describe their family as ‘very’ important to them. A respectable figure, but not as respectable as the 92% saying the same thing in the face to face poll. When asked the question directly, respondents are perhaps prone to exaggerating what they really feel in order to give the response they think they should be giving.
The other benefits found in online research include the ability for researchers to compare claimed attitudes and behaviours against a plethora of other data points – from social preferences to advertising exposure and many more. Beyond this, the freedom the respondent has to complete the survey when and where they choose. This is a particularly important as we consider the rise of mobile research and mobile devices being used to complete online surveys – something presenting a challenge in terms of limitations on the length and depth of surveys.
Online research is far from perfect. In the UK alone there are still 10% of households – typically the very old or the very poor – who are without internet access, and as such excluded from online panels. But by removing the interviewer effect, it can present results that cause us to question other approaches – and may well get us closer to what consumers really think.