15 years ago, when we conducted our first online survey, the biggest question facing the research community was: ‘Will online surveys work?’
Fast forward to 2017, online research has certainly proven its worth to many clients through quality of research and ability of brands to conduct fast and flexible surveys that doesn’t break the bank. But at MaCorr Research our ethos is to test more and faster in order to employ the best and most agile methods.
As Response Rates to customers surveys are at record low, we, at MaCorr, constantly challenge ourselves to optimize our online research to improve Response Rate, as one of the best quality indicator of an online survey.
Why should we care about Response Rate?
We live in an era of enabling. Creating and hosting a survey has become easier than ever before thanks to new technology platforms and tools. Yet, access to tools doesn’t necessarily translate to good results. A beautifully designed survey is not enough to get good data; you also need a good response rate.
A good response rate provides four major benefits:
- Higher data quality and accuracy: The easier it is to respond to a survey, the greater the chance respondents will answer truthfully. Badly designed surveys will see people dropping out or clicking through senselessly.
- More representative sample: Having fewer people drop out mid-survey means your results will deviate less from your targeting goals.
- Reduce the need for high incentives: If your survey is short, relevant and engaging, doing the survey will, in itself, feel like a rewarding experience. That means you will receive good survey data while avoiding offering the kinds of incentives that can induce cheating.
- Customer engagement: The last survey experience leaves a strong lasting impression on a respondent, so it’s important to ensure respondents aren’t frightened away by a long and boring survey. This is the first step to a good customer development process.
At MaCorr, we play the dual role of both the user and the manager of our online community. On one hand, we want to extract as much information for our clients as possible and on the other; we are also take the responsibility of taking care of our rapidly growing online community very seriously. That’s why we not only make every effort to keep our surveys short, relevant and engaging, but we also come up with a model to predict how well our surveys will perform.
In order to understand what happened to survey response rates as survey length increases, we analyzed responses and drop-offs in aggregate across all of the recent surveys we scripted and hosted. We defined response rate as the likelihood that a survey starter completes the survey.
We looked into survey attributes such as sampling patterns, intro page configurations, different question types compositions, skip logic complexity, the use of media (images and videos), and the survey topics.
The most important finding was that survey length has a major impact on response rate. As expected, the longer the survey, the less likely respondents finish it. A simple bivariate analysis between survey length in minutes and completion rate can show a lot.
Survey length accounts for more than a 54% of the variability in the likelihood of a respondent completing our surveys. Suppose we have a 1-minute long survey, only 85 out of 100 respondents will complete it. And for every extra minute added to the survey length, we lose an extra 3 respondents. A 20-minute survey will likely see only 30-40 people finishing it.
The MaCorr Team