I’m a huge advocate of simplicity in survey design, especially when a survey is to be delivered online. Yet, when I talk to people about cutting out questions, simplifying response tasks, and minimizing the use of various presentational options (e.g., AJAX), I sometimes get the sense I’m viewed as a spoil-sport. Fortunately, people don’t have to take my word for it. A wealth of methodological research shows that completions and data quality suffer as you stray from following the Keep It Simple, Stupid (K.I.S.S.) principle in survey design (see the links at the bottom of this page).
Some respondents will also tell you how well (or badly) you’ve structured your questionnaire, although waiting until you get this to do anything is probably leaving things a little too late! Respondents can give feedback on a survey in a couple of ways: by dropping out if they are having difficulty with it, or by mentioning their experience at the end (assuming you give them an opportunity to do so). Here are some examples pulled from three surveys I’ve been involved with over the past 12 months. These went out to general population samples provided by a well-known consumer panel. The topics differed, but the surveys were similar in length – about 35 questions over 15 pages. Two of them involved presenting choice sets as part of a stated choice modelling experiment.
My intent here is not to take the glory for the results I’m about to present; I took care of the online delivery in these surveys, but the questions and structure were mainly developed by others. I’m using them because I do think the questionnaires were generally well designed. Questions were kept to a minimum, pre-testing was done, and the technology used was as simple as possible.
First, some selected respondent comments taken from across the three surveys. Many other comments echoed the same general sentiment:
“Very good survey, was easy to follow and understand.”
“Thoroughly enjoyed that survey is all I can say.”
“Clear and simple, well worded – well done, whomever designed it.”
“It was more interesting than the usual surveys :o)”
“It was a very simple, well put together survey that was easy to understand. Well done.”
“I enjoyed doing it :)”
“This was a great survey, thank you!”
“I really enjoyed this survey. Very easy to follow.”
“Great Survey, easy to do and no dumb questions.”
“Wish they were all this easy to complete.”
My key take-out points are that a) it is actually possible for people to enjoy completing a questionnaire and b) many of the online surveys people are sent appear to be complex, hard to follow, and sprinkled with “dumb questions”. Although I’m speculating, I think the “dumb question” comment refers to those that are ambiguous, overly complicated or repetitive (e.g., matrix-style questions) or attempt to psychoanalyze the respondent (e.g., brand ‘personality’ items).
However, most people won’t take the time to leave a comment. In fact, if your survey suffers from particularly bad design, many won’t even stick around to get to the last page. So, you should pay attention to the second (silent) respondent feedback mechanism: completion rates. Here are the completion rates for the three surveys mentioned above. These show the proportion of people who started the survey that went on to complete it. A low completion rate is a key signal of problems with your survey design because it means many people dropped-out.
Survey 1: 79%
Survey 2: 71%
Survey 3: 81%
So, it really is worth keeping things as simple as possible in your survey design. Respondents can tell when you are asking them flaky, ill-prepared questions and many won’t stick around if your questionnaire causes them frustration.
Short URL for this post: http://wp.me/pnqr9-1K