How to Create Effective User Surveys

Tom Hall
UX Planet
Published in
9 min readOct 2, 2017

--

Running a survey is a quick and relatively easy way to get data about your users. But it’s also easy to create a survey that lies to you, and hard to know when that is. Even if you know the data from a survey is wonky, the results still have a way of digging into your brain.

In today’s post I’m going to detail some ways to write better surveys that engage users and provide more reliable data.

Writing the questions

This is the most important part. You have to write good questions to get reliable answers. Bad questions lead to bad data.

Types of Questions

Broadly speaking, there are two types of questions that you can ask in a survey: open and closed.

In closed questions, respondents have a fixed number of possible responses to choose from. For example: yes/no, multiple choice, checkboxes, or Likert scale questions. In open questions, they can respond however they want, such as essays or short answers.

Closed questions offer a fixed number of possible responses (Image from Pexels)

So when should you use each question type?

Open questions will give you much better qualitative data. For example, if you’re looking for some insight into how users think about a problem, open questions will give you a lot more detail. They also allow for responses that you may not have accounted for if you’d used a closed question instead. Sometimes respondents will also be happy that you’ve given them an opportunity to express themselves.

On the downside, qualitative responses tend to take a lot longer to analyze, for the very reason that they can provide so much detail. If you’re expecting a lot of responses, be aware that including open questions can take a ton of work to analyze. Also, since the respondents will have to type out responses, this can lower your response rate. This is especially true for mobile users.

In contrast, closed questions tend to have higher response rates. It’s also much easier to analyze closed questions statistically. This can make them very useful when you’re trying to quantify things, such as how many of your users are interested in a given feature you’re proposing.

Question Completeness

A problem that you can easily run into with closed questions is leaving valid response options out of your available answers. You may need to do some qualitative research with users beforehand (such as one-on-one interviews) to make sure you understand enough about the problem to have most of the correct options covered.

If you’re going to use closed questions, make sure all your ducks are in a row (Image from Pexels)

One way to test this is with a pilot study. Say that you’ve written a multiple choice question and included “Other, please specify” as one of the options. If you find that a significant number of people are selecting the “Other” option, that may be a sign that you should add some more response options to the question. You’ll have to think about how exactly to define what a “significant number” of people is, but one rule of thumb is to make sure 90% of responses are covered in your multiple choice options.

Keep it Simple

The language you use in your survey should be simple and direct. Consider your audience. Don’t use jargon, advanced concepts, or abbreviations if you can avoid them. If you have to use them, include an explanation. People do tend to read useful information, so if you need to add more details to make sure the question is easily understood, go for it. If your question means the same thing to everybody, your data will be more clean.

You especially want to avoid double barreled questions, which is where you’re actually asking two things at once. For example:

“Would you like to buy a new TV and laptop?”

What if the respondent wants a new TV but they like their current laptop? Make sure that your questions are asking exactly one thing at a time.

Use only single barreled questions (Image from Pexels)

Bias and Priming

It’s important that you write your questions in a way that doesn’t bias your respondents. One way to introduce bias is by using unbalanced scales. Say that you ask the following question:

“What impact does our product have on your life?”

With the following response options:

  • Extremely positive
  • Very positive
  • Somewhat positive
  • Not positive

In this example, the scale is balanced towards positive, because there are three positive options and only one negative option. The question introduces bias because the response scale is tilted towards positive responses.

Another way to introduce bias is with leading questions. A leading question is one that nudges the respondent towards the answer that you’re looking for. You’ve probably heard this term in a TV courtroom before. Consider this question:

“How helpful is our app?”

The way the question is phrased nudges the respondent towards thinking that the app is helpful. A better way to approach this would be to ask:

“Do you think our app is helpful?”

Lawyers (mostly) aren’t allowed to ask leading questions, and neither should you (Image from Pexels)

Another important concept is priming. This happens when questions or concepts introduced earlier in the survey can influence how people respond to later questions.

For example, say that your survey asks the respondent to rank a bunch of the customer support options your website offers. In the next question, you ask them:

“Does our website offer enough support options?”

By listing all of the support options first, you may have primed the respondent. They may be more likely to answer that you do offer enough support options, since you just listed a bunch of them, even if they don’t necessarily think that.

UX professionals reading this might be interested to know that priming can happen in usability testing as well.

How to get people to participate

By this point, you’re writing questions that are easy to understand, don’t introduce bias, and get you just the kind of data you’re looking for. Now you need to make sure you’re getting enough people to fill out your survey. Here’s a few things that will help you do that.

Length

The length of your survey is a key factor in response rate, particularly when surveying the general public. Ask only the number of questions necessary to get data about what you’re trying to learn, and don’t include questions that aren’t necessary. Adding more questions tends to decrease your response rate, so make sure every question is worth it.

Length is an important factor in your survey’s response rate (Image from Pexels)

SurveyMonkey found that respondents tend to spend less time on each question the longer a survey is, and for surveys longer than 7–8 minutes, completion rates dropped by 5% to 20%. Remember: keep your survey as short as possible while still getting the information you want.

Structure

You’ll want to structure your survey in a way that minimizes the number of people dropping out. One technique that may help, called the funnel, is where you ask basic, general questions at the beginning, more complex questions in the middle, and then return to general questions at the end. You still want to make sure that the questions follow a logical order, because it might get confusing if the topics jump around too much.

Also, don’t forget about priming! The order in which you introduce your questions still matters.

Incentives

Using incentives (such as a gift card, a draw for an iPad, etc.) can be an effective way to increase your response rate. If you’re having trouble getting enough data from your survey, incentives can certainly help. There’s also some evidence that offering an incentive can increase the quality of your data in some ways, as respondents may put more thought into their answers.

Offering an incentive for completing the survey is an effective — if sometimes risky — strategy (Image from Pexels)

However, there are also some drawbacks. For one, if you’re looking for a lot of data, offering an incentive can get expensive. There’s also the possibility that offering a reward for completing the survey might change the types of people who respond, which can introduce bias. Respondents might also be more likely to answer questions positively if you offer them an incentive, which introduces bias.

So, in short, incentives can be effective, but you have to be careful about when and how you use them.

One Other Thing

Now you’ve got your survey all ready to go. The questions are great, it’s structured in a way that encourages people to fill it out, and maybe you’ve got some incentives lined up. That’s dope. But there’s still some things that can go wrong at this point.

I’ve covered bias to some extent earlier in this article, but there are some other sources of bias that are important to cover. These could easily be an entire article on their own, but I’m going to cover them briefly here because it’s so important, particularly when you’re recruiting participants for your survey.

When you run a survey, you’re trying get a sample (the people who respond to your survey) of the population that you’re trying to measure (e.g. your potential customers). Selection bias is when the sample is not truly representative of the population. If this happens, then the conclusions you draw from your survey might not accurately represent the population. Selection bias can happen for a number of reasons, but two common reasons are undercoverage and nonresponse bias.

Undercoverage

Undercoverage is when an element of the population isn’t properly represented. This is a problem if the group (or groups) of people that you didn’t cover in your survey have meaningfully different traits or ideas about the questions you’re asking than the people who filled it out.

Here’s an (extreme) example. Say that you’ve created a survey to research how people feel about social media and privacy. You also happen to have a mailing list for your internet privacy newsletter, so you send the survey out to them since you’ve already got their emails. The people on that mailing list might feel very differently about the subject than people who aren’t on that mailing list, and who aren’t being covered in your survey. This might mean that the results from your survey don’t apply to the general population.

Not everyone feels the same about privacy (Image from Pixabay)

Here’s a less extreme example:

The Literary Digest voter survey predicted that Alfred Landon would beat Franklin Roosevelt in the 1936 presidential election. The survey sample suffered from undercoverage of low-income voters, who tended to be Democrats. How did this happen? The survey relied on a convenience sample, drawn from telephone directories and car registration lists. In 1936, people who owned cars and telephones tended to be more affluent.

Try to make sure that the people you’re sending your survey to are a reasonably accurate cross section of the population you’re trying to measure.

Nonresponse bias

Nonresponse bias happens when the people that choose to complete your survey are meaningfully different from the people who choose not to. Say that 20% of the people you sent the survey to responded. When you use that data, you’re assuming that the 80% who didn’t respond would have responded in the same way as the 20%. Sometimes that’s true, and sometimes it isn’t.

Email surveys tend to have low response rates, and can be especially vulnerable to this problem. There are a number of ways to check for and correct nonresponse bias, generally involving following up with the people who didn’t respond.

The Takeaway

It’s not easy to write good surveys. It takes some thought and effort to make sure you’re doing it right. The key lessons outlined in the article are:

  • Write clear and simple questions
  • Ask the right types of questions for what you’re trying to learn
  • Make sure your questions are complete
  • Make the survey short and sweet
  • Avoid introducing bias

Follow these lessons, and you’ll be well on your way to collecting some good, clean data. Get out there and start surveying!

If you liked this article, give it some applause. Share with your friends! You can find more articles by ThinkUX on our blog.

Feel free to get in touch on LinkedIn.

--

--