Writing Good Survey Questions: 10 Best Practices (2024)

Summary: Designing a good survey is harder than it may seem. Ensure accurate and reliable data by writing questions that are appropriate for the method and worded to minimize bias.

Unfortunately, there is no simple formula for cranking out good, unbiased questionnaires.

That said, there are certainly common mistakes in survey design that can be avoided if you know what to look for. Below, I’ve provided the 10 most common and dangerous errors that can be made when designing a survey and guidelines for how to avoid them.

In This Article:

  • 1. Ask About the Right Things
  • 2. Use Language that Is Neutral, Natural, and Clear
  • 3. Don’t Ask Respondents to Predict Behavior
  • 4. Focus on Closed-Ended Questions
  • 5. Avoid Double-Barreled Questions
  • 6. Use Balanced Scales
  • 7. Answer Options Should Be All-Inclusive and Mutually Exclusive
  • 8. Provide an Opt-Out
  • 9. Allow Most Questions to Be Optional
  • 10. Respect Your Respondents

1. Ask About the Right Things

Ask Only Questions that You Need Answered

One of the easiest traps to fall into when writing a survey is to ask about too much. After all, you want to take advantage of this one opportunity to ask questions of your audience, right?

The most important thing to remember about surveys is to keep them short. Ask only about the things that are essential for answering your research questions. If you don’t absolutely need the information, leave it out.

Don’t Ask Questions that You Can Find the Answer to

When drafting a survey, many researchers slip into autopilot and start by asking a plethora of demographic questions. Ask yourself: do you need all that demographic information? Will you use it to answer your research questions? Even if you will use it, is there another way to capture it besides asking about it in a survey? For example, if you are surveying current customers, and they are providing their email addresses, could you look up their demographic information if needed?

Don’t Ask Questions that Respondents Can’t Answer Accurately

Surveys are best for capturing quantitative attitudinal data. If you’re looking to learn something qualitative or behavioral, there’s likely a method better suited to your needs. Asking the question in a survey is, at best, likely to introduce inefficiency in your process, and, at worst, will produce unreliable or misleading data.

For example, consider the question below:

If I were asked this question, I could only speculate about what might make a button stand out. Maybe a large size? Maybe a different color, compared to surrounding content? But this is merely conjecture. The only reliable way to tell if the button actually stood out for me would be to mock up the page and show it to me. This type of question would be better studied with other research methods, such as usability testing or A/B testing, but not with a survey.

2. Use Language that Is Neutral, Natural, and Clear

Avoid Biasing Respondents

There are endless ways in which bias can be introduced into survey data, and it is the researcher’s task to minimize this bias as much as possible. For example, consider the wording of the following question.

⛔️ Biased Question

We are committed to achieving a 5-star satisfaction rating. How would you rate your satisfaction?

  • ★★
  • ★★★
  • ★★★★
  • ★★★★★

By initially providing the context that the organization is committed to achieving a 5-star satisfaction rating, the survey creators are, in essence, pleading with the respondent to give them one. The respondent may feel guilty providing an honest response if they had a less than stellar experience.

Note also the use of the word satisfaction. This wording subtly biases the participant into framing their experience as a satisfactory one.

An alternative wording of the question might remove the first sentence altogether, and simply ask respondents to rate their experience.

Use Natural, Familiar Language

We must always be on the lookout for jargon in survey design. If respondents cannot understand your questions or response options, you will introduce bad data into your dataset. While we should strive to keep survey questions short and simple, it is sometimes necessary to provide brief definitions or descriptions when asking about complex topics, to prevent misunderstanding. Always pilot your questionnaires with the target audience to ensure that all jargon has been removed.

Speak to Respondents Like Humans

For some reason, when drafting a questionnaire, many researchers introduce unnecessary formality and flowery language into their questions. Resist this urge. Phrase questions as clearly and simply as possible, as though you were asking them in an interview format.

3. Don’t Ask Respondents to Predict Behavior

People are notoriously unreliable predictors of their own behavior. For various reasons, predictions are almost bound to be flawed, leading Jakob Nielsen to remind us to never listen to users.

Yet, requests for behavioral predictions are rampant in insufficiently thought-out UX surveys. Consider the question:How likely are you to use this product? While a respondent may feel likely to use a product based on a description or a brief tutorial, their answer does not constitute a reliable prediction and should not be used to make critical product decisions.

Often, instead of future-prediction requests, you will see present-estimate requests: How often do you currently use this product in an average week? While this type of question avoids the problem of predictions, it still is unreliable. Users struggle to estimate based on some imaginary “average” week and will often, instead, recall outlier weeks, which are more memorable.

The best way to phrase a question like this is to ask for specific, recent memories: Approximately how many times did you use this product in the past 7 days? It is important to include the word approximately and to allow for ranges rather than exact numbers. Reporting an exact count of a past behavior is often either challenging or impossible, so asking for it introduces imprecise data. It can also make respondents more likely to drop off if they feel incapable of answering the question accurately.

⛔️ Future Prediction

How likely are you to use this product?

⚠️ Present Estimate

How often do you currently use this product in an average week?

Past Estimate

Approximately how many times did you use this product in the past 7 days?

4. Focus on Closed-Ended Questions

Surveys are, at their core, a quantitative research method. They rely upon closed-ended questions (e.g., multiple-choice or rating-scale questions) to generate quantitative data. Surveys can also leverage open-ended questions (e.g., short-answer or long-answer questions) to generate qualitative data. That said, the best surveys rely upon closed-ended questions, with a smattering of open-ended questions to provide additional qualitative color and support to the mostly quantitative data.

If you find that your questionnaire relies overly heavily on open-ended questions, it might be a red flag that another qualitative-research method (e.g., interviews) may serve your research aims better.

On the subject of open-ended survey questions, it is often wise to include one broad open-ended question at the end of your questionnaire. Many respondents will have an issue or piece of feedback in mind when they start a survey, and they’re simply waiting for the right question to come up. If no such question exists, they may end the survey experience with a bad taste. A final, optional, long-answer question with a prompt like Is there anything else you’d like to share? can help to alleviate this frustration and supply some potentially valuable data.

5. Avoid Double-Barreled Questions

A double-barreled question asks respondents to answer two things at once. For example: How easy and intuitive was this website to use? Easy and intuitive, while related, are not synonymous, and, therefore, the question is asking the respondent to use a single rating scale to assess the website on two distinct dimensions simultaneously. By necessity, the respondent will either pick one of these words to focus on or try to assess both and estimate a midpoint “average” score. Neither of these will generate fully accurate or reliable data.

Therefore, double-barreled questions should always be avoided and, instead, split up into two separate questions.

⛔️ Double-Barreled

How easy and intuitive was this website to use?

Separated

How easy was this website to use?

How intuitive did you find this process to be?

6. Use Balanced Scales

Rating-scale questions are tremendously valuable in generating quantitative data in survey design. Often, a respondent is asked to rate their agreement with a statement on an agreement scale (e.g., Strongly Agree, Agree, Neutral, Disagree, Strongly Disagree), or otherwise to rate something using a scale of adjectives (e.g., Excellent, Good, Neutral, Fair, Poor).

You’ll notice that, in both of the examples given above, there is an equal number of positive and negative options (2 each), surrounding a neutral option. The equal number of positive and negative options means that the response scale is balanced and eliminates a potential source of bias or error.

In an unbalanced scale, you’ll see an unequal number of positive and negative options (e.g., Excellent, Very Good, Good, Poor, Very Poor). This example contains 3 positive options and only 2 negative ones. It, therefore, biases the participant to select a positive option.

⛔️ Unbalanced

How would you rate your experience using this website?

  • Excellent
  • Very good
  • Good
  • Poor
  • Very poor

Balanced

How would you rate your experience using this website?

  • Excellent
  • Good
  • Neutral
  • Fair
  • Poor

7. Answer Options Should Be All-Inclusive and Mutually Exclusive

Answer options for a multiple-choice question should include all possible answers (i.e., all inclusive) and should not overlap (i.e., mutually exclusive). For example, consider the following question:

⛔️ Answer options skipped and overlapping

How old are you?

  • 0–20
  • 20–30
  • 30–40
  • 40–50

In this formulation, some possible answers are skipped (i.e., anyone who is over 50 won’t be able to select an answer). Additionally, some answers overlap (e.g., a 20-year-old could select either the first or second response).

Always doublecheck your numeric answer options to ensure that all numbers are included and none are repeated.

8. Provide an Opt-Out

No matter how carefully and inclusively you craft your questions, there will always be respondents for whom none of the available answers are acceptable. Maybe they are an edge case you hadn’t considered. Maybe they don’t remember the answer. Or maybe they simply don’t want to answer that particular question. Always provide an opt-out answer in these cases to avoid bad data.

Opt-out answers can include things like the following: Not applicable, None of the above, I don’t know, I don’t recall, Other, or Prefer not to answer. Any multiple-choice question should include at least one of these answers. However, avoid the temptation to include one catch-all opt-out answer containing multiple possibilities. For example, an option labeled I don’t know / Not applicablecovers two very different responses with different meanings; combining them fogs your data.

9. Allow Most Questions to Be Optional

It is so tempting to make questions required in a questionnaire. After all, we want the data! However, the choice to make any individual question required will likely lead to one of two unwanted results:

  • Bad Data: If a respondent is unable to answer a question accurately, but the question is required, they may select an answer at random. These types of answers will be impossible to detect and will introduce bad data into your study, in the form of random-response bias.
  • Dropoffs: The other option available to a participant unable to correctly answer a required question is to abandon the questionnaire. This behavior will increase the effort needed to reach the desired number of responses.

Therefore, before deciding to make any question required, consider if doing so is worth the risks of bad data and dropoffs.

10. Respect Your Respondents

In the field of user experience, we like to say that we are user advocates. That doesn’t just mean advocating for user needs when it comes to product decisions. It also means respecting our users any time we’re fortunate enough to interact with them.

Don’t Assume Negativity

This is particularly important when discussing health issues or disability. Phrasings such Do you suffer from hypertension? may be perceived as offensive. Instead, use objective wording such as Do you have hypertension?

Be Sensitive with Sensitive Topics

When asking about any topics that may be deemed sensitive, private, or offensive, first ask yourself: Does it really need to be asked? Often, we can get plenty of valuable information while omitting that topic.

Other times, it is necessary to delve into potentially sensitive topics. In these cases, be sure to choose your wording carefully. Ensure you’re using the current preferred terminology favored by members of the population you’re addressing. If necessary, consider providing a brief explanation for why you are asking about that particular topic and what benefit will come from responding.

Use Inclusive and Appropriate Wording for Demographic Questions

When asking about topics such as race, ethnicity, sex, or gender identity, use accurate and sensitive terminology. For example, it is no longer appropriate to offer a simple binary option for gender questions. At a minimum, a third option indicating an Other or Non-binary category is expected, as well as an opt-out answer for those that prefer not to respond.

An inclusive question is respectful of your users’ identities and allows them to answer only if they feel comfortable.

Writing Good Survey Questions: 10 Best Practices (2024)
Top Articles
Latest Posts
Article information

Author: Catherine Tremblay

Last Updated:

Views: 5594

Rating: 4.7 / 5 (47 voted)

Reviews: 94% of readers found this page helpful

Author information

Name: Catherine Tremblay

Birthday: 1999-09-23

Address: Suite 461 73643 Sherril Loaf, Dickinsonland, AZ 47941-2379

Phone: +2678139151039

Job: International Administration Supervisor

Hobby: Dowsing, Snowboarding, Rowing, Beekeeping, Calligraphy, Shooting, Air sports

Introduction: My name is Catherine Tremblay, I am a precious, perfect, tasty, enthusiastic, inexpensive, vast, kind person who loves writing and wants to share my knowledge and understanding with you.