8 Types of Response Bias and how to avoid them in your surveys

Response bias can lead product teams to make inaccurate product-led decisions that jeopardize users, growth, and business MRR. Here’s how to avoid it

Response bias is a misrepresentation of customer survey data because of customers’ induced bias.

There are eight different types of response bias your customer surveys can fall victim to. Response bias is a massive risk for product teams trying to conduct market research.

If your surveys encourage response bias, it can drastically affect your product growth decisions and result in a decline in customer retention, acquisition, and ultimately MRR. 

You’ll never be able to eliminate response bias altogether. However, there are certainly ways you can minimize it and give your surveys the best chance of being wholly accurate and truthful.

In this article, we address this survey nightmare for any product marketing VP out there looking to squash the bug upfront.

We’ll break down the eight types of response bias you need to look out for and share practical solutions to minimize these biases with your customer survey strategies.   

What is response bias?

Response bias is a misrepresentation of input from a survey respondent due to a manipulative factor. Response bias can be intentional or unintentional, and there are eight types of bias for businesses to be aware of when creating surveys. Response bias can be detrimental to survey results and survey-led business decisions as product and marketing teams end up making decisions based on inaccurate data. 

8 Types of response bias to be aware of

The eight types of response bias you’ll need to be aware of in order to build surveys that minimize the risk of inaccurate responses are:

  1. Demand bias 
  2. Social desirability bias
  3. Dissent bias
  4. Agreement bias 
  5. Extreme responses
  6. Neutral responding 
  7. Personal bias 
  8. Non-response bias 

Let’s explore each type of response bias and some B2B SaaS use cases that could be encouraging this bias. 

Demand Bias

Demand response bias is when respondents alter their answers to help the survey garner the results they know it’s looking for in an effort to appease the survey maker. 

Demand bias often occurs when respondents: 

  • Have too much context around the survey’s goals
  • Can see live survey results (as is often the case with polls) 
  • Have a personal connection with the person conducting the survey 
  • Care passionately about the business conducting the survey 
  • Is eager to please

Here is an example of a B2B SaaS question that shows demand bias:

Response bias psychology at its finest! Although this often happens by simply being a survey respondent for a brand the customer feels passionately about, in the above example, you can get an idea of demand bias factors to look out for.

There are practical steps your product team can take to fight demand bias, which we’ll explore further on. 

Social desirability bias 

Social desirability bias occurs when survey respondents know their response is going to potentially reflect badly or positively on a community they’re passionate about—or themselves.

This type of bias occurs when respondents want their answers to be more socially acceptable.  

Social desirability bias can lead to dishonest answers and occur when: 

  • Surveys are not anonymous 
  • Respondents know their answer will be showcased to a community whose opinion they care about 
  • Respondents know their answer will affect the reputation of a community they care about 

Here is a response bias example question that shows social desirability bias: 

Dissent bias 

Dissent bias is when respondents consistently and purposefully select negative answers to your surveys for every question.

This act of sabotage can drastically affect your survey results and can happen for a few reasons: 

  • The introductory wording in your survey has emotionally upset your respondent, and they are now defensive or answering your survey with negative sentiment toward your brand. 
  • Your survey respondents don’t have time to be answering your survey, but you’re making them do so regardless, perhaps via an in-app modal popup that’s stopping them from getting their job-to-be-done actually done. Your negative multiple-choice answers are in the same on-page position every time.  

A non-multiple-choice response bias question example that could encourage dissent bias could look like this. 

Agreement bias 

Agreement bias is when a survey respondent repeatedly answers positively to every survey question, with little thought around the questions at hand. Also known as acquiescence bias, agreement bias is the other side of the spectrum from dissent bias.

Agreement bias occurs most when: 

  • A respondent is repeatedly given yes/no answer options, and “yes” is consistently in the same place. This happens when a respondent suffers from survey fatigue and doesn’t want to think about their answers. 
  • A respondent becomes agreeable. It’s human nature to say yes over no, otherwise known as agreement psychology. When answers encourage a respondent to agree or disagree with a statement, they’re more likely to give a pleasing response

An example of agreement bias, outside of survey fatigue, may look something like this: 

It’s worth noting that this example of response bias often occurs when the respondent has direct contact with the researcher. 

Extreme response bias  

Extreme bias occurs most when you use Likert scale questions in your survey. Likert questions are closed questions that often start with a statement and ask the respondent how much they agree or disagree with that statement using a scale.

Extreme response bias occurs especially when your scale is small: for example, a scale of 1-5 over 1-10. This type of bias is hard to combat, though it’s not impossible. 

A few things that can instigate extreme bias responses are: 

  • Culture: certain geographical cultures are more inclined to fall victim to response bias
  • Education: lower IQ people are more likely to showcase extreme response bias 
  • Questioning Motivations: questions that provoke or question motivations or beliefs are more likely to instill extreme response bias

An example question that could lead to inaccurate data due to extreme response bias is: 

Neutral response bias 

Neutral response bias is when survey respondents reply passively to every question you ask. On a Likert scale from one to three, this would be answering two every time. This type of response is damaging to your data, and a waste of time for your research team as it tells them nothing. 

Neutral response bias can occur when: 

  • You’ve selected the wrong group of people for your questionnaire, and they’re simply not interested in the topic.
  • Your question response options at one and three are too extreme, and there’s no middle-ground other than neutral.

An example question that would encourage neutral response bias would look something like this: 

Personal bias 

Personal bias is when your survey respondent’s personal interests, opinions, experiences, and beliefs overcome their logical response.

Personal response bias is tricky to avoid. It will always need to be considered in your question’s wording and when analyzing your data. 

A few factors that can affect personal bias responses are: 

  • Experience: An executive or senior company official will have a very different outlook to entry-level workers, even if they work in the same field. 
  • Opinions & Beliefs: Personal opinions may come into play if respondents are provoked by your question copy  

There are no clear questions that fall victim to personal bias. It largely comes down to your audience selection and how you tailor your copy for that specific group. 

Non-response bias

Non-response bias is when an entire cohort of survey receivers don’t answer the survey. Non-response bias is potentially the most damaging of all biases, as it excludes an entire group of users and their opinions.

That’s not to say these users aren’t interested in your product or have opinions to share; their non-response might be for very practical reasons. 

A few reasons for non-response bias are: 

  • A cohort of users doesn’t have as user-friendly access to the survey as another cohort 
  • The UI of the survey does not function with a particular device type 
  • There is a bug in your survey send out, limiting your responses 
  • The opening of your survey excludes the receiver and presents the survey as not being relevant to them

Like personal bias, there are no particular question types that will initiate a non-response.

How to minimize response bias (or get rid of it altogether) with your survey questions and strategy 

As you may have gathered, you can’t always eradicate response bias. Although, there are certainly ways you can minimize response bias in your future surveys.

Let’s explore a few strategies for minimizing response bias while maximizing the potential insights from your target audience. 

1. Align your target audience 

First up, you’ll want to make sure your survey is going out to the people that can provide you with the responses you need. 

Wynter allows you to run B2B message testing with your ideal hand-validated B2B audience. You’ll be able to target your survey based on job title, seniority, industry, and employee headcount.  

2. Pay special attention to introductions

Your introduction is your first impression with your respondents. Make a good impression, and give them just enough context as to why they’re receiving the survey. 

3. Use different question types

In order to avoid survey fatigue and give your product team a wealth of insights, you can use different question types, such as: 

  • Likert scales
  • open and close-ended questions
  • Multiple-choice
  • Multi-select
  • Single-select

4. Write conscious copy

Response bias questions need to be carefully written. You’ll want to avoid emotionally-charged copy, and always remember who you’re speaking to. Try not to charge your questions with presumptions.

This will get respondents on the defense and will pave the way for emotionally-charged bias, or them throwing the results in spite.

5. Give your respondents time and (possibly) incentive 

Incentivising a survey response is largely down to your demographic. It could encourage one cohort to sit up and pay attention to their responses, but it could encourage another to rapidly fire through the survey to get the reward. Either way, give your respondents time to answer your survey at their pace. 

6. Test your survey’s functionality

Testing is absolutely key to making sure everyone can functionally use your survey with their device. 

7. Mix up the placement of your answers

If you’re running Likert scale questions, ensure you mix up how you place your answers. For example, in the visual mock-up below, we’ve got “YES” on the left for the first two questions, and place it on the right for the next three.  

8. Open the floor for qualitative feedback

Don’t limit your surveys to only close-ended questions in an attempt to quantify your results. Ask respondents to dive deeper with options to leave qualitative feedback too. 

A few open-ended survey questions to consider are: 

  • What gets in the way of doing [job] effectively? 
  • What are your three most important activities? 
  • What is your biggest challenge, frustration, or problem with [job]?

9. Give optional anonymity

Anonymity will go a long way in avoiding a lot of bias. When someone knows they’re not going to be judged for their responses, they’ll be more likely to impart their true feelings.  

10. Don’t showcase your results live

People are easily swayed. If your survey displays live results—like a social media poll—respondents may be inclined to go with the masses to appease your results. Or, they might be inclined to go the other direction as an act of defiance. Either way, it’s not the result you’re hoping for. 

Looking for proven ways to avoid bias in your surveys? Wynter gives you ready-made templates to help you spot a customer’s needs, pains, gains, and jobs-to-be-done with zero bias.

Wrapping up how to avoid bias in a survey 

That’s everything you need to know for mitigating response bias with your surveys, from your response bias definition to your survey best practices.

Hopefully, you’ve found this walk-through useful and can walk away with some practical tactics to implement in your product surveys today. 

If you’re looking to implement regular surveys or just a one-off survey with minimal response bias affecting your data, consider running your research with Wynter, today.

Know exactly what your buyers want and improve your messaging

Join 10,000+ other marketers and subscribe and get weekly insights on how to land more customers quicker with a better go-to-market machine.
You subscribed successfully.