The definitive guide to message testing

To know whether your copy is pulling its weight, you need to test your messaging.
Want articles like this straight into your inbox?
Subscribe here

If message testing isn’t part of your marketing strategy, you’re likely wasting resources on copy that doesn’t resonate, spending thousands on campaigns that don’t engage, and publishing generic landing pages.

Message testing gives you insights into what messaging generates an impact. 

In this article, we'll share how to get started with message testing, methodologies that improve your message’s effectiveness, and three practical examples of revamped brand messaging. 

What is message testing?

Message testing analyzes a company’s marketing message to evaluate how well it resonates with its target audience. 

Marketing messages are your company’s offer to potential customers. A compelling marketing message makes customers feel understood, while showing how your product benefits them and how it can solve their problems.

Your message will only resonate if it clearly communicates solutions that matter most to your audience. 

Let’s say you’re creating messaging for an email marketing platform targeted at senior marketing leaders. If your messaging focuses on ease of use, it’s unlikely they’ll care. Marketing leaders are measured against return on investment (ROI).

With message testing, you put your landing page copy and value proposition in front of your ideal customer profile, or buyer persona, and collect qualitative insights to improve it.

Message testing is different from user testing because it tests the messaging only, not how your customers use your website.

It’s all in the messaging: Why you need message testing

Too many marketers aim to be clever and original with their marketing. However, cleverness can be confusing. Clarity, on the other hand, communicates precisely why your audience should care.

For example, a clever marketing subheading might read: 

“Entrepreneurial-approved time-savers.”

This leaves customers confused about what they’re being sold and why it matters to them. 

Clear brand statements set up the value and intentions from the beginning. 

For example, Webflow’s landing page states, “The site you want—without the dev time.” 

Screenshot of Webflow Clear Value Proposition
Webflow’s clear value proposition

Its purpose is clear: to provide a no-code web design tool that allows customers to build the site they’ve always wanted in a fraction of the time. Why should they care? Because, unlike with other tools, you don’t need coding experience.

When marketing messages work, they are 2x as influential as design in converting customers. When they don’t, you’re missing out on conversion potential—and your business's success depends on how well you can convert. 

However, message testing isn’t only for conversions. Conducted well, it can:

  • Help you appeal to your target customers, their preferences, values, pain points, and objectives; 
  • Refine your product messaging and serve as a brand guide for future marketing campaigns. 

In contrast, when messaging isn’t working, it can confuse, frustrate, and alienate your customers.

Getting started with message testing

Message testing is a form of qualitative research

Qualitative research relies on observation or non-numerical market research analysis, such as in-depth interviews. It’s ideal for getting a more rounded understanding of customer motivations. 

In message testing, your research should focus on these key dimensions of your message: 

  • Clarity. How well you communicate your offer to your customers and how well they understand it.
  • Relevance. How your value proposition aligns with customers’ needs.
  • Value. How important, timely, and beneficial your message is for your users.
  • Differentiation. Why your offer is different from the rest.

If any of these is lacking, your message might fail to land. Collecting information on all of these dimensions may be a weeks-long process, but there are tools to make this easier.

Wynter is a tool for message testing with B2B audiences

Wynter delivers results from target customers (people you're trying to influence with your messaging) in 12–48 hours, far quicker than any method on this list. We have our own B2B panel of validated professionals in various industries with different seniority levels, job titles and company sizes to help test your messaging with your target customers.

We test your website and landing page messaging, overall brand positioning and narrative, sales and marketing funnels, and outbound email messages and sales demos. 

How a message test works with Wynter:

  1. Choose your target audience based on job title and seniority levels.
  2. Specify the audience details and configure your test by choosing your preferred employee count and industry. You can also ask one of our CRO professionals to summarize your message testing results.
  3. Enter the URL of the page for message testing market research or upload an image of it. We offer full-page message tests where you can select up to three areas for additional feedback.
  4. Set up your test questions. Choose from our pre-made questions and question templates, or write your own. 
  5. Once you receive the results, optimize your pages and start testing the effectiveness of the new messaging. 

Wynter is the most efficient way to gather the needed qualitative data to refine your message.

Ask qualitative open-ended questions

Open-ended questions cannot be answered in a single word or number. They permit a broad range of potential responses and allow a nuanced response the Likert scale lacks. As a result, you can draw out as much information as possible from your customers. 

Examples of strong open-ended questions are:

  • What do you think this message is trying to say?
  • Are there any unclear parts of the message?
  • What do you find to be the least/most effective parts of the message?
  • How do you feel about the message?
  • If you could make one change to the message, what would it be?
  • How has the message changed your perception of the product?

These questions help you gauge how well (or not) your customers understand your product. If your customers struggle with their answers (or miss the intended point), it’s a sign you need to rework your messaging. 

A/B testing is not a replacement for message testing

A/B testing is a measurement methodology. Message testing is a diagnostics exercise.

A/B testing tells if A is better than B, and by how much. It does not tell you what the problem with your messaging is, or what the customers really care about, or how they think about the problems you solve.

Message testing gives you the why, what and where - where your messaging fails, and what specifically you need to improve.

A/B testing needs significant sample size to reach valid conclusions - usually at least 500 transactions (signups, demo requests, etc) per month.

I consistently come across folks that are unfamiliar with sample size requirements for qualitative research -- assuming it needs stat significance like a quant test like A/B test.

Instead of stat significance the methodological principle used is 'saturation'. The standard is that it takes 12-13 responses to reach saturation -- meaning whether you survey 13 or 130 people, the number of insights/themes you get is the same. There are folks who debate the exact number of participants, but most in the scientific community agree it's below 20.

A review of 23 peer-reviewed articles suggests that 9–17 participants can be sufficient to reach saturation, especially for studies with homogenous populations and narrowly defined objectives.

How to turn your messaging into something that resonates with customers: 3 case studies

In this section, we’ll look at three B2B companies whose messaging we tested at Wynter. We’ll explore their case from problem to solution, including target audience responses and our recommendations.

Case study #1: PandaDoc

Screenshot of PandaDoc Landing Page for Marketing Teams
PandaDoc’s landing page for marketing teams

PandaDoc is a cloud-based SaaS company helping users create proposals, quotes, contracts, and more. But you wouldn’t know it based on their early messaging. 

For PandaDoc, we put together a 50-person marketing panel, analyzed the results, and found that the overall brand messaging was confusing.

Screenshot of PandaDoc Marketing Panel Result in Google Sheet
A Google Sheet of the panel results

The company uses unclear, generic words like “on-brand docs.” “On-brand” left marketers uncertain, and the word “docs” made some think the company was targeting doctors. 

How PandaDoc can improve its messaging

PandaDoc is suffering from a lack of clarity. They could message mine for commonly used words; in this case, “professional look” was used multiple times in their reviews, so this is a great term to add to your marketing efforts. 

Screenshot of TrustRadius Website showing PandaDoc Review about its professional look
A review using the “professional look” customer language

Apart from this, “on-brand docs” isn’t the most essential value proposition for the target market. It left marketers wondering, “why does this matter?”:

Screenshot of panel comment calling out the flatness of the “on-brand” value proposition
A panel comment calling out the flatness of the “on-brand” value proposition

 Because there is plenty of competition in this market, we’d recommend differentiation and clarification: 

  • Why PandaDoc over alternatives? 
  • Why use PandaDoc instead of using generic templates? 
  • What is PandaDoc, and why should we care about its features? 

Another missed opportunity is the sub-heading “easy-to-use, hyper-personalized sales documents.” 

Although clear, it provides no information on how PandaDoc provides more personalized documents than anyone else. Spell out the customer relevance by identifying the problem and selling the customer on it in their own words. 

Finally, PandaDocs should back everything up with social proof. Add specific numbers of other businesses who use their software (or who have switched from a competitor) and include authentic customer testimonials to build customer trust. 

Case study #2: Metadata.io

Metadata is an AI demand generation platform for B2B marketing companies that automates mundane tasks. 

Screenshot of Metadata homepage

The first thing we noticed was the graphics. There are too many elements on the page fighting for our attention, namely Benjamin Franklin and the woman. This detracts from our focus on the value proposition. 

The copy also lacks consistency; the rest of the homepage doesn’t match the quirkiness of the material above the fold.

Screenshot of an illustration showing with or without MetaData comparison
The below-the-fold material is sleek but lacks the quirkiness of the hero section material.

How Metadata can improve its messaging

Apart from toning down their hero section and aiming for a more consistent tone of voice throughout, Metadata could also aim for specificity and clarity.

“Drive more revenue” is an overused, obvious, non-specific phrase. Instead of this buzzphrase, Metadata should explain what the platform does. By the end of the landing page, we still don’t know how it automates paid campaigns, and neither did the audience.

Screenshot of a Panel Comments about How MetaData Works
Panel comments confused about how the platform works

Finally, Metadata should include better examples of differentiation. They use a vague chart comparing themselves to other companies but never mention what makes them different from competitors. 

Screenshot of a Panel comments about the G2 chart of MetaData Platform
Panel comments confused about the G2 chart

This chart doesn’t clarify who the competitors are. There are logos but no company names, so it leaves customers to do the guesswork. 

To more clearly demonstrate where it excels, Metadata could use a comparison chart more like the one featured on their ROI page. 

Screenshot of a Chart that compares MetaData specific feature vs. competitors
A chart that shows how Metadata compares on a specific feature vs. competitors

This gives customers a clear view of their competition and how they stack up against them. 

Find more insights on Metadata, like we did for PandaDoc, in this Google Sheet.

Case study #3: Loom 

Loom is a screen recording application allowing users to record audio, video, browser windows, or whole screens. Loom does many good things on its website but the messaging could be more clear according to our panel. 

Screenshot of Loom’s Landing Page
Loom’s landing page

We put this in front of Project Management directors to find the problems.

First, there are unclear use-cases. The photo shows “Q1 Closed Deals,” which focuses on sales, but this is an excellent opportunity to showcase more use cases and call out target users other than salespeople.

There’s also a lack of differentiation. It’s clear that it’s a video service, but how is this different from Zoom or Camtasia? Loom fails to communicate this well. 

Lastly, distracting visuals draw our attention away from the value prop. The video of the woman distracts from the product preview. They should feature this in another section on the page or do away with it entirely. 

How Loom can improve its messaging

Loom needs to explain who it’s for. Since the company is relatively new (founded in 2016), the copy needs to be explicit about their offer and their audience. 

Loom could list several popular use cases for their video software and why customers choose them over competitors (the ability to send video links instead of files, notifications when the video has been viewed, etc.).

Many customers requested more information on how it works: does it record your whole screen? Can you limit the recording area? Do you have to be in a circle alongside your screenshare?

Screenshot of the Panel comments asking questions about how Loom works
Panel comments asking questions about how Loom works

Finally, Loom should also feature their value proposition more prominently, as right now the hero section is dominated by the graphics.

Find more insights on Loom here in this Google Sheet.

Measure what resonates

If you want to connect with customers, message testing must be part of your marketing strategy. 

Without it, you’re guessing whether your marketing communications resonate with customers and potentially wasting valuable time and money on misguided campaigns.

Luckily, there are several ways to test your messages—both before and after launch. Validate your copy, resonate with your audience, and increase conversion rates with message testing.

Out now: Watch our free B2B messaging course and learn all the techniques (from basic to advanced) to create messaging that resonates with your target customers.

If message testing isn’t part of your marketing strategy, you’re likely wasting resources on copy that doesn’t resonate, spending thousands on campaigns that don’t engage, and publishing generic landing pages.

Message testing gives you insights into what messaging generates an impact. 

In this article, we'll share how to get started with message testing, methodologies that improve your message’s effectiveness, and three practical examples of revamped brand messaging. 

What is message testing?

Message testing analyzes a company’s marketing message to evaluate how well it resonates with its target audience. 

Marketing messages are your company’s offer to potential customers. A compelling marketing message makes customers feel understood, while showing how your product benefits them and how it can solve their problems.

Your message will only resonate if it clearly communicates solutions that matter most to your audience. 

Let’s say you’re creating messaging for an email marketing platform targeted at senior marketing leaders. If your messaging focuses on ease of use, it’s unlikely they’ll care. Marketing leaders are measured against return on investment (ROI).

With message testing, you put your landing page copy and value proposition in front of your ideal customer profile, or buyer persona, and collect qualitative insights to improve it.

Message testing is different from user testing because it tests the messaging only, not how your customers use your website.

It’s all in the messaging: Why you need message testing

Too many marketers aim to be clever and original with their marketing. However, cleverness can be confusing. Clarity, on the other hand, communicates precisely why your audience should care.

For example, a clever marketing subheading might read: 

“Entrepreneurial-approved time-savers.”

This leaves customers confused about what they’re being sold and why it matters to them. 

Clear brand statements set up the value and intentions from the beginning. 

For example, Webflow’s landing page states, “The site you want—without the dev time.” 

Screenshot of Webflow Clear Value Proposition
Webflow’s clear value proposition

Its purpose is clear: to provide a no-code web design tool that allows customers to build the site they’ve always wanted in a fraction of the time. Why should they care? Because, unlike with other tools, you don’t need coding experience.

When marketing messages work, they are 2x as influential as design in converting customers. When they don’t, you’re missing out on conversion potential—and your business's success depends on how well you can convert. 

However, message testing isn’t only for conversions. Conducted well, it can:

  • Help you appeal to your target customers, their preferences, values, pain points, and objectives; 
  • Refine your product messaging and serve as a brand guide for future marketing campaigns. 

In contrast, when messaging isn’t working, it can confuse, frustrate, and alienate your customers.

Getting started with message testing

Message testing is a form of qualitative research

Qualitative research relies on observation or non-numerical market research analysis, such as in-depth interviews. It’s ideal for getting a more rounded understanding of customer motivations. 

In message testing, your research should focus on these key dimensions of your message: 

  • Clarity. How well you communicate your offer to your customers and how well they understand it.
  • Relevance. How your value proposition aligns with customers’ needs.
  • Value. How important, timely, and beneficial your message is for your users.
  • Differentiation. Why your offer is different from the rest.

If any of these is lacking, your message might fail to land. Collecting information on all of these dimensions may be a weeks-long process, but there are tools to make this easier.

Wynter is a tool for message testing with B2B audiences

Wynter delivers results from target customers (people you're trying to influence with your messaging) in 12–48 hours, far quicker than any method on this list. We have our own B2B panel of validated professionals in various industries with different seniority levels, job titles and company sizes to help test your messaging with your target customers.

We test your website and landing page messaging, overall brand positioning and narrative, sales and marketing funnels, and outbound email messages and sales demos. 

How a message test works with Wynter:

  1. Choose your target audience based on job title and seniority levels.
  2. Specify the audience details and configure your test by choosing your preferred employee count and industry. You can also ask one of our CRO professionals to summarize your message testing results.
  3. Enter the URL of the page for message testing market research or upload an image of it. We offer full-page message tests where you can select up to three areas for additional feedback.
  4. Set up your test questions. Choose from our pre-made questions and question templates, or write your own. 
  5. Once you receive the results, optimize your pages and start testing the effectiveness of the new messaging. 

Wynter is the most efficient way to gather the needed qualitative data to refine your message.

Ask qualitative open-ended questions

Open-ended questions cannot be answered in a single word or number. They permit a broad range of potential responses and allow a nuanced response the Likert scale lacks. As a result, you can draw out as much information as possible from your customers. 

Examples of strong open-ended questions are:

  • What do you think this message is trying to say?
  • Are there any unclear parts of the message?
  • What do you find to be the least/most effective parts of the message?
  • How do you feel about the message?
  • If you could make one change to the message, what would it be?
  • How has the message changed your perception of the product?

These questions help you gauge how well (or not) your customers understand your product. If your customers struggle with their answers (or miss the intended point), it’s a sign you need to rework your messaging. 

A/B testing is not a replacement for message testing

A/B testing is a measurement methodology. Message testing is a diagnostics exercise.

A/B testing tells if A is better than B, and by how much. It does not tell you what the problem with your messaging is, or what the customers really care about, or how they think about the problems you solve.

Message testing gives you the why, what and where - where your messaging fails, and what specifically you need to improve.

A/B testing needs significant sample size to reach valid conclusions - usually at least 500 transactions (signups, demo requests, etc) per month.

I consistently come across folks that are unfamiliar with sample size requirements for qualitative research -- assuming it needs stat significance like a quant test like A/B test.

Instead of stat significance the methodological principle used is 'saturation'. The standard is that it takes 12-13 responses to reach saturation -- meaning whether you survey 13 or 130 people, the number of insights/themes you get is the same. There are folks who debate the exact number of participants, but most in the scientific community agree it's below 20.

A review of 23 peer-reviewed articles suggests that 9–17 participants can be sufficient to reach saturation, especially for studies with homogenous populations and narrowly defined objectives.

How to turn your messaging into something that resonates with customers: 3 case studies

In this section, we’ll look at three B2B companies whose messaging we tested at Wynter. We’ll explore their case from problem to solution, including target audience responses and our recommendations.

Case study #1: PandaDoc

Screenshot of PandaDoc Landing Page for Marketing Teams
PandaDoc’s landing page for marketing teams

PandaDoc is a cloud-based SaaS company helping users create proposals, quotes, contracts, and more. But you wouldn’t know it based on their early messaging. 

For PandaDoc, we put together a 50-person marketing panel, analyzed the results, and found that the overall brand messaging was confusing.

Screenshot of PandaDoc Marketing Panel Result in Google Sheet
A Google Sheet of the panel results

The company uses unclear, generic words like “on-brand docs.” “On-brand” left marketers uncertain, and the word “docs” made some think the company was targeting doctors. 

How PandaDoc can improve its messaging

PandaDoc is suffering from a lack of clarity. They could message mine for commonly used words; in this case, “professional look” was used multiple times in their reviews, so this is a great term to add to your marketing efforts. 

Screenshot of TrustRadius Website showing PandaDoc Review about its professional look
A review using the “professional look” customer language

Apart from this, “on-brand docs” isn’t the most essential value proposition for the target market. It left marketers wondering, “why does this matter?”:

Screenshot of panel comment calling out the flatness of the “on-brand” value proposition
A panel comment calling out the flatness of the “on-brand” value proposition

 Because there is plenty of competition in this market, we’d recommend differentiation and clarification: 

  • Why PandaDoc over alternatives? 
  • Why use PandaDoc instead of using generic templates? 
  • What is PandaDoc, and why should we care about its features? 

Another missed opportunity is the sub-heading “easy-to-use, hyper-personalized sales documents.” 

Although clear, it provides no information on how PandaDoc provides more personalized documents than anyone else. Spell out the customer relevance by identifying the problem and selling the customer on it in their own words. 

Finally, PandaDocs should back everything up with social proof. Add specific numbers of other businesses who use their software (or who have switched from a competitor) and include authentic customer testimonials to build customer trust. 

Case study #2: Metadata.io

Metadata is an AI demand generation platform for B2B marketing companies that automates mundane tasks. 

Screenshot of Metadata homepage

The first thing we noticed was the graphics. There are too many elements on the page fighting for our attention, namely Benjamin Franklin and the woman. This detracts from our focus on the value proposition. 

The copy also lacks consistency; the rest of the homepage doesn’t match the quirkiness of the material above the fold.

Screenshot of an illustration showing with or without MetaData comparison
The below-the-fold material is sleek but lacks the quirkiness of the hero section material.

How Metadata can improve its messaging

Apart from toning down their hero section and aiming for a more consistent tone of voice throughout, Metadata could also aim for specificity and clarity.

“Drive more revenue” is an overused, obvious, non-specific phrase. Instead of this buzzphrase, Metadata should explain what the platform does. By the end of the landing page, we still don’t know how it automates paid campaigns, and neither did the audience.

Screenshot of a Panel Comments about How MetaData Works
Panel comments confused about how the platform works

Finally, Metadata should include better examples of differentiation. They use a vague chart comparing themselves to other companies but never mention what makes them different from competitors. 

Screenshot of a Panel comments about the G2 chart of MetaData Platform
Panel comments confused about the G2 chart

This chart doesn’t clarify who the competitors are. There are logos but no company names, so it leaves customers to do the guesswork. 

To more clearly demonstrate where it excels, Metadata could use a comparison chart more like the one featured on their ROI page. 

Screenshot of a Chart that compares MetaData specific feature vs. competitors
A chart that shows how Metadata compares on a specific feature vs. competitors

This gives customers a clear view of their competition and how they stack up against them. 

Find more insights on Metadata, like we did for PandaDoc, in this Google Sheet.

Case study #3: Loom 

Loom is a screen recording application allowing users to record audio, video, browser windows, or whole screens. Loom does many good things on its website but the messaging could be more clear according to our panel. 

Screenshot of Loom’s Landing Page
Loom’s landing page

We put this in front of Project Management directors to find the problems.

First, there are unclear use-cases. The photo shows “Q1 Closed Deals,” which focuses on sales, but this is an excellent opportunity to showcase more use cases and call out target users other than salespeople.

There’s also a lack of differentiation. It’s clear that it’s a video service, but how is this different from Zoom or Camtasia? Loom fails to communicate this well. 

Lastly, distracting visuals draw our attention away from the value prop. The video of the woman distracts from the product preview. They should feature this in another section on the page or do away with it entirely. 

How Loom can improve its messaging

Loom needs to explain who it’s for. Since the company is relatively new (founded in 2016), the copy needs to be explicit about their offer and their audience. 

Loom could list several popular use cases for their video software and why customers choose them over competitors (the ability to send video links instead of files, notifications when the video has been viewed, etc.).

Many customers requested more information on how it works: does it record your whole screen? Can you limit the recording area? Do you have to be in a circle alongside your screenshare?

Screenshot of the Panel comments asking questions about how Loom works
Panel comments asking questions about how Loom works

Finally, Loom should also feature their value proposition more prominently, as right now the hero section is dominated by the graphics.

Find more insights on Loom here in this Google Sheet.

Measure what resonates

If you want to connect with customers, message testing must be part of your marketing strategy. 

Without it, you’re guessing whether your marketing communications resonate with customers and potentially wasting valuable time and money on misguided campaigns.

Luckily, there are several ways to test your messages—both before and after launch. Validate your copy, resonate with your audience, and increase conversion rates with message testing.

Out now: Watch our free B2B messaging course and learn all the techniques (from basic to advanced) to create messaging that resonates with your target customers.

Know exactly what your buyers want and improve your messaging

Join 10,000+ other marketers and subscribe and get weekly insights on how to land more customers quicker with a better go-to-market machine.
You subscribed successfully.