If message testing isn’t part of your marketing strategy, you’re likely wasting resources on copy that doesn’t resonate, spending thousands on campaigns that don’t engage, and publishing generic landing pages.
Message testing gives you insights into what messaging generates an impact.
In this article, we'll share how to get started with message testing, methodologies that improve your message’s effectiveness, and three practical examples of revamped brand messaging.
Message testing analyzes a company’s marketing message to evaluate how well it resonates with its target audience.
Marketing messages are your company’s offer to potential customers. A compelling marketing message makes customers feel understood, while showing how your product benefits them and how it can solve their problems.
Your message will only resonate if it clearly communicates solutions that matter most to your audience.
Let’s say you’re creating messaging for an email marketing platform targeted at senior marketing leaders. If your messaging focuses on ease of use, it’s unlikely they’ll care. Marketing leaders are measured against return on investment (ROI).
With message testing, you put your landing page copy and value proposition in front of your ideal customer profile, or buyer persona, and collect qualitative insights to improve it.
Message testing is different from user testing because it tests the messaging only, not how your customers use your website.
Too many marketers aim to be clever and original with their marketing. However, cleverness can be confusing. Clarity, on the other hand, communicates precisely why your audience should care.
For example, a clever marketing subheading might read:
“Entrepreneurial-approved time-savers.”
This leaves customers confused about what they’re being sold and why it matters to them.
Clear brand statements set up the value and intentions from the beginning.
For example, Webflow’s landing page states, “The site you want—without the dev time.”
Its purpose is clear: to provide a no-code web design tool that allows customers to build the site they’ve always wanted in a fraction of the time. Why should they care? Because, unlike with other tools, you don’t need coding experience.
When marketing messages work, they are 2x as influential as design in converting customers. When they don’t, you’re missing out on conversion potential—and your business's success depends on how well you can convert.
However, message testing isn’t only for conversions. Conducted well, it can:
In contrast, when messaging isn’t working, it can confuse, frustrate, and alienate your customers.
Message testing is a form of qualitative research.
Qualitative research relies on observation or non-numerical market research analysis, such as in-depth interviews. It’s ideal for getting a more rounded understanding of customer motivations.
In message testing, your research should focus on these key dimensions of your message:
If any of these is lacking, your message might fail to land. Collecting information on all of these dimensions may be a weeks-long process, but there are tools to make this easier.
Wynter delivers results from target customers (people you're trying to influence with your messaging) in 12–48 hours, far quicker than any method on this list. We have our own B2B panel of validated professionals in various industries with different seniority levels, job titles and company sizes to help test your messaging with your target customers.
We test your website and landing page messaging, overall brand positioning and narrative, sales and marketing funnels, and outbound email messages and sales demos.
How a message test works with Wynter:
Wynter is the most efficient way to gather the needed qualitative data to refine your message.
Open-ended questions cannot be answered in a single word or number. They permit a broad range of potential responses and allow a nuanced response the Likert scale lacks. As a result, you can draw out as much information as possible from your customers.
Examples of strong open-ended questions are:
These questions help you gauge how well (or not) your customers understand your product. If your customers struggle with their answers (or miss the intended point), it’s a sign you need to rework your messaging.
A/B testing is a measurement methodology. Message testing is a diagnostics exercise.
A/B testing tells if A is better than B, and by how much. It does not tell you what the problem with your messaging is, or what the customers really care about, or how they think about the problems you solve.
Message testing gives you the why, what and where - where your messaging fails, and what specifically you need to improve.
A/B testing needs significant sample size to reach valid conclusions - usually at least 500 transactions (signups, demo requests, etc) per month.
I consistently come across folks that are unfamiliar with sample size requirements for qualitative research -- assuming it needs stat significance like a quant test like A/B test.
Instead of stat significance the methodological principle used is 'saturation'. The standard is that it takes 12-13 responses to reach saturation -- meaning whether you survey 13 or 130 people, the number of insights/themes you get is the same. There are folks who debate the exact number of participants, but most in the scientific community agree it's below 20.
A review of 23 peer-reviewed articles suggests that 9–17 participants can be sufficient to reach saturation, especially for studies with homogenous populations and narrowly defined objectives.
In this section, we’ll look at three B2B companies whose messaging we tested at Wynter. We’ll explore their case from problem to solution, including target audience responses and our recommendations.
PandaDoc is a cloud-based SaaS company helping users create proposals, quotes, contracts, and more. But you wouldn’t know it based on their early messaging.
For PandaDoc, we put together a 50-person marketing panel, analyzed the results, and found that the overall brand messaging was confusing.
The company uses unclear, generic words like “on-brand docs.” “On-brand” left marketers uncertain, and the word “docs” made some think the company was targeting doctors.
PandaDoc is suffering from a lack of clarity. They could message mine for commonly used words; in this case, “professional look” was used multiple times in their reviews, so this is a great term to add to your marketing efforts.
Apart from this, “on-brand docs” isn’t the most essential value proposition for the target market. It left marketers wondering, “why does this matter?”:
Because there is plenty of competition in this market, we’d recommend differentiation and clarification:
Another missed opportunity is the sub-heading “easy-to-use, hyper-personalized sales documents.”
Although clear, it provides no information on how PandaDoc provides more personalized documents than anyone else. Spell out the customer relevance by identifying the problem and selling the customer on it in their own words.
Finally, PandaDocs should back everything up with social proof. Add specific numbers of other businesses who use their software (or who have switched from a competitor) and include authentic customer testimonials to build customer trust.
Metadata is an AI demand generation platform for B2B marketing companies that automates mundane tasks.
The first thing we noticed was the graphics. There are too many elements on the page fighting for our attention, namely Benjamin Franklin and the woman. This detracts from our focus on the value proposition.
The copy also lacks consistency; the rest of the homepage doesn’t match the quirkiness of the material above the fold.
Apart from toning down their hero section and aiming for a more consistent tone of voice throughout, Metadata could also aim for specificity and clarity.
“Drive more revenue” is an overused, obvious, non-specific phrase. Instead of this buzzphrase, Metadata should explain what the platform does. By the end of the landing page, we still don’t know how it automates paid campaigns, and neither did the audience.
Finally, Metadata should include better examples of differentiation. They use a vague chart comparing themselves to other companies but never mention what makes them different from competitors.
This chart doesn’t clarify who the competitors are. There are logos but no company names, so it leaves customers to do the guesswork.
To more clearly demonstrate where it excels, Metadata could use a comparison chart more like the one featured on their ROI page.
This gives customers a clear view of their competition and how they stack up against them.
Find more insights on Metadata, like we did for PandaDoc, in this Google Sheet.
Loom is a screen recording application allowing users to record audio, video, browser windows, or whole screens. Loom does many good things on its website but the messaging could be more clear according to our panel.
We put this in front of Project Management directors to find the problems.
First, there are unclear use-cases. The photo shows “Q1 Closed Deals,” which focuses on sales, but this is an excellent opportunity to showcase more use cases and call out target users other than salespeople.
There’s also a lack of differentiation. It’s clear that it’s a video service, but how is this different from Zoom or Camtasia? Loom fails to communicate this well.
Lastly, distracting visuals draw our attention away from the value prop. The video of the woman distracts from the product preview. They should feature this in another section on the page or do away with it entirely.
Loom needs to explain who it’s for. Since the company is relatively new (founded in 2016), the copy needs to be explicit about their offer and their audience.
Loom could list several popular use cases for their video software and why customers choose them over competitors (the ability to send video links instead of files, notifications when the video has been viewed, etc.).
Many customers requested more information on how it works: does it record your whole screen? Can you limit the recording area? Do you have to be in a circle alongside your screenshare?
Finally, Loom should also feature their value proposition more prominently, as right now the hero section is dominated by the graphics.
Find more insights on Loom here in this Google Sheet.
If you want to connect with customers, message testing must be part of your marketing strategy.
Without it, you’re guessing whether your marketing communications resonate with customers and potentially wasting valuable time and money on misguided campaigns.
Luckily, there are several ways to test your messages—both before and after launch. Validate your copy, resonate with your audience, and increase conversion rates with message testing.
Out now: Watch our free B2B messaging course and learn all the techniques (from basic to advanced) to create messaging that resonates with your target customers.
If message testing isn’t part of your marketing strategy, you’re likely wasting resources on copy that doesn’t resonate, spending thousands on campaigns that don’t engage, and publishing generic landing pages.
Message testing gives you insights into what messaging generates an impact.
In this article, we'll share how to get started with message testing, methodologies that improve your message’s effectiveness, and three practical examples of revamped brand messaging.
Message testing analyzes a company’s marketing message to evaluate how well it resonates with its target audience.
Marketing messages are your company’s offer to potential customers. A compelling marketing message makes customers feel understood, while showing how your product benefits them and how it can solve their problems.
Your message will only resonate if it clearly communicates solutions that matter most to your audience.
Let’s say you’re creating messaging for an email marketing platform targeted at senior marketing leaders. If your messaging focuses on ease of use, it’s unlikely they’ll care. Marketing leaders are measured against return on investment (ROI).
With message testing, you put your landing page copy and value proposition in front of your ideal customer profile, or buyer persona, and collect qualitative insights to improve it.
Message testing is different from user testing because it tests the messaging only, not how your customers use your website.
Too many marketers aim to be clever and original with their marketing. However, cleverness can be confusing. Clarity, on the other hand, communicates precisely why your audience should care.
For example, a clever marketing subheading might read:
“Entrepreneurial-approved time-savers.”
This leaves customers confused about what they’re being sold and why it matters to them.
Clear brand statements set up the value and intentions from the beginning.
For example, Webflow’s landing page states, “The site you want—without the dev time.”
Its purpose is clear: to provide a no-code web design tool that allows customers to build the site they’ve always wanted in a fraction of the time. Why should they care? Because, unlike with other tools, you don’t need coding experience.
When marketing messages work, they are 2x as influential as design in converting customers. When they don’t, you’re missing out on conversion potential—and your business's success depends on how well you can convert.
However, message testing isn’t only for conversions. Conducted well, it can:
In contrast, when messaging isn’t working, it can confuse, frustrate, and alienate your customers.
Message testing is a form of qualitative research.
Qualitative research relies on observation or non-numerical market research analysis, such as in-depth interviews. It’s ideal for getting a more rounded understanding of customer motivations.
In message testing, your research should focus on these key dimensions of your message:
If any of these is lacking, your message might fail to land. Collecting information on all of these dimensions may be a weeks-long process, but there are tools to make this easier.
Wynter delivers results from target customers (people you're trying to influence with your messaging) in 12–48 hours, far quicker than any method on this list. We have our own B2B panel of validated professionals in various industries with different seniority levels, job titles and company sizes to help test your messaging with your target customers.
We test your website and landing page messaging, overall brand positioning and narrative, sales and marketing funnels, and outbound email messages and sales demos.
How a message test works with Wynter:
Wynter is the most efficient way to gather the needed qualitative data to refine your message.
Open-ended questions cannot be answered in a single word or number. They permit a broad range of potential responses and allow a nuanced response the Likert scale lacks. As a result, you can draw out as much information as possible from your customers.
Examples of strong open-ended questions are:
These questions help you gauge how well (or not) your customers understand your product. If your customers struggle with their answers (or miss the intended point), it’s a sign you need to rework your messaging.
A/B testing is a measurement methodology. Message testing is a diagnostics exercise.
A/B testing tells if A is better than B, and by how much. It does not tell you what the problem with your messaging is, or what the customers really care about, or how they think about the problems you solve.
Message testing gives you the why, what and where - where your messaging fails, and what specifically you need to improve.
A/B testing needs significant sample size to reach valid conclusions - usually at least 500 transactions (signups, demo requests, etc) per month.
I consistently come across folks that are unfamiliar with sample size requirements for qualitative research -- assuming it needs stat significance like a quant test like A/B test.
Instead of stat significance the methodological principle used is 'saturation'. The standard is that it takes 12-13 responses to reach saturation -- meaning whether you survey 13 or 130 people, the number of insights/themes you get is the same. There are folks who debate the exact number of participants, but most in the scientific community agree it's below 20.
A review of 23 peer-reviewed articles suggests that 9–17 participants can be sufficient to reach saturation, especially for studies with homogenous populations and narrowly defined objectives.
In this section, we’ll look at three B2B companies whose messaging we tested at Wynter. We’ll explore their case from problem to solution, including target audience responses and our recommendations.
PandaDoc is a cloud-based SaaS company helping users create proposals, quotes, contracts, and more. But you wouldn’t know it based on their early messaging.
For PandaDoc, we put together a 50-person marketing panel, analyzed the results, and found that the overall brand messaging was confusing.
The company uses unclear, generic words like “on-brand docs.” “On-brand” left marketers uncertain, and the word “docs” made some think the company was targeting doctors.
PandaDoc is suffering from a lack of clarity. They could message mine for commonly used words; in this case, “professional look” was used multiple times in their reviews, so this is a great term to add to your marketing efforts.
Apart from this, “on-brand docs” isn’t the most essential value proposition for the target market. It left marketers wondering, “why does this matter?”:
Because there is plenty of competition in this market, we’d recommend differentiation and clarification:
Another missed opportunity is the sub-heading “easy-to-use, hyper-personalized sales documents.”
Although clear, it provides no information on how PandaDoc provides more personalized documents than anyone else. Spell out the customer relevance by identifying the problem and selling the customer on it in their own words.
Finally, PandaDocs should back everything up with social proof. Add specific numbers of other businesses who use their software (or who have switched from a competitor) and include authentic customer testimonials to build customer trust.
Metadata is an AI demand generation platform for B2B marketing companies that automates mundane tasks.
The first thing we noticed was the graphics. There are too many elements on the page fighting for our attention, namely Benjamin Franklin and the woman. This detracts from our focus on the value proposition.
The copy also lacks consistency; the rest of the homepage doesn’t match the quirkiness of the material above the fold.
Apart from toning down their hero section and aiming for a more consistent tone of voice throughout, Metadata could also aim for specificity and clarity.
“Drive more revenue” is an overused, obvious, non-specific phrase. Instead of this buzzphrase, Metadata should explain what the platform does. By the end of the landing page, we still don’t know how it automates paid campaigns, and neither did the audience.
Finally, Metadata should include better examples of differentiation. They use a vague chart comparing themselves to other companies but never mention what makes them different from competitors.
This chart doesn’t clarify who the competitors are. There are logos but no company names, so it leaves customers to do the guesswork.
To more clearly demonstrate where it excels, Metadata could use a comparison chart more like the one featured on their ROI page.
This gives customers a clear view of their competition and how they stack up against them.
Find more insights on Metadata, like we did for PandaDoc, in this Google Sheet.
Loom is a screen recording application allowing users to record audio, video, browser windows, or whole screens. Loom does many good things on its website but the messaging could be more clear according to our panel.
We put this in front of Project Management directors to find the problems.
First, there are unclear use-cases. The photo shows “Q1 Closed Deals,” which focuses on sales, but this is an excellent opportunity to showcase more use cases and call out target users other than salespeople.
There’s also a lack of differentiation. It’s clear that it’s a video service, but how is this different from Zoom or Camtasia? Loom fails to communicate this well.
Lastly, distracting visuals draw our attention away from the value prop. The video of the woman distracts from the product preview. They should feature this in another section on the page or do away with it entirely.
Loom needs to explain who it’s for. Since the company is relatively new (founded in 2016), the copy needs to be explicit about their offer and their audience.
Loom could list several popular use cases for their video software and why customers choose them over competitors (the ability to send video links instead of files, notifications when the video has been viewed, etc.).
Many customers requested more information on how it works: does it record your whole screen? Can you limit the recording area? Do you have to be in a circle alongside your screenshare?
Finally, Loom should also feature their value proposition more prominently, as right now the hero section is dominated by the graphics.
Find more insights on Loom here in this Google Sheet.
If you want to connect with customers, message testing must be part of your marketing strategy.
Without it, you’re guessing whether your marketing communications resonate with customers and potentially wasting valuable time and money on misguided campaigns.
Luckily, there are several ways to test your messages—both before and after launch. Validate your copy, resonate with your audience, and increase conversion rates with message testing.
Out now: Watch our free B2B messaging course and learn all the techniques (from basic to advanced) to create messaging that resonates with your target customers.