The definitive guide to message testing

To know whether your copy is pulling its weight, you need to test your messaging.
Want articles like this straight into your inbox?
Subscribe here

If message testing isn’t part of your marketing strategy, you’re likely wasting resources on copy that doesn’t resonate, spending thousands on campaigns that don’t engage, and publishing generic landing pages.

Message testing gives you insights into what messaging generates an impact. 

In this article, we'll share how to get started with message testing, methodologies that improve your message’s effectiveness, and three practical examples of revamped brand messaging. 

What is message testing?

Message testing analyzes a company’s marketing message to evaluate how well it resonates with its target audience. 

Marketing messages are your company’s offer to potential customers. A compelling marketing message makes customers feel understood, while showing how your product benefits them and how it can solve their problems.

Your message will only resonate if it clearly communicates solutions that matter most to your audience. 

Let’s say you’re creating messaging for an email marketing platform targeted at senior marketing leaders. If your messaging focuses on ease of use, it’s unlikely they’ll care. Marketing leaders are measured against return on investment (ROI).

With message testing, you put your landing page copy and value proposition in front of your ideal customer profile, or buyer persona, and collect qualitative insights to improve it.

Message testing is different from user testing because it tests the messaging only, not how your customers use your website.

It’s all in the messaging: Why you need message testing

Too many marketers aim to be clever and original with their marketing. However, cleverness can be confusing. Clarity, on the other hand, communicates precisely why your audience should care.

For example, a clever marketing subheading might read: 

“Entrepreneurial-approved time-savers.”

This leaves customers confused about what they’re being sold and why it matters to them. 

Clear brand statements set up the value and intentions from the beginning. 

For example, Webflow’s landing page states, “The site you want—without the dev time.” 

Screenshot of Webflow Clear Value Proposition
Webflow’s clear value proposition

Its purpose is clear: to provide a no-code web design tool that allows customers to build the site they’ve always wanted in a fraction of the time. Why should they care? Because, unlike with other tools, you don’t need coding experience.

When marketing messages work, they are 2x as influential as design in converting customers. When they don’t, you’re missing out on conversion potential—and your business's success depends on how well you can convert. 

However, message testing isn’t only for conversions. Conducted well, it can:

  • Help you appeal to your target customers, their preferences, values, pain points, and objectives; 
  • Refine your product messaging and serve as a brand guide for future marketing campaigns. 

In contrast, when messaging isn’t working, it can confuse, frustrate, and alienate your customers.

Getting started with message testing

Message testing uses qualitative and quantitative methods

Qualitative research relies on observation or non-numerical market research analysis, such as in-depth interviews. It’s ideal for getting a more rounded understanding of customer motivations. 

Quantitative research collects hard data using techniques like surveys, polls, and other close-ended questions. It’s a scalable way to identify patterns and trends. 

For instance, in a focus group designed to gather user input about a specific product, you can use qualitative observations and qualitative observations. 

Qualitative questions will sound something like: What do you like about this product? How does it solve your problems? 

Quantitative observations will sound like: How likely are you to recommend our product to a friend? Rate 1–5, 1 as very unlikely, and 5 as highly likely. 

In message testing, your quantitative and qualitative research should focus on these key dimensions of your message: 

  • Clarity. How well you communicate your offer to your customers and how well they understand it.
  • Relevance. How your value proposition aligns with customers’ needs.
  • Value. How important, timely, and beneficial your message is for your users.
  • Timeliness. How effectively your message inspires customers to take action soon.
  • Consistency. How uniform your brand tone is throughout your website.
  • Differentiation. Why your offer is different from the rest.

If any of these is lacking, your message might fail to land. Collecting information on all of these dimensions may be a weeks-long process, but there are tools to make this easier.

Use Wynter

Wynter delivers results in 12–48 hours, far quicker than any method on this list. We have our own B2B panel of validated professionals in various industries with different seniority levels, job titles and company sizes to help test your messaging with your target customers.

We test your website and landing page messaging, overall brand positioning and narrative, sales and marketing funnels, and outbound email messages and sales demos. 

How a message test works with Wynter:

  1. Choose your target audience based on job title and seniority levels.
  2. Specify the audience details and configure your test by choosing your preferred employee count and industry. You can also ask one of our CRO professionals to summarize your message testing results.
  3. Enter the URL of the page for message testing market research or upload an image of it. We offer full-page message tests where you can select up to three areas for additional feedback.
  4. Set up your test questions. Choose from our pre-made questions and question templates, or write your own. 
  5. Once you receive the results, optimize your pages and start testing the effectiveness of the new messaging. 

Wynter is the most efficient way to gather the needed qualitative data to refine your message. However, using Wynter isn’t the only approach to message testing. We’ll review alternative methods and how you might deploy them for your company. 

Test your messaging with the Likert scale

The Likert scale is a psychometric tool used to measure attitudes, values, and beliefs. A typical scale is made up of five items, each having a different value representing the degree of agreement or disagreement with the topic.

Screenshot of likert scale survey example
An example Likert survey

To compile Likert scale questions, you’ll use the six dimensions discussed above: clarity, relevance, value, timeliness, consistency, and differentiation. 

For example, sample statements might look like this: 

The message is clear, and you know exactly what the company offers.

  • Strongly agree;
  • Somewhat Agree;
  • Neutral;
  • Somewhat Disagree; 
  • Strongly Disagree.

The company’s messaging motivates you to take action.

  • Strongly agree;
  • Somewhat Agree;
  • Neutral;
  • Somewhat Disagree;
  • Strongly Disagree.

Likert scales have many advantages over other types of quantitative research surveys. They’re simple to create, easy for respondents to answer, and provide precise quantification of people's feelings about the topic surveyed.

However, there are limitations. Most fundamentally, Likert scales cannot capture nuanced responses and typically fail to pinpoint why a customer disagreed with a statement. 

They’re also only as good as the person who writes them. Even a minor ambiguity in wording or presentation can compromise the validity of responses. It takes time and professional expertise to achieve reliable results.

Ask qualitative open-ended questions

Open-ended questions cannot be answered in a single word or number. They permit a broad range of potential responses and allow a nuanced response the Likert scale lacks. As a result, you can draw out as much information as possible from your customers. 

Examples of strong open-ended questions are:

  • What do you think this message is trying to say?
  • Are there any unclear parts of the message?
  • What do you find to be the least/most effective parts of the message?
  • How do you feel about the message?
  • If you could make one change to the message, what would it be?
  • How has the message changed your perception of the product?

These questions help you gauge how well (or not) your customers understand your product. If your customers struggle with their answers (or miss the intended point), it’s a sign you need to rework your messaging. 

Mine for messages

Review mining is the process of digging into customer reviews and surveys to see what customers think of your offering. 

Using this qualitative method, identify common words to describe your product or uncover hidden insights, like what features your customers like best. Then, use this data to craft the right messaging. 

For example, software solution finder Cuspera is a goldmine for Wynter customer testimonials. 

Screenshot of Wynter Customer Review or Testimonials
Reviews of Wynter on Cuspera

As you can see, there are common themes in our reviews: 

  • Increased conversion rates;
  • Increased AOV (average order value);
  • Quick setup and results.

These are results from real people. Use this data to highlight your customer’s outcomes and favorite product features. 

It’s worth noting that reviews are experiencing a dip in consumer trust. In 2020, BrightLocal reported that 79% of survey respondents trust reviews as much as recommendations from family and friends. In 2022, just 49% trust reviews.

This is likely down to consumers wising up to fake reviews. Steady Demand’s SEO expert Ben Fisher says: 

“It’s no surprise that Google and Amazon lead the pack here, followed by Facebook and Yelp. Google is horrible when it comes to detecting and removing fake reviews. 

Amazon has always had an issue with them as well. Yelp happens to be the best when it comes to detecting and removing fake reviews, but, then again, reviews really are their business.” [via BrightLocal]

When looking for reviews to mine, be on the lookout for fake reviews as this could skew your results. Anything that sounds repetitive (key terms worded the exact same way across several reviews), overly positive (a suspicious amount of exclamation points), or like it came straight out of a boardroom deserves a skeptical eye.

The message mining process might look like this:

  1. Pick your target audience’s main pain point.
  2. Research what solutions exist to satisfy this pain point.
  3. Go through your customer testimonials and those of your competitors’. These may be hosted on various sites: your website, Capterra, Facebook, LinkedIn Groups, etc.
  4. Collect commonly used words, topics, phrases, questions people ask, and problems reported in a spreadsheet.

This process helps you gain insight into user preferences to better inform your website, landing page and ad copy, blog post topics, and more. 

However, while message mining can produce useful results, it can be time-consuming. Worse still, the insights don’t uncover any flaws specific to your brand’s current messaging. 

Conduct website polls

Website polls are questions that pop up as a customer navigates a website. Polls help you collect customers’ feedback about your product or service through open- and closed-ended message testing questions. 

Common survey tools include Hotjar, Mentimeter, and Qualaroo

Screenshot of Hotjar’s Likert scale-style pop-up question
An example of Hotjar’s Likert scale-style pop-up question

Web-site polls work because they’re convenient for customers to answer quickly while continuing (or not) on their buying journey. 

Unfortunately, feedback does not include specific customer issues and there’s no way to target ideal customers. 

Use heat maps correctly

Heat maps are visual representations of data showing where users click on a webpage. They identify and quantify user behavior patterns such as what buttons users click and where they spend the most time on the website—and can predict what may happen next.

“Heat map” is a general term that includes:

  1. Hovermaps;
  2. Click maps;
  3. Attention maps;
  4. Scroll maps. 
Screenshot of Click Map Example
An example of a click map

To make accurate conclusions from your heat map reports, use a substantial sample size. We recommend 2,000–3,000 pageviews per design screen. 

Heat maps are an effective way to collect, visualize, and analyze data quickly and easily, but they’re severely limited. They only reflect the user’s actions on the site, but communicate little about their motivations, such as why specific keywords, text, links, or images draw their attention.

If you are on a budget or are more interested in evaluating user experience on your website, heat maps can be a good start.

Look out for website analytics metrics

Website analytics consists of tracking website activity and how visitors interact with a site using tools such as Google Analytics and Adobe Analytics. 

These tools provide insight into how users navigate a site, where they go when they visit, what they are looking at, what they click on, and other relevant information.

Various types of analytics exist, but all aim to provide insight into website behavior. 

These include:

  1. Key metrics. Used to measure overall website performance, including traffic, conversions, and bounce rate.
  2. Performance metrics. Used to measure individual page performance and track changes over time.
  3. Search engine metrics. Used to measure the effectiveness of search-related features such as search engine result rankings and click-through rates.

Website analytics provide great quantitative analysis but miss the emotional qualitative piece of the customer experience.

Try A/B testing

Quantitative message testing and A/B testing are both forms of empirical testing but are often inaccurately lumped together as the same. Message testing and A/B testing are not the same thing.

A/B testing is a method in which two different versions of a web page or website are created and tested against each other. It does involve testing, but it doesn't determine what customers find important or how they interpret your message. 

Message testing gives you all these details. 

The method of messaging research you choose will be down to your needs and resources. For a more comprehensive understanding, let’s look at three companies and their message testing process.

How to turn your messaging into something that resonates with customers: 3 case studies

In this section, we’ll look at three B2B companies whose messaging we tested at Wynter. We’ll explore their case from problem to solution, including target audience responses and our recommendations.

Case study #1: PandaDoc

Screenshot of PandaDoc Landing Page for Marketing Teams
PandaDoc’s landing page for marketing teams

PandaDoc is a cloud-based SaaS company helping users create proposals, quotes, contracts, and more. But you wouldn’t know it based on their early messaging. 

For PandaDoc, we put together a 50-person marketing panel, analyzed the results, and found that the overall brand messaging was confusing.

Screenshot of PandaDoc Marketing Panel Result in Google Sheet
A Google Sheet of the panel results

The company uses unclear, generic words like “on-brand docs.” “On-brand” left marketers uncertain, and the word “docs” made some think the company was targeting doctors. 

How PandaDoc can improve its messaging

PandaDoc is suffering from a lack of clarity. They could message mine for commonly used words; in this case, “professional look” was used multiple times in their reviews, so this is a great term to add to your marketing efforts. 

Screenshot of TrustRadius Website showing PandaDoc Review about its professional look
A review using the “professional look” customer language

Apart from this, “on-brand docs” isn’t the most essential value proposition for the target market. It left marketers wondering, “why does this matter?”:

Screenshot of panel comment calling out the flatness of the “on-brand” value proposition
A panel comment calling out the flatness of the “on-brand” value proposition

 Because there is plenty of competition in this market, we’d recommend differentiation and clarification: 

  • Why PandaDoc over alternatives? 
  • Why use PandaDoc instead of using generic templates? 
  • What is PandaDoc, and why should we care about its features? 

Another missed opportunity is the sub-heading “easy-to-use, hyper-personalized sales documents.” 

Although clear, it provides no information on how PandaDoc provides more personalized documents than anyone else. Spell out the customer relevance by identifying the problem and selling the customer on it in their own words. 

Finally, PandaDocs should back everything up with social proof. Add specific numbers of other businesses who use their software (or who have switched from a competitor) and include authentic customer testimonials to build customer trust. 

Case study #2: Metadata.io

Metadata is an AI demand generation platform for B2B marketing companies that automates mundane tasks. 

Screenshot of Metadata homepage

The first thing we noticed was the graphics. There are too many elements on the page fighting for our attention, namely Benjamin Franklin and the woman. This detracts from our focus on the value proposition. 

The copy also lacks consistency; the rest of the homepage doesn’t match the quirkiness of the material above the fold.

Screenshot of an illustration showing with or without MetaData comparison
The below-the-fold material is sleek but lacks the quirkiness of the hero section material.

How Metadata can improve its messaging

Apart from toning down their hero section and aiming for a more consistent tone of voice throughout, Metadata could also aim for specificity and clarity.

“Drive more revenue” is an overused, obvious, non-specific phrase. Instead of this buzzphrase, Metadata should explain what the platform does. By the end of the landing page, we still don’t know how it automates paid campaigns, and neither did the audience.

Screenshot of a Panel Comments about How MetaData Works
Panel comments confused about how the platform works

Finally, Metadata should include better examples of differentiation. They use a vague chart comparing themselves to other companies but never mention what makes them different from competitors. 

Screenshot of a Panel comments about the G2 chart of MetaData Platform
Panel comments confused about the G2 chart

This chart doesn’t clarify who the competitors are. There are logos but no company names, so it leaves customers to do the guesswork. 

To more clearly demonstrate where it excels, Metadata could use a comparison chart more like the one featured on their ROI page. 

Screenshot of a Chart that compares MetaData specific feature vs. competitors
A chart that shows how Metadata compares on a specific feature vs. competitors

This gives customers a clear view of their competition and how they stack up against them. 

Find more insights on Metadata, like we did for PandaDoc, in this Google Sheet.

Case study #3: Loom 

Loom is a screen recording application allowing users to record audio, video, browser windows, or whole screens. Loom does many good things on its website but the messaging could be more clear according to our panel. 

Screenshot of Loom’s Landing Page
Loom’s landing page

We put this in front of Project Management directors to find the problems.

First, there are unclear use-cases. The photo shows “Q1 Closed Deals,” which focuses on sales, but this is an excellent opportunity to showcase more use cases and call out target users other than salespeople.

There’s also a lack of differentiation. It’s clear that it’s a video service, but how is this different from Zoom or Camtasia? Loom fails to communicate this well. 

Lastly, distracting visuals draw our attention away from the value prop. The video of the woman distracts from the product preview. They should feature this in another section on the page or do away with it entirely. 

How Loom can improve its messaging

Loom needs to explain who it’s for. Since the company is relatively new (founded in 2016), the copy needs to be explicit about their offer and their audience. 

Loom could list several popular use cases for their video software and why customers choose them over competitors (the ability to send video links instead of files, notifications when the video has been viewed, etc.).

Many customers requested more information on how it works: does it record your whole screen? Can you limit the recording area? Do you have to be in a circle alongside your screenshare?

Screenshot of the Panel comments asking questions about how Loom works
Panel comments asking questions about how Loom works

Finally, Loom should also feature their value proposition more prominently, as right now the hero section is dominated by the graphics.

Find more insights on Loom here in this Google Sheet.

Measure what resonates

If you want to connect with customers, message testing must be part of your marketing strategy. 

Without it, you’re guessing whether your marketing communications resonate with customers and potentially wasting valuable time and money on misguided campaigns.

Luckily, there are several ways to test your messages—both before and after launch. Validate your copy, resonate with your audience, and increase conversion rates with message testing.

Out now: Watch our free B2B messaging course and learn all the techniques (from basic to advanced) to create messaging that resonates with your target customers.

Know exactly what your buyers want and improve your messaging

Join 10,000+ other marketers and subscribe and get weekly insights on how to land more customers quicker with a better go-to-market machine.
You subscribed successfully.