Competing on features alone is no longer a winning strategy. To stand out from the competition, you need to differentiate yourself in your messaging. Your messages must fit the market, aligning with your target audience’s goals and addressing their pain points.
Copy is a critical element–even twice as important as your design. After testing 36,928 page variants between March 2019 and March 2020, Unbounce found that, across all industries, copy has twice the influence to convert over design.
The only way to know if your messages land successfully is to test them against your audience.
In this article, we’ll review what copy testing is and how automated copy testing elevates your messaging, provide Wynter’s framework for copy testing, and highlight examples of copy testing with Wynter.
Copy testing puts your marketing messaging in front of your ICP and collects feedback on how it resonates with them. It provides fast and affordable qualitative insights to refine your value proposition, offers, and landing pages.
Done right, copy testing helps you identify which messages lead your website visitors to take desired actions using both qualitative and quantitative feedback.
Back in the day, “pre-testing” measured an advertisement’s effectiveness using consumer feedback. Marketers would gather consumer juries, show them ad variations, and measure elements like likeability, persuasion, and day-after recall using both quantitative and qualitative research methods.
Modern copy testing follows the same principles. You gather a panel of 15-30 people in your ICP, expose them to your website copy, and ask them open-ended questions about your copy’s effectiveness.
Quantitative data uses numerical data to measure audience response using collection methods like a Likert scale or closed-ended questions.
Qualitative data is non-numerical, describing the attitudes, feelings, or experiences of your ICP using collection methods like open-ended questions and focus groups.
Combining qualitative and quantitative research improves your data’s integrity; the strengths of one data type balances out the limitations of the other.
For example, Zingtree blended qualitative and quantitative data when testing their homepage copy with Wynter. They asked their target customer base, Customer Support Directors, to rate the overall clarity of the page.
Zingtree scored 3.9/5 for overall clarity, but without qualitative insights, there’s no “reason why” behind this.
To do this, they asked the question: “after reading everything, what’s still unclear?”
Zingtree’s audience found the message was clear, but only after doing some digging. The product’s use case was buried so far down the page it wasn’t obviously apparent to visitors.
Feedback from their ICP included:
Mixed-method feedback enriches the data, giving a full picture view and allows marketers to uncover exactly what must change and why.
Automated copy testing is a new copy testing model that tests variations of your company’s messaging with a panel of B2B professionals in your target audience.
Traditional copy testing can take weeks or months. You gather your panel, ask them questions about your messages, analyze the data, tweak your website copy, and test again. By this point you might have lost customers along the way who may not resonate with your new messaging.
Automated copy testing gives you a head start, putting your message in front of your target customers, with results in under 48 hours so you can apply the feedback on your website as soon as possible.
Wynter helps you:
Wynter uses a 5-point scoring system to test for:
For example, we recently tested OpsLevel’s homepage.
OpsLevel, a SaaS startup, helps development teams organize and track their microservices in a centralized portal, but does their copy convey this? Our targeted panel of 15 back-end, full-stack, and software developers found that OpsLevel’s homepage lacked clarity on what services they provide.
Questions arose like “how long will it take me to go from zero to a dashboard of useful monitoring information?” or “what is the exact functionality of the service?”
Other doubts appeared like how OpsLevel differs from the competition. While one panelist found the app’s design and features compelling, they questioned the distinct advantage of using the product.
Now OpsLevel has centered feedback about how to tweak their messaging to better appeal to their target audience.
Outsourcing your copy testing cuts down on time and cost, which is valuable for all businesses but especially smaller B2B companies lacking the scale and budget to complete their own testing.
Other tests like A/B testing won’t cut it, especially because B2B companies lack the necessary traffic to conduct a successful A/B test (± 500 conversions/month). The only way to test is through your messages.
Marketers frequently blur the lines between copy testing and A/B testing. While A/B testing informs which version of an element (e.g. landing page variation, headline, etc.) worked best, it doesn’t tell you why it worked.
A/B testing is also limited, with only 1 in every 7 A/B tests resulting in a statistically significant winner. While A/B testing can help maximize ROI, you’re still paying for the losing variation.
Budget constraints aside, you still face questions like:
Implementing new copy also risks inconsistent brand messaging.
Let’s say you’re testing a new features section on your homepage. Through A/B testing, you find that variant B (the treatment) wins.
But what happens if the copy on your product page still quotes the original features of variant A (the control)? Or your social media bio still reflects unrelated features or benefits? A/B testing can confuse audiences with inconsistent messaging.
Copy testing provides what A/B testing lacks: insight from the people you want to click the buy button.
At time of writing, 53% of users wait no more than three seconds before leaving your site (according to Google Consumer Insights).
As Peep Laja says, in the B2B SaaS world, people want you to get to the point.
Copy testing helps you refine your message, increasing the likelihood that your ICP will stick around and continue their journey with your brand.
During the initial stages of copy testing, you must decide key details like:
For example, heatmaps may have historically revealed that a disproportionate number of customers engage with product descriptions but rarely book a demo. If so, your copy may need re-aligning closer to your ICPs needs.
Write open-ended (qualitative) and closed-ended (quantitative) questions. Here, we ask our panelists to rate the overall clarity of a message and their interest in following through on a demo.
We also ask questions like:
Asking both question types gives you a concrete number (e.g. 3.9/5) and provides a detailed explanation to back it up.
Write as many questions as you need. There’s no limit to what questions you might ask as long as they suit your research goals.
Your plan should ideally be made up of 5-15 members of your ICP. Although a single individual in your target market can offer accurate insights, you need more to catch the outliers.
As Peep says:
“Studies consistently show that 5 to 15 users will find 90% of your usability problems. 15 will find 97% of your usability problems. The same goes for copy.”
One person saying “this is bad, I don’t like this” doesn’t necessarily mean you should throw away your messaging. If you have 5-15 people and the issue recurs, it needs fixing. You need enough people to validate the results in your study.
Find people in your target audience who aren’t yet customers so they can offer unbiased feedback.
For B2B companies, your target audience might hang out on LinkedIn or Twitter. Search your product’s use-case to find leads already talking about your offering.
For example, if you offer a social media management tool, search “social media scheduler” or “social media scheduling tool” and you’ll find leads talking about your offering.
Alternatively, we’ll help you get your messaging in front of the right audience for you.
Now it’s time to put your copy to the test.
Run sessions in various ways:
Although a focus group saves time because you can interview all panelists at once, individual conversations allow your participants to give unbiased answers, uninfluenced by their peers.
In the research session, either read aloud the questions you wrote and ask them to respond or send them an email with an attachment of the copy you want them to review along with the questions you want answered.
Compile your data in one place so you can search for themes within the feedback. We recommend using a simple spreadsheet like Google Sheets or Microsoft Excel.
If your panelists reviewed multiple pages, give each page its own sheet within the spreadsheet so you won’t mix up feedback.
Then, use these insights to adjust your copy to match your target audience’s feedback. If you’re wondering how, here’s how Wynter would do it using our message testing method.
Each copy test with Wynter goes like this:
After 12-48 hours of private deliberation among our B2B panel, we send the test results. But once you receive the results what do you do with them?
Here’s an example of how Wynter message tested Qualified’s homepage.
Qualified’s product helps marketers convert website visitors into sales pipeline. However, their original copy lacked clarity and muddled their messaging.
Their homepage header introduces visitors with “Welcome to the Pipeline Cloud.” It leaves us wondering, “what is a pipeline cloud?”
Peep Laja searched all over LinkedIn and Twitter for “pipeline cloud,” but only Qualified’s employees were talking about it.
We put this copy in front of a panel of marketing professionals using two quantitative questions, how compelling was the pitch (3.7/5)? And what was the overall clarity (3.9/5)?
Validating these quantitative findings, we asked panelists to go into detail and provide reasoning behind their responses.
They shared comments such as:
Qualified’s headline, while clever and a trademark of their own making, lacks clarity. If customers must read beyond the header and subheader to find out what your company does, you’ve lost them.
The subheading also has so many words that even the Hemingway editor finds it hard to read:
Using these gathered insights, we helped Qualified tweak their homepage copy to clearly communicate the outcome their product offers:
We removed the confusing “Pipeline Cloud” copy and simplified the subheading so people know exactly what Qualified does within a few seconds of reading the copy.
Mayple offers hiring services for on-demand marketing. We tested their copy with ecommerce and SaaS founders.
While Qualified used many words in their subheader to convey a lot, Mayple doesn’t offer enough. It’s too vague. People may read it and wonder if they’re an agency or offer a freelance marketplace.
Mayple didn’t lead with a market category. We’re left assuming what their product is (a cardinal sin of product positioning):
We asked two quantitative questions; how compelling is this offer (2.9/5), and please rate the overall clarity (3.5/5).
So, Mayple scored low in both categories. Backing it up with qualitative insights, we know why. The panelists said:
Their header and subheader lacks specificity–who are they for? How do they do it?
Since this test, Mayple successfully changed their homepage copy:
They listened to their ICP’s feedback and included a pre-header sentence to call out their market category. Now visitors know they help companies hire marketing talent.
Then, they define who you can hire with their services: freelancers and boutique agencies. Visitors understand that marketing talent is vetted through results and not sales pitches, resulting in a clearer and more compelling offer.
Chili Piper helps sales teams schedule sales appointments, but their copy leaves readers uninterested in their offering.
We tested their copy with a panel of Demand Generation and Growth Marketing Directors.
They scored a 3.5/5 for overall clarity, with some panelists wondering how the product works and what it does (besides converting or qualifying leads into meetings).
Chili Piper also scored low for interest in demos with a 2.5/5 overall. Panelists didn’t trust the efficacy of the tool; there’s no case study to read or price to review. They said:
Based on readers feeling their offer lacked efficacy, we suggested they lead with a benefit statement so it’s crystal clear why users need Chili Piper.
We also suggested they call out their target audience of B2B revenue teams and follow it up with a clear outcome or job-to-be-done; “to book with prospects as soon as they express interest in a meeting.”
Their new copy reduces as much purchase friction as possible. Leads understand how their offering benefits them and what it is they do in less than 3 seconds of reading.
When visitors land on your site, your messaging can either convince them to buy or leave them to bounce.
Copy testing helps measure how well your target audience responds to your messaging, preventing any friction in the buying process.
Mixed-method feedback models like Wynter’s testing framework allows you to combine qualitative and quantitative insights to verify what doesn't resonate with your customer base and improve your messaging.
Out now: Watch our free B2B messaging course and learn all the techniques (from basic to advanced) to create messaging that resonates with your target customers.
Competing on features alone is no longer a winning strategy. To stand out from the competition, you need to differentiate yourself in your messaging. Your messages must fit the market, aligning with your target audience’s goals and addressing their pain points.
Copy is a critical element–even twice as important as your design. After testing 36,928 page variants between March 2019 and March 2020, Unbounce found that, across all industries, copy has twice the influence to convert over design.
The only way to know if your messages land successfully is to test them against your audience.
In this article, we’ll review what copy testing is and how automated copy testing elevates your messaging, provide Wynter’s framework for copy testing, and highlight examples of copy testing with Wynter.
Copy testing puts your marketing messaging in front of your ICP and collects feedback on how it resonates with them. It provides fast and affordable qualitative insights to refine your value proposition, offers, and landing pages.
Done right, copy testing helps you identify which messages lead your website visitors to take desired actions using both qualitative and quantitative feedback.
Back in the day, “pre-testing” measured an advertisement’s effectiveness using consumer feedback. Marketers would gather consumer juries, show them ad variations, and measure elements like likeability, persuasion, and day-after recall using both quantitative and qualitative research methods.
Modern copy testing follows the same principles. You gather a panel of 15-30 people in your ICP, expose them to your website copy, and ask them open-ended questions about your copy’s effectiveness.
Quantitative data uses numerical data to measure audience response using collection methods like a Likert scale or closed-ended questions.
Qualitative data is non-numerical, describing the attitudes, feelings, or experiences of your ICP using collection methods like open-ended questions and focus groups.
Combining qualitative and quantitative research improves your data’s integrity; the strengths of one data type balances out the limitations of the other.
For example, Zingtree blended qualitative and quantitative data when testing their homepage copy with Wynter. They asked their target customer base, Customer Support Directors, to rate the overall clarity of the page.
Zingtree scored 3.9/5 for overall clarity, but without qualitative insights, there’s no “reason why” behind this.
To do this, they asked the question: “after reading everything, what’s still unclear?”
Zingtree’s audience found the message was clear, but only after doing some digging. The product’s use case was buried so far down the page it wasn’t obviously apparent to visitors.
Feedback from their ICP included:
Mixed-method feedback enriches the data, giving a full picture view and allows marketers to uncover exactly what must change and why.
Automated copy testing is a new copy testing model that tests variations of your company’s messaging with a panel of B2B professionals in your target audience.
Traditional copy testing can take weeks or months. You gather your panel, ask them questions about your messages, analyze the data, tweak your website copy, and test again. By this point you might have lost customers along the way who may not resonate with your new messaging.
Automated copy testing gives you a head start, putting your message in front of your target customers, with results in under 48 hours so you can apply the feedback on your website as soon as possible.
Wynter helps you:
Wynter uses a 5-point scoring system to test for:
For example, we recently tested OpsLevel’s homepage.
OpsLevel, a SaaS startup, helps development teams organize and track their microservices in a centralized portal, but does their copy convey this? Our targeted panel of 15 back-end, full-stack, and software developers found that OpsLevel’s homepage lacked clarity on what services they provide.
Questions arose like “how long will it take me to go from zero to a dashboard of useful monitoring information?” or “what is the exact functionality of the service?”
Other doubts appeared like how OpsLevel differs from the competition. While one panelist found the app’s design and features compelling, they questioned the distinct advantage of using the product.
Now OpsLevel has centered feedback about how to tweak their messaging to better appeal to their target audience.
Outsourcing your copy testing cuts down on time and cost, which is valuable for all businesses but especially smaller B2B companies lacking the scale and budget to complete their own testing.
Other tests like A/B testing won’t cut it, especially because B2B companies lack the necessary traffic to conduct a successful A/B test (± 500 conversions/month). The only way to test is through your messages.
Marketers frequently blur the lines between copy testing and A/B testing. While A/B testing informs which version of an element (e.g. landing page variation, headline, etc.) worked best, it doesn’t tell you why it worked.
A/B testing is also limited, with only 1 in every 7 A/B tests resulting in a statistically significant winner. While A/B testing can help maximize ROI, you’re still paying for the losing variation.
Budget constraints aside, you still face questions like:
Implementing new copy also risks inconsistent brand messaging.
Let’s say you’re testing a new features section on your homepage. Through A/B testing, you find that variant B (the treatment) wins.
But what happens if the copy on your product page still quotes the original features of variant A (the control)? Or your social media bio still reflects unrelated features or benefits? A/B testing can confuse audiences with inconsistent messaging.
Copy testing provides what A/B testing lacks: insight from the people you want to click the buy button.
At time of writing, 53% of users wait no more than three seconds before leaving your site (according to Google Consumer Insights).
As Peep Laja says, in the B2B SaaS world, people want you to get to the point.
Copy testing helps you refine your message, increasing the likelihood that your ICP will stick around and continue their journey with your brand.
During the initial stages of copy testing, you must decide key details like:
For example, heatmaps may have historically revealed that a disproportionate number of customers engage with product descriptions but rarely book a demo. If so, your copy may need re-aligning closer to your ICPs needs.
Write open-ended (qualitative) and closed-ended (quantitative) questions. Here, we ask our panelists to rate the overall clarity of a message and their interest in following through on a demo.
We also ask questions like:
Asking both question types gives you a concrete number (e.g. 3.9/5) and provides a detailed explanation to back it up.
Write as many questions as you need. There’s no limit to what questions you might ask as long as they suit your research goals.
Your plan should ideally be made up of 5-15 members of your ICP. Although a single individual in your target market can offer accurate insights, you need more to catch the outliers.
As Peep says:
“Studies consistently show that 5 to 15 users will find 90% of your usability problems. 15 will find 97% of your usability problems. The same goes for copy.”
One person saying “this is bad, I don’t like this” doesn’t necessarily mean you should throw away your messaging. If you have 5-15 people and the issue recurs, it needs fixing. You need enough people to validate the results in your study.
Find people in your target audience who aren’t yet customers so they can offer unbiased feedback.
For B2B companies, your target audience might hang out on LinkedIn or Twitter. Search your product’s use-case to find leads already talking about your offering.
For example, if you offer a social media management tool, search “social media scheduler” or “social media scheduling tool” and you’ll find leads talking about your offering.
Alternatively, we’ll help you get your messaging in front of the right audience for you.
Now it’s time to put your copy to the test.
Run sessions in various ways:
Although a focus group saves time because you can interview all panelists at once, individual conversations allow your participants to give unbiased answers, uninfluenced by their peers.
In the research session, either read aloud the questions you wrote and ask them to respond or send them an email with an attachment of the copy you want them to review along with the questions you want answered.
Compile your data in one place so you can search for themes within the feedback. We recommend using a simple spreadsheet like Google Sheets or Microsoft Excel.
If your panelists reviewed multiple pages, give each page its own sheet within the spreadsheet so you won’t mix up feedback.
Then, use these insights to adjust your copy to match your target audience’s feedback. If you’re wondering how, here’s how Wynter would do it using our message testing method.
Each copy test with Wynter goes like this:
After 12-48 hours of private deliberation among our B2B panel, we send the test results. But once you receive the results what do you do with them?
Here’s an example of how Wynter message tested Qualified’s homepage.
Qualified’s product helps marketers convert website visitors into sales pipeline. However, their original copy lacked clarity and muddled their messaging.
Their homepage header introduces visitors with “Welcome to the Pipeline Cloud.” It leaves us wondering, “what is a pipeline cloud?”
Peep Laja searched all over LinkedIn and Twitter for “pipeline cloud,” but only Qualified’s employees were talking about it.
We put this copy in front of a panel of marketing professionals using two quantitative questions, how compelling was the pitch (3.7/5)? And what was the overall clarity (3.9/5)?
Validating these quantitative findings, we asked panelists to go into detail and provide reasoning behind their responses.
They shared comments such as:
Qualified’s headline, while clever and a trademark of their own making, lacks clarity. If customers must read beyond the header and subheader to find out what your company does, you’ve lost them.
The subheading also has so many words that even the Hemingway editor finds it hard to read:
Using these gathered insights, we helped Qualified tweak their homepage copy to clearly communicate the outcome their product offers:
We removed the confusing “Pipeline Cloud” copy and simplified the subheading so people know exactly what Qualified does within a few seconds of reading the copy.
Mayple offers hiring services for on-demand marketing. We tested their copy with ecommerce and SaaS founders.
While Qualified used many words in their subheader to convey a lot, Mayple doesn’t offer enough. It’s too vague. People may read it and wonder if they’re an agency or offer a freelance marketplace.
Mayple didn’t lead with a market category. We’re left assuming what their product is (a cardinal sin of product positioning):
We asked two quantitative questions; how compelling is this offer (2.9/5), and please rate the overall clarity (3.5/5).
So, Mayple scored low in both categories. Backing it up with qualitative insights, we know why. The panelists said:
Their header and subheader lacks specificity–who are they for? How do they do it?
Since this test, Mayple successfully changed their homepage copy:
They listened to their ICP’s feedback and included a pre-header sentence to call out their market category. Now visitors know they help companies hire marketing talent.
Then, they define who you can hire with their services: freelancers and boutique agencies. Visitors understand that marketing talent is vetted through results and not sales pitches, resulting in a clearer and more compelling offer.
Chili Piper helps sales teams schedule sales appointments, but their copy leaves readers uninterested in their offering.
We tested their copy with a panel of Demand Generation and Growth Marketing Directors.
They scored a 3.5/5 for overall clarity, with some panelists wondering how the product works and what it does (besides converting or qualifying leads into meetings).
Chili Piper also scored low for interest in demos with a 2.5/5 overall. Panelists didn’t trust the efficacy of the tool; there’s no case study to read or price to review. They said:
Based on readers feeling their offer lacked efficacy, we suggested they lead with a benefit statement so it’s crystal clear why users need Chili Piper.
We also suggested they call out their target audience of B2B revenue teams and follow it up with a clear outcome or job-to-be-done; “to book with prospects as soon as they express interest in a meeting.”
Their new copy reduces as much purchase friction as possible. Leads understand how their offering benefits them and what it is they do in less than 3 seconds of reading.
When visitors land on your site, your messaging can either convince them to buy or leave them to bounce.
Copy testing helps measure how well your target audience responds to your messaging, preventing any friction in the buying process.
Mixed-method feedback models like Wynter’s testing framework allows you to combine qualitative and quantitative insights to verify what doesn't resonate with your customer base and improve your messaging.
Out now: Watch our free B2B messaging course and learn all the techniques (from basic to advanced) to create messaging that resonates with your target customers.