This week I conducted message testing on the home page of Jellyfish (volunteered by Kyle Lacy).
One of their ICPs is engineering leaders, so that's whom I tested their messaging against.
Jellyfish clearly solves important problems as acknowledged by their target customers. However, the effectiveness of said messaging is held back by some very specific issues.
Right now their demo interest scores only 3.2/5. However, they can significantly increase this and drive more pipeline if they fix some key issues.
1. How does it work?
The home page makes a lot of marketing claims, but offers little explanation of how it's all possible.
When your ICP reads everything and still has questions, you have a problem. They really want to know where does all this data come from, and does it require any change in the ways they work and manage.
"I don't know how your product works or what it does. There's marketing fud claiming that it will measure my team's business impact.... but I don't believe you at all. I need to see more about how it works and what it measures."
"I didn't see anything about how it actually works. For example, how is the data collected? What kind of changes would I have to make to my departments' workflows to enable this tool to integrate?"
The audience has a lot of questions - answering these in context (when the questions arise) will help a lot.
2. What's the setup like? What's involved?
People are worried about what it takes to set up, does it add to the workload, and is it yet another tool to look at.
"It was not made clear how this would be implemented in my environment."
"What is the overhead of maintaining this tool's data vs all the project management stuff I'm already doing? I don't want to bring in an admin just to keep this up-to-date."
3. What kind of specific reports does it show, what specific insights does it offer?
The home page only shows one generic screenshot (one that turns most off), so people are left wondering if the insights it produces are merely superficial.
If you leave blanks in the story, people will fill those in by themselves.
All of this lack of information is holding back people from scheduling demos.
One VP of Engineering commented:
"I get what its goal is and I like it. But I wouldn't ask for a demo for a product going in blind."
Note: a lack of information doesn't mean it's not there somewhere on the website. It's when it's not served when the questions come up. You lose most people right there.
This is the very point of message testing. You discover where your messaging misses the mark, so you can fix it and get more demos.
There are also a number of smaller items that need addressing, and I go through all of them in this video.