HubSpot Sequences have always been a go-to tool for Sales Pro and Enterprise-level users. But let's face it, figuring out the perfect subject line, message body, or personalization tokens has always been a challenge, or even a guessing game—until now.
With HubSpot’s new A/B testing feature for sequences, you can easily test elements like subject lines, call-to-action buttons, and personalization tokens. So, let's dive into how this works, why it's a game-changer, and how you can leverage it to win more leads with your sales emails.
A/B testing, also known as split testing, is a simple way to improve your results by testing two versions of a user experience against each other, (such as a landing page or email) to determine which one performs better.
For instance, you might create two versions of a landing page (version A and version B, hence the name). Each one is identical except for a single change (such as a different headline, or different CTA). You then randomly distribute your traffic to each page to see which has a better conversion rate.
Similarly, you can A/B test your sales emails, and since HubSpot has incorporated A/B testing into its sales tools, this is now easier than ever.
By testing email steps in HubSpot sequences, you can refine your outreach, enhance prospect engagement, and improve your conversion rates.
Once you set up your test, HubSpot's A/B testing feature automatically splits sends evenly across two versions of your emails, capturing data on sends, opens, clicks, replies, and meetings booked for each version. This helps you make decisions more easily, so you can continuously improve your email campaigns.
A/B testing is easy to set up from the page for a given sequence. Check out our full HubSpot Hacks video for more detail on getting it set up properly:
What email variables should you test? A/B tests work best when testing one variable at a time. Otherwise, you won’t be able to figure out which variable is making the difference. There are some exceptions to this, but let’s keep it simple for now.
Consider testing variables like these in your emails:
What makes good email body text anyway? Here are some aspects of the body text you’ll want to improve. Consider what you might change about each of these, which you can then A/B test to see if your change helped.
Pro tip: In most cases, less is more when it comes to body copy length. In addition to testing your body copy messaging, keep length in mind too — test shorter vs. longer!
The subject line is about getting someone to open your email in the first place. But what makes a good subject line? Here are some tips to get started:
Pro Tip: open rates are becoming less reliable due to increased privacy settings, so remember to check more reliable metrics, like meetings booked or email replies.]
Do you use HubSpot’s personalization tokens to reference contacts’ names, company, etc? Like all marketing questions, the answer is “it depends!” So let’s test.
Personalization can be a little difficult if you’re sending emails en masse, which is part of why sales results come from quality of messaging, not quantity.
How early do you make the ask? Do you ask boldly, or humbly? Optimizing your CTA is one of the most important parts of an effective sales sequence.
Let’s explain this with an example of how NOT to do it. Say you run a test where you change two things: the subject line and the CTA. So you have
This is a recipe for unreliable data!
Let’s say Version 2 gets a 5% better response rate. But wait — what if Subject Line B is improving the results a lot for Version 2, yet CTA 2 is actually damaging performance?
Without an additional test, you have no way of knowing if it’s the new subject line or new CTA that’s making the difference.
Each email you make can have up to five versions, with up to two versions toggled on at a time. If you wanted to test the two different variables in the example above, you could end up with four emails, like this:
So after testing A against B, let’s say B wins (meaning CTA 2 performs better). On the next test, you can test B versus D to see which subject line is better.
There’s a whole lot more you can figure out from here, such as seeing if certain subject lines only sometimes get better results, like only when they’re paired with certain CTAs. This gets more complicated with each variable you introduce, so keep it simple if you’re not familiar with the theory behind it.
A good rule of thumb is to get at least 100 contacts to receive each version of your email before you compare results, to ensure you have a statistically significant sample size.
That said, this is just a rule of thumb. The smaller the difference in results, the more data points you’ll want to back up your conclusion. After all, if the difference isn’t huge, it could just be random chance that one outperformed the other.
We’ll avoid the math lesson, but you can easily test the statistical significance of your results online if you aren’t sure.
A/B testing is about continuous improvement. Say you first test your subject line, and find Version B is substantially better than Version A, winning you way more email opens
Armed with this data, use Version B as your new starting point: now with two emails, using the same subject line, you can test your body copy, or your CTA, to see if you can drive up your results even further.
Rinse and repeat, with one variable per test, and over time you’ll get much better sales results.
From inconsistent naming conventions to improperly categorizing leads, a disorganized HubSpot account makes it harder to track results, and can ruin your ability to make data-backed decisions. To make sure you’re getting the most of your HubSpot setup (and fix anything that’s currently breaking), get in touch with us today.