Quantcast
Channel: Salesforce Pardot
Viewing all articles
Browse latest Browse all 1428

Best Practices for A/B Testing Your Emails—Plus 3 Noteworthy Examples

$
0
0

When it comes to email marketing best practices, contradictory advice abounds. You’ve probably heard that the best color to make your CTA buttons is red…or blue…or orange… The truth of the matter is that it’s less about the color of the button and more about the contrast between the button and the rest of your email—not to mention the fact that you’ll also want to consider any cultural or industry-specific significance the color may have to your audience.

So what color will stand out and capture your audience’s attention? I won’t pretend to know the answer to that, which is exactly why A/B testing is so important.

A/B testing, also called split testing, is a simple way to test and optimize your emails for the best results. With a marketing automation tool, A/B testing is simple. You just need two different versions of an email and a list to send them to. Your marketing automation tool will send your test emails to a small, randomly selected portion of that list. Then, after a specified period of time, it will analyze the engagement results for you and automatically distribute the winning email to the remainder of the list.

Whether you’re just getting started with A/B testing or you’re a seasoned pro who just wants to try out some new tactics, make sure to keep the following points in mind.

Have a goal.

Before you get started with an A/B test, you need to have a goal in mind. What results do you want to see? If your goal is to increase your open rate, you’ll want to focus on the subject line or the time you send the email. If you’re aiming for more downloads on your latest white paper, however, you’ll probably want to optimize the content and design of your email.

You can test virtually any aspect of your email, including:

  • Subject line
  • Layout
  • Headline
  • Copy/tone
  • Call to action
  • Button color
  • Offers and incentives
  • Send time

Keep it simple.

Make sure you’re only changing one element per email—otherwise you won’t know which factor made the winner successful.

Set a timeline.

How long should you test the email before sending out the winner? That depends. If you’re sending a time-sensitive offer, the timeline will likely be short (a few hours to two days), but in other cases, there’s no need to rush. Give your recipients enough time to interact. If you have historical data that shows how quickly recipients normally interact with your emails, you can use it to come up with a reasonable timeline.

Be consistent.

If you send your test emails in the morning, try to send the winning email in the morning as well. Changing the conditions of the second round of emails may result in decreased performance.

Math matters.

If your test audience consists of four people, your results won’t have any real validity. Fortunately, you don’t have to be a statistician be sure your results are statistically significant. This free tool from VWO can tell you in just a few seconds whether you can put stock in your A/B split test’s results:

VWO A/B Split Test Significance Calculator

Never stop testing.

So, let’s say you’ve learned that personalized subject lines result in higher open rates than unpersonalized ones. That’s great, but don’t stop there. Continue tweaking your emails. For example, test short personalized subject lines against longer ones. It’s also important to understand that what works with one segment of your audience may not work with another—and what works today may not work next month. If you keep testing, you’ll keep improving.

If you’re looking for some inspiration to get started, take a look at these companies that put split testing to good use and saw incredible results.

Case Study #1: Spicerhaart’s Subject Line Test

Spiicerhaart, a European real estate agency, tested two different subject lines to determine which format would get a higher open rate.

Version A: Mill View | Join us for our pre-launch weekend to find out more about these exciting new homes

Version B: Mill View | Pre-launch weekend | Quality homes affordably priced

While both subject lines have similar messages, version B is easier to skim because it’s a little bit shorter and broken into three distinct parts. As a result, this version increased opens by nearly 74%, and Spicerhaart was able to reach 97.5% of its list using the optimized subject line.

Source: WhichTestWon

Case Study #2: ActiveNetwork’s Tone Test

RegOnline, an event management software provider, used split testing to learn which of two distinct “voices” resonated with leads who started signing up for their service but abandoned the form partway through.

Version A:

Marketing-Experience-Version1

Version B:

Marketing-Experiments-Version2

As you can see, version A includes a hard sell with typical sales language: “you’re just one step away”; “call us”; “get started.” Version B, on the other hand, acknowledges that the recipient may be hesitant to share their personal information. In this version of email, the sales rep mitigates this concern by using low-pressure language and a friendly tone. In the end, version B won out, increasing lead inquiries by an impressive 349%.

Source: MarketingExperiments

Case Study #3: VTNS Solutions’ Send Day Test

VTNS Solutions, a Nigerian SEO firm, used A/B testing to find the best day of the week to send their bimonthly newsletter. They used to send it out at 4 p.m. on Mondays, but they hypothesized their open rates would increase if they sent it on weekends instead—and they were right. When they sent out their newsletter on Saturday afternoon, it received 5 times as many opens.

Source: VTNS Solutions

Have you seen success with A/B split testing? Did any of your results surprise you? We’d love to hear what works for you, so feel free let us know in the comment section.


Viewing all articles
Browse latest Browse all 1428

Trending Articles