With MailCharts’ database of thousands of ecommerce email examples from a wide variety of brands, it’s easy to gather inspiration for your next campaigns. But how do you figure out if something that seems to be working for your competitors will also work for you without sending a new type of email or campaign to your whole list – and risking lower-than-usual performance?

By A/B testing your emails.

A/B testing email marketing allows you to test campaign ideas with a small sample of your audience so you can move forward knowing what works best. In this post, you’ll learn how you can best A/B test emails as well as get concrete A/B testing examples you can try on your own campaigns.

Let’s dig in!

What is an Email A/B Test?

An email A/B test or split test is a test where you create two versions (A and B) of the same campaign or email and send each version to a small, equal percentage of your total list or a segment of your list. 50% of your test group gets version A, and 50% gets version B. The winning version is determined based on higher open rate, clicks, or revenue, and sent to the remaining subscribers.



How to A/B Test Emails: Tips

Select a test group

First of all, you need to decide who you’ll be testing on. The answer could be your entire list or a segment of your list. If you have a rather small list and everyone gets the same newsletter, the former is a good choice, but if you send different emails to frequent buyers than you do to less engaged subscribers, you may want to target only one of those segments.

Now, you won’t send your split test to everyone who falls within the audience that you want to run the test for. Instead, you’ll send it only to a randomly selected 20 to 30% of them. Continuing with our examples from above, that means to 20-30% of your entire list or to 20-30% of a specific segment.

Who gets version A and who gets version B is random as well, and ideally, everyone will get their emails at the same time so delivery time can’t influence the test result.

Pick one element to test

So what should you test?

You can find a list of A/B testing examples below, but in general (and there is one exception), you’ll only test one specific element of an email for each A/B test you run. If you change two things – for example, both the color and the copy of your CTA button – you won’t be able to tell which of those two changes is responsible for the test results.

If version B gets the best results, it could be because of the different button color, the different button copy, or both. If version B loses, the same goes.

Select your determining performance metric

When you A/B test emails, the winning version can be chosen based on

  • the open rate of each version.
  • the total of unique clicks each version received.
  • the total of clicks on a predetermined link within each email.
  • the total revenue generated. Note that this option is only available when your store is connected to your email testing tool and allows for revenue tracking.

Obviously, if you’re testing your in-email CTA copy, you won’t choose a winner based on open rate, but the opposite is true if you’re testing your email subject line.

You’ll need to decide which of the above metrics you’ll use to determine the winner when you set up your test, as email service providers usually won’t let you change the metric once your test is running as that would make the results unusable.

However, some ESPs do let you monitor all of these performance metrics while the test is running, even though they’ll only use one of them to determine the winner at the end of the test period.

Set your test runtime

The runtime when you A/B test email marketing depends on the size of your test group as well as the behavior of your recipients. If you have a large list that is highly engaged, you’ll be able to create a fairly large test group that will open your test emails quickly (or not), meaning your test runtime can be relatively short.

If, however, you’re working with a smaller test group of recipients who tend to be a bit slower, you want to extend the test period over several days.

Double-check everything

As split tests usually can’t be changed while they’re running, it’s crucial to double-check everything before you start them. This usually means going over all the settings to make sure they’re exactly what they need to be.

Alternatively, you can create a special test email list consisting of company email addresses you have access to so you can actually run the test and check everything live before sending it to your real test group. The downside to this is that it takes more time – especially for longer email marketing campaigns.

What to do when a mistake gets through
Even if you check everything before hitting that “Run test” button, there’s still the chance you’ll catch a mistake when your test is already in progress. When this happens, it’s good to be using an email testing tool that allows you to stop a test and download a list of the recipients that have already received their email.

This way, you can either send the remaining recipients the corrected test, or you can do both that and send the recipients who’ve gotten the “wrong” test a “Woops” email. The latter is a good option if you have, for example, included an incorrect link.

How to Evaluate A/B Test Results

Most email service providers and dedicated email testing tools will automatically select a winner at the end of the test period based on the performance metric you selected when setting up your A/B test.

Some also allow you to manually select a winner before the end of the test period, which results in stopping the test early. This is useful when one version is clearly performing better than the other and you want to get the winning email out to the rest of your list as soon as possible.

When selecting an email service provider or dedicated A/B testing email tool, it’s a good idea to check whether they provide elaborate reporting while a test is in progress so you have the data you need in case you want to end a test early.



A/B Testing Examples and Ideas for Email

Below, you can find a selection of A/B testing ideas for email marketing. If you’re looking for more inspiration, browse through the email examples we’ve curated and organized in handy categories.

Different types of email subject lines

Testing different subject lines against each other is probably one of the best-known types of email A/B testing. Here are just a few subject line tests you can run:

  • with and without emoji.
  • generic versus specific.
  • question versus statement.

 

Different “From:” names

People open emails from people and brands they trust. Would it serve you to have “Ben” send your newsletters instead of having them come from your more generic newsletter department? You can find out!

When A/B testing the sender, use both a different “From:” name and email address. For example, the more generic

versus the more personal

.

Preheader text

The preheader text of emails is often overlooked or stuffed with non-appealing keywords, while it can play a massive role in getting your emails opened. If your preheader text always looks the same, here are some ideas to test:

  • make the preheader text complimentary to the subject line.
  • add a little teaser.
  • mix in some scarcity or sense of urgency.
  • create curiosity.
  • focus on the email’s main CTA.

 

Shorter versus longer email body length

It’s not just about what you say, but also about how much you say. If you tend to send out short emails, your audience might not be getting enough information to entice them to click. On the other hand, if your emails are text-rich, they may be hitting “delete” before they get to the bottom (and your CTA).

Experiment with shorter and longer versions of your email content to see what works best for the different segments of your list.

Call-to-action styles and copy

While switching button colors is a part of split testing your calls-to-action, there’s so much more that you can do:

  • experiment with the placement of your CTAs.
  • try to swap buttons for links and vice versa.
  • play with font style and size.
  • change the copy.

 

Different (header) images

Sober or colorful?
With or without text?
Focusing on your product or merely hinting at it?

There are a lot of different ways to use images within your email body and your header image is the first one recipients will see, so you better make sure it works for you!

Whether you make little tweaks to its current style or test a completely different visual, the result may surprise you.

Higher versus lower email frequency

Sometimes, your emails are great, but your audience is just getting too many of them, causing people to click “delete” rather than “open”. The opposite can also be the case: your audience loves your emails, but you send them so infrequently that you’re missing out on a lot of possible clicks.

Instead of changing the content of your campaigns, try split testing a different email cadence. Can you leave a week in between every email, instead of just two days? Or can you send that month-long campaign in just two weeks by emailing more frequently?

This is an easy A/B test to run as the only thing you might need to change within your emails is the mention of a date or a time reference.

Different send times

Even simpler than A/B testing your email frequency, is testing your email send times. To do this, you send the exact same emails on the exact same days, but you change the time at which they are sent. This way, you test at which time of the day your audience is most likely to engage with your campaigns.

Different email templates/designs

I know, I know. I said earlier that you should pick only one email element to test, but if your campaign isn’t working for you at all, it might be a good idea to switch things up entirely. Keep the same subject lines, preheader texts, and email copy, but use a different template. You can tweak your current template, or go for a completely different look.

The downside to this type of A/B test is that you won’t know what exactly made the difference if the new version wins. Was it the entire new look, or just a few elements?

If you already have well-performing campaigns, you may want to try split testing smaller elements at first, but if your emails are highly under-performing, a whole new look may be a good idea.

Test for the Best

Don’t guess or assume what will work best for your email campaigns, but test it instead. A/B tests offer a great way to figure out exactly which elements, send time, and send frequency are working for your audience so you can learn exactly how to move forward.

Use the A/B testing examples in this post and sign up for MailCharts for more inspiration for your next A/B email test.

Post tags   Email Marketing Strategy