Email A/B Tests

Email A/B Tests: What to Test and How to Do It

Email marketing is a huge part of my business success. I used email marketing to build my first business, and that got me using it for every single one of my business ventures since then.

But here’s the thing about email marketing. It doesn’t just work automatically by virtue of your sheer brilliance. Instead, you must do the tough work of building an address list, creating emails, curating those emails, and testing those emails.

Testing is a major but often overlooked component of email marketing. The real way to score big with email marketing is to test your emails in order to find out what gives you the biggest impact.

In this article, I share some of the most powerful A/B tests that I’ve discovered for email marketing. When you start testing your emails, you’ll begin to discover things that you hadn’t even dreamed of regarding what works and what doesn’t.

This alone is an eye opener, but it gets even better. Testing your emails leads to better email marketing success, which leads to better success in general.

How do you test your emails?

First off, let’s deal with a simple question:  How do you do email a/b testing?

It’s easy. Most email marketing programs have split testing functionality built right in. All you have to do is turn it on (sometimes, it’s an additional paid feature), set up your tests, and let the results pour in.

Here are some of the major email marketing programs that have A/B testing built in:

  • Infusionsoft
  • ConstantContact
  • HubSpot Email
  • AWeber
  • Marketo
  • Mailchimp

This article is not a tutorial on each email marketing platform, but is a guide to A/B testing that will apply to any email marketing program.

A Word of Warning

Once you start testing your emails, you may be tempted to start thinking of email split testing as a silver bullet to all your email marketing problems.

Well, it’s not a silver bullet. But it’s the next best thing.

There’s one problem regarding email marketing that I want to draw your attention to:  statistical significance. The problem isn’t with testing itself, but with the way that some email marketing programs deliver test results.

Although most email marketing programs allow you to set up A/B testing, not all of them report the results accurately.

As Peter Borden explained in his article on Sumall, “I have yet to see an email platform that actually takes those results and tries to determine if they’re statistically significant. Most don’t. They simply see which version has the greater number of opens and calls that one as the winner.”

This isn’t a show-stopper to email testing. The simple way to solve this conundrum is to look at the data yourself, rather than trusting the declared winner.

Use an A/B significance test calculator, like this free one. Simply plug in the numbers, and analyze the test results.

email test 1

This calculator will allow you to deal with the data yourself and figure out the significance of your split tests.

Now, let’s get into some of those split tests that you can start working on today, and start bringing in the results that will change your business.

Timing:  What day of the week or time of day gives you the best open rates?

One of the most blogged-about issues in email marketing is timing. What’s the best time of day or the day of week to send emails?

email test 2

Everyone wants to know, but precious few actually test it. Save yourself some time, spare yourself some grief, and give yourself a break.

You see, there’s no stock answer to the question “what time is best?” Like everything else in marketing (and life), the true answer is it just depends.

Everything has a perfect timing — tweets, posts, retweets, +1s, etc. That timing has a major impact on whether or not someone sees your email, let alone opens it, let alone clicks through, let alone converts.

Finding that perfect time isn’t just a matter of getting the biggest open rates, but is also a major contributing factor to getting the highest CTRs and conversions.

For example, you may think that an email sent at 7:30 a.m. would get high CTRs. Yes, it might. Employees are turning on their computers and working through their email. Here’s the catch, though:  They feel rushed, open your email, but may not have time to respond to your offer. Sure, you get high CTRs, but your conversions are awful.

But what if you send it at 4:30 p.m., when employees are bored, winding down, and looking for a distraction? They may see your email, open it, and be more likely to convert. Fewer opens? Maybe. Higher conversions? Yes.

See? It just depends on a lot of things. Test and you’ll nail the perfect time eventually.

Subject Line:  What subject line increases your open rates and conversion rates?

By far, the most significant tests deal with subject lines.

Take a look at the results I recently got by testing my subject lines:

email test 3

Subject line testing can blow your open rates sky high. It took me a few efforts, but I eventually found the subject line formula that boosted my open rates by 203%.

In email marketing, a 203% can easily translate to tens of thousands of dollars increase in revenue. Think about it:  What good is an email that nobody opens? Statistically speaking, most people don’t open marketing emails. Global email open rates are only 32%, and the clickthrough rate is a shocking 4%. Sorry, but that’s life.

If you can somehow get people to open your emails you’ve automatically gained big time.

The goal of subject line testing is to improve open rates. But that’s not all. The right subject line doesn’t just get people to open the email. It also shapes their perception of the entire email, and whether or not they convert.

The result that is easiest to identify in subject line testing is the open rate, but you should also see what kind of impact your subject line has on your conversion rate.

You can’t afford not to test subject lines. This is the most important part of your entire email.  But what is it about your subject line that you should test? Here are 6 subject line test possibilities.

1. Length:  Which works better — a short subject line or a long subject line?

MailChimp reports that 28–39 characters is the sweet spot. I’ve found that even shorter can sometimes be better.

2. Curiosity:  Which works better — a subject line that reveals the contents of the email or one that sparks curiosity?

Some email marketers swear by the mantra that you should always reveal the contents of the email in the subject line. Others declare that sparking curiosity is the only true way to get more open rates. Which is the better way?

This subject line from Quicksprout got major opens: “Who Is More Active on Social Media? Men or Women?”

email test 4

Obviously, you won’t know until you test.

3. Marketing:  Which works better — a subject line that invites the user to buy or one that directly provides value to the user?

I’ve discovered that subject lines that communicate value to the user are the ones that get the biggest open rates. People are looking at their email inbox thinking, “What’s in it for me?” If your email can deliver something good, then they will be more likely to open it.

Notice how these email subject lines makes the offer more benefit focused, and less “buy” focused.  Each attempts to add value

email test 5

4. Capitalization: Which works better — a subject line with some words in all caps or one without words in caps?

Most of us are aware that using all caps can be a big turnoff for most readers, but some email marketers have used it successfully. Will it work for your emails?

5. Questions:  Which works better — a subject line that asks a question or one that doesn’t?

Questions make users think. More specifically, questions makes them think about your email. If the answer isn’t obvious or if they want to see if their answer is correct, you’ve gained a clickthrough. It’s worth trying.

6. Greetings:  Which works better — a subject line that greets the user by name, or one that doesn’t?

Name-specific greetings can be extremely powerful, or they can be a complete turnoff. The issue is a complex one, because different market segments may like the personalization, whereas others see it as fawning, artificial, or even a perceived breach of security.

The best way to find out is to test it. Once you win with your subject lines, then you’ve made some of the biggest possible gains in email marketing

Sender:  What “from” line works best — an individual or a company?

Nearly every email client clearly displays the “sender” of the email before the user even opens the email.

In fact, the sender is usually the first thing that people notice — even before the subject line! The reason for this is that many email clients display the sender prior to the subject line in the standard left-to-right reading pattern.

Here’s how Gmail displays emails in the inbox:  1) Sender first, 2) Subject line next, followed by a body copy preview.

email test 6

It logically follows that the sender of an email has a huge impact on open rates and other critical email success metrics.

The best test to perform is sending the email from an individual’s name vs. the company name. It seems fairly intuitive that people are more likely to open an email from an individual rather than a faceless corporate entity.

But can you be sure? Only if you test it.

Greeting:  What works better — a personalized greeting or a non-personalized greeting?

Personalization in all arenas of marketing is hailed as a major breakthrough.

Some evidence suggests, however, that personalization might not be that effective. Were you affected by any of the recent major security breaches? If so, then you know that personalization can seem a bit scary.

As long ago as 2012, researchers were concerned that “personalized emails don’t impress customers.” According to the report from Sunil Wattal at Temple University Fox School of Business, “Given the high level of cyber security concerns about phishing, identity theft, and credit card fraud, many consumers would be wary of emails, particularly those with personal greetings.”

Now, in the wake of major cyber scandals, consumers are probably even more wary. But are your customers wary?

The issue has been tested time and again. The results vary widely. Often, the differences are negligible.

Foolishadventure tested personalized emails vs. nonpersonalized. The first email was the personalized one:

email test 7
Image from Foolishadventure.

The open rate is only slightly higher, a 1% difference. The clickthrough rate is 2.2% higher.

When tested again, however, the opposite was true. The personalized greeting, on the top, got lower open rates and CTRs.

email test 8

The takeaway is not that “personalization is better” or “generic is better.” The takeaway is that Foolishadventure should probably run the test a few more times.

Another takeaway is that you should test to see what results you get.

Length:  What works better — a complete email message or one that requires a clickthrough?

Most of my email marketing require a clickthrough. Why? It’s all about the goal of my email marketing. In the case of Quicksprout, in the email below, I’m trying to provide my subscribers with the best information on the Web.

I want subscribers to be able to get the full experience on my blog, to interact in the comments, and to engage at a higher level. That’s why I give them three opportunities to click through.

email test 9

Other marketers use the long email approach. Most notably, Ramit Sethi of IWT writes really long emails, and his email marketing is top notch.

Both approaches work. This entire test depends on the goal of your email marketing. Are you looking to drive traffic, improve conversions, engage readers? Decide on your objective, or allow your testing to influence that objective.

CTA:  What works better — a button or text?

Every successful email has a call to action — something that you are asking the reader to do. It can be as simple as a “read the rest of this article,” or it can be as significant as “sign up for your free trial.”

Should you put this CTA in a line of text? Or with a big button? Or both?

The issue of email CTAs is a big one. The most shocking mistake that I’ve seen email marketers commit is the mistake of having no CTA! By all means, put a CTA in your email, and then figure out which ones gives you higher conversions.

This type of test is particularly valuable, because you get a close look at one of the issues that matter most in your email marketing — actual conversions or clickthroughs.

You can (and should) run more tests off of the CTA one. Once you figure out which one converts better, you can start testing button color, size, and other features. You should also try different variations of the CTA, and even different CTA objectives.

A testing blueprint.

So, you’ve read this email, and now you’re thinking, “Um, great. Now what?”

Let me sketch out a quick how-to of what to do next. First, go to your email marketing provider, and figure out how to turn on and use A/B testing. Next, start a test — just one test. Once you get results, move on to the next test.

I’ve listed the tests above in order of the sequence in which you should run them. They function sort of like an inverse pyramid, in which the first tests will allow you to attract the greatest numbers of opens, which then leads to greater amounts of conversions in the final tests.

  1. Timing: What day of the week or time of day gives you the best open rates?
  2. Subject Line: What subject line increases your open rates and conversion rates?
  3. Sender: What “from” line works best — an individual or a company?
  4. Greeting: What works better — a personalized greeting or a non-personalized greeting?
  5. Length: What works better — a complete email message, or one that requires a clickthrough?
  6. CTA: What works better — a button or text?

Once you conduct these six basic tests, move on to more advanced split tests, or simply start over and run each test again.

What email A/B tests have you used and found to be successful?

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s