Email Marketing

Test, test and test again …

LO-Emarketing-Testing-–-Blog-–-3

Firstly, let me start by saying that ‘testing’ is the one, if not the most important area of email marketing – if you aren’t testing then what are you learning? However, in the beginning, it’s important to implement the basics correctly – create a schedule and content plan, ensure your data is in good shape, build templates and start sending emails. Then you can be confident in your knowledge and set up, allowing you to ‘test, test, test’.

Why is testing so important?

Simply – so we can learn and improve. Nothing is right first time and you can learn so much from your audience.  For me, this is the beauty of email marketing, your audience is right there, you can test and measure anything to your audience, as long as your audience is a reasonable size.

As an email marketer, you’ll get asked a range of questions from what day you should send emails, to how many emails should I be sending, how long should the subject line be and does personalisation actually work, and while yes, they are all good questions, they all have one thing in common and that is, the answer is to ‘test’. Email is one of the most measurable tools, testing will ultimately improve your results and you’ll understand your audience better – it’s like having an open focus group that you can learn from.

Planning your testing

You’ve probably got a lot of ideas to test, and I don’t blame you, it’s addictive. Before you start testing it is a good idea to create a spreadsheet collating all of your ideas and thoughts, identifying information on what you might test, the impact it might have, and the effort needed to build the test. I’ve put a few ideas together that you might consider:

Campaign level testing ideas 

Campaign level testing is about the elements of your campaign, for example:

  • Subject line length 
  • Subject line tone of voice
  • Subject line emojis 
  • Incentive or no incentive 
  • CTA tone of voice: shop now vs. ready set shop 
  • Personalisation vs. no personalisation

Customer base testing ideas 

Customer base testing is focused on the actual customer base activity, so for example how many emails do customers need a month? If you are doing this type of test, send your control group your regular schedule and your test group your new schedule and measure the same metrics as above, but you might also want to measure how individual customers performed, i.e., receiving a lower email volume did they purchase less or more as individuals.

So how do you run a test?

1.  Set your hypothesis 

What do you want to know more about, what do you think might happen and what test are you going to run to review the outcome?

2. Assign your test time 

You should test your hypothesis more than twice, to make sure that you give your test a fair chance at performing. For example, if you’re sending weekly emails, you want to set a whole month aside to test your hypothesis. 

3. Splitting your audience

I would recommend you use the whole database to test, with a customer base of over 1,000 contacts, allowing for a sizeable segmentation, to gain realistic results. 

You should split your base 50/50 on a contact level, creating an ‘A – Control Group’ and ‘B – Test Group’. These groups must stay the same for the duration of the test. 

If you currently segment your customer base into new customers, active customers, inactive customers, you should keep them in that grouping and then split the groups, so you obtain results per segment. Keep in mind what works for one segment might not work for another. Your new users are unlikely to respond the same way as your inactive base.

4. Setting your test criteria

Now you’ve got your A and B groups you’ll need two copies of your email, make sure mark which campaign is group A and group B. Let me just make one key point here, everything should stay the same between the campaigns, apart from the element you are testing, for example: if you’re testing your subject line only change that element between the two groups.  

Launch your campaigns then wait for the results – exciting! 

5. Results 

Once your campaign has sent, wait 24 hours and pull the results into a spreadsheet to collect the essential metrics: Sent, Opened, Clicked, Unsubscribes, Orders and Revenue (if you measure conversion metrics). 

You’ll need to calculate your open rate, click to open rate, unsubscribe rate and conversion rate, for both your test and control emails, A/B groups.

You’ll also then need to calculate the uplift/decrease of your test vs. your control. For example, if your test drove a 32.7% open rate and your control drove a 43% open rate, your test hasn’t performed well. The formula to calculate uplift/decrease is: =test/control-1

Continue to do this until your testing period is over, then add up your results for your test group and control group and then calculate your uplift/decrease across all metrics. 

This will give you a view of whether your test was successful or not. If it was successful you can roll it out, however, if it wasn’t you’ve got more questions to answer and therefore test again!

6. Interpreting your results

If your test is showing a mixed bag of results, it’s important to review these, determining which metrics are most important to you – here are some examples:

  • The test has driven a high open-rate, but low click-through – I think this entirely depends on what your testing, if you’re testing subject lines, then it was successful because your north star metric is opens/open rate, but if you’re testing content in an email then look on a click map to see how each group behaved differently over the testing period. 
  • High click-through rate, low conversion rate – It might be that your click-through rate was up significantly for example on your test version, but the conversion rate was lower than the control, therefore I would question if there was something in their on-site journey that could have caused the fluctuation. 

Be in control of your test

When running tests it is important to think about the structure of the testing process and to be confident you are in control of it. 

From my experience, many will advise you to split your audience by a certain percentage, run the test and then send the remaining contacts the winning variant, around four hours later. I would advise you against this .. why – because many variables change in 4 hours. 

Also, why aren’t you testing on your whole base? The percentage of your base could be all ‘rarely engaged’ customers and therefore aren’t providing you with an optimal view of how your whole base would perform in this test. 

This approach also only looks at once set of results, whereas its always best to test for a longer period of time, to obtain a greater level of impact.
The power of emarketing is huge and regular testing of your emarketing activity to your audience will drive results. 

If you’d like to learn more about how we can help drive growth through emarketing, drop us a line and we’d love to start a conversation. 

Alternatively sign up to our regular enewsletter for the latest news and updates from StrategiQ. 

Test, test and test again …

We don’t want briefs.
We want problems.
That’s where the magic happens.

StrategiQ Full Awards List
2024
UK Dev Awards
Rising Star
UK Dev Awards
Fintech Website
UK Dev Awards
Third Sector Website
Campaign Best Places to Work
26/100
UK Dev Awards
Retail/Ecommerce Website
UK Company Culture Awards
Best HR Tool
Sunday Times' 100 Best Places to Work
Small Organisations Category
2023
UK Dev Awards
Best Third Sector Website
UK Dev Awards
UX Award for StrategiQ
UK Paid Media Awards
Best Use of Linkedin Ads
UK Paid Media Awards
Paid Media Agency Led Campaign Of The Year
European Paid Media Awards
Best Use of Linkedin Ads
UK Agency Awards
Best Culture Transformation Initiative
UK Search Awards
Best Use of Search (Travel)
Social Media Awards
Best Use of Instagram
Social Media Awards
Best Use of Linkedin
Social Media Awards
Best Audience Engagement Campaign
DEVELOPHerAWARDS
Emerging Talent
UK Search Awards
Best Use of Search
2022
Elite Agency
Campaign Best Places to Work
Winner Top 50
UK Dev Awards
Project of the Year
UK Dev Awards
Travel Website of the Year
UK Dev Awards
Best Site Migration
UK Dev Awards
B2B Website of the Year
UK Paid Media Awards
Local Campaign of the Year
UK Paid Media Awards
Best Use of Attribution
UK Search Awards
Best Local Campaign (PPC) (LARGE)
UK Search Awards
Travel / Leisure (PPC) (LARGE)
UK Search Awards
Retail / Ecommerce (SEO) (LARGE)
The Drum Awards
Best Business Development Initiative
2021
UK Dev Awards
Best Migration
Campaign Best Places to Work
Winner Top 50
UK Agency Awards
Covid Response (Silver)
UK Agency Awards
Campaign Effectiveness Award (Silver)
UK Search Awards
Best Use of Search Third Sector (Silver)
UK Search Awards
Best Use of Content Marketing (Silver)
UK Search Awards
Best Large SEO Campaign
2020
Campaign Best Places to Work
Winner Top 50
Suffolk Business Awards
Business of the Year
Suffolk Business Awards
Small & Medium Business of the Year
2019
DXA Awards
Best PPC Strategy with Powertool World
Suffolk Business Awards
Best Employer
2018
Best Employers Eastern Region
Best Digital & Technology Business
UK Search Awards
Best Small Integrated Search Agency
2016
EADT Business Awards
One To Watch Award
Read
Play
Hover