Saturday 10 March 2007

Split testing for fun and profit: an introduction

Introduction

I've created this blog as a place to write about split testing: a subject which, because it brings together marketing, technology and a certain amount of mathematics, combines several interests in one.

Don't worry, I don't intend to delve very deeply into any of these topics, because I haven't the time to write lengthy articles, and also of course I want the largest possible audience for this blog. Too much marketing or maths is liable to turn some readers off.

So what is split testing? Very simply, it's the process of testing marketing materials by experimentally comparing how well different versions perform. Unlike many traditional forms of market research, it's carried out in the live environment, on actual prospects and customers. This is what makes it rather exciting (and a bit scary too).

I hope to take the scariness away to some extent by explaining it in a way people can understand. I'll start by taking a very simple case, that of an email newsletter. I'll continue to use this same example and expand on it in future entries.

Meet Charles Farnes-Barnes of Farnes-Barnes Fab Gear

Charles Farnes-Barnes sends a newsletter to 2,000 people, every week. He uses it to inform his readership about new lines in his online clothes shop and to reward their loyalty with special offers. The special offers link to his website where readers can claim their reward by entering a special discount code from the newsletter when they make an order.

Charles has been reading up on internet marketing and he's concerned he's not doing as well as he might be with his newsletter. He's heard about this idea of a "landing page" - a page which is designed specifically to act as part of a marketing campaign. So he wonders if instead of sending his newsletter readers to the front page of his site he should try a landing page instead, one that reinforces the messages in the newsletter.

But Charles has also been reading about split testing and he's keen to try that out too. So here's what he does: he creates his landing page and at the same time makes two versions of his next newsletter, one with the special offer linked to the new landing page and the other linked to his site front page as before. The two versions also have two different discount codes: that's important for later.

Then he takes his mailing list and divides it in two, so he has two lists each of 1,000 addresses. Since the order of the email addresses on his list is not significant he simply splits it in the middle. (However, every subsequent time he does a test he'll have to divide the list differently, ideally by shuffling it randomly before making the split. )

He sends one version of his newsletter to one list and the other to the other. Then he waits for the results. He decides he'll set a deadline of a week, just before he prepares the next newsletter, to see how the test went. When the time arrives he looks at the total number of each discount code that's been used. To make it easy for himself, he used LANDING as the discount code in the letter linked to the new landing page, and HOMEPAGE in the other one.

Charles gets his results

As it happens. he finds that 200 people used the LANDING discount code while only 100 used the HOMEPAGE one. That's a pretty clear result, a winning vote for use of a landing page, so he decides that's the way to do it in all his future newsletters.

This was a very simple example of what's called an A/B split test, in which two things are compared one against the other. It's a slightly unusual one in that most split testing - as we'll see in future examples - is much subtler in that the difference between the things being compared is not so pronounced. We'll also be looking at how to analyse results when they are less clear cut. Happy testing!

No comments: