A/B Testing: The Scientific Approach To Marketing

Originally posted for DMXENGAGE on April 3.

For me, A/B testing is kind of like your 6th grade science fair project. Gone are the days of testing how temperature affects bouncy balls or how to blow the biggest bubblegum bubble, but the scientific method comes into play more than I bet you ever thought it would back then. Let me drop some scientific marketing knowledge.



The first step of the Scientific Method and A/B Testing is determining what you are testing. There are endless amounts of things to test when it comes to email, web or direct mail marketing. To figure out where to start, it’s best to think about what you want to change. If you want to increase Click through Rates, try adjusting your Call to Action. If your team is hoping to increase your email open rate, test length and copy of your subject lines. If you need more people to complete you forms, consider using shorter, progressive forms.

In a recent A/B test for our West Point client, we tested different envelopes to determine which would generate more opens and responses.

“Which envelope, full-color, graphic, tagline vs. plain, white #10 envelope, will generate a greater student response rate?”


Any good scientist (read: marketer) knows that the best recommendations are backed by research.  Understand the current behaviors of your audience; what actions are they currently taking (or not taking), how long does it take for them to take those actions, are they interacting with you on mobile or desktop, are they familiar with your brand? Answers to these questions can help you form your hypothesis or tailor your question.


Who doesn’t love a little friendly competition? Make a note of which test you and the rest of your team think will perform better. For this particular test, the majority of our team felt that the flashy, full-color envelope would entice more students to open the direct mail piece and complete the Call to Action. 

“If we send the flashy, full-color, graphic envelope, then we will see an increased open and response rate.”


If you can flashback to 6th grade, you know that this isn’t an overnight project. A/B testing, just like your science fair project, requires careful planning and documentation. No, no one is grading you and you probably won’t receive a big blue ribbon, but I promise you will thank yourself later when you have notes on what you tested, why you tested it and how you tested your hypothesis.

To get the most accurate results, you want to test only one variable. Changing too many things between the A test and B test will make it more difficult to determine what caused the increased response rate (or your desired success metric). In our case, we only changed the envelope, keeping the letter and CTA the same.

Sample Size and Testing Time Frame

We recommend a relatively large list in order to generate a statistically significant result. Hubspot recommends a baseline of at least 1,000 contacts. If you have fewer contacts than that, you will have to test a larger portion of your list. Most marketing automation platforms have A/B testing functionality embedded in their offerings, making this part of the process easy. Some platforms simply cut your list in half, based on your sample size (or, in our case, mailing list). You can also use an A/B Test Sample Size Calculator like this one from Optimizely.

The next portion of the experiment phase is determining how long your test should run before determining a “winner.” An easy way to do that is to look at your past mailing data. Figure out when your email or direct mail open/clicks/response rate start dropping off and use that as a baseline to set your time frame. For example, if you receive the majority of your opens/clicks in the first 24 hours and only a small percentage after that 24-hour mark, cap your test to 24 hours. Again, this is a functionality that is likely included in your marketing automation platform.

For our envelope test, we simply spilt our mailing list in half, 50 percent of the list receiving the full-color envelope and 50 percent of the list receiving the plain, white envelope. Then we sat back and waited about a week for the responses to start rolling in.

Check in next week for the results of our A/B test! In the meantime, which scientific marketing question are you experimenting with?

Leave a Reply

Your email address will not be published. Required fields are marked *