“A Split Test a Day Keeps the Competition at Bay” – DC
At Yanik Silver’s Underground Online Seminar many years back, a friend and client shared “A Split Test a Day Keeps the Competition at Bay” and it’s always stuck with me. So often what you’re finding works in your business now will be followed by others and you’ll be forced to innovate to stay one step ahead of the curve.
Increasing ad costs for fewer leads, higher Costs Per Click, increased Costs Per Acquisition, fewer opt-ins on your “proven” page, and more are all signs that what was working is no longer as effective and you’re on your way to being left in the dust.
Innovation can either inspire you or force your hand depending on how much you seek out Kaizen – the pursuit of never ending, continuos improvement.
Many of us entrepreneurs or marketers enjoy the thrill of crafting multiple ads, headlines, offers, email subject lines, sequences, and more to continually improve and beat our previous best numbers. We’re drawn to improving.
So if you hear a copywriter claim that their copy is “perfect” and not open to running tests to improve, you know it’s time to find a new copywriter…
How do you get better unless you practice, measure, and course correct? Even the greatest masters continue to practice and learn. And you can, too.
Proven Split Test Winners
One of my colleagues, friends, and split test machines, Russell Brunson, put together a great book full of ideas, inspiration and actual live-from-the-field results and asked if I’d like to contribute some of our recent tests and results.
We grabbed some of our latest experiments and you can find them in his publication 108 Proven Split Test Winners.
Before we dive into what some of those tests and results are, however, let’s go over how to even run a split test.
How to Run a Split Test
So how do you run an experiment in your business (or your life)?
- Start with a Hypothesis of what you think might be better. For example, Headline A vs. Headline B. Or perhaps Price Point A vs Price Point B. Or Long Copy vs. Short Copy. And so on.
- Test your hypothesis until you have a winner. For example, Headline A might get 9% more opt-ins than headline B. Or Price Point B might convert 23.6% more prospects than Price Point A. You’ll want to test on a large enough sample set over a useful length of time so you know you’re getting some confidence in the numbers.
- Implement your winner. Once you’ve found your hypothesis to be true or false with some statistical significance, go with the winner.
- Create a new hypothesis and Repeat the whole process.
Test #1: Video vs. No Video on an Opt-In Page
One of the tests we ran was for a webinar opt-in page. Our hypothesis was that the video we were using was hurting conversions.
Sample Set: The page received 1,914 visits and 1,542 webinar registrations at a blended 80.56% opt-in rate. (We were sending traffic from lists with relationship – not cold traffic)
Result: With the video, we were getting a 78.45% opt-in rate. Without the video, we were getting a 82.78% opt-in rate.
Analysis: I think THIS particular video was hurting conversions. We filmed it unscripted in the middle of a ski-trip and it could have been more on point than our first take. We have a bunch of other videos we could have tested given more time. Video CAN improve conversion, but this particular video was actually hurting conversions.
Test #2: Short Copy vs. Long Copy on a Sales Page
Copywriters have likely debated short copy vs. long copy since they started chiseling sales letters in stone, and almost came to a compromise with the concept of “dual path” readership – or even more lately “triple path” readership. But which one converts more on a sales page?
After a 90 minute webinar, we were running traffic to a sales page and decided to test this out. Our hypothesis was that long copy kills conversion. The pages were the same, but the short page had all the long copy in the middle removed.
Sample Set: The sales page received 806 visitors resulting in 109 sales at a blended 13.52% conversion rate. The product price point was $997 or 3 payments of $397. (Don’t worry, we split test-tested the price box and verbiage separately.)
Result: Our short copy page resulted in more clicks (23.68%) to the Add to Cart button (hooray!) but… fewer actual orders (12.44%). Our long copy page had fewer (20.62%) Add to Cart button clicks, but when someone clicked, they were more likely to buy and so had overall more purchases (14.69%).
Analysis: The short page had way too little, i.e. no summary, so people were curious and clicked Add to Cart. In the end, though, with no summary of benefits and what was included, hot prospects didn’t order. With ClickTale installed, we could also see via an on-page heatmap that on the longer page, most visitors skipped the entire middle of the page with all the details. Our next test would be a version of the shorter page but with a summary of what’s included.
We continued to test things like showing the date/time of a webinar on the registration page. Would showing the date and time increase conversions or decrease conversions?
Would a red animated arrow over an opt-in box outperform static blue hand drawn arrows in the same spot?
Would showing the “single page” price before the “payment plan” price result in more sales or less? There is so much to split-test every single day.
If you’d like to see our other tests as well as those of my colleagues, pickup a copy of Russell’s 108 Proven Split Test Winners.
Tools of the Trade
There are a handful of tools that we like to use when split testing and you can use them, too.
Visual Website Optimizer (VWO) is the tool that let’s us simply click on one of our pages, make a change, and test it, without ever touching any source code or HTML.
OptimizePress Plus Pack has the ability to run A/B tests right on your landing pages, sales pages, opt-in pages, and more.
ClickFunnels is a creation of Russell’s and allows for split-testing every part of your funnel.
Google Analytics “Experiments” let’s you run experiments right on your page when you have Google Analytics installed.
PPC – Using Google AdWords, Facebook Ads, and other ad platforms, you can easily split test multiple ads to see which ones convert better and optimize more traffic to go to the better converting ads.
Beyond Just Webpages
On a recent Freakanomics Podcast called “The Whitehouse Gets Into the Nudge Business“, there was discussion of how to increase military service member enrollment in the Thrift Saving Plan (TSP) – a free program to help service members save for retirement.
The first test that was run was on the paperwork that all 12,000 new service members coming into Fort Bragg each year have to review. After seeing success (increased enrollment) with one small change, they moved from paper applications to email split testing.
Instead of emailing 800,000 service members with the same message about enrolling in TSP, they split the 800,000 randomly into 8 segments of 100,00 service members each. They then sent 8 slightly different emails out to see which one converted better and found that one of the eight significantly outperformed the other seven – a result they wouldn’t have known without testing.
Hypothesis. Test. Implement. Repeat.
And split testing isn’t just for web pages; you can also run experiments and tests on things like:
- Email Sequences
- PPC Ads (Google AdWords, Facebook Ads, etc…)
- Direct Mail
- Sales Scripts
Experimenting in Life
I got an interesting text from a friend a few years back saying “I just saw you in a TED Talk” which was strange since I don’t recall being invited to give a TED Talk yet. I asked him what he was talking about and he sent me a link on Matt Cutts’ “Try Something New Every 30 Days” TED Talk.
Matt joined me on a climbing expedition in 2010 up Mt. Kilimanjaro and as part of his presentation on trying something new for 30 days every 30 days he discussed how he tried small changes – like biking to work, writing every day, cutting out sugar, and turning off the news – and used other 30 day changes to build up to big challengers like flying with me half way around the world to summit the highest mountain in all of Africa.
The main point in Matt’s TED Talk is that we can be constantly trying something new, experimenting in our own lives, and seeing what works for us and what doesn’t. Maybe it’s seeing how you feel eating vegetarian one meal a day. Perhaps it’s taking one photo a day to highlight the beauty around you. Maybe it’s biking to work every day. Perhaps it’s drinking a glass of water every morning when you wake up. Maybe it’s joining a pushup challenge and doing a set number of pushups every single day. Maybe you try journaling every morning when you wake up.
The beauty is that you get to pick the experiments. You get to run them. And you get to benefit from what you learn in the process.
Committing to Experimentation
At the Bay Area Mastermind, in addition to our monthly Accountability Action Items that we go home with after each meeting, we’re now experimenting (see?) with committing to an experiment in business or life that we’ll run between each meeting and reporting back on the results.
This is a way for our Mastermind Members to grow, learn, and share every single month. Together, we all grow.
What will YOU commit to experimenting with in your life or business over the next month?
Jeremy Shapiro facilitates the Bay Area Mastermind and runs a Software as a Service (SaaS) Company in the Customer Delight space. When not running split tests for his business and clients, you’ll find Jeremy hiking, cycling, or traveling with his family.
For more on this topic, see Jeremy’s complete presentation live from the Bay Area Mastermind, “Experiments”. If you’d like to learn more about topics like this, join us for a Mastermind Test-Drive.