This is all about testing your opt-in pages and increasing opt-in conversion.
I have an opt-in page for my weekly Star Power Half Hour webinar series. Conversion is pretty high because most people who get to the page were referred by one of the upcoming speakers, who is someone they know.
The opt-in page has a video in which I welcome people. I had it set to start automatically. It starts with some loud music–drums–and then I start speaking. Do you hate it when a video starts automatically? Do you sometimes immediately close the tab when that happens?
A couple of months ago, I looked at the conversion rate and it was 25%. Here’s how I figured out the conversion rate:
- I got the number of unique visits from Google Analytics for the previous 30 days
- I got the number of subscriptions in my email service for that list for the previous 30 days
- I divided the subscriptions by the visits to get the conversion rate.
For example, if 100 people visited the page and 25 people subscribed, my conversion rate would be 25%. In fact, that’s what the conversion rate was. (I don’t remember the exact numbers that got me to that conclusion.)
I wanted to increase the conversion rate and had a thought that the Auto-Play video might be turning people off. So I asked my Marketing Assistant to change that setting. You can see the Auto-Play setting in the above image. She just unchecked the checkbox.
The following month I did the same calculation and guess what?
The conversion rate was 40%!
What difference does this make?
Everyone who opts in gets on my email list and receives all of my emails. So what’s the difference for me over a year?
Let’s say that 300 people visit that page in a month.
If I get 25% conversion, I get 75 subscriptions in a month or 900 in a year.
If I get 40% conversion, I get 120 subscriptions in a month or 1,440 in a year.
That’s a difference of 540 new subscribers in a year!
All by unchecking a checkbox and taking the time to look at the numbers. Well worth it.
Why this isn’t a perfect test
We did this 2 months in a row, but it isn’t a perfect test. Why?
- Maybe the speakers who promoted the page did a better job in the 2nd month
- Perhaps the speakers in the second month were more interesting
To do a good test, you need to do a split test, also called an A/B test. This type of test is done all at once, so that you know that the results are real.
It’s a little more complicated, but not much. Here’s what I do:
- Create a duplicate of the opt-in page.
- Make a change in the duplicate — it can be a big change at first, but once you find a significant difference, make small changes to get even better results. You can change a headline, a picture, even the colors.
- Send people to a link that redirects 50% of visitors to one of the pages and 50% to the other page. I use Simple Click Tracker (affiliate link) for this. It makes the split testing easy and can even track conversions for me.
This works for sales pages, too.
In fact, if you aren’t doing A/B testing, you’re definitely leaving money on the table.
Are you split testing? If not, why not? If you are, which tool do you use? (I tried two others before I settled on Simple Click Tracker.) Leave a comment and please share using the Share buttons below so others can benefit from this post as well.