Most businesses launch campaigns and the whole focus of those campaigns is usually the business itself, and not the customer.
Data-driven marketing helps you overcome the possibility that a gazillion things could go wrong with your campaign, and it’s a good thing if you asked me.
If your campaigns had a 100% strike rate, you’d be sipping Pina Colada on some remote beach.
It’s easy to setup campaigns and launch them. I know that. You know that. What’s hard is to do smart (and back-breaking) things you just have to do such as A/B testing and optimization.
In simple words A/B testing is an online marketing best practice which involves showing two different versions of a web page, a pop-up, a landing page, an ad or whatever it was that you are testing — to check the response of the users to measure which page is more effective in bringing you the results you seek.
If Page A is the original version, Page B will be the new page with a different design (with only the testing element changed on the new design) and is commonly referred as the “Challenger variant”.
Certain set of visitors will be shown Page A and certain set of visitors will be shown Page B. Depending on the results you expect (and those that you’ve set as goals) – and after getting enough data for you to make the decision – you’d arrive at a winner.
You’ll then create a new variant (maybe you’ll call it Variant C) as the new challenger for the winner in the previous round (between variants A and B).
What A/B Testing Really helps you achieve?
- Entrepreneurs have massive egos. So do freelancers, digital marketing agencies, and everyone else who keeps smarting about how smart they are. A/B testing removes the need for anyone’s ego, whim, thoughts, or opinions to influence campaigns.
- A/B testing allows you to optimize campaigns to maximize results. Effectively, it lets you squeeze more out of the campaign for the same budget.
- When you do A/B testing, you’ll have specific data – that pertains to your brand, your customers, your traffic, and your niche – that you can work with. If you didn’t have this, you’d have to depend on generic Industry benchmark reports that don’t serve your specific needs.
Now, that the basics are out of question, let’s dig into a few case studies to prove how effective A/B testing is and why you should test instead of letting someone else tell you what to do and what not to do:
ServerDensity is a hosted server and website monitoring service. They monitor the website downtimes from location, and around the world, combined with internal server metrics to analyze the reasons for your website’s downtime.
Supposition: Increase the price of their services which will increase overall revenue in spite of reduced sign ups.
Initial Price Offering (Original web page)
Initially they used to charge $13 per month for one server+ one website.
A/B testing (Challenger web page)
A comparative analysis showed that earlier a company had to pay for per website per server basis but now they had certain price slabs.
Making these changes showed a radical difference in Service Density’s revenue which increased by 114%.
The founders of www.gyminee.com main were focused on improving the home page conversion rate.
While their original version of the page was cluttered, they just didn’t jump to conclusions. They tested their page out.
Existing Web Page (Before A/B testing)
If you note the above web page, it has too many options from which the user can select. Also an important point to be noted is that the press coverage is below the fold which a user may or may not notice.
Challenger page A/B testing
The challenger page has a simplified design with few options, and the focus is on either the “Signup Button” or the “Take The Tour” button. The logos for social proof are more prominent now with the “Featured on” tab is above the fold which gets easily noticed by the user thus increasing the company’s credibility.
The result? Conversion rate increased from 24.4% to 29.6% after the simplified design.
Question: What button color you like?
Ok, leave that.
Question again: Which button performs better? Red or Green?
Don’t just jump at the answer. Stop debating. No one cares about your opinion or mine.
As usual, test it out. It’s phenomenal what you’ll dig up when you actually test instead of insisting that you’d like a color – like blue, green, or red.
When Joshua Porter did a test on Performable’s page to test whether green buttons perform better or the red ones.
Out of 2000 visits to their page, the red button outperformed the green button by 21% — which means that 21% more people click on red than they clicked on green.
Now imagine what that means for you: if you did a similar A/B test and just changed the color from green to red while you keep your budget the same, you’d get 21% more conversions.
That’s A/B Testing for you.
Car & Driver
If you ran a test in 2008, would it still hold its own now? What would happen when you add a single element of interactivity to an otherwise static block of content?
A lot, at least for Car & Driver.
For the company, page views directly equate to success. Their control element (the original) was an HTML block with a few links in there for customers to click through. The testing variant (Treatment) took a different approach with a simple dropdown menu to allow customers to select a car make they preferred.
The results? A whopping 74% relative increase in their total page views. The test, done in 2008, was so important that Car & Driver uses the insights gained from that test to design their website today.
Simplicity wins, right?
Web Design Best practices from a marketing standpoint always push you to create an uncluttered hero section on the home page or landing page, right?
Not right. At least, not for Zapier.
Their original version of the home page looked like this. Nice bright orange, a call to action that stands out, and a very simple layout.
Yet, it didn’t work.
What worked was this:
And how well did the rather cluttered but effective design work? Like this:
Wade and Mike, the founders of Zapier, have this to say:
“It’s easy to look at a page and judge it “qualitatively” based on how it looks to you.
But that doesn’t tell the whole story.
The aesthetic of a page is one thing. But if a beautiful page doesn’t convert, it’s not useful. An ugly page that does convert, though, still makes money.”
Stop assuming, when it comes to digital marketing.
Always keep testing. Do you test your landing pages, web pages, subject lines, copy, forms, and everything else or do you run your business on assumptions?
Tell me about it