Conversion Rate Optimization is a scientific process.

It involves multiple intricate steps that are often time-taking.

For this reason, many CRO newbies (and even “experts) try to cut corners. However, they end up making several mistakes.

Many times, they miss out on conversions. And sometimes, even worse, they lose revenue.

This post aims to make you aware of such common conversion rate optimization mistakes, and ways to avoid them.

Let’s start.

1. Not Having Enough Traffic

What is the first thing you need to have conversions on your website? That’s right, traffic!

And what happens when your website doesn’t get significant traffic? Well, your CRO efforts suffer.

With small-sized traffic, there come many issues:

Firstly, analyzing the website data becomes tricky.

Let’s say, you use a web analytics tool (such as Google Analytics) to track your website performance and user behavior. You analyze data, and look for web pages, traffic sources, and other such areas that require optimization. However, when you have low traffic, your website data represents only a small sample size of your potential customer base. The trends and projections based on this sample size can often be misleading.

Related Post: A Guide to Using Google Analytics Metrics and Dimensions for Conversion Optimization

Secondly, traditional (Frequentist) A/B tests on small sets of website visitors can take a long time to give statistically significant results.

Another pitfall in testing ideas on low traffic is getting skewed results. With traditional A/B testing engines, it’s always advisable to have a large number of visitors before running A/B tests. Conversion expert, Peep Laja, advises to have at least 1000 visitors as a sample size for Frequentist A/B tests.

Without a healthy stream of visitors, CRO can be a waste of time. Tweet: 9 Silly Blunders Every Conversion Optimizer Should Avoid. Read more at https://vwo.com/blog/conversion-rate-optimization-mistakes

Hence, a low-traffic website should focus on user acquisition first, and then on conversion optimization.

[VWO’s new Bayesian model of A/B testing delivers faster results in comparison to the Frequentist model. Use the A/B test duration calculator here, and see the difference.]

Smart Stats - VWO

However, if you still want to perform Frequentist A/B tests on a low-traffic website, here’s a useful guide.

2. Ineffective Targeting

So you’ve got healthy traffic on your website. It must be enough for your CRO practices to deliver favorable results, right?

Not quite.

Sure, having a significant number of website visitors is important for A/B testing. But, it’s equally important to have the “right” kind of visitors. Your CRO efforts, again, won’t deliver conversions if the visitors you attract have no interest in your product at all.

Here’s ConversionXL on how to get the kind of traffic that actually converts.

When your marketing campaigns attract users from the wrong demographics, it may result in high traffic but minimal conversion rate. Suppose, you run a fashion eCommerce store catering to young women and your web traffic consists mostly of men, they will seldom convert.

Similarly, if your website attracts visitors from geographies where you do not operate, you’ll lose out on conversions.

Ensure that your website and its related campaigns (online ads, SEO, email campaigns, etc.) only acquire relevant users. Target users based on characteristics that directly pertain to your business.

For example, Monetate has helped its clients offer personalized experiences to visitors based on geo-location, and improved their conversion rates.

3. Borrowing A/B Test Hypotheses

Yes, finding inspiration from A/B tests performed by other websites is fine. Even best practices can sometimes tell you which parts of your website need optimization.

But if all your A/B testing hypotheses are borrowed, chances are they’ll not work on your website.

Why? Because not all websites (and their visitors) are the same.

There is no one-size-fits-all conversion optimization strategy. Tweet: 9 Silly Blunders Every Conversion Optimizer Should Avoid. Read more at https://vwo.com/blog/conversion-rate-optimization-mistakes

You must follow a methodical approach to find leaks and areas of improvement across your conversion funnel. Here’s what you can  do:

Heatmap of City University Website
How Heatmaps Help You Observe Visitor Behavior
 VWO On-page surveys
VWO On-page Surveys (for collecting user feedback)

Developing hypotheses based on this insight, can potentially provide greater value to your CRO strategy.

4. Thinking Design Equals Conversions

There is an even bigger blunder than borrowing hypotheses every time. It’s not hypothesizing at all (or applying changes to a website without A/B testing).

For instance: Just revamping your website, and giving it a fresh, modern look will not guarantee conversions.

The mistake is to think that a newly designed website will invariably improve conversions.

Design does not equal conversions. Tweet: 9 Silly Blunders Every Conversion Optimizer Should Avoid. Read more at https://vwo.com/blog/conversion-rate-optimization-mistakes

A case study from our archive demonstrates how a seemingly neat, modern design failed to improve conversions.

cleaner design fails to improve conversions

While design is an integral part of website usability, it’s not enough to convince users to convert.

Of course, there can be a case when your revamped modern website offers a greater conversion rate than a previous traditional website. While that is great news, you cannot be sure if you’re getting the most out of the new website.

To unleash the full potential of your new website, you need to come up with data-driven hypotheses, and perform A/B tests on it.

5. Ending A/B Tests Too Early

First of all, you MUST not end a Frequentist A/B test before it arrives at a statistically significant result.

And even if you have a statistically significant result at your hand, you must see if the variations have enough conversions to back it up.

It wouldn’t make sense to choose a winning variation, when your frequentist test doesn’t even have, say, 100 conversions to run the test on. (No matter how statistically significant the result is.)

Caveat: For websites that offer high involvement or high investment goods, 100 conversions can be a big number. It can make sense for them to go ahead with this data.

So what do you do? You keep the test running, until you reach the pre-calculated test duration that makes you feel confident about the result.

VWO A/B Test Duration Calculator
VWO A/B Test Duration Calculator

Even with a Bayesian engine, you need to run A/B tests long enough to get a short range of expected conversion rates. (Download the ebook here to learn more about it.)

6. Giving Up Optimization After Failed Tests

So you are optimizing a crucial page of your conversion funnel. You develop a hypothesis to improve its conversion rate, and run an A/B test. What do you do if the test fails?

It will be absurd to leave the page to its original state. Duh! Because the page is crucial and requires optimization.

The alternative is to learn what the losing variation lacked and why users didn’t find it more appealing. Employ tools like heatmaps and clickmaps to observe user behavior.

Use this information to devise a new and better hypothesis and run more A/B tests, incrementally improving chances for a high performing test.

The key here is to always expect the unexpected results. Many times, even the most logical hypotheses fail.

If your best hypothesis fails an A/B test, it wasn’t the best. Tweet: 9 Silly Blunders Every Conversion Optimizer Should Avoid. Read more at https://vwo.com/blog/conversion-rate-optimization-mistakes

Keep testing.

Keep testing meme

7. Not Tracking Macro Conversions

So your A/B test result has come in. Your variation beat the control. The result is statistically significant and it’s positively improved your conversion rate!

Now while you prepare for a party, let me ask you something.

Did the variation increase just micro conversions (e.g. form-submits, blog signups, visits to next page, etc.) or did it actually improve your macro conversions (e.g. revenue)?

If your answer is “increased only micro conversions,” I suggest you cancel the party.

Why? Because while these small wins will help you justify your role as an optimizer and spread some smiles around, unless it betters your bottom-line, micro-conversions might not amount to anything.

This doesn’t mean that micro conversions don’t hold any value. A series of micro conversions is what makes a macro conversion possible. The trick is to carefully decide on goals for your A/B tests and tie your macro conversion goals to your micro conversions when you set up an A/B test. Unless you do it, it’ll become a case of not seeing the wood for the trees. Now, suppose you’ve set up conversions tracking right and see that your A/B tests only lift micro conversions and not macro conversions, then some stages of your conversion funnel needs optimization to cash in on the increase in micro-conversions.

Measure Macro AND Micro Conversions
Source

Once you see your macro conversions rising, you can rethink about planning that party.

Related Post: Average Order Value, Conversion Rate or Revenue Per Visitor – What Should You Track?

8. A/B Testing For The Sake of Testing

One of the biggest mistakes a marketer can make, is running A/B tests without having a properly defined CRO strategy.

A CRO strategy should be a clear road-map of website tweaks that aims at improving user experience to in turn effect higher conversions. As a CRO practitioner, you want to reduce the time and effort users have to spent for a conversion.

Let’s take an example.

You run heatmaps on your forms and see that there is a considerable drop-off happening along the length of the form. Next, you hypothesize that by shortening the length of your signup form, registrations will jump. Next, you A/B test different lengths of the form to see which one works best.

If the form with the shortest length wins, you can reasonably deduce that your visitors don’t like filling up lengthy forms. This insight can encourage you to run similar tests on your other forms, too. Such a strategy can help you improve user experience, and consequently conversion rate.

By going forward with a robust website optimization strategy, you incrementally understand your users’s preferences better and can plan subsequent A/B tests better.

9. Expecting Huge Results

There are tonnes of CRO case studies across the internet. (VWO alone has 150+ A/B testing case studies.)

Many of those case studies boast double-digit (sometimes, triple-digit) percentage increase in conversion rate.

MedienReich VWO A/B Testing case study

Going through these case studies, we dream about running our own A/B tests, and delivering the same, glorious results!

We run our tests, and get a relatively small rise in the conversion rate — probably a tiny single-digit percentage.

That disheartens us. Some of us don’t even implement the winning variation of the test.

There.

Two mistakes.

  1. Expecting unreasonable results.

    For instance, according to Monetate, the global average eCommerce conversion rate is around 2.5 percent. If you are doing less than that, it is reasonable to expect an increase, even a double digit percentage increase. But if your conversion rate is already well over the average, you need to dig deeper into your specific eCommerce niche to see what results can be expected. If you are looking to optimize your landing page, our landing page analyzer can help you estimate the potential for improvement.

  2. Discarding test results because they didn’t meet your expectation.

    Let’s say you find a 1% increase to your overall conversion rate. Ask yourself this: Does the resource investment required to make the change justify the expected improvement to the bottom-line? If the answer is no, hold on to the implementation. The winning hypothesis should help show you the right direction, which if pursued further, can lead to a better hypothesis and consequently, higher conversions.

Remember:

Any optimization is better than no optimization. Tweet: 9 Silly Blunders Every Conversion Optimizer Should Avoid. Read more at https://vwo.com/blog/conversion-rate-optimization-mistakes

We need to acknowledge all the website optimization gain that we can get, no matter how small they may be.

Are You Making These Mistakes?

I hope this list made you aware of the common mistakes that conversion optimizers often make. Use this as a checklist to develop and execute your website optimization strategy like a pro.

PS. If you liked this piece, you will probably love our other posts. Subscribe to the blog to get research-driven original content delivered right to your inbox, fresh and warm.

100% privacy guaranteed. We will never share your details.

The post 9 Silly Blunders Every Conversion Optimizer Should Avoid appeared first on VWO Blog.