Same CRO Test, Different Results: Optimising Your eCommerce Checkout

Same CRO Test, Different Results: Optimising Your eCommerce Checkout

Never underestimate the importance of running your own optimisation experiments and gathering your own data. This is why CRO testing for each individual merchant is so important. Simply relying on general data published online from your favourite guru may steer you down the wrong path for your individual business.

For this particular test, it’s easy to think perhaps there was a skew in the data or that the sample size wasn’t big enough, but this experiment had over 500 sales/conversions which is statistically significant. You can’t argue with the data.

It’s funny how in this business the things that seem like clear winners turn out to be losers, and the tests you think will underperform the control experiment actually end up converting higher. This is often the case between different websites. As much as we like to standardize data and draw overall conclusions, the results show the importance of testing individually for each different merchant.

The test was quite simple: Add a Security Lock icon to the checkout button

Hypothesis: Displaying a security lock icon on the checkout button will make customers feel safer when ordering knowing their information and order details are protected, hence increasing conversion rate.

Obvious winner right?

Wrong.

Here’s the results:

You can see that on Desktop and Mobile/Phone the results are all negative across CR (Conversion Rate), RPV (Revenue Per Visitor), and AOV (Average Order Value).

This same test run on a different website had this result:

Remember the first website had over 500 conversions, so you can’t write it off as an anomaly. We ended up turning off this experiment for the first merchant in favour of better performing tests.

Interestingly though, a variation test did produce better results for the first merchant:

This test involved displaying the words “Quick Checkout” on the order button rather than “Checkout”. There was no security icon on the button for this test in case you were wondering.

Notice the difference in performance results across device categories.

Insider Tip: Make sure you measure each device category separately. Relying simply on desktop or overall conversion results may cannibalize your conversion rate on Mobile devices. It’s at the point now where you need to run mobile specific optimisation tests.

In summary, the main points to take away from this are:

  • Always be measuring your own data against published information.
  • Measure each device category separately for results.
  • Run specific optimisation tests for mobile vs desktop devices.

If you are interested in obtaining more insights like this, head on over to https://www.onlinevisions.com.au/blog/ and download our free report on the 14 game changing experiments that have given merchants an average revenue lift of 8.12%