Why Are Marketers Ending A/B Tests Early and Reporting Inflated Conversion Results?

We’re constantly working to understand our target audience even better. Recently, we decided to launch a survey of digital marketers to learn about the challenges and opportunities of their conversion rate optimization and A/B testing programs. Of all the results, we found it most interesting that they show the majority of marketers end tests before they’re ready and report an inflated increase in conversion when compared to industry reports. Why is this happening? We have a few hunches. Read on and let us know your thoughts.

Ending Tests Before Reaching Significance

Of the marketers surveyed, 86% either end tests or implement test results before reaching significance. This means new page designs are either being discarded or pushed live prematurely, before a high level of confidence in the results can be achieved.

Why is this happening? Testing may be taking longer than marketers would like. Of those surveyed, 47% said they need to run their live A/B tests for more than four weeks to reach statistical significance, and 91% say they’d find it valuable to receive test results in just days instead of weeks or months.

In a world where everything is expected to be faster, better and smarter, it isn’t at all surprising that marketers would start wanting results sooner.

Reporting Inflated Conversion Results

According to conversion statsonly 1 out of 8 A/B tests drive a significant change, and when tests do show a positive improvement, they usually only do so in small increments of 1% to 5%. However, 70% of digital marketers surveyed believe the majority of their tests result in the new page showing increased conversion, with 30% reporting a 10% increase in conversion or higher on their most recent A/B test.

Where’s the disconnect? Interestingly enough, only 34% of the marketers indicate that a percentage conversion improvement is their most important key performance metric. The majority of others said their organization is primarily measured on site health data such as bounce rates and time on page.

Inflated reporting could be a result of respondents focusing on their primary KPI’s of overall page improvement. Or, they could just truly believe the majority of their new page designs are performing better. This opens up a much larger conversation about data and analysis quality.

Have you seen any challenges with this within your organization? Have you ever had conflicting points of view on how well a page is or isn’t performing?

Opportunities and Solutions for Improvement

So, what did we learn from the survey in terms of other challenges and opportunities for marketers’ conversion optimization efforts? The common theme that emerged is the desire for more qualitative insight. The majority feel confident with their quantitative reporting, but nearly half report the need for more feedback from their target audience.

A few key results:

  • 50% say they need a better understanding of what visitors like and dislike
  • 48% say they face the challenge of wanting to know why visitors do or do not convert
  • 44% report using customer testing to help determine what is tested and when

Another key theme: the desire for more information before launching a live A/B test. A whopping 93% of those surveyed would do more high impact A/B tests if they could test before going live. Additionally, 90% say the ability to test a page using just a design and not a live, developed page, would be valuable to them.

What about frequency? When broken down into four ranges of number of tests performed per month, the majority (46%) perform 6-10 tests, and 13% perform more than 10 tests. Of those who report being satisfied with their conversion rate program, 88% report performing more than four A/B tests per month.

Are you running less than four tests per month? Are you satisfied or dissatisfied with your testing program?

This survey was completed by more than 250 mid- to senior level digital marketers in the U.S. who are responsible for conversion rate optimization at their organization.

WEVO recently launched a new platform for digital marketers to test and improve website conversion before going live. The platform predicts conversion testing outcomes, provides audience insight, and helps marketers build a more effective website. WEVO makes this possible prior to developing the site pages or launching A/B tests, which are typically required to test website effectiveness and can waste resources and time.

What are your thoughts on these results?

Share This Post

More To Explore

Ready to Get Started?