Customer engagement has become a bit of a catch-phrase. We know we want it and we know the steps we should follow to make it happen, but not many of us know how to actually ensure its success with our digital content. So we experiment with our customer experience — on live traffic. And that’s a big risk to take.
We know it’s risky, but we still do it because it’s been the historical approach to creating engaging digital content and designs. We want to capture the attention of our audience and we want our digital content and designs to perform at a level that exceeds our expectations — and our customers’. So, we implement every best practice available in our pre-live workflow: design, coding, ads, and SEO optimization.
Then we launch the site, measure its results, and, when they fail to meet our performance goals, we make a guess at why and iterate from there. Lather, rinse, repeat. This incremental path to success is not going to generate the customer engagement we want. Why? Because we’re doing the experiment the wrong way.
Experiments (read: testing) should be done before the product reaches the customers. When you make your site visitors your test subjects, you unintentionally pay for it in more ways than you think — with opportunity cost, bad impressions, and less willingness to try new approaches over time.
The Opportunity Cost of Testing Live Customer Engagement
You can conduct extensive A/B and multivariate testing and still end up going to market with subpar customer experiences. Iteration only works if one of the options is actually good. Remember that old joke about how fast you need to be to outrun a hungry lion? Just because you hightail it a little faster than the guy running next to you, that doesn’t make you Usain Bolt. And, of course, the same is true for version A beating out version B in a split test duel. If A did better than B, it doesn’t mean that A is actually good.
It is true that, over time, experimentation should weed out conversion blockers, which should improve customer engagement and result in losing fewer visitors over time. But how many site visitors who see the “losing version” can you afford to lose? The gap in effectiveness between the loser in A/B test #1 and the final winner in test N is an opportunity cost we’re paying.
Yes, experimentation and testing is something we should be doing to improve the customer experience, but it needs to be done in a way that reduces the opportunity cost as much as possible before we go live.
Temporary Mistakes Can Leave Permanent Damage
According to a report published by Forester and Accenture Interactive, “while 52% of respondents believe they are ahead of their competitors or ‘best in class’ in their industries, only 7% are exceeding customers’ expectations.”
Yikes. That’s a significant amount of conversion loss, damaged brand loyalty, and diminishing customer lifetime value.
Consider this: in today’s fast-paced digital landscape, content can seem short-lived, or perhaps so transient that mistakes will go unnoticed or even be acceptable. After all, unlike an ad printed on the front page of The New York Times, a digital ad or even an entire webpage can be removed in seconds. But even if it can be replaced, you can’t make your audience “unsee it.” This is a pitfall that A/B testing won’t help you avoid — as mentioned above, you may be launching the higher-performing design, but you still don’t know if there are elements on that “winning” page that fall short of your customers’ expectations. And when we come across a poor experience, we’re less likely to return for more.
Taking Creative Risks to Improve Customer Engagement
This isn’t your permission slip to skip testing or to play it safe for fear of creating disappointing digital experiences, especially when a lot of future CXO careers can be launched on a big-bet move that shows originality, vision, and initiative. Those qualities matter to inspire customer engagement, especially when it comes to releasing a new product or meaningful update. Just think about the Dollar Shave Club’s “our blades are f***ing awesome” messaging.
Risk-taking moves the needle, gets headlines in Adweek, builds buzz and followers, rises above the noise, and, perhaps one of the most feel-good results, it inspires envy in our peers. Obviously, we can’t afford our content to fail. To feel confident in taking creative risks with customer experience, we need to ensure we’re doing things that are backed by customer-centric, pre-live data (and preferably data that isn’t tainted by analysis paralysis of endless deliberation that’s only ended by the most senior person in the room).
In fact, a McKinsey survey found that 72% of senior executives thought bad strategic decisions either were about as frequent as good ones, or “were the prevailing norm” in their organization. Not great. So as you’re preparing to pitch your bold, new idea, be prepared with real audience data to improve your chances of making an impact.
It’s All About the Benjamins
Despite the rapid advancements in industries all around us, some marketing and CX teams are still operating in the Dark Ages. We’re not well informed, we lack time and budget, and do not have the tools to listen to our audience before launching a new digital experience. Launching a page based on raw guesses has a significant impact on customer engagement and, as a result, we leave a lot of money on the table.
To turn things around, we need to rethink measurement. Believe it or not, we’re measuring the wrong things. We’re misinterpreting metrics like page views, trial signups and social media “likes,” things that Eric Reis astutely calls “vanity metrics.” They are essentially a means to an end. Revenue is what drives KPIs and is rapidly becoming the primary indicator of how digital marketers are being measured.
The pre-live timeframe, the window right before a webpage goes live to the public, is an opportunity. Take advantage of it! It’s your chance to figure out the things that will drive customer engagement, like the most relatable hero image and the most eye-catching headline, so you can design – and launch – the best digital experience possible.