The truth behind the yin and yang of conversion rate optimization

Reading Time: 3 minutes

I’m going to advocate for the flip side of the conversion coin today. I’m delving into the “dark side” that is rarely discussed.

At WiderFunnel, we always advocate the importance of testing and data-driven decision making. The truth is that you can’t do conversion optimization without testing and web analytics is important for identifying optimization opportunities.

But, testing and data alone don’t tell the whole the story.

The truth: testing is the easy part.

Gasp!

Yes. It’s the how and what that determine the results you achieve from your testing. The secret is in the test variation ideas.

The so-called “left brain” perspective full of logic and analysis only tells half the story. The right brain, creative input is just as important.

Left brain right brain

(Let’s ignore for a moment that the whole left/right brain meme isn’t supported by the latest research for this discussion.)

I believe the left brain aspect gets more attention in the CRO discussion simply because it’s been neglected by marketers for so long. That’s one reason I called my book “You Should Test That!“; to help wake up marketers relying solely on gut feeling and unproven ideas. But, the creative side of solving problems is just as critical as the testing side.

There’s a creative yin to the rigorous yang

Just as important as testing is innovative big ideas. Big results come from big ideas.

The Yin is innovative, intuitive, messy, artful marketing ideas. The Yang is the proof in the pudding, the sunshine of truth searing the reality from the fog. Yes, we test, but without the great ideas, there’s nothing to prove.

Creativity is needed because every situation is unique. Context is key.

There is no conversion optimization rule book

When we started WiderFunnel in 2007, we initially imagined we’d optimize to find the perfect landing page design that we could replicate for any situation. Well, I don’t know if we really believed that, but it was at least a hope.

Back then, we certainly thought we’d find the “best” button colour, the “best” headline approach and the “best” website layout.

In reality, what we’ve found are more universal and exciting: principles, patterns, and processes that we can confidently apply to any situation, platform, target market and media. We continue to test on websites, mobile sites, mobile apps, video game interfaces and more, all with the same system. Much like the fabled Canadian RCMP Police who “always get their man” WiderFunnel always get to a winning test result.

RCMP always get their Bieber
RCMP always get their Bieber

A right brain test example

When WiderFunnel optimized the Expensify home page, the variations clearly weren’t developed by an algorithm. The cross-functional team, led by an experienced strategist, developed new approaches that no software alone could conceive.

For context, Expensify is a fast-growth startup with a lean team facing large competitors. The expensify.com home page is the company’s primary landing page for free online signups and it needed to be improved to produce more signups. The page had been developed with clean design “best practices” but the Expensify team believed it could be improved.

Here’s the original home page the company came to us with.

Expensify Control Page
Expensify Control Page

Our strategists identified 16 conversion barriers using WiderFunnel’s LIFT Analysis system and prioritized 12 primary hypotheses to test. They translated the hypotheses into four initial test variations to discover answers to the major hypotheses and isolate a few important questions. WiderFunnel’s design team brought the wireframes to life, then developed and launched the first test on the page.

Expensify winning home page
Expensify home page winning variation

When you compare the winning page we tested, you’ll see that it clearly required creativity to design. No algorithm could come up with that combination of headline, design and benefit copywriting. On the winning variation, we:

  • Added a new headline reflecting the company’s unique positioning and brand
  • Moved the form field up on the page
  • Created visual emphasis on the signup form
  • Added anxiety-reducing message on the call-to-action (CTA)
  • Added a strong CTA subhead
  • Isolated features vs. benefit copy points
  • Designed new colour and font treatment for support points

Of course, we also tested other variations that isolated questions and led to insights, or what we call “Aha!” moments, about persuasional triggers for this target audience.

And, the testing didn’t end with a single winning test because CRO is an ongoing process.

The latest winning page clearly took another dose of creativity and could only be arrived at with a combination of creativity and rigorous testing.

Expensify latest winning page variation
Expensify latest winning page variation

This winning page not only looks great, it produces a 47% higher signup conversion rate than the original control page.

For more detail, check out Expensify’s SaaS conversion optimization case study.

As an industry, I believe we’ve made good progress in convincing marketers of the need to test, use big data, and become experts in analysis. Now, let’s remember that the creative side is still needed to imagine new solutions to old challenges.

Enjoy this post? Share with your friends and colleagues:

  • Great design on the new landing page – but I'm more intereseted in the white paper/book/blog post containing a thorough run-down of your "principles, patterns, and processes that we can confidently apply to any situation, platform, target market and media."

    😉

  • Elena Carson

    Hi, Chris. Great case study!

    I love the latest winning page. It gives me a lot of confidence in the company; my perception is that, unlike the original, someone spent a considerable amount of time to present me with a professional, well-thought through design. Someone worked hard to capture my attention.

    I would be very interested to see the page that will outperform this one!

    • chrisgoward

      Thanks, Elena!

  • Great article Chris!

    The case study it’s good and easy to understand. It would be great to provide some more case studies if possible.

    I’m doing A/B testing for my clients and I’ve always tried to get out of the box with it. But clients are not always happy with big changes, as many of them are risky, with negative results.

    Thanks for sharing the info.

  • Stephen

    I had a look through your examples on your case studies page. The studies were very clear in their performance improvements. But when I went to look at the client’s actual web site, they appeared to have a complete redesign and were no longer using the designs you had suggested.

    This suggests you couldn’t overcome the client’s desire to either be in control of their own design or their propensity for change. Perhaps somebody else in the client’s organisation was setting the priorities.

    It appeared to me that most of your improvements were gained by reducing the content on the page, providing more focus. That certainly provides clarity and reduces distraction and anxiety. But is that sufficient? The client’s appeared to have higher priorities.

    • chrisgoward

      Yes, for most of our clients, the website is different now than when we published each of the case studies. Your logic is off, though. Why would you assume that suggests a problem?

      To clarify, here are some other things to consider:
      ▪The designs in our case studies are not “designs we suggested.” They are designs we created, tested, and proved through A/B testing to be winners at the time. We don’t ever “suggest” a new design. We test and optimize the design, content, layout, messaging, etc.
      ▪The screenshots in those case studies are snapshots in time, and not the forever final result. We usually show them in the case study to illustrate an insight, result or strategy. They’re by no means comprehensive of all the work we’re doing for them.
      ▪For example, we often perform site-wide tests that dramatically redesign our clients’ websites but the case studies may have been published before the most dramatic of the tests.
      ▪The strategy we use is called Kaizen, which means “continuous improvement." By definition, they shouldn't be the same today. In fact, it would be more of a problem if the site was exactly the same today!
      ▪We’re often training a client’s teams to do conversion optimization by “looking over our shoulders” and they take over at some point. That’s totally fine with us.
      ▪The improvements are sometimes achieved through reducing content, sometimes increasing content, and sometimes not touching content. It’s sufficient if and only if it lifts business results. (e.g. Profitable Sales)
      ▪We focus precisely on the clients’ highest priority and spend considerable effort ensuring we’re doing so at the beginning of an engagement.
      ▪However, organizations don’t always follow the best strategies either. Politics are involved and some people’s incentives aren’t aligned with their own company’s. They’re the minority, but I certainly won’t argue that all organizational behaviour is logical, optimal or unbiased.

  • Jen

    Really like the winning result. It's always amazing to me how tweaking certain things can make such a huge difference. Plus, this reiterates the importance of great copy – even if there's not much there.

  • sherisa

    Great test and results. Of course, I'm also curious about the data and insights you have to share about embedding the twitter icon directly into copy highlights.

  • I’m working through a number of ideas right now to improve conversions and this article serves as a great reminder – TEST TEST TEST everything!