Controlled experiments can transform decision making into a scientific, evidence-driven process—rather than an intuitive reaction. Without them, many breakthroughs might never happen, and many bad ideas would be implemented, only to fail, wasting resources.“The Surprising Power of Online Experiments” published in the Harvard Business Review
When Nate Wallingsford stepped into the role of Director of Conversions, Acquisition Marketing at The Motley Fool 3 years ago, he knew that experimentation would be a primary focus.
Nate understood that the ability to provide data-backed recommendations for digital experience improvements is crucial. And for a multimedia company like The Motley Fool, whose website receives millions of unique visitors each month, even a slight increase in customer acquisition rate could have a massive impact on the bottom-line.
Today, Nate is the Head of US Marketing Operations and Optimization at The Motley Fool, and the company is testing across the entire customer journey and multiple teams.
While experimentation has always been celebrated at The Motley Fool, Nate’s work in partnership with Widerfunnel has led to increased visibility for the experimentation program, massive revenue gains, and—perhaps most importantly—actionable customer insights.
This case study explores how Nate and his team are using experimentation as a fool-proof (pun intended) growth strategy.
The business context
As mentioned, The Motley Fool is an established multimedia financial services company.
Their website consists of pages on pages of useful, free content for investors; the company generates revenue through various stock, investing, and personal finance premium services. The Motley Fool’s funnel consists of three primary areas:
- User acquisition or front-end marketing: The goal being to encourage non-paying members to sign up and become paying members
- Upsell or back-end marketing: The goal being to encourage members paying for lower-tier services to purchase more premium services
- Product: The goal is to increase member engagement with the service(s) they’re paying for and increase member retention
Each area of the funnel is the responsibility of a different team within The Motley Fool, but everyone is focused on creating the best customer experience and increasing customer lifetime value.
Across the teams, we’re all really focused on customer lifetime value (LTV) for every stage. Whether it’s someone’s first product with us—getting them into a higher tier. And on the product side, the more we get people to renew, the higher their lifetime value. It’s a win-win for both the customer and us—they get great stock advice and recommendations and we get to retain them as a customer.— Nate Wallingsford, Head of US Marketing Operations & Optimization at The Motley Fool
As the leader of the user acquisition team, Nate’s main focus was to hit his revenue targets by increasing paid subscriptions. And experimentation was his tool of choice.
Finding the right experimentation partner
From the outset, Nate had almost everything he needed to build a successful experimentation program within his team: senior-level support, high website traffic, plenty of ideas for improvement. But he was lacking a process.
Nate was running what he refers to as “good-idea tests” – you have what you think is a good idea, you test it, it wins, loses or is inconclusive. If it wins, hooray! You can implement changes. But if it loses, it’s back to the drawing board.
As a self-proclaimed process-oriented person, he knew there had to be a better method. It was in this moment—searching for experimentation processes on Google—that Nate stumbled upon Widerfunnel Founder Chris Goward’s book, You Should Test That! He bought it, read it, and loved it.
I thought, ‘This book is awesome.’ I’m going to start building out our optimization program for acquisition around the Widerfunnel methodologies.— Nate Wallingsford
And he wasn’t kidding. Nate took everything he’d read in the book, as well as resources from the Widerfunnel website: The LIFT Model®, what makes a great hypothesis, the Infinity Optimization Process™, the PIE prioritization framework – and built a Trello board of the Widerfunnel process as he interpreted it.
Nate understood experimentation from day one. It was fantastic. He was already familiar with the LIFT Model and experiment design principles. We were able to jump into collaborative experiment ideation and strategy right away. You know, the fun stuff.— James Flory, Experimentation Strategist, Widerfunnel
Ultimately, Nate decided to partner with Widerfunnel to ensure his program would be as successful as possible: To solidify his use of these processes internally. To gain access to additional design and web development resources. And to collaborate with expert experimentation strategists who are testing across industries every day.
The power of fresh perspective
With every new client, there is a discovery phase. In the early days of our partnership, Nate and the Widerfunnel team dug into The Motley Fool’s most important user acquisition goals and conducted initial analyses of fool.com.
One of the first things we noticed was the lack of a prominent call-to-action to sign up for The Motley Fool’s premium services. Users were landing on the site and engaging with content, but there was no clear path to conversion. In fact, there was almost no indication that there were professional services available.
Nate and his team have a ton of knowledge about their product, their marketing, and their customer, but they had overlooked a seemingly common-sense improvement. This happens to marketers constantly. We are so close to our day-to-day, so wrapped up in our way of thinking, that we miss simple solutions.
Which is why a partner with a fresh perspective can be essential.
The first experiment Widerfunnel ran with us was to add a call-to-action button to our desktop site-wide navigation. It was super simple and seemed like a no-brainer, but the impact was insane. After that, I knew this was a good decision. And I knew it was going to be really cool to work with [Widerfunnel].— Nate Wallingsford
The experiment details
For this experiment, we added a “Latest Stock Picks” call-to-action in the navigation. This CTA replaced a dropdown menu labelled “Stock Picks”. The assumption was that Motley Fool users are looking for stock-picking advice, specifically.
Hypothesis: Creating a clear “Latest Stock Picks” CTA in the site-wide navigation will cause more users to enter a revenue-driving funnel from all areas of fool.com.
Two variations were tested. Each featured the “Latest Stock Picks” call-to-action. But in each variation, this CTA took the user to a different page. Our ultimate goal was to find out:
- If users were even aware that there were premium paid services offered, and
- Which funnel is best to help users make a decision and, ultimately, a purchase
In variation A, the “Latest Stock Picks” call-to-action sent users to the homepage and anchored them in the premium services section. This section provides detail about The Motley Fool’s different offerings, along with a “Sign Up Today” call-to-action.
With variation B, we wanted to experiment with limiting choice. Rather than showing users several product options, the “Latest Stock Picks” call-to-action sent them directly to the Stock Advisor service sign up page; this was Motley Fool’s most popular service.
Both variations beat the control. Variation A resulted in an 11.2% lift in transactions with 99% confidence and variation B resulted in a 7.9% increase in transactions with 97% confidence.
Interestingly, because variation B was built on variation A using factorial design, we were able to see that this change actually decreased transactions by 3.3%.
What is Factorial Experiment Design?
Factorial Design is a method of Design of Experiments. Similar to multivariate (MVT) testing, factorial design allows you to test more than one element change within the same variation. The greatest difference is that factorial design doesn’t force you to test every possible combination of changes.
Instead of creating a variation for every combination of changed elements, you can design an experiment to focus on specific isolations that you hypothesize will have the biggest impact or drives insights.
What were the insights?
In The Motley Fool’s initial experience, users may have been unsure of how to sign up (or that they could sign up) due to lack of call-to-action prominence on the original site-wide navigation. Users also seemed to prefer some degree of choice over being sent to one product (as seen with the decrease in transactions caused by variation B).
Following the customer insights
One of the biggest problems with the “good idea” testing that Nate had been doing is that it prioritizes conversion rate lift over insights, over learning. If an experiment won, great. If not, it had zero value. All that effort to ideate, design, and run the test was wasted.
But a great experimentation program will generate insights with every single test. And that’s what Nate began to build with Widerfunnel.
One series of experiments was particularly fascinating. It generated a core customer insight that The Motley Fool is still leveraging and validating throughout their digital experience. The initial idea was to leverage an extremely common persuasion principle: social proof. Adding social proof to any customer experience is widely accepted as a “CRO best practice”.
The experiment details
A secondary top-of-funnel metric for The Motley Fool is email sign-ups for the company’s email marketing list. For the first experiment on this funnel, we focused on the email capture modal. In one of our variations, we added a social proof statement: “Join over 121,837,512 other Fools who have come to The Motley Fool for investing insights.”
This tactic has worked for other Widerfunnel clients in the past, encouraging more users to enter a revenue-driving funnel. In this case, however, the variation tanked. It resulted in a -11.2% decrease in sign-ups.
In this experiment, social proof resulted in an extreme reaction from users, indicating high sensitivity around this persuasion technique. One theory was as follows: Rather than being comforted by the fact that others trust The Motley Fool, prospective customers may actually be looking for exclusivity.
Nate and Widerfunnel Strategist, James Flory, wanted to understand further. They ran another experiment, this time on the primary landing page for the email capture funnel. But they leveraged a different form of social proof: a customer testimonial.
The Hypothesis: Adding testimonials will make users trust this page as a place to submit their emails and improve email capture rates.
Again, Motley Fool users responded negatively. The social proof variation resulted in a slight decrease in conversions. Seeing this result, Nate thought about the testimonials splashed across the customer journey and wondered: Should we remove social proof throughout the funnel?
“We had started to test injecting social proof into our lead capture pages. And we saw a drag on conversions any time we did that. We tried adding social proof and trust elements on our video sales letter pages. That had a drag on conversion. And we thought it was strange. Adding social proof seems like a best practice, an industry standard,” explains Nate.
“So, we thought, let’s experiment with this on our order pages. These are at the bottom of the funnel, right before purchase. What happens if we remove the testimonials and social proof we have on order pages? Will we then see a lift because earlier in the funnel we saw a drag across the board?”
We ran another experiment, this time on the order page—the stage of the funnel that includes the point of purchase. In the variation, all customer testimonials were removed. This variation performed terribly, decreasing transactions, average order value and revenue per session. However, it generated several actionable insights:
- Previous learnings indicated social proof had a negative correlation with conversion rate. This experiment challenged that insight.
- It may be that, in the early stages of the user journey, users are not yet in a purchase state of mind and still crave exclusivity.
- Early stages of the funnel don’t hint at a paid service or subscription, but adding testimonials may put the thought of an upcoming sales pitch into the user’s mind, possibly triggering an exit or increased wariness.
- Inversely, when a user is exposed to a purchase decision, they respond positively to social proof which may reduce anxiety and increase trust and confidence in their decision.
That was really interesting to see. Even though we had a decrease in conversion rates across all three experiments, they generated this insight that social proof and testimonials are huge at the point of purchase, but may need to be avoided at the top of the funnel.— Nate Wallingsford
This series of experiments points to the importance of experimentation in general. If Nate had simply made changes to fool.com based on best practices, he might have seen conversion rates drop with no understanding as to why.
And if he hadn’t been leveraging an experimentation process to understand where to retest and revalidate insights (in this case, the threshold and elasticity of social proof), he might’ve just removed social proof lower in the funnel based on the initial experiment results, assuming that social proof doesn’t work.
Socializing experimentation: The importance of gaining visibility
Every marketer and product owner has growth targets they are trying to hit. Which is why achieving positive experiment results is hugely important. But visibility is crucial to the longevity of any experimentation program—on both winning experiments and ‘losing experiments’ that generate learnings.
Nate’s goal has always been to promote a culture of evidence-based decision-making at The Motley Fool.
Early on, Nate realized that the insights gained through process-based experimentation were a firestarter for even better tests. He wanted to spread this knowledge throughout the organization, so he began compiling his experiments and insights into a monthly email newsletter.
At first, Nate was just distributing this newsletter to the U.S. acquisition team. But people began to forward it on, and more Fools became interested in joining his distribution list. So, he began to scale this communication to other teams.
This newsletter became a key resource for other teams at The Motley Fool—specifically teams with lower website traffic. These teams lack the traffic volume to test at the same velocity as the acquisition team, but are able to leverage Nate’s insights and results to implement new experiences on their sites.
Today, Nate and his colleague Lauren conduct a weekly standup on experimentation. Attendees come from across the company—from marketing, technical, and editorial teams. This constant communication generates buzz and momentum around experimentation at The Motley Fool and is a key piece of Nate’s strategy.
The future of experimentation at the Motley Fool
At the beginning of this partnership, Nate was looking to leverage Widerfunnel’s expertise in experimentation and augment his resources to scale The Motley Fool’s experimentation program quickly. The relationship has since morphed into a highly collaborative partnership. Today, Nate and James feed off each other’s insights and ideas to develop new tests and experiences together.
The test ideation, optimization conversations, and overall rapport [between us and Widerfunnel] is exceptional. I feel like I’m having these conversations with my colleagues, not an agency.— Nate Wallingsford
Recently, Widerfunnel and The Motley Fool expanded their partnership to help drive testing strategy within The Motley Fool’s product experience. This aligns perfectly with Nate’s priorities for experimentation, which are:
- To enable deeper collaboration between the Marketing and Product teams and unify the new member journey from purchase to product experience
- And to optimize The Motley Fool’s mobile experience and improve monetization
“Things change quickly at The Motley Fool,” explains Nate. “I always try to prioritize the experiments that have the biggest potential business impact. It’s a big part of what has made our program successful, and will be a continued focus for our team.”
Subscribe to receive experimentation insights straight to your inbox.