How to turn user research into revenue-driving experiments
It can be tempting to rush into conversion optimization. You may have a powerful tool and a lot of ideas based on tips, best practices, what your competitor is doing today…
But when you develop hypotheses to test based on intentional user research and information-gathering, you are setting your program up for major success.
In this new case study with Frontline Solvers, an analytics software vendor, you’ll learn how an in-depth information-gathering and experiment ideation phase over 12 weeks led to three consecutive experiments that each resulted in +15% lift in conversions.
A bit of background
Frontline Solvers is a leading vendor of advanced predictive and prescriptive analytics software. It is best known for having developed Solver, a mathematical optimization tool included in Microsoft Excel.
Last year, the President at Frontline Solvers, Daniel Fylstra realized that the company wasn’t making enough progress toward improving conversion rates on the website. The problem was resources: His team had to focus on other priorities such as new education offerings and product refinement. But Dan knew that conversion optimization was too important to ignore.
So, he began the search for an ideal solutions partner, and found one in WiderFunnel.
While leadership at Frontline understood the importance of conversion optimization, they wanted to be sure that the optimization program was focused on the company’s most important business problems. So, the first three months of our partnership focused solely on user research and exploration.
We wanted to achieve three main goals:
- Gain a better understanding of Frontline’s online visitors
- Gain a better understanding of how to position Frontline Solvers’ products and services in a competitive marketplace
- Develop strong hypotheses to validate user research findings through experimentation
WiderFunnel conducted a 3-month research-only engagement, followed by three experiments to validate the findings.
During this time, WiderFunnel Strategist, Michael St Laurent launched online polls and sent out customer surveys; he explored the impact of psychological triggers; he collected data from clickmaps, scrollmaps, user session recordings, eye-tracking, and more; he broke down Frontline’s user journey; and finally, he presented analyses and recommended hypotheses to test.
The research informed several hypotheses that targeted different areas of Frontline’s funnel. Experiment 1 was conducted on the homepage and the product overview page; experiment 2 on the product detail page; and Experiment 3 was conducted on the site-wide navigation. Each of these three experiments aimed at improving user experience and increasing conversions of Frontline Solvers’ primary KPI.
And each was an A/B cluster test. We didn’t isolate small changes. Instead, we tested dramatic redesigns. And we did so with confidence because each redesign was addressing specific barriers that were discovered during the user research phase.
The initial 3-month engagement led to three strong hypotheses that tackled Frontline Solvers’ biggest business problems, each of which increased registrations for Frontline software by over 15% with high confidence.
Read the full case study to:
- Discover the overarching questions that guided the WiderFunnel Strategy team’s in-depth user research;
- Understand how each of the three hypotheses were informed by different user research tactics;
- See what needed to be improved on the Frontline Solvers website in order to address user frustrations;
- Gain insight into the business problems that were identified through user research and their potential negative impact.
6 mistakes that will derail your testing program (and how to not make them)
In this free 70-page guide, we describe 6 of the worst mistakes optimization teams make in their design of experiments. And we show you the right approach to take.Get your guide now