|Business need:||Better understanding of online visitors and improved conversion rate|
|Solution:||In-depth user research and funnel optimization|
|Case study PDF:||Case Study PDF|
Frontline Solvers is a leading vendor of advanced predictive and prescriptive analytics software. It is best known for having developed Solver, a mathematical optimization tool included in Microsoft Excel. But its full product line includes tools for data mining and machine learning, Monte Carlo simulation and risk analysis, and optimization, both inside and outside Excel.
Frontline’s tools have been used by more than 8,500 companies, while more than 500,000 students have learned analytics methods using Frontline’s tools in university MBA programs.
Last year, the President at Frontline Solvers, Daniel Fylstra realized that the company wasn’t making enough progress toward improving conversion rates on the website. The problem was resources: His team had to focus on other priorities such as new education offerings and product refinement. But Dan knew that conversion optimization was too important to ignore.
So, he began the search for an ideal solutions partner, and found one in WiderFunnel.
While Dan understood the importance of conversion optimization, he wanted to be sure that his optimization program was focused on Frontline’s most important business problems. So, the first three months of our partnership focused solely on the Explore Phase.
The goal was to leverage WiderFunnel’s strategy and research capabilities to provide a set of powerful hypotheses, gain a better understanding of Frontline’s online visitors, and a better understanding of how to position their products and services.
The result? The first 3 experiments launched resulted in 3 winning test results that produced over 15% lift in conversions each.
What is the Explore Phase?
The Explore Phase is the first phase in WiderFunnel’s Infinity Optimization Process™. It is all about information-gathering and experiment ideation.
During Explore, an Optimization Strategist gathers insights about your business and customers through many different sources, like:
- Stakeholder interviews. These interviews reveal customer and business insights from your customer-facing teams, like Sales and Customer Support
- Extensive user research. Leveraging clickmaps, scrollmaps, eye tracking, polls, user session recordings, etc
- Web analytics review. To gain a deeper understanding of your customer journey
- Voice-of-customer analysis. To learn how your users are engaging with and describing your product or service
- Review of past, relevant experiments within WiderFunnel’s Test Archive
All of this data-gathering is centered around the LIFT Model®, which is our framework for understanding your customer’s conversion barriers and persuasion opportunities.
For Frontline, the in-depth Explore phase took place over 12 weeks.
WiderFunnel Strategist, Michael St Laurent launched online polls and sent out customer surveys; he explored the impact of psychological triggers; he collected data from clickmaps, scrollmaps, user session recordings, eye-tracking, and more; he broke down Frontline’s user journey; and finally, he presented analyses and recommended hypotheses to test.
What we learned during Explore
Online polls and user surveys
Early in the engagement, Mike got permission from Frontline to run user polls on the company’s product pages. He had a few initial ideas about problems that users were experiencing. But he wanted more direct feedback from the users themselves.
He set up several polls to try to answer the following questions:
- Are customers getting the information they need to choose a product?
- Do customers really understand which product is right for them?
- How could Frontline improve their product pages?
- What are the differentiators that cause customers to choose Frontline over a competitor?
- How do customers feel about payment terms and commitment?
Note: These are not the questions that Mike used in the actual polls. These are the overarching questions he was trying to answer.
After six weeks of collecting feedback, Mike discovered several key takeaways.
One of the most important was that users were unsure about what action they should take on the product pages. There were too many conflicting calls-to-action, and users were split between them. Should they download a free trial, live chat, call Frontline, access pricing, register for access…?
Hypothesis to validate:
- Focusing users on fewer and more direct calls to action will lead to more registrations for Frontline Solver.
Scrollmaps, heatmaps and clickmaps
User engagement tools like heatmaps and clickmaps can help you determine user intent. User engagement with the elements on a web page communicates a lot about which elements they prefer over others. And this information can help you design for probability over possibility.
For Frontline, Mike wanted to know:
- Where do users go instinctively?
- What % of users is seeing specific, important content?
- Is there content that is receiving a lot of prominence on the page, but is not being engaged with?
- When comparing multiple calls-to-action, which are more commonly clicked?
Engagement was tracked on the Frontline homepage, product overview page, and product details page.
The data revealed several compelling takeaways, like:
- On the homepage, the main call-to-action was not attracting clicks.
- Visitors weren’t scrolling down the page, and were missing the great customer testimonials on this page.
- On the product page, visitors were clicking all over the place, which indicated that they may have been confused.
These are just three of the insights gathered from user engagement tools (there were many more). Each of the above would inform the final hypotheses, however.
Breaking down the customer journey
When looking at Frontline’s buyer cycle, Mike wanted to analyze it against the ‘ideal cycle’.
The WiderFunnel Strategy team scored Frontline within each stage of the ‘ideal’ journey out of 10 to try to identify areas for improvement. The Introduction to Solution – Product Overview; the Feasiblity Check – Pricing / Trial; and the Free Trial – Education, all received relatively low scores.
These stuck out as touch points within the buyer journey that Frontline should focus on improving first.
With these insights in mind, Mike began to construct Frontline’s ideal customer journey.
Awareness: Problem Identification (Homepage)
Frontline needed to…
- Clarify that the user is in the right place by clearly stating the types of problems that they solve
- Provide visual cues to help the user identify the business model
- Preview what the end goal will be (Free Trial)
- Visually guide users to the next step: Introduction to solutions
Awareness: Introduction to Solutions (Product Overview Page)
Frontline needed to…
- Present options as solutions first, then features second.
- Communicate the free trial is the same for the majority of products to set proper expectation and reduce the user’s cognitive load.
- Guide users to the correct product for their need and advance them to the Detailed Inquiry stage.
Consideration: Detailed Inquiry (Product Details Page)
Frontline needed to…
- Meet the feature requirements of the prospect
- Show that other customers trust the product
- Guide users into pricing / a product trial
Consideration: Feasibility Check (Product Details Page)
Frontline needed to…
- Prove this product is feasible for the user (pricing, terms, etc.)
- Build enough value to make it worth trying
- Go for the hard ask in terms of the product trial
The Validation Phase
Experiment 1: The homepage & product overview page
In the first post-Explore experiment, Mike and the Frontline team wanted to make some big changes to both the homepage and the product overview page.
This was a multi-page A/B test, which pitted two fully redesigned pages against the Control. While we often prefer to test isolated changes, in this case, the Strategy team felt confident in clumping the changes, because the redesigned experience was based on insights gathered during the three month research phase.
Major changes included:
On the homepage
- Changed the hero image to more clearly show the product and reduce distraction caused by the image of a woman looking away from the call-to-action
- Changed the primary call-to-action from a video link to one that guides people to the product overview page
- Increased the prominence of social proof with logos of reputable companies that use Frontline Solvers
- Changed “Tabs” area to clarify one primary call-to-action
- Created call-to-action hierarchy in video area
- Increased the prominence of testimonials (another social proof factor)
On the product overview page
- Attempted to limit the user’s choices, making product selection as simple and guided as possible
- Highlighted one default option for users who may need guidance; then, showcased three other options for users who are looking for something specific
- Eliminated several product options to drive users to the most popular products.
- Added a FAQ section under the assumption that many users are not sure what to do
The redesigned experience led to a 21.53% increase in registrations for Frontline, with 99% confidence.
The new homepage was dramatically more effective at moving users through to the product overview page (77% more users, in fact). Users seemed to respond to the revised above fold content and clear call-to-action.
The large increase in users moving to the product overview page was used as an indicator that the first page of the multi-page experiment was having a positive impact.
We also noted that fewer users were clicking on the tabulated content and tutorial videos. Because more users were moving through the funnel (due to the clear, singular CTA), fewer users were engaging with the content lower on the page.
This is okay! It’s normal to see some degree of goal cannibalization and should be expected. Users can only do one thing at a time, and as long as we are moving clicks to more valuable funnels, that is a success!Michael St Laurent, Senior Strategist, WiderFunnel
Experiment 2: The product detail page
The product (platform) detail page was the focus of the second experiment. This test was another A/B test, with the variation being a partial redesign.
This test addressed insights gathered from the user session recordings and eye-tracking analytics. Namely that users were distracted by the large face in the hero image, and were not interested in the primary call-to-action offering.
The hypothesis: Clarifying the primary purpose on this page, reducing confusion, and improving the form user interface (UI) will lead to more conversions.
This test led to a 21.74% increase in downloads of the Analytic Solver Platform (Frontline’s most popular product), at 98% confidence.
It is clear that the original page was causing some confusion for many visitors. As indicated by the eye-tracking data, users were unsure of the goal of this page. We saw a huge increase in clicks to the primary call-to-action: from 3% to 10%. This suggested that many users wanted to follow the “Free Trial” trail, but may not have understood how to fulfill that action.
The decrease in interactions with the video call-to-action suggests that it may have been a distraction in its primary position in the Control.
Finally, there was a lack of clarity surrounding the prominence of the login field in the control. This field was positioned below the form; returning users may have been re-filling out the form, before realizing they could login below. Improving the user interface resulted in more users actually finding the login.
Overall, it is clear the sum of these changes had a positive impact on registrations.
Experiment 3: Site-wide
Users browse and behave differently on websites based on their expectation of a business model. A user should be able to understand the business model and goal of a website within seconds.
Frontline’s website, however, was straddling two business models. It had e-commerce features such as a product drop-down, product style pricing, cart, and search. But it also had SaaS features like a Free Trial, subscription license terms, and a user login.
With the third experiment, the goal was to address the potential issues that were identified during this analysis of Frontline’s business model.
This A/B test featured a redesigned site-wide navigation, meant to set user expectations around what type of business Frontline Solvers actually is.
In the original site-wide navigation, a user could “Contact Us”, explore a “Students” CTA, visit the “Help Desk”, “Live Chat”, search, log in, or browse products, examples, support, or order tabs.
This nav was also lacking a single, key call-to-action, to push users into a profitable funnel from any page on the website.
The redesigned navigation minimized all secondary calls-to-action, and clarified the main call-to-action: “Free Trial”. Users can immediately identify “Free Trial” → subscription-style business model.
The redesigned navigation led to a 15.43% increase in free trial registrations.
With increased prominence of the Free Trial CTA and layout of the navigation bar, more users were entering the primary funnel. Users were able to easily navigate to the “Download Platform” page.
Also, clarifying that a free trial is the outcome may have better set user expectations of what will be required of them later on.
The power of exploration
It can be tempting to rush into conversion optimization. You may have a powerful tool and a lot of ideas based on tips, best practices, what your competitor is doing today…
But when you develop hypotheses to test based on as little as 12 weeks of upfront research and information-gathering about your customers, you are setting yourself up for major success.
You’ll note that each of these three experiments was an A/B cluster test. We didn’t isolate small changes. Instead, we tested dramatic redesigns. And we did so with confidence because each redesign was addressing specific barriers that were discovered during Explore.
Frontline was able to take giant strides relatively quickly, with minimal risk of a failing variation. That’s the power of Explore.
This engagement with WiderFunnel was far more effective than our own previous conversion optimization efforts – and that was clearly a result of their up-front exploratory research, application of their experience with many previous tests, and their systematic approach. Thanks to careful goal definition and data gathering, we learned a lot about user behavior from each test, beyond the significant “lift” in free trials of our software. This has made an immense positive impact on our marketing.– Daniel Fylstra, President, Frontline Systems Inc (Frontline Solvers®)
Discover how your experimentation program stacks up!
Benchmark your experimentation maturity with our new 7-minute maturity assessment and get proven strategies to develop an insight-driving growth machine.Get started