- Find the Optimizely Results page
- Understand and analyze conversion rate, improvement, difference interval, and statistical significance
- Create annotations within the results
- Filter results by date and change the variation that's used as the baseline
- Export results and create custom views of your data for reporting
- Make decisions about the data you see on the Results page
Optimizely's Results page allows you to see your experiments' performance and explore your visitors' behavior. You can use the results on this page to make key business decisions based on the data you see.
Want to see a quick overview of the Results page? Watch our short overview video.
Results page features (including segmentation, custom reports, and annotation) vary based on your Optimizely plan type. If what you see doesn’t match what’s in this article, or you want to learn more about what’s available, please refer to our pricing matrix.
Fundamentally, on the Results page, you’re looking at how each variation performed, measured by each goal in your experiment. The variation that was most successful at achieving the goal is the winner.
From the Editor: To view the current results for any experiment, click Options > View Results.
From the Home page: Select the experiment whose results you wish to see, and then in the right-sidebar click Results. You'll be taken to the Results page for that experiment.
Then you'll see the Results page for that experiment. On the Results page, you'll see:
- A performance summary showing high-level results for your experiment
- Goal modules for each goal that you used to track success in your experiment, which may be expanded to show more information
The performance summary shows you:
- Unique Visitors: The number of unique visitors who were exposed to your experiment (this is based on visitors who received the optimizelyEndUserID cookie)
- Days Running: How many days the experiment has been running and when it was started
- Variations and Goals: Each variation's visitors and how each variation performed compared to the baseline for your goals
By default, each goal module will display unique conversions, total number of visitors, conversion rates, difference intervals, improvement, and statistical significance for each variation. You can also view total conversions and conversions per visitor by clicking Totals on the goal module.
Variations in green can be considered clear winners compared to the baseline, while variations in orange can be considered clear losers.
For each module, you'll see:
- Visitors: The number of unique visitors who reached this variation, based on visitors who received the optimizelyEndUserID cookie.
- Unique Conversions: The number of unique visitors who triggered this goal. In this mode, Optimizely deduplicates conversions, so even if a visitor triggers a goal multiple times, it will only count once.
- Conversion rate: The percentage of unique visitors who saw this variation and triggered this goal. This measures how often your variation was successful at achieving the conversion goal.
- Total Conversions: You can toggle a goal module to show total conversions rather than unique conversions by clicking Totals.
- Conversions per Visitor: You can toggle a goal module to show conversions per visitor rather than conversion rate by clicking Totals.
- Difference Interval: A range of possible values you would see when actually implementing this experiment.
- Improvement: The relative difference between the observed conversion rate of a variation and the observed baseline conversion rate, expressed as a percentage -- for example, if your baseline conversion rate is 5% and your variation conversion rate is 10%, your improvement is 100%.
- Statistical significance (formerly known as "Chance to beat baseline"): This essentially measures how confident we are that the difference we see in conversion rate between the Original and Variation is due to the changes you made on the page and is not due to randomness. Until Optimizely has enough data for statistical confidence, the Results page will tell you that we need more data. Optimizely will declare a winner when statistical significance is above the threshold you set in your Project Settings.
Total versus Unique Conversions
Whereas the default Uniques metric set captures how many unique visitors converted on a single goal (i.e., de-duplicated conversions), Totals displays the total number of conversions per unique visitor.
This metric set allows you to easily measure repeat conversions with elements on your page, rather than just a single conversion. For goals in which repeat conversions are common or desired (such as how many total times was a social sharing button clicked), this could be a revealing lens to apply.
Goal modules by default will list the variations in descending order from oldest to newest. For experiments that have 10+ variations, such as MVT, the table will be sorted in descending order by Statistical Significance. In this scenario of having 10+ variations, you can also change the sort order of the table by clicking a column heading -- i.e. sort by Improvement, Conversions, Total Revenue, and so forth.
One goal you will see is Engagement. Engagement is the percentage of visitors who clicked any link or submitted any form while viewing a page on which this experiment was running. We recommend you add your own goals that tie to your business objectives before you start the experiment. You will then see these goals on the Results page.
For more information on how to make sense of the numbers that you see, read our articles on Optimizely's Stats Engine.
See our article on default mobile goals for more information on these goals: Average Session Length, Sessions Per Visitors Per Week, and Retention.
And if you'd like to launch a "winning" variation from our Results page, please see our article on taking action from your results.
Charts and Annotation
Click Show Charts to see a chart of conversion rate over time.
Charts can be filtered by time by clicking and dragging the area below the chart. You can also create more focused visualizations for stakeholders by turning off variations. Just click the colored button by the variation you want to hide.
Charts can also be toggled to look at other metrics and visitor patterns.
The charts allow you to see how different factors affect your experiment results. The default chart (Conversion Rate) shows the cumulative conversion rate over the life of your experiment, but filtering by Visitors may reveal weekday/weekend traffic or abnormal spikes in traffic that you wouldn't have seen in the cumulative chart. Or filtering by Conversions may reveal interesting conversion patterns that were not revealed in the Conversion Rate chart.
You can filter revenue in the following ways:
- Revenue: provides a graphical rundown of revenue per day, through the lifetime of the experiment. Use this chart to identify when certain spikes in revenue occur, or days of the week when the revenue goal peaks.
- Revenue distribution: displays the distribution of revenue events across purchase amounts, for each variation. Use revenue distribution to identify whether changes in revenue are driven by a large number of visitors making small purchases, or a small number of visitors making large purchases.
You can display the distribution by purchase amount to quickly identify outliers. Or, you can bucket revenue results by quartile to see where the biggest impact is coming from.
- Revenue per visitor: tracks how much revenue has been recorded for an experiment, divided by the number of visitors up to that point. This can be a good indicator of whether revenue through the lifespan of the experiment is trending upwards or downwards between variations.
- Revenue per paying visitor: shows how one variation performs in terms of total revenue per paying visitor, compared to another. Optimizely calculates statistical significance for revenue per paying visitor.
- Purchases: this tracks the number of purchases visitors make. Optimizely calculates statistical significance for purchases.
Enhanced revenue reports (which include revenue distribution, revenue per paying visitor, and statistical calculations for revenue per visitor and purchases) are available for all Optimizely plan types for web and mobile app testing. It's important to note that this view is only available in experiments that started on January 25, 2016 or later.
If you’re not already tracking revenue, enable revenue tracking in Optimizely to build these reports.
Sometimes it's helpful to make note of certain events that affect your experiment results, such as:
- Experiment-related changes (like pausing the experiment, or making changes to a running experiment -- which we don't recommend)
- External changes like spikes in traffic, redesigns, marketing campaigns, or site downtime