Skip to main content

We are currently experiencing an issue that is preventing customers from submitting support tickets. Please contact us at (866) 819-4486 if you need immediate assistance.

x
Optimizely Knowledge Base

The Results page

This article will help you:
  • Find the Optimizely Results page
  • Understand and analyze conversion rate, improvement, difference interval, and statistical significance
  • Create annotations within the results
  • Filter results by date and change the variation that's used as the baseline
  • Export results and create custom views of your data for reporting
  • Make decisions about the data you see on the Results page

Optimizely's Results page allows you to see your experiments' performance and explore your visitors' behavior. You can use the results on this page to make key business decisions based on the data you see.

Want to see a quick overview of the Results page? Watch our short overview video.

 

Find more short videos about Results page features in our help video gallery. Or jump down for a quick video about advanced concepts on the Results page.

 
Note:

Results page features (including segmentation, custom reports, and annotation) vary based on your Optimizely plan type. If what you see doesn’t match what’s in this article, or you want to learn more about what’s available, please refer to our pricing matrix.

You may also want to learn how to segment your results, make business decisions based on results, or share results with your broader team.

Fundamentally, on the Results page, you’re looking at how each variation performed, measured by each goal in your experiment. The variation that was most successful at achieving the goal is the winner.

From the Editor: To view the current results for any experiment, click Options > View Results.

From the Home page: Select the experiment whose results you wish to see, and then in the right-sidebar click Results. You'll be taken to the Results page for that experiment.

Then you'll see the Results page for that experiment. On the Results page, you'll see:

  • performance summary showing high-level results for your experiment
     
  • Goal modules for each goal that you used to track success in your experiment, which may be expanded to show more information

Performance Summary

The performance summary shows you:

  • Unique Visitors: The number of unique visitors who were exposed to your experiment (this is based on visitors who received the optimizelyEndUserID cookie)
     
  • Days Running: How many days the experiment has been running and when it was started
     
  • Variations and Goals: Each variation's visitors and how each variation performed compared to the baseline for your goals

Goal Modules

By default, each goal module will display unique conversions, total number of visitors, conversion rates, difference intervals, improvement, and statistical significance for each variation. You can also view total conversions and conversions per visitor by clicking Totals on the goal module.

Variations in green can be considered clear winners compared to the baseline, while variations in orange can be considered clear losers.

For each module, you'll see:

  • Visitors: The number of unique visitors who reached this variation, based on visitors who received the optimizelyEndUserID cookie.
     
  • Unique Conversions: The number of unique visitors who triggered this goal. In this mode, Optimizely deduplicates conversions, so even if a visitor triggers a goal multiple times, it will only count once.
     
  • Conversion rate: The percentage of unique visitors who saw this variation and triggered this goal. This measures how often your variation was successful at achieving the conversion goal.
     
  • Total Conversions: You can toggle a goal module to show total conversions rather than unique conversions by clicking Totals.
  • Conversions per Visitor: You can toggle a goal module to show conversions per visitor rather than conversion rate by clicking Totals.  
  • Difference Interval: A range of possible values you would see when actually implementing this experiment.
     
  • Improvement: The relative difference between the observed conversion rate of a variation and the observed baseline conversion rate, expressed as a percentage -- for example, if your baseline conversion rate is 5% and your variation conversion rate is 10%, your improvement is 100%.
     
  • Statistical significance (formerly known as "Chance to beat baseline"): This essentially measures how confident we are that the difference we see in conversion rate between the Original and Variation is due to the changes you made on the page and is not due to randomness. Until Optimizely has enough data for statistical confidence, the Results page will tell you that we need more data. Optimizely will declare a winner when statistical significance is above the threshold you set in your Project Settings.

Total versus Unique Conversions

Optimizely also allows you to view both unique and total conversions.
 
 

Whereas the default Uniques metric set captures how many unique visitors converted on a single goal (i.e., de-duplicated conversions), Totals displays the total number of conversions per unique visitor.

This metric set allows you to easily measure repeat conversions with elements on your page, rather than just a single conversion. For goals in which repeat conversions are common or desired (such as how many total times was a social sharing button clicked), this could be a revealing lens to apply.

 
Note:

Goal modules by default will list the variations in descending order from oldest to newest. For experiments that have 10+ variations, such as MVT, the table will be sorted in descending order by Statistical Significance. In this scenario of having 10+ variations, you can also change the sort order of the table by clicking a column heading -- i.e. sort by Improvement, Conversions, Total Revenue, and so forth.  

 
Tip:

One goal you will see is Engagement. Engagement is the percentage of visitors who clicked any link or submitted any form while viewing a page on which this experiment was running. We recommend you add your own goals that tie to your business objectives before you start the experiment. You will then see these goals on the Results page.

For more information on how to make sense of the numbers that you see, read our articles on Optimizely's Stats Engine.

See our article on default mobile goals for more information on these goals: Average Session Length, Sessions Per Visitors Per Week, and Retention.

And if you'd like to launch a "winning" variation from our Results page, please see our article on taking action from your results.

Charts and Annotation

Click Show Charts to see a chart of conversion rate over time. 

 

Charts can be filtered by time by clicking and dragging the area below the chart. You can also create more focused visualizations for stakeholders by turning off variations. Just click the colored button by the variation you want to hide.

 
 

Charts can also be toggled to look at other metrics and visitor patterns.


The charts allow you to see how different factors affect your experiment results. The default chart (Conversion Rate) shows the cumulative conversion rate over the life of your experiment, but filtering by Visitors may reveal weekday/weekend traffic or abnormal spikes in traffic that you wouldn't have seen in the cumulative chart.  Or filtering by Conversions may reveal interesting conversion patterns that were not revealed in the Conversion Rate chart.

You can filter revenue in the following ways:

  • Revenue: provides a graphical rundown of revenue per day, through the lifetime of the experiment. Use this chart to identify when certain spikes in revenue occur, or days of the week when the revenue goal peaks.
     
  • Revenue distribution: displays the distribution of revenue events across purchase amounts, for each variation. Use revenue distribution to identify whether changes in revenue are driven by a large number of visitors making small purchases, or a small number of visitors making large purchases.

    You can display the distribution by purchase amount to quickly identify outliers. Or, you can bucket revenue results by quartile to see where the biggest impact is coming from.
     
  • Revenue per visitor: tracks how much revenue has been recorded for an experiment, divided by the number of visitors up to that point. This can be a good indicator of whether revenue through the lifespan of the experiment is trending upwards or downwards between variations.
     
  • Revenue per paying visitor: shows how one variation performs in terms of total revenue per paying visitor, compared to another. Optimizely calculates statistical significance for revenue per paying visitor.
     
  • Purchases: this tracks the number of purchases visitors make. Optimizely calculates statistical significance for purchases.
     
 
Note:

Enhanced revenue reports (which include revenue distribution, revenue per paying visitor, and statistical calculations for revenue per visitor and purchases) are available for all Optimizely plan types for web and mobile app testing. It's important to note that this view is only available in experiments that started on January 25, 2016 or later.

If you’re not already tracking revenue, enable revenue tracking in Optimizely to build these reports.

Sometimes it's helpful to make note of certain events that affect your experiment results, such as:

  • Experiment-related changes (like pausing the experiment, or making changes to a running experiment -- which we don't recommend)
     
  • External changes like spikes in traffic, redesigns, marketing campaigns, or site downtime
Enterprise users are able to annotate the Results page graphs to indicate changes. These annotations will be displayed to all collaborators on your account, so they are an effective tool for sharing information about events that may affect your results. To annotate a results chart, just do the following:
  1. Expand a goal module by clicking the Show Chart button.
     
  2. Hover over the graph where you'd like to add an annotation, then click the plus icon.


     
  3. Write your annotation, then click Add to complete it.


     
  4. Now, whenever you click View Annotations in the Goal Module, you'll see a vertical line on the chart to indicate that there is an annotation. You'll also see a panel appear on the right with a list of all annotations, most recent listed first.




     
  5.  To delete an annotation, click Delete underneath the annotation you wish to delete.

Filtering and exporting data

Choosing Your Date Range

At the top of the Results page, you will see a filter to set the date range for your report. Changing the date range will show you experiment results for the selected period of time.

In order to select an entire 24-hour period, select from 12:00am on Day 1 until 12:00 on Day 2. If you select only 12:00am - 12:00pm on the same day, this will capture only a 12-hour period.

 

The above image will capture a full 24-hour period from 00:00 on July 16th until 00:00 on July 17th.

 
Note:

Optimizely will calculate data for a provided date range as if the experiment had happened over that time period. To use an example: if the same visitor were to access an experiment on Day 1 and Day 2, the lifespan of the experiment would display one unique visitor. When filtering specifically for Day 1, this user would be counted as a unique visitor. In the same sense, when filtering specifically for Day 2, the user would also be counted as a unique visitor because they had entered the experiment on both days. 

 

Comparing filtered date ranges with non-filtered date ranges

If you compare the results page graphs for the entire period the experiment was running versus a custom date range, you may see some differences. The reason for this is that the results page loads 100 data points per time series each time you request a new date range. If the experiment results span 25 days, then there are approximately 4 data points per day. If you then select a single day using the custom date range picker, you will have 100 data points for that single day, which gives a greater level of granularity.

Additionally, with respect to how results are calculated, our results page will only take into account data points that fall within the date range you have selected. If a visitor has converted once on August 1st and once on August 3rd for the same goal, if you look at the graph for the entire time the experiment was running, you will see that conversion point on August 1st. However, if you then select a custom date range starting on August 2nd, you will now see that conversion counted towards August 3rd instead. 

 
Note:

If you zoom into the date range instead of selecting a date range using the date range picker, the conversion will remain attributed to August 1st. In other words, when zooming, we don't recalculate the conversion rate like we do when selecting a custom date range window - we simply zoom in on it! 

Choosing Your Baseline Variation

Let’s say you don’t want to compare the results of your variations to your original page. Maybe you want to compare their performance to a certain variation. Click the drop-down under Baseline: in the top-right of the Results page, and you can use a different variation as your baseline.

For a multivariate experiment (MVT), you can select which combination of sections and variations you want to use as your baseline from this dropdown. You can also use this drop-down to hide entire sections from results, by unchecking a checkbox.

Exporting CSV and Sharing Reports

At the top of the Results page, you have the ability to export your results in CSV (spreadsheet) format, or to share the report that you are currently viewing. Clicking Share gives you the option to download a CSV or generate a link that points to the current view.

Filtering out specific visitor IP ranges

You can exclude visitors from certain IP ranges, such as your internal employees, from affecting the results of your experiment. To learn how, read our article on IP filtering.

Results views

All plan types can see default views so you can visualize and interpret results. Enterprise customers can also create and share custom views. Results views help you package and share results data in ways that are interesting to other stakeholders in your organization.

In the right-hand column of the screen, you'll see the default views available:

  • All goals - Shows each goal in your experiment, sorted by variations
     
  • Revenue - Shows several views of revenue, sorted by variations
  • All variations - Shows each variation in your experiment, sorted by goals 
     
  • Primary goal - Shows only the goal you've selected as your primary goal -- when you select this view, you'll be able to select your primary goal from a dropdown


     
  • Winning and losing goals - Shows only goals that currently have winning or losing variations (not inconclusive)

Enterprise customers can also create custom views, by clicking Add View. This will display a list of add () buttons that let you add and compare specific goals and/or variations. 

When you save a view, you can also use filters (described above) to customize the view, for example by date range or visitor segment. As you're creating a new custom view, you can also choose specific segments for these goals or variations (see the segmentation section above).

Custom Views are especially useful when paired with Custom Dimensions, because the two in tandem allow you to segment your results and present the findings.

 
Tip:

Did you know that you can segment your results to see how your experiments performed for certain groups of visitors, rather than the whole? Check out our article on segmentation to learn more about how you can segment based on Audiences, Dimensions, or data from your other platforms!

What you may not know about the Results page

Learn more about statistical significance, risk mitigation, how to read and report on results correctly, in just under 7 minutes.