relevant products:
  • Optimizely X Web Experimentation
  • Optimizely X Web Personalization
  • Optimizely Classic
  • Optimizely X Mobile
  • Optimizely X OTT
  • Optimizely X Full Stack
  • Optimizely Classic Mobile

This article will help you:
  • Identify discrepancies with third-party data
  • Implement best practices to avoid data discrepancies

Optimizely offers many out-of-the-box analytics integrations in addition to developer tools which enable teams to use custom integrations, and has many partners offering third party integrations. With all of the analytics platforms you can use to measure the impact of your A/B tests, you’ll quickly find out that your numbers will not match perfectly from one platform to the next.

If you’re seeing a discrepancy in your data, it’s important to make sure you’re following some best practices. These are to ensure that any involved platforms or datasets are measuring the same thing. If your datasets are not based on the same information, user set, or activity, then they will probably not align.

Common Causes

The vast majority of data discrepancies that we’ve seen came down to root causes covered in this section. If you aren’t comparing the same data, then it probably won’t match up. Below are some best practices to follow to ensure that your data can be compared to Optimizely results. You may find that you can get the results to better align just by making adjustments to your reports.

Not Integrated

If you don’t have an integration between Optimizely and your other platforms, there should be no expectation that they will match. Users need to be tagged with Optimizely experiment and variation information when they are exposed to an A/B test. Your other platforms’ data should be filtered to only show that which contains experiment and variation information.

We offer many out-of-the box integrations that can be enabled easily. If there is no integration for the platform you want to integrate with, you or your developers can create a custom analytics integration following this guide:  https://developers.optimizely.com/x/integrations/#custom-analytics-beta-

Filters

Optimizely and other platforms have many options for results segmentation. You can drill your results down to different categories of users and attributes such as web browser, location, language, plan type, etc. Reports can also be narrowed to specific date and time ranges, and results from certain IPs can be filtered out. Check the following for consistency from platform to platform:

  • User attributes
  • Date / Time range
  • IP Filtering
  • Bot Filtering

User scope

Optimizely X Web Experimentation, Full Stack, Mobile, and OTT calculate results based on unique visitor counts whereas Optimizely X Personalization computes based on unique sessions. You can read more about how visitors are counted here. Other platforms may count results differently resulting in different counts for users who are tagged as having seen an Optimizely experiment. It’s important to make sure you understand how each platform counts so you can account for any differences.

For example, Optimizely and Google Analytics use a different “visitor” definition. 

  • Google Analytics uses a tracking call that is session based, meaning a single visitor can trigger multiple visits over a given period of time (GA Support Article)

  • Optimizely, on the other hand, uses a 10 year cookie and counts unique users.

Events

If you have similar visitor counts but different conversion counts for an event, you’ll want to take a closer look at the event in each platform. Two similarly named events aren’t necessarily tracking the exact same action taken by users.

For example, you may have a “Signup completed” event tracking a form submission. In Optimizely, you may have this configured as a “submit” button click metric or a confirmation page view metric. While each of these events represents the same thing — the user submitted the form — they are not tracking the same action, which can lead to a discrepancy. The user may have clicked the submit button without having filled out the form, or maybe he or she submitted the form by pressing “enter”, bypassing the button click event.

If you need to, check with your engineers to see how each event is being tracked. Make sure your events are tracking the same thing the same way. If they don’t then you will have dissimilar data.

Audience

In Optimizely, a visitor’s actions only count toward A/B tests for which they still meet the audience conditions. If your other analytics platform continues to track the user after he or she no longer meets the audience condition, then our results page may show lower total conversions.

Attribution

Optimizely’s attribution model may differ from that of other analytics platforms. Read about Optimizely's attribution model.

Optimizely has decision-first counting which means that we only count conversions from users who have previously sent us a decision event. When an A/B test activates, the decision event fires. If a conversion event fires before the decision, it wouldn’t be counted on the results page but may be counted in your other analytics platform.

Segmentation values may have differing attribution as well. In the article above you’ll see how we attribute segment values at the user level using the most recent value for each segment in each session.

Other Discrepancies

Discrepancies caused by differences in event tracking will probably not be able to be aligned without making adjustments and running a new experiment. A new experiment is able to track new information about visitors that might aid in a future investigation into the root cause of a data discrepancy. Best practice for these types of discrepancies is to run an A/A test, make adjustments, and then run another A/A test to verify that the issue has been corrected.

Bots

Optimizely X Web Experimentation as well as other analytics platforms offer bot filtering for results, but Optimizely X Full Stack does not. If you’re seeing higher counts in an Optimizely X Full Stack experiment than you’re seeing in your other analytics, it could be that they are filtering out bots and Optimizely is not.

If you want to filter bots for Optimizely X Full Stack, you can use getVariation() for bots and activate() for real visitors. That will forgo sending an impression event for any bots which will cause them to not be counted on the results page.

Content blockers

Many internet users these days are using content blockers such as Adblock or Ghostery to block trackers and advertisers. These content blockers are capable of blocking any client-side trackers including Optimizely X Web Experimentation. Content blockers can not however block any server-side experimentation you do using Optimizely X Full Stack. Since the impression event and tracking happens in the back end, the visitor’s client-side content blocking will not be effective. This could be a reason why Optimizely X Full Stack would count higher than a third party platform. It’s valuable to know what portion of your end users have content blocking extensions enabled.

Timing

A difference in counts could be due to the relative distance between the points in time when Optimizely counts the visitor and when the other platform counts the visitor. During the page load process, as time elapses, the visitor’s web browser makes requests for resources in the page and sends information to various providers. Eventually, when all requests are completed and your browser renders the page with all of its images, the page will be completely finished loading. 

The best practice for Optimizely X Web Experimentation is to have your snippet be one of the first resources in your page (near the top of the <head> tag) and it should load synchronously (blocking). We recommend synchronous because we want to make sure that the visual edits you make in your A/B tests are ready to display before the content exists. This helps avoid a problem where the original version of the page shows before the variation renders (a flash of original content).

Since Optimizely is the first thing to run, we activate A/B tests and get ready for page targeting and everything outlined here. We send events as soon as possible, so Optimizely will begin making asynchronous requests to fire impression events and page views. If you’re running Optimizely X Full Stack in your web server, you may have events fire before the server responds to the visitor’s browser before any of the resources start to be downloaded.

General analytics scripts do not need to be one of the first things to load. In fact, Google Analytics recommends implementing after the <head> tag. What this means is Optimizely counts a visitors a considerable amount of time before most third party analytics. Because a duration exists between points in time an opportunity for visitors to close the browser, drop internet connection, click back, etc., and not be counted by the latter platform

Because this duration where Optimizely has counted the visitor but others have not exists, users have the opportunity to close the browser tab, bounce, or lose internet connection. This usually manifests as Optimizely having higher counts than other platforms.

Adjust Timing in Optimizely X Web Experimentation

You can choose when Optimizely should send its events even with the snippet being one of the first things to load. The default behavior is for Optimizely to send events as soon as possible, but we have a new feature (in beta) for users who want to align their data with another platform may find it worth delaying Optimizely’s events.

We have two APIs for you: holdEvents() and sendEvents()

  • holdEvents() – when called, all subsequent events are kept in a queue. Use this before Optimizely to hold all events.

  • sendEvents() — when called, the queue created by holdEvents() is sent in a batch as one network request. 

(times shown for illustrative purposes only)

If you would like access to this feature, file a support ticket. You or your developers can deploy these APIs in Project Javascript or in an external script. Make sure to use sendEvents() around the same time as when your other platform fires its impression event.

Adjust Timing in Optimizely X Full Stack

With Optimizely X Full Stack running in a web server, it might not even be a browser that is making the request. If you have a service that requests the page, you will want to make sure that this does not trigger an A/B test activation. Since these types of requests don’t end up rendering the page and running the Javascript in the page, other analytics can’t track the activation when Optimizely can.

Disparities may be the most exaggerated when comparing Optimizely X Full Stack results to client-side, after the <head> implemented analytics because that setup has the largest delay between when the user is counted on each platform.

Note: Times shown are for illlustration purposes only.

Optimizely Full Stack gives you complete control of when to send the impression event. You can activate() which will send the impression event and return the variation, or you can getVariation() which will just return the variation.

If you’re using Feature Flags, then your impression event will fire when you use isFeatureEnabled(). You don’t need to use isFeatureEnabled() before getting the values of the variables — only use isFeatureEnabled() when you want to record and impression on an A/B test using the feature.

In a Full Stack experiment running on a web server, you’ll want to get a user’s variation when he or she makes a request for the page. You’ll use the variation key to respond to the request with one page or another page with edits to it.

If your activations happen too soon for your results to align, you can move the activation into the front end by adding the Javascript SDK and using activate() at the end of the right before your other analytic's JavaScript.

Getting Support

We want to make sure you have the help that you need in case the tips above don’t point you to anything conclusive. There are three types of integrations you can have with Optimizely, so to get the best support please keep reading. When submitting support requests for data discrepancies and integrations, please provide as much detail as possible including screenshots and code samples. If you’re having an issue with a custom integration, we’ll need to know every detail of how the integration is set up and what numbers you’re comparing to Optimizely.

First Party Integrations

Integrations that Optimizely offers out of the box with its Optimizely X Web Experimentation product are supported by Optimizely Support. Feel free to file a support ticket to get help with implementation and results discrepancies.

Custom Analytics Integrations

Custom Analytics Integrations are a developer API offered and supported by Optimizely. Although we are not able to fully support results discrepancies that may occur using these integrations, we are happy to take a look at your implementation and provide guidance using our APIs.

Third Party Integrations

Many of our partners offer integrations with Optimizely. These integrations are supported by the partners who develop them. Please reach out to the partner’s support team for assistance.