Skip to main content

We are currently experiencing an issue that is preventing customers from submitting support tickets. Please contact us at (866) 819-4486 if you need immediate assistance.

x
Optimizely Knowledge Base

Your Pre-Launch Checklist: QA your experiment before launch

This article will help you:
  • Use a thorough QA process to make sure your experiments work correctly before they go live
  • Make sure that your changes appear correctly and on-time
  • Make sure that visitors are included or excluded as you expect
  • Make sure that goals are tracking correctly
  • QA an Optimizely web experiment (or, learn about QA for mobile experiments instead)

Quality assurance, or QA, is the last of six steps in creating an experiment. A rigorous QA process ensures that your experiment looks and works the way you want before you show it to visitors. Once you’ve finished building your experiment, take a little extra time to verify your setup so you can trust your results.

Many mature optimization teams have access to separate staging and production environments. Separate environments can save you a headache; they reduce the risk of accidentally launching an unfinished or unverified experiment to the live site.

If you have separate environments, use a staging environment to set up the experiment and perform initial QA without exposing the experiment to visitors. Then, duplicate the experiment into your production environment project, and perform the same QA process there. When possible, ensure that your production and staging environments match so experiments function the same way in both.

If you don’t have a dedicated staging environment, don’t worry. You can still QA your experiment. Perform all the steps outlined below, but take special care when you use the test cookie to QA in a live environment.

Ultimately, the most effective QA process really depends on the processes that are in place in your organization. Use the framework in this article to build rigor into your QA steps and integrate your testing process with company practices.

What you need to get started:
Materials to prepare
  • QA use case checklist that you prepared during the experiment planning stage that identifies:
    • All new functionality to QA
    • All goals and analytics to be captured
    • All user flows into the experiment
    • Places for regression testing to verify that the new experiment doesn't break existing functionality
People and resources
    • Developer or experiment implementer
    • Power user
    • QA team
Actions you'll perform 
    • Verify that components in the production environment match the staging environment
    • Verify that the experiment functions correctly in staging
    • Verify all experiment changes for the purposes of regression testing
    • Duplicate experiment to the production environment project
    • QA in production environment
Deliverables
    • Fully built experiment in staging environment
    • Duplicated, fully built experiment in production environment
    • A completed QA and use case doc with all cases marked "pass"
What to watch out for
    • Flashing of original content
    • Variation code that doesn't run at all
    • Goals don't fire when you expect them to 
    • Analytics integration doesn't capture any data
    • Analytics integration captures a data discrepancy
    • Failure in audience or URL targeting

Context before you QA

When you QA, you’ll test these four functional areas for every experiment. In the next section, we’ll break down the workflow so you know how you’ll QA each function, step-by-step.

Take a moment to learn about each below. They should look familiar to you, based on steps you used to create your experiment.

 

 

Variation Code

Does your variation look the way it should?

Variation Code is generated in the form of jQuery when you make any changes in the Visual Editor. You can also write the JavaScript/jQuery manually in the Code Editor. With many websites loading content dynamically with technologies like AJAX and Angular JS, there is a chance that content will not exist on the page when Optimizely executes your variation code so no changes will be made. 

If you're having problems with this, see our article on troubleshooting variation content.

 

 

Timing

Does your variation show up when it should, without page flashing?

Timing refers to the order in which scripts fire on the page. Flashing means you may see original content load in the browser before the variation content appears. Make sure your variation code executes fast enough to avoid any flashing.

If you're having problems with this, see our article on troubleshooting flashing issues.

 

 

Targeting

Does your experiment appear on the right pages, for the right audience?

Targeting includes both URL and Audience conditions. Make sure the test runs only on URLs and for visitors you intend to include. Show the test only to visitors you've identified as eligible.

If you're having problems with this, see our article on troubleshooting activation issues.

 

 

Goals

Do your goals fire when they should and capture accurate data?

Goals should fire the proper tracking calls when triggered. In other words, when you trigger the goal, it should report correctly to Optimizely. If you don’t QA your goals properly, you may end up needing to throw out the results of your experiment because it wasn’t set up to capture accurate data.

If you're having problems with this, see our article on troubleshooting goals.

AN OVERVIEW OF YOUR QA TESTING TOOLS

Use the following four methods to QA every experiment:

The QA checklist that you created as part of your experiment plan will help you perform a thorough check.

As a reminder, your checklist should include:

  • All goals that were added, as well as how each is triggered
  • All functionality that’s been added (for example, a new button)
  • Every visitor use case, including all expected user flows to and from the page
  • Audiences that are eligible and ineligible to see the experiment
  • URLs where the experiment should and should not run
  • Sample workflow to fire a goal (especially for custom events)

Imagine, for example, that you’re targeting the experiment to visitors who come to your site from an organic search traffic source. Your QA list might include the following user paths: 

Audience visitor path Eligible for experiment? Pass / Fail
A visitor from Bing Eligible for experiment  
A visitor from Yahoo! Eligible for experiment  
A visitor from Google Eligible for experiment  
A visitor who clicks a paid ad Not eligible for experiment  
A visitor who clicks an email link Not eligible for experiment  

The actual checklist that you create will be more rigorous. Your QA team will grade each use case pass or fail. Set the experiment live only after every item on the list has passed the test.

1. Preview mode in Optimizely

You may have used Optimizely's Preview mode to check the visual layout of your experiment when you built it. During QA, use Preview mode primarily to check your variation code, timing, and experiment overlap in Optimizely. (You’ll use other, more rigorous methods to evaluate goals and audiences.)

QA method: Preview Mode

Place to look: Optimizely

Navigate through your experiment as if you are a visitor. Check that each variation appears as intended and pages load without flashing.

Then, use the cross-browser test to verify that your variations display correctly in different browsers. Watch this video to learn more about these methods.

2. Variation code and timing

For each page of your experiment, verify that the variation appears as intended and pages load without flashing. Use the force parameter method to QA each variation in the browser, page by page. The force parameter enables you to force the variation to display without launching the experiment for all visitors. It also helps you check the responsive design of your variations.

QA method: Force parameter

Place to look: Your website

At a glance:

  1. Enable force parameters in Optimizely. Navigate to the Settings tab and select the Privacy sub-tab. Uncheck the Disable the force variation parameter box.

  2. To see a variation of your experiment before setting it live, make sure the Exclude draft and paused experiments box is not checked. You’ll also find this under the Privacy sub-tab.

    Open the first page of your experiment in a browser. Force yourself to see the variation by appending the following parameter to the URL:

    ?optimizely_xEXPERIMENTID=VARIATIONINDEX

    For example: http://mytestpage.com/?optimizely_x10730927=1

    To find the experiment and variation number, go the Visual Editor, select the Diagnostic Report, and click the Options dropdown menu.

Troubleshooting notes:

If your variation isn't appearing at all, your snippet may not be implemented correctly. Learn about the basics of snippet implementation. If your variation code doesn't work when the experiment runs but it functions properly when you load it in the console, consider manual or conditional activation.

If a variation appears but doesn’t look quite right, check out this variation code troubleshooting guide.

Not sure if you're seeing a variation code issue or a timing issue? Use your Developer Console to identify the problem. Copy your variation code from Optimizely’s Edit Code box and paste it directly into the console.

If a variation looks good in the browser console but it doesn't work in the Editor, you may have a scoping issue. Check your jQuery scope or ask your developer.

Tip:

An easy way to share your variation designs with stakeholders is to send the URL with force parameters attached. However, visitors who enter through the force variation only see variation you selected on the specified page; they won’t be able to navigate through a multi-page experiment unless they manually add the parameter on every page.

3. Targeting conditions

Next, use Optimizely's data object to check your experiment's targeting. You'll set a test cookie and target it to create strong, persistent audience restrictions. When you set the experiment live, only visitors with the test cookie can see the experiment.

The test cookie method is the best way to rigorously QA your experiment end-to-end. We strongly recommend that you build this method into your standard QA process.

QA method: Test cookie

Place to look: Data object in console tab

By typing the commands outlined below into your browser console, you'll discover how Optimizely buckets you into experiments. Use this to check whether URL and audience targeting are working correctly.

Sometimes the test cookie can be useful in a staging environment as well as production. If large internal teams have access to the staging environment, launching an experiment can cause confusion. Use the test cookie to keep your experiment private for stakeholders involved in testing. If you have a large amount of control over your staging environment, you may decide to launch the experiment directly in staging and QA it live before copying it to production.

If your company doesn't allow adding cookies to staging or production environments, you can still create heavy audience restrictions so you don’t show the experiment to visitors yet. Use the test query parameter or IP targeting instead.

At a glance:

  1. First, open a new incognito or private browsing window and set a test cookie.

  2. In your Optimizely experiment, create an audience that targets both the test cookie AND your other audience conditions. It's important to use an AND condition so only visitors who both have the test cookie and meet your audience restrictions can enter the experiment.

  3. Type optimizely.activeExperiments into the console to return a list of all experiments currently running on the page for that visitor.

  4. Type optimizely.variationNamesMap into the console to return a mapping of experiment IDs to variation names for all active and inactive experiments into which the visitor is currently cookied on the page.

    Use the force parameter to view different variations in the experiment that you’re bucketed into.

Troubleshooting notes:

If you failed the targeting conditions when you expected to pass, or vice versa, go back and check your URL Targeting and Audience conditions. Use the URL Match Validator to make sure that the URL you're testing passes your targeting conditions. Check if any audience conditions you set fail to match the conditions you used to view your page.

If you're still having problems, probe the Optimizely log and filter by your Experiment ID for greater detail.

4. Goals and data capture

Finally, check that your goals fire properly and verify that data is captured on your Results page and in your analytics integrations.

QA method: Test cookie

Place to look: Network traffic and your analytics dashboard

Network traffic is a log within your browser of all events that occur as you interact with a page. When you trigger a goal in Optimizely, it fires a tracking call, which will be picked up in the network traffic.

At a glance:

  1. Open a new incognito or private browsing window, and go to the page(s) you want to test and set a test cookie.

  2. Go to the Network tab of the developer console.

  3. Perform the action that you expect to fire the goal in Optimizely. Look for the event corresponding to that goal in the Network tab to see if it appears when you expect it to.

    If the action involves moving from one page to another, check the Preserve log box in the network tab to track the network call across pages.

  4. Check your Results page and your analytics integrations to ensure that data is captured correctly.

Troubleshooting notes:

For more information on troubleshooting different types of goals and using network traffic, see our articles on troubleshooting goals and troubleshooting analytics.

If you want to test goals on a specific variation, use both the force variation and force tracking parameters.

Because force variation parameter disables goal tracking by default, you should add another parameter to enable tracking:
optimizely_force_tracking=true

Your URL would look something like: http://www.example.com/testpage?opti..._tracking=true

Once your QA team verifies every item on your QA checklist, you're ready to launch the test to visitors. Remove the test cookie audience condition. Then, select your experiment on the Optimizely Home page and click Start Experiment.

Congratulations! Your experiment is now live for visitors to your site.