Create a full experiment plan to manage design and implementation
Create a QA checklist to build rigor and efficiency into your QA process
A full experiment plan gathers decisions from stakeholders into a single, collaborative document. It provides a detailed summary of the motivations and mechanics of your experiment or campaign. When shared, it helps build visibility across the organization and scale your experimentation strategy by establishing a standardized process.
A QA checklist helps you build rigor and efficiency around your experiment QA process -- so your experiments and campaigns work the way you intend.
Download this QA Checklist template to outline your team's QA process. Your team will use this document to review the experiment before you publish it live to your visitors.
When you plan an experiment, include the QA team so they can create a QA checklist. By looping the QA team early, you'll build a rigorous, efficient process that prepares for all cases to check against -- and eliminates those that don't need to be checked.
Your QA checklist should include:
All goals or events in the experiment, and how each is triggered
All functionality that’s been added (for example, a new button)
Common use cases including expected user flows to and from the page
Audiences that are included or excluded
URLs where the experiment should run
Sample workflow to fire a goal (especially for custom events)
Your QA team should grade each use case pass or fail. They'll push the experiment live once everything on the list has passed the test.
QA for separate development and production environments
If you have access to separate development and production environments, notify the development team before running the experiment on the site.
Maintaining separate environments can help you mitigate the risk of accidentally deploying an unfinished experiment to your live site. If you have separate environments, we recommend you build experiments and QA by the following process:
First, make sure each environment has its own Optimizely snippet in the head tag.
Build your experiment in the development environment.
QA the development environment.
Push the experiment live in the QA environment. Confirm goal firing on the Results page and all analytics data collection.
Duplicate the experiment into your production environment.
Set a test cookie so only you can see the experiment on the live site.
Push the experiment live in the production environment and QA.