Skip to main content

We are currently experiencing an issue that is preventing customers from submitting support tickets. Please contact us at (866) 819-4486 if you need immediate assistance.

Optimizely Knowledge Base

Create an experiment in Optimizely X Full Stack

  • Create a Full Stack experiment
  • Set up experiment parameters, including the experiment key, traffic allocation, variations, audiences, and metrics
  • Find the Optimizely-generated experiment code block for your primary language

Once you've set up a Full Stack project, you can create an experiment.

To get started, navigate to the Experiments dashboard and click New Experiment.

Learn more about managing Full Stack experiments. Or, read more about getting started with Optimizely X Full Stack.

1. Set an experiment key

The experiment key is a unique identifier for your experiment. Your developer will use it to reference the experiment in code.

Specify an experiment key. For example, "SEARCH_RESULTS_ALGORITHM."

Your experiment key must contain only alphanumeric characters, hyphens, and underscores.

The key must also be unique for your Optimizely project. A unique experiment key means you can reference specific keys in your code.

Don’t change the experiment key without making the corresponding change in your code. If you want to learn more, read about experiment activation in our developer documentation.

2. Set experiment traffic allocation

The traffic allocation is the fraction of your total traffic to include in the experiment, specified as a percentage. In this example, we allocated 5% of traffic to the experiment:

The traffic allocation is determined at the point where you call activate() in the SDK.

In the example above, the experiment is triggered when a visitor does a search, but it won’t be triggered for all users. 5% of users who do a search will be in the experiment, but 95% of users who do a search won't. In addition, users who don’t do a search also won't be in the experiment. In other words, the traffic allocation percentage may not apply to all traffic for your application.

3. Set variation keys and traffic distribution

Variations are the different code paths you want to test.

Each variation requires a unique variation key to identify the variation in the experiment. In this example, we added two variations, variation_a and variation_b:

You must specify at least one variation. There’s no limit to how many variations you can create.

A short, human-readable description for each variation will help make reports clear.

You can specify any traffic distribution you’d like. By default, variations are given equal traffic distribution.

You can also use one variation to gradually roll out a feature without A/B testing the impact. Your developers should ensure that you're executing the correct code paths when users are bucketed in the control variation, and in the default case when visitors are not allocated to the experiment.

To learn more about activating experiments, check out experiment activation in our developer documentation.

4. Create audiences

Use audiences to show your experiment to certain groups of users. You don’t have to set up audiences if you don’t need them. Add audiences if the variations you’re testing only apply to certain groups of users.

Click () to add an existing audience. Or, click Create new audience to define a new audience.

Optimizely takes the union of audiences as the eligible traffic for the experiment. So, a user browsing with Mobile web qualifies for the audience. A user browsing with an iPhone also qualifies.

Learn more about defining audiences. Or, read about attributes in Full Stack, including passing audience data correctly in your application code.

Audience evaluation may affect the exact traffic allocation you’ve specified for the experiment. For example, imagine that mobile users (iOS, Android, or mobile web) constitute 40% of total traffic. Above, the experiment traffic allocation is set to 5% of total traffic. The expected total fraction of traffic in the experiment is 2% (40% of the 5% traffic allocation is 2%). However, the actual total fraction of traffic in the experiment could be higher or lower than 2% -- it depends on how many mobile users there actually are during the experiment.

5. Add a metric

Next, add events that you’re tracking with the Optimizely SDKs as metrics to measure impact. Add at least one metric to an experiment.

Click () to add existing events as metrics to your experiment.

To re-order the metrics, click and drag them into place. 

The top metric in an experiment is the primary metric. Stats Engine uses the primary metric to determine whether an A/B test wins or loses, overall. Learn about the strategy behind primary and secondary metrics.

Learn more about tracking events with an Optimizely SDK in our developer documentation.

6. Add experiment code

Once you enter the unique experiment keys and variation keys, Optimizely creates a code block in your primary language at the bottom of the page.

  1. Copy and paste the code block directly into your application code.

    For example, Optimizely created this Python code block for a Python project:

    The sample code block shows how to call activate() for your experiment key, a user ID that you provide, and the different variations. The code block distinguishes between bucketing the user in the control variation of the experiment (variation_a) and the default case, where the user doesn't enter the experiment.

  2. Then, click Create Experiment to complete your experiment setup.

In the example above, the experiment is targeted to users who qualify for a certain audience. To make sure that traffic enters the experiment, pass attributes to activate() so that Optimizely can evaluate that audience in the SDK.

For example:

# attributes of the user
attributes = {‘device’: user.device}
# activate user in the experiment
variation = optimizely.activate(‘SEARCH_RESULTS_ALGORITHM’,, attributes)
if variation == ‘variation_a’:
  # User is in the control variation.
  # Roughly 1% of traffic.
elif variation == ‘variation_b’:
  # User is in the treatment variation.
  # Roughly 1% of traffic.
  # User is not the experiment.
  # Roughly 98% of traffic (95% of total, plus another 3% for non-mobile traffic)

To learn more about correctly passing audience data in your application code, check out user attributes in our developer documentation.