Skip to main content


Optimizely Knowledge Base

Share your results with stakeholders

This article will help you:
  • Communicate the results of your tests and campaigns with others in your organization
  • Focus on what to share: what your team has learned and how you’ve made an impact
  • Plan when to share: the cadence that’s best for different stakeholders
  • Use different formats and templates to communicate results

Sharing the results of your testing and personalization may not sound like work. Unlike interpreting results, it doesn’t require deep analysis. Nor does it draw deeply on creative resources like experiment design. At the results sharing stage, most of the heavy lifting is behind you. But sharing results - what the team has learned and how it's made an impact - is crucial to the success of every optimization program.

Here’s why. Optimization programs are about gathering data, drawing insights from that data, and taking action. Imagine that your team creates a test to learn that a certain value proposition increases conversions, so you implement that type of offer on your site. Great! Your program has used its insights about customers to move the needle on your company’s business goals. But if you don’t also share your data-driven insights widely, you limit the impact of your testing efforts. You’ve taken action on behalf of customers, but you haven’t enabled other teams to do the same by taking this knowledge into account in their decision-making.

Share what you’ve learned to widen the impact of your testing. Share how you’ve made an impact to demonstrate the value of your optimization program. Communicate the metrics you’ve helped move the needle on, based on test results. If you do this well, you'll evangelize data-driven decision-making within your organization.

Read on to learn more about how to share results effectively with:

  • your testing team

  • your broader organization

  • executive stakeholders

Or, take our Optimizely Academy course on sharing your results with others.

Companies that consistently share their test results end up testing more, testing more effectively, and moving the metrics that matter to their businesses.

See requirements
Materials to prepare
People and resources
    • Program manager
    • Analyst
    • Decision maker
    • Stakeholders
    • Executive sponsor (where applicable)

Actions you'll perform 
    • Finalize documentation
    • Store documentation in an accessible place
    • Share results broadly at a cadence that aligns with company practices
    • Shareable, accessible versions of all regularly used documentation, for example the experiment plan and the roadmap
    • Special newsletter (that details changes in goals, staffing, new quarterly roadmap, test results) at a regular cadence
    • Results sharing documents (find templates below)
    • Quarterly Business Review (QBR)
  What to watch out for
    • Not sharing at each juncture
    • Potentially oversharing with the wrong stakeholders (instead, share results that are relevant to each stakeholder)
    • Not being clear about how individual experiments affect top-line goals in program-level reporting, such as in a QBR
  • This article is a part of the Optimization Methodology series.

Ready to share your results? Download these templates to get started:

Click here to read more articles for our series on running an optimization program. For more downloadable resources, check out the Optimizely Testing Toolkit.

The basics: what to share

When you share results, include the following sections:

Purpose: Provide a brief description of “why” you’re running this test, including your experiment hypothesis.

Details: Include the number of variations, a brief description of the differences, the dates when the test was run, the total visitor count, and the visitor count by variation.

Results: Be concrete. Provide the percentage lift or loss, compared to the original, conversion rates by variation, and the statistical significance or difference interval.

Lessons Learned: This is your chance to share your interpretation of what the numbers mean, and key insights generated from the data. The most important part of results sharing is telling a story that influences the decisions your company makes and generating new questions for future testing.

Revenue Impact: Whenever possible, quantify the value of a given percentage lift with year-over-year projected revenue impact.

The sections above don’t apply only to winning tests. Tests that don’t produce a winning variation generate valuable lessons; learning what not to do can be just as valuable as learning what to do. In fact, you’re more likely to get a nuanced understanding of your visitors behaviors through tests that don’t win than you are through those that do.

Share with your testing team

When to share

Update your optimization team on all active tests in a weekly, bi-weekly, or monthly team meeting, depending on how often your team is testing. Don’t forget to include:

  • team members involved in execution such as designers and developers

  • executive decision-makers not typically involved in day-to-day testing

  • peers who may not be focused on testing but can make valuable contributions to the program’s mission

How to share

Emails and shareable spreadsheets (Google Spreadsheets or Smartsheets, for example) are effective ways to share updates with your team. Your team’s wiki page in Atlassian Confluence is another place to store updated results. Email results to your testing team as experiments conclude.

To create a one-page brief to share by email or on your wiki, fill out the second slide of your test plan document (download here).

Don’t forget

Save your results to a roadmap or spreadsheet where you track results of all tests in a project. Doing so will make it easier to consult these metrics in the future, when you return to results to brainstorm new tests and campaigns.

Download the CSV file of your experiment results from your Results page. On the Results Page, click the Export CSV button in the toolbar. Check out this article for an explanation of columns in the exported CSV.


Add the metrics in your CSV download to a spreadsheet where you track results of all experiments. If you’re using the Optimizely Roadmap template (download here) to plan and track your tests, add these numbers to the Results section of the document.

If you’ve set up any custom goals, such as in Google Analytics or SiteCatalyst, these will be included in your CSV download.

Share with your broader organization

When to share

Share results with the rest of your organization at the end of each test to raise the visibility of your program and champion data-driven decision-making at your company.

How to share

Emails and shareable spreadsheets (Google Spreadsheets or Smartsheets, for example) are effective ways to share updates with the broader organization. Use the slides in the section above to create a one-page brief.

Update results to an internal team wiki page (in Atlassian Confluence, for example) to keep your organization involved. Create a centralized, public source for all test results. Link to and share the wiki in internal emails, so stakeholders beyond your testing team know where to find it.

Try sending out a “which test won” poll by email to generate interest and engagement with data-driven testing in the rest of your organization.

Share with executive stakeholders

When to share

Executive stakeholders help allocate time and resources for your optimization program. Share your progress with executive stakeholders once per quarter, using your company’s internal presentation format.

What to share

Review the company goals that guide your testing program and past experiment results. Use these signposts to frame your report. It's also a good idea to include the following information for executive stakeholders:


  1. Top experiments with important business results

  2. Total number of active experiments

  3. Major takeaways from the project

  4. Number of monthly unique visitors to the site (and projection of future MUV)

  5. Major tests applied to your funnel and what you’ve learned

  6. Top-level summary of insights gained from experiments and campaigns overall

Audiences, integrations, and mobile experiments:

  1. Important tests in mobile

  2. Audiences that you’ve used in tests and plans for future audiences you’ll target

  3. Integrations currently used and the case to be made for future integrations

Next steps:

  1. People and culture: what human resources or skillsets are needed for future testing and personalization? How do you plan to build a culture of testing at your company?

  2. Improved processes: what processes need improvement? For example, how would you collaborate more efficiently with the Marketing team next quarter?

  3. Strategy: where you plan to test and personalize next -- and why?

  4. Execution and resources: what non-human resources do you need for future testing and personalization?

Sharing the results of your tests with different stakeholders will help you spread the insights you've gained and communicate the ROI of your program. Doing this well will help you build visibility at your company and evangelize a data-driven culture at your company.