A/B testing

Game Overrides can be split into multiple variants to compare the impact they have on your game. For example, you might believe starting players are given too many coins and are progressing through your game too quickly or stop playing too early, missing IAP (in-app purchases) opportunities. To test this, you’d set multiple starting balances: the control and the variants.

Define the Override as an A/B test with statistical significance tracking

Statistical significance tracking increases the confidence that the effects observed during an A/B test are not due to chance. This option allows you to configure A/B tests to run for long enough and receive the required participation before they display results.

  1. In the Unity Cloud Dashboard, open Game Overrides.
  2. Select Create Override.
  3. Give your override a name and description.
  4. Select Set up A/B test with statistical significance tracking.
  5. Select a goal metric from the dropdown.
  6. Select Next.

To set up a multi-variant test using Game Overrides without enabling statistical significance tracking, refer to Get started.

Configure the test

  1. Set the Population size, Minimum effect of interest and Statistical power of the experiment.
  2. Select Calculate.

The result displayed after calculation represents how many players need to fall in each of the variants so the experiment is statistically significant.

Target

A/B tests with statistical significance tracking target all players; you can't choose a specific Audience or use JEXL for targeting. To limit the number of players who get an alternate configuration, set a lower weight for the variants and a higher weight for the control.

Audiences and JEXL are still available for targeting within Game Overrides that do not use statistical significance tracking.

Configuring the content

On the Content page, under Variant 1, select Add Keys. Choose your Key name and Value. Variant 1 is the control and matches the default value of your configuration. Add a new variant group using the tab at the top and repeat the same steps, with a different value you want to test. Select Add a variant group to add variants. A best practice would be to test a single change to know whether the variable you’re testing impacts your metrics.

By default, your players are allocated equally between the control and variant group. Select Split Manually if you want to control the individual weightings of a group. We recommend splitting the variants to a level you are comfortable with.

You can change the percentage of players that will fall into each group. The values must add up to 100%. Select Next to continue.

Schedule the test

Schedule your Override with a Start Date.

Make sure you are not running overlapping Overrides against the same variables to maintain the integrity of the configuration served to players as part of the test. Select a higher priority for the test so it has precedence over other Overrides.

Set the priority for the Override and select Continue to create the Override.

Viewing the results and reporting

Once the A/B test has been running for 7 days and the required number of participants has been met, select Load experiment results from the reporting tab.

You can then interpret the result and see how the change introduced to a subset of players as part of the A/B test impact the chosen metric.

You can also view the corresponding Game Overrides reporting through Analytics events. In your Game Overrides list, select the Override name, then select the Reporting tab.

You can filter by metrics and adjust the time window to view results for today, the last seven days, the last 14 days, the last 30 days, and the last quarter.

Make a decision and end the Override

Once the results of the A/B test have been loaded to review, the associated Game Override will stay active until you make a decision.

Select End Override at the top right of the Override's page select one of the two main options:

  1. Select End your Override (The variant group will no longer be served the alternate configuration associated to this A/B test and will return to the active base configuration for the project.)
  2. Select Select a variant to roll out to the players targeted by your Override (The variant from this A/B test becomes to new active configuration for all players).

To confirm and apply this choice check the reminder checkbox at the bottom of the prompt and select End Override.

Best practices

  • Run your test with a goal metric in mind. If you wait until after your test is run to think about how you're trying to improve player behavior, you might not run the test optimally.
  • Create a control group and one or more treatment or “variant” groups, to compare your change to the default behavior of the game.
  • Ensure you have a number of players in each variant group to reach the calculated sample size. This ensures the test is valid, because too small of a group can affect the accuracy of the test.
  • Test a single change at a time to see if the change is impacting your metrics. Testing multiple changes makes it difficult to know which changes affect the metrics.
  • When running multiple tests, ensure that you’re not running overlapping tests against the same variable. This makes it easier to see which variable affects which metric.