A/B testing is a Dashboard feature within Game Overrides that splits your Override into variants that you set to see what impact they make on your game. For example, you might believe starting players are given too many coins and are progressing through your game too quickly or stop playing too early, missing IAP (in-app purchases) opportunities. You’d set two (or more) starting balances: the control, and the variant.
Go to Game Overrides and select Create override. Name your Override, then select Next.
Name your Override.
Choose the players to target. Choose between stateless (JEXL), and stateful (uses Analytics to generate Audiences). You can control a progressive rollout, but a 100% rollout is recommended. Note that to have a valid test, each variant needs approximately 10,000 players or more. Ensure that the targeting isn’t set to too small a group. Select Next.
Choose your Audience.
On the Content page, enable the A/B button to run an A/B test. Under Variant 1, select Add Keys. Choose your Key name and Value. Variant 1 is the control and you should leave the default values for the game. Add a new variant group using the tab at the top and repeat the same steps, with a different value you want to test. Select Add a variant group if you want to add further changes to the test. However, best practice would be to test a single change to know whether the variable you’re testing impacts your metrics. Testing a single change (one key) is recommended.
By default, your players are allocated equally between variant groups. Select Split Manually if you want to control the individual weightings of a group. We recommend splitting evenly.
Add your variants in key-value pairs.
You can change the percentage of players that will fall into each group you add. The values must add up to 100% to continue.
Schedule your Override with a Start and End Date. A shorter runtime means fewer people go through the testing. Ensure you don’t run overlapping tests against the same variable. Set the Override priority and select Finish.
Schedule your A/B test.
Test allocation is captured as part of Analytics events, with reporting available. In your Game Overrides list, select the Override name, then select the Reporting tab.
You can filter by:
ARPDAU: Average revenue per daily active user
Daily play time per daily active user
You can select to see results for today, the last seven days, the last 14 days, the last 30 days, and the last quarter.
Report of the A/B test.
Run your test with a goal metric in mind. If you wait until after your test is run to think about how you're trying to improve player behavior, you might not run the test optimally.
Create a control group and one or more treatment or “variant” groups, to compare your change to the default behavior of the game.
Ensure you have at least 10,000 players in each variant group. This ensures the test is valid, because too small a group can affect the accuracy of the test.
Test a single change at a time to see if the change is impacting your metrics. Testing multiple changes makes it difficult to know which changes affect the metrics.
When running multiple tests, ensure that you’re not running overlapping tests against the same variable. This makes it easier to see which variable affects which metric.