Most game design questions would be easily answered if you could just test your players’ reactions to different solutions. Now you can do just that with our new A/B Testing feature.

Custom Queries

An A/B test experiment consists of a unique test run for a particular audience during a specific timeframe showing different versions of a game to different groups of players within the audience. You'll be able to compare the different player groups' behaviour when viewing the test results.

Test creation - Segmentation

Before you get started with creating and running experiments, please ensure that you have integrated our latest SDKs with your games and have the remote config calls setup correctly. When you create an A/B test, you will first need to define an audience group that you would like to run the experiment on by selecting a combination of:

  • Build
  • Target Percentage
Custom Queries

Your experiment will only be targeted at New Users – existing players will not be included in any A/B tests. Once a new player has entered an A/B test, they will remain exclusively part of that experiment until you stop the experiment.

You can optionally include or exclude certain Countries and OS Versions in your experiment.

Test creation - Variants

Now you can define the remote config and variant groups:

- Config Key:

  • Create the key that you want to perform the A/B test on. This key will need to be supported in all the builds previously selected.

- Variants:

  • Each different config value to be tested.
  • You can create up to 3 variants, allowing for 4 test groups in total, including the control group.
  • Players included in the control group, will not receive any value for the a/b test config key.
  • Players will be randomly allocated to the control group or a variant.
Custom Queries

Test creation - Metric

The next step is to select the Goal Metric to determine the experiment winner. Metrics available now are Conversion & Retention. Playtime per user, playtime per session and session count are coming soon.

Custom Queries

Test creation - Summary

Here you can:

  • See an overview of the A/B test specification
  • Give your test a title
  • Start the test

Custom Queries

Experiment Overview Page

The overview page will show you all your experiments by status. You can see which tests are running, scheduled, completed and cancelled. You may also get warning messages displayed when there aren't enough users partaking in the test (when the sample size is too small).

This is a great place to get a general overview of your experiments.

Custom Queries

Results page - Completed test

The overview page will show you the experiment's results. At the top you'll see the experiment details along with the goal metric comparison between the variants. What's interesting here is that you can see which variant has the highest probability of being the best.

Probability to be the best is the probability that a given variant’s goal metric is the highest among the variants. If this probability for a given variant exceeds a significant threshold, we declare that variant a winner. In case of more than two variants – regardless of having a winner - we can have one or more variants that are statistically significant improvements over the control variant.

Custom Queries

Underneath this, we can see a comparison of the core KPIs between each variant:

Custom Queries

Finish Test

Tests can be stopped (cancelled) after or before they are complete, altough we strongly suggest letting them complete so that you allow enough data to enter for us to identify the winning variant. There are two ways to stop (cancel) a running experiment:

  • Select "Stop" from the overview page actions
  • Custom Queries
  • Select "Stop" from the View Results page
  • Custom Queries

    You also have the option of copying & deleting Cancelled tests:

    Custom Queries