Today, we’re excited to announce that we’re releasing A/B testing to everybody who uses GameAnalytics. You can start refining your game and testing out new ideas for free by going to GA Labs and trying it out. And in 2021, we’ll release a premium version when we launch even more features.

A big thank you to everyone who took part in our early access program and gave us feedback. It really helped us make sure that we built the best tool possible. As part of all that work, we’re also bringing a few changes to A/B testing, which we’ll get into later in this blog. If you have any questions or feel like we’ve missed something, feel free to get in touch. We’ll get back to you right away.

But before we dive into those details, let’s start at the beginning.

What is A/B testing?

A/B testing is where you have two versions of your game with small differences. You send version A to one group of users, B to the other and see which performs better. Once you’re done with your experiment, you can use the one that worked best.

This lets you answer questions like: does playtime increase if the ball moves 10% slower? Do people spend more if we give them a free gem every ten or fifteen games? Where’s the best place for the start button? Things like that.

How can A/B testing help you?

You can test your games like a real scientist, leaving all the guesswork behind. What’s more, with A/B Testing you can:

  • Grow your game revenue. Increase In-App Purchases and ad revenue faster by making smart changes. All based on accurate data.
  • Optimize your game on the fly, track your experiments, and measure the results in real-time.
  • Boost retention. Learn what your players love. Adjust your gameplay to keep them coming back for more.
  • Confirm your hunch. Having to guess? Run tests and see how different versions of your game change the players’ experience.
  • Lower the risk of rolling out features. Test your ideas first on a sample of players.

Try it out for free this year

We’re running a free trial, so everyone can now use A/B testing through GA Labs. Studios. Publishers. Indie developers. Whatever your size, you’ll be able to give it a go.

The free trial will run until 1st January 2021, after which you can upgrade to our premium service (or keep using our free version).

We’re still talking to the community about what the pricing models should be, so we’ll share those plans closer to the time. But during the trial, you’ll be able to have up to 10 A/B tests active at once for each of your games. Other than that, there are no limits during the trial. (Except your imagination.)

What are you waiting for? Get started.

What have we changed?

We’ve tweaked how the tool works since early access, based on your feedback. So here are the big differences and important features you’ll notice.

Retention and Conversion as Goal Metrics

When creating a new experiment, you can now pick one of the below as a Goal Metric:

  • Retention (Day 1, 3, 5, 7, 14, 28)
  • Conversion

Once you have enough new users using the variant, we calculate if there’s a winning variant and give you the results in a table (check it out below).

To make sure you draw the right conclusions, we use statistical analysis to check if the differences were by chance or if it was actually because of your variant. (If you’re interested, we use Bayesian inference to figure this out.)

Keep an eye out for an in-depth article into the data science behind our statistical models.

Goal Metric: Conversion

This approach lets you see exactly which variant is performing better for your specific goal, so you can decide how to fine-tune your game. You’ll still be able to see other core metrics, and their comparison to the control group, in the results page.

New status messages

As we now have goal metrics, we’ve improved the status messages to give you better information about how the experiment is going.

An experiment can now have the below statuses:

  • Scheduled
    It’ll start on the date you set.
  • Cancelled
    When you stop an experiment before we’ve calculated the goal metric results.
  • Warning – not getting enough users
    Before we can calculate a winning variant, each one needs at least 500 players. (Bear in mind, we only add new players to a group. Not existing players.)
  • Running – roughly {N} days to go
    {N} is the number of days we think you need before we can run the statistical model.
  • Running – Finalizing – {N} days to calculate retention
    The experiment has enough users, but needs to run for {N} more days to have enough data to figure out the retention.
  • Running – Finalizing
    The test has enough users and results for the goal metric, so you’ll see it soon.
  • Running – Success
    Everything’s ready, but the test is still running and you’re still sending your variants to the players. You’ll need to stop the test to remove players from the experiment.
  • Stopped
    All done. Time for a cuppa.

What else is on the horizon?

We’re always looking for new ways to make A/B testing even better, and our team is constantly updating and improving our features based on your feedback. As always, we want to be as clear with you as possible about when these changes are coming, and what they mean to you. So, here’s a list of all of the minor tweaks, fiddles, and improvements that you can expect to see in the platform soon.

  • Custom Build – The ability to manually enter the target build(s) for an experiment.
  • Weighted Means – We’ll update the Core Metrics table to show weighted mean calculations for ARPAU, ARPPU, and Retention.
  • Ad Metrics – We’ll add Total Ad Views, Total Rewarded Ad Views, and Total Interstitial Ad Views to the core metrics table.
  • More Goal Metrics including Sessions, Playtime, IAP Revenue, and Ad Views.

Ready for your first experiment? Take the guesswork out of your game design and start your A/B test.

Subscribe to our newsletter for very best games industry news, tips and stories.

Learn more about GameAnalytics

GameAnalytics.com