Mobile Game Analytics Example

This example discusses some key action points to measure for a sample mobile game. The mobile game is called Ancient Blocks and is actually available on the App Store if you want to see the product in full. This example is meant to be a starting point, it is not meant to be an exhaustive list of everything a mobile game should measure.

Ancient Blocks game in progress Ancient Blocks game in progress Ancient Blocks map screen

Common KPIs

High-level key performance indicators (KPIs) are typically the same across all genres of mobile game. Most developers will have KPIs that include the following:

  • D1, D7, D30 retention - how often players are coming back.
  • DAU, WAU, MAU - daily, weekly and monthly active users, a measurement of the active playerbase.
  • User LTVs - what is the lifetime value of a player (typically measured over various cohorts, gender, location, acquiring ad campaign etc).
  • DARPU - daily average revenue per user, i.e. the amount of revenue generated per active player per day.
  • ARPPU - average revenue per paying user, a related measurement to LTV but it only counts the subset of users that are actually paying.
There will also be a selection of game specific KPIs. These will give insight on isolated parts of the game so that they can be improved. The ultimate goal is improving the high-level KPIs by improving as many game areas as possible.

Retention

Player retention is a critical metric - arguably even more important to measure than revenue. If you have great retention but poor user life-time values (LTV) then you can normally refine and improve the latter. The opposite is not true. It's much harder to monetise an application with low retention rates.

But what exactly is retention? If a user likes your application they will come back and use it again. The retention rate is the percentage of users that come back 1 day later (D1), 7 days later (D7), 14 days later (D14), 30 days later (D30) and so on. For a mobile game a D1 retention rate of over 40% is a good indicator. D7 retention rates are typically half of D1.

Calq provides a retention grid report to easily visualise retention for this example game.

Retention grid for last 14 days

As the game is iterated upon (either by adding/removing features, or adjusting existing ones) the retention can be checked to see whether or not the changes had a positive impact.

Active user base

The DAU/WAU/MAU measurements are industry standard measurements showing the size of your active user base. From here it's easy to spot if your audience is growing, shrinking, or flat.

Active users chart

Active user measurements need to be analysed with the additional context of the retention report. Your userbase will be flat if you have lots of new users but are losing existing users (i.e. churn) at the same rate. If this is the case then time could be spent trying to keep existing users rather than investing in new ones.

Game tutorial

As with most successful games, new users are shown an interactive tutorial to teach them how to play. This is often the first impression a user gets of your product so it needs to be well refined. With a bad tutorial your D1 retention (and onwards) will be poor.

Ancient Blocks tutorial in progress #1 Ancient Blocks tutorial in progress #2 Ancient Blocks tutorial in progress #3

Goals

The tutorial is designed to teach the new user how to play. The data collected about the tutorial needs to show any areas which could be improved. Typically these are areas where users are getting stuck, or taking too long.

  • Identify sticking points within the tutorial.
  • Iteratively these refine steps to improve tutorial conversion (the number that get to the end successfully).

Key unknowns

  • What percentages of users make it through each tutorial step?
  • What percentage actually finish the tutorial?
  • How long is spent on each step?
  • Are any steps unclear or too difficult?
  • After completing the tutorial how many go on to play the next level?

Actions

Tracking tutorial steps is relatively straight forward. In this example we will use a single action called Tutorial Step. This action needs to include a custom attribute called Step to indicate which tutorial step the user is on (0 indicates the first step). We also want to track how long a user spend on each step (in seconds). To do this we also include a property called Duration.

Calq actually has some in-built time measurement capabilities, but for maximum flexibility we are including the duration as a custom property.

Action Properties
Tutorial Step
  • Step - The current tutorial step (0 for start, 1, 2, 3 ... etc).
  • Duration - The duration (in seconds) the user took to complete the step.

The Tutorial Step action sent to Calq would look something like this (write_key and actor removed for clarity):

{
    "action_name": "Tutorial Step",
    "properties": {
            "Step": 2
    }
}

Reporting

Analyzing the tutorial data within Calq is relatively straight forward. Most of our unknowns can be solved by creating a simple funnel, with one funnel step for each tutorial stage. This shows the conversion rate of the entire tutorial on a step by step basis. From here it is very easy to see which steps "lose" the most users.

Tutorial funnel steps Tutorial funnel results

As you can see from the results: step 4 in only has a conversion rate in the 97%s instead of the 99%s compared to other steps. This would be a good step to look at. Even though it's only a 1 percentage point difference, that still means around $1k in lost revenue just on that step. Per month! For a popular game the different would be much larger.

Once you have your initial data you can refine parts of your tutorial to improve them. Continue to refine, continue to measure, and continue to improve throughout the life of your game.

Gameplay

The core of any mobile game is always going to be the gameplay itself. It doesn't matter how slick the rest of your product is if the game play sucks. So how do we measure this? Standard engagement and retention metrics are great overall indicators and something you should always look to improve. Drilling down into the specifics of gameplay will vary somewhat on a game to game basis. In this example, Ancient Blocks is a level based puzzle game and the action model will reflect that.

Goals

  • Increase retention (the percentage of your users that come back over time - D1, D7 and D30 common measurements).
  • Increase engagement (Often measured as the amount of time played, number of games played, etc).

Key unknowns

Goals such as "increase engagement" are obvious targets, but it's often quite difficult to decide how to do that in practice. Some initial unknowns which will be useful to quantify include:

  • What percentage of players finish the first level?
  • What percentage of players finish the first 5 levels?
  • Are any levels too difficult?
  • How many times does a player replay a level before they succeed?
  • How long does each level take?
  • How far does the average player get into the game before they stop playing?
  • How many "power ups" does a player use each level?
  • How many blocks does a player move each level?
  • How many launches (explosions) does a player trigger each level?

Actions

This time we are going to use 3 different actions. Gameplay.Start for when a player starts a level, Gameplay.Finish for when a user finishes playing a level (successfully or otherwise), and Gameplay.PowerUp for when a player uses a "power up" ability during a level.

Action Properties
Gameplay.Start
  • Level - The number / id of the current level being played.
  • Difficulty - The difficulty setting of the level being played.
Gameplay.Finish
  • Level - The number / id of the current level that was just finished.
  • Difficulty - The difficulty setting of the level that was just finished.
  • Duration - The duration (in seconds) the user took to complete the level.
  • Success - Whether the user successfully completed the level (true) or if were defeated (false).
  • PowerUps - The number of times a power up was used in this level play.
  • Blocks - The number of blocks the user moved during this level play.
  • Launches - The number of launches that the player managed to trigger during this level play.
Gameplay.PowerUp
  • Id - The number / id of the power up that was used.
  • Level - The number / id of the current level being played.
  • Difficulty - The difficulty setting of the level being played.
  • After - The amount of time (in seconds) into the level the user was when they used a power up.

Early level progression

One of the first things we can look at is successful progression through the first 5 or 10 levels. This gives us data on the percentage of players reaching each level and indicates whether or not we have our difficulty progression at a reasonable level. The first way to do this is to create a conversion funnel describing the user journey.

We will need 10 funnel steps, one for each of the first 10 levels. The action we will be analyzing is Gameplay.Finish. For each step we will need to apply a filter, first on Id to filter the step to the correct level and secondly on Success as we are analyzing successful completions only.

Level funnel steps Level funnel results

There will be a natural user drop off as not everyone will progress deep into the game, but if certain levels are experiencing a larger drop than expected then those levels will need looking it. It could be too hard, it could just be less enjoyable - it will depend on the game in question.

Completion rates

Other indicators we can look at are the number of times a level is played versus the amount of time that level is completed. For example, looking at level 3, we can build a query that counts all the times players have finished that level using the Gameplay.Finish action again. We will need to filter by the correct level using the Level attribute, and to group the results by Success. We can easily see the percentages of failed versus successful attempts by displaying the result a pie chart.

Level success rate chart

Remember that you can save your queries inside Calq so you don't have to keep writing them. Optimization is an on-going process and you will want to revisit previous queries often to see how they have changed over time.

We could also look at how long is spent on each level. If a level takes far longer than the previous it is probably too hard. To check this we can create another query on the Gameplay.Finish action. This time we can filter by level and graph and the average duration (AVERAGE(Duration)) for failed and successful attempts. We could then go on to show all plays larger than this average if we wanted.

The actions being sent allow for more detailed analysis on a level by level basis as needed. We could look at whether power ups are being used to pass a particularly hard level, how many launches an average player can achieve and still fail a level, the average time into a level before power ups are used, and much more.

Abandoned sessions

Another metric which is incredibly useful for measuring early play is the number of people that start a level but don't actually finish it - i.e. quit. This is especially useful to measure for the first level after the tutorial has finished. If players are just quitting then either don't like the game, or they are getting frustrated after the tutorial.

We can create a short conversion funnel within Calq to measure this. Using the Gameplay.Start action, the Tutorial Step action (so we can account for people that dropped off before the tutorial was even finished), and the Gameplay.Finish action.

Abandoned session funnel steps Abandoned session funnel results

The results show 64.9% of players that finished the tutorial went on to finish the level (the result between the 2nd and 3rd step). This means 35.1% of players quit the game in that gap. This is rather high, and this metric should be a focus for the Ancient Blocks designers to iterate on and improve.

Remember the goal is to look at any aspects of gameplay which could be improved. Identify a potential change, implement it, and then monitor your metrics to see if they improve. Iterate, measure, refine. This is the key to getting the most out of a mobile game.

In app purchases

Often the goal of a mobile game is to drive revenue. Our example game Ancient Blocks is no different. The game has an in game currency Gems which can be spent on boosting the effects of power ups. Using a power up mid game will also cost one gem.

Players can accrue gems slowly by playing. However, if a player wishes they can buy additional gems in bulk using real world payments.

Goals

We want more players actually spending cash, spending more with us when they do, and doing it more often. The purpose of this section is to improve the IAP flow itself. Improving the standard engagement and retention metrics are heavily related to user LTV (the longer you keep a player the more likely they are to spend) but are not the focus of this section.

  • Increase average user lifetime value (LTV).
  • Increase percentage of users that make a purchase.
  • Increase repeat purchases.

Key unknowns

  • How many players look at the IAP options but then do not make a purchase?
  • How many players try to make a purchase but fail?
  • Which user journey to the IAP screen gives the best conversions?
  • What is the most popular item?
  • What is the most popular cost bracket?
  • What percentage of customers make a repeat purchase?
  • Which customer source generates the most valuable customers?

Actions

We want to track a few actions here. Firstly we want information about when a user actually buys something with real world cash using in-app purchasing. This event will be called Monetization.IAP, and will be tracked using the trackSale(...) client method as it represents a revenue action. It will also be useful to track users who attempted the IAP process but were unable to complete it. Some information is normally given back by the store provider (whether that be iTunes, Google Play etc) and we can store that with an action called Monetization.FailedIAP.

We also want to know what users are spending their gems on. We already have an event from the gameplay section for every time a power up is used (Gameplay.PowerUp), but now we are also going to add Monetization.Spend to represent a user spending gems in the in game shop.

As well as measuring what players are spending on, its just as important to know how they reached the store in the first place. If a particular action (such as an in-game prompt) generates the most sales, then you will want to trigger that prompt more often (and probably refine its presentation). We want actions that record the game shop and IAP screen being shown and to include some data about how they were shown in the first place - a referrer if you like. This will be done with an action called Monetization.Shop.

Action Properties
Monetization.IAP
  • ProductId - The number / id of the product or bundle being purchased.
  • FirstTime - Is this a first time purchase by this user?
  • MaxLevel - The highest level the user has reached in the game when making this purchase.
  • ScreenReferrer - Identifies the screen / prompt / point of entry that eventually triggered this purchase.
  • $sale_value (added by trackSale(...)) - The value of this sale in real world currency.
  • $sale_currency (added by trackSale(...)) - The 3 letter code of the real world currency being used (e.g. USD).
Monetization.FailedIAP
  • ProductId - The number / id of the product or bundle that failed to be purchased.
  • Response - A response code from the payment provider.
  • Message - A message from the payment provider.
Monetization.Spend
  • ProductId - The number / id of the item being spent on.
  • Type - The type of spend this is (Item Upgrade, Cooldown, Lives, etc)
  • Gems - The number of gems being spent.
  • MaxLevel - The highest level the user has reached in the game when making this purchase.
  • ScreenReferrer - Identifies the screen / prompt / point of entry that eventually triggered this purchase.
  • $sale_value (added by trackSale(...)) - The value of this sale in real world currency.
  • $sale_currency (added by trackSale(...)) - The 3 letter code of the real world currency being used (e.g. USD).
Monetization.Shop
  • Screen - Which shop screen this was (such as the main shop, the IAP shop etc).
  • ScreenReferrer - Identifies the screen / prompt / point of entry that resulted in the shop being displayed.

In addition to these properties we would also normally be sending a range of global properties detailing how we acquired the customer in question (which campaign, which source etc). This allows us to do further analysis on which customer sources convert better, which spend more, and which are ultimately better for user LTV.

IAP conversions

One of the most important metrics is the conversion rate for the in game store, i.e. how many people viewing the store go and make a purchase with real world currency.

In a typical freemium game in the style of Ancient Blocks, around 2% of players will actually make a purchase. However, the store to purchase conversion rate is typically much lower as the store is often triggered many times in a game session. If a game is particularly aggressive at funnelling players towards the store screen then the conversion rate could be even lower - and yet still be a good conversion rate for that game.

To analyse the conversion rate we are going to start with a funnel. We want to know the percentage of users that see the IAP store and then go on to make a purchase. To do this we use a simple 3 step funnel. The first step is on the action Monetization.Shop with the Screen property set to show main shop, followed by the same action again with the Screen property set to the IAP shop. The last step is the Monetization.IAP action.

In app purchase conversion funnel steps In app purchase conversion funnel results

As you can see, the conversion rate in Ancient Blocks is 1.36%. This is lower than expected and is a good indicator that the whole process needs refining. As the authors of Ancient Blocks modify the store page and the flow, they can revisit this conversion funnel to see if the changes were positive.

IAP failures

It's useful to keep an eye on the failure rates of attempted IAPs, where a customer has attempted to make a purhcase but it did not complete. This can easily be measured using the Monetization.FailedIAP action from earlier.

Looking at why payments are failing allows you try and do something about it - though a lot of the time it's out of the developer's control. Sharp changes in IAP rates can also indicate problems with payment gateways, API changes, or even attempts at fraud. In each of these cases you would want to take action pro-actively.

In app purchase failure rates over time In app purchase failure rates by failure reason

The reasons given for failure vary between payment providers (whether that's a mobile provider such as Google Play or the App Store, or an online payment provider such as Paddle). Depending on your provider you will get more or less granular data to act upon.

Comparing IAPs across customer acquisition sources

Most businesses measure the conversion effectiveness of acquisition campaigns (e.g. the number of impressions vs the number of people that downloaded the game). Using Calq this can be taken further to show the acquisition sources that actually went on to make the most purchases (or spend the most money etc).

Using the Monetization.IAP or Monetization.Spend actions as appropriate, Calq can chart the data based on the referral data set with setGlobalProperty(...). Remember to accommodate that you may have more players from one source than another which could apply a bias. You want the query to be adjusted by total players per source.

In app purchase campaign sources

The results indicate which customer sources are spending more, and this data should be factored in to any acquisition budgets.

Summary

This example is just meant as a starting point for you to build on. The live version of Ancient Blocks actually measures many more data points than this, and you should do.

Key take away points:

  • The ultimate goal is to improve your core KPIs (retention, engagement, and LTV), but you need to drill down and measure many components of your game to do this.
  • Metrics are often linked. Improving one metric will normally affect another and vice versa. You need to approach your product as a whole.
  • Propose, test, measure, and repeat. You should always been adding refinements or new features to your product. Measure their impact each time. If it works then you refine it. If it doesn't then you should probably remove it.
  • Measure everything! You will likely want to answer more questions of your product later, and you will need data to answer these questions.