· 3 min read

Measuring Player Experience

if we can analyze the behaviour of a player and find out whether or not the player is having a good experience playing the game, and if not why, there is a lot of revenue that we can prevent losing from players quitting.

A topic that is currently gathering a lot of interest in the games research community is how player behaviour in-game relates to the subjective experience of playing a game. However, there are some challenges towards this.

One of the main challenges is that what we are actually doing in this situation inferring experience from behavior. There is a substantial amount of anectodal evidence supporting this, e.g. game user research reports at the Game Developers Conference over the past few years. Companies such as Microsoft and EA have released case examples that strongly indicate that in practice, and especially with experience, it is possible to go a long way towards debugging the playing experience of a game just by analyzing at behavioral telemetry data.

Fun meters

Back in 2005, Nicole Lazzaro and Larry Mellon proposed the use of “fun meters” in games, essentially the collection of metrics of user behavior that are indicative of whether or not the players are enjoying the gaming experience. For example, looking at what people spend their money on in The Sims Online as an indicator of what entertains them in the game. Conversely, if it can be observed from behavioral data from a game that a player keeps dying over and over, and spends several minutes in vain trying to progress through a game level, probably that player is not enjoying the experience.

Inferring PX from behavioral telemetry can however be prone to errors, as it is not possible to verify whether conclusions drawn from behavioral analysis are correct unless some form of measure of the playing experience is used, e.g. a survey or interview questions. Towards solving this issue, method triangulation is the key, e.g. obtaining both an experience measure and behavioral telemetry.

[bctt tweet=”The problem with play experience measures such as surveys is that the interaction flow between player and game is interrupted. ” username=”GameAnalytics”]

Breaking experience

User experience data in game testing is generally obtained using qualitative or semi-quantitative methods, such as user feedback (e.g. attitudinal data) via interviews or surveys, or potentially in combination with usability-oriented testing. Usability testing generally focuses on measuring the ease of operation of a game, while playability testing explores is users have a good playing experience, measuring this using e.g. small pop-up surveys. In comparison, gameplay metrics analysis offers however insights into how the users are actually playing the games being tested.

The problem with play experience measures such as surveys is that the interaction flow between player and game is interrupted. An alternative approach, combining metrics with psycho-physiological measures (e.g. heart rate), has been explored in the academia but has yet to see widespread use in industrial development – there are a range of practical and financial challenges, and as yet the approach championed by Microsoft Game Studios, where simple pop-up surveys combined with behavioral telemetry are used to detect experience-related problems, remain the mose widespread approach in the industry.

Mining experience

In the very latest years, however, sophisticated data mining on behavioral telemetry performed by academics working in concert with game developers have indicated that it may be possible to detect play experience from behavior – maybe even using automated systems. One of these studies, led by Dr. Alessandro Canossa from the IT University Copenhagen, in collaboration with IO Interactive, developed a model that could detect behaviors that aligned with playtesters feeling frustrated in the FPS Kane & Lynch: Dog Days. This kind of research points to a potential way of solving the underlying problem of analyzing behavior and experience jointly.