The documents distributed by this server have been provided by the contributing authors as a means to ensure timely dissemination of scholarly and technical work on a non-commercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, not withstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Evaluation of Adaptive Serious Games using Playtraces and Aggregated Play Data

Author:Christian Reuter, Florian Mehm, Stefan Göbel, Ralf Steinmetz
Date:October 2013
Kind:In proceedings - use for conference & workshop papers
Publisher:Academic Conferences Limited
Book title:Proceedings of the 7th European Conference on Games Based Learning
Editor:de Carvalho, Carlos Vaz and Escudeiro, Paula
Keywords:serious games, evaluation, adaptation, playtrace, testbed
Number of characters:4697
Research Area(s):Serious Games
Abstract:Adaptive Serious Games often feature complex algorithms and models, which influence the player’s progression through the game. These models include properties like pre-existing knowledge or preferred playstyle and are matched with a pool of appropriately annotated parts of the game, such as assignments or scenes, during runtime. While being transparent for players, these models must be visualized for testing and evaluation purposes. In order to allow authors the retrospective interpretation of playtraces generated by a gaming session, we developed a replay component for adaptive serious games created with the authoring tool “StoryTec”. This method removes the need for continuous observation of individual players while retaining the same level of detail and being much more understandable compared to log files, especially for the non-programming audience addressed by StoryTec. In addition to showing the player’s view, the state of the internal models and the progression through the story structure are also visualized. Sharing the same models and data structures as the authoring tool and making their runtime behaviour visible to the author, the replay component is therefore able to offer additional benefits compared to more generic methods like screen capturing or key recording tools. A complementary tool which is able to aggregate a large number of playtraces into one comprehensive spreadsheet for statistical analysis was also implemented. This allows authors to gain an overview over a great number of players in a shorter time compared to investigating them individually. In order to reduce the complexity of the result, the table contains aggregated information like the total time the players spent in each scene or the final value of variables at the end of their sessions. If authors detect an anomaly, they can then access more detailed information by loading the original traces into the replay component, which uses the same data format. Together these two components support the evaluation of adaptive serious games by means of user studies with the intended target audience, for example pupils. By combining them with our testbed for rapid prototyping named “StoryPlay”, we were able to provide a set of tools covering a broad range of evaluation tasks based on the same underlying models and data formats. Using these tools, it is possible to gain insights on how the adaption algorithms behave over a large number of players, e.g. which paths were taken by how many players or whether the time to solve a task as estimated by the author was matched.
Full paper (pdf)

[Export this entry to BibTeX]