“Serious Games for Policy Analysis and Capacity-Building” Workshop Review

As previously alluded to, on Nov. 22-23, I had the good fortune to attend a workshop entitled ‘Serious Games for Policy Analysis and Capacity-Building,’ delivered by Prof. Rex Brynen (McGill, PAXsims) via the Norman Paterson School of International Affairs Professional Training and Development centre.

The course was rich in history, provided extensive examples of modern applications of simulations and wargaming to multiple contexts, and supplied practical tools for building and applying simulations and serious games in the “complex, uncertain environments” to which they are suited.

Rex brought together a wide range of best practices for design and delivery, collected and collated from across the industry and heavily supported by his own practical experience—I would strongly recommend taking a look at his slides, as there are few opportunities to find such a wealth of practical resources on professional simulations in one place. In the coming weeks I expect to highlight a few take-aways and taxonomies raised during the course.

In the meantime, here are samples of just a few of the interesting side conversations we held:

  • Terminology is an ongoing issue in simulation-based education and policy analysis. “Matrix analysis training exercises”, “dynamic scenario exercises”, “table-top training scenarios” are all terms supplied by course participants to improve buy-in from participants and sponsors and avoid the “g-word”. “Wargaming” has been accepted by both military and business sponsors, in part thanks to its long legacy (and also the fact that “they like it because it sounds tough and macho”). In social development and humanitarian spheres, however, this tends to be unpalatable. Clearly, in establishing Lessons Learned, I have settled on “simulations” as a middle-ground between definitional consistency and palatability. This term can still be vague as it lumps together table-top decision-making exercises (what I do) with immersive trainings featuring actors, simulated kidnappings, and paintballs (ie, hazardous environment awareness trainings) which are already common in the humanitarian field. One of the challenges presented by these gymnastics of nomenclature is also the ability to refer back to the successes of others, when so much of the literature focuses on “serious games”. A useful project may be to simply collate a list of terms, explaining which is preferred and why.
  • The “danger of fun” was introduced. People who enjoy themselves too much during simulation exercises—ie, when the simulation drifts towards simply being a game—can tend to focus on their own enjoyment rather than the learning objectives. They may overlook learning moments or worse, learn the wrong lessons. Simulations (or “serious games”) can and should be engaging learning products, but building a “fun” product should not be a goal. A related common problem in serious game design is being asked for a product that simply demonstrate the sponsors are right on a given topic—ie, “give me a game that says we should implement this project I like” is a common ask, especially in the realm of wargaming. 
  • Understanding what a simulation or game does and does not do is important, and we as practitioners have an obligation not to “oversell” the power of simulations and serious games. They should never be offered in isolation: the pre-simulation training and debriefing are as important as the exercise itself. In policy analysis, simulations are not fortune-telling machines: they help you to “find out what could go wrong before you roll out.” 
  • Often, in policy analysis, simulations can only feasibly be run once. A common concern when using probabilistic scenario-building is that the results are not useful because they constitute a study where n = 1. The response given by Rex and other experienced military wargamers: “Well, so is real life.” Good point. 
  • Rex gave a lively discussion on what he calls “the stochastic paradox”: that is, some people just hate dice. Rolling dice gives simulations a loose, “gamey” feel which can undermine buy-in. Ironically, the same people who are suspicious of dice might have complete faith in a computer script that assigns a probability and picks a random number—exactly the same operation, but the intervention of the computer suggests analytical rigour whereas the dice suggest blind chance. Interestingly, Rex and other experienced participants suggested that cards also work much better than dice—because in western culture, “cards are destiny”. Note that this can cause problems elsewhere, specifically in the Middle East.

In many cases, Rex’s own work was supplemented further by course participants: we were fortunate to have such experienced individuals as Tom Mouat, Tom Fisher, and Jerry Hall in attendance who all helped to drive these conversations.

The course featured a run of a sample matrix game from the Matrix Game Construction Kit. While the subject matter was military/political in nature (the manoeuvring of political actors on the eve of death of a totalitarian leader) we were able to explore the thesis/antithesis structure that matrix exercises take. We also had the opportunity to participate in a facilitated run of AFTERSHOCK, a training game developed by Rex and Tom Fischer to model the political and operational dynamics of a natural disaster response.

We participants were left with a considerable volume of ideas to work through—we certainly could have spent another day or two working on the topic. Fortunately, some tentative plans to form a “working group” were shared as the conference closed. Stay tuned!

Leave a Reply

Your email address will not be published. Required fields are marked *