Successful Professional Wargame - Graham Longley Brown
Successful Professional Wargame - Graham Longley Brown
Successful Professional Wargame - Graham Longley Brown
MANUAL DO PRATICANTE
Existem mais de setenta livros atualmente editados ou escritos por John Curry como parte do Projeto
História dos Jogos de Guerra. Aqueles sobre jogos de guerra profissionais usados pelo militar incluem:
Manual de Jogos Matriciais: Aplicações Profissionais desde Educação até Jogos de Guerra
Jogos Matriciais para Jogos de Guerra Modernos: Desenvolvimentos em Jogos de Guerra
Profissionais e Educacionais: Inovações em Jogos de Guerra Volume 2
O Kriegsspiel de Sandhurst: Jogos de Guerra para o Oficial de Infantaria Moderno: Treinamento para
a Guerra: Volume 1
A Arte dos Jogos de Guerra de Peter Perla, Um Guia para Profissionais e Hobbistas
Jogos de Treinamento Dark Guest para Guerra Cibernética Volume 1 Jogos de Guerra de Ataques
Baseados na Internet
Jogos de Guerra de Thomas Allen: Jogos de Guerra Profissionais 1945-1985
Inovações em Jogos de Guerra Vol 1, Desenvolvimentos em Jogos de Guerra Hobby e Profissionais
Jogos de Guerra do Exército: Exercícios da Escola de Estado-Maior 1870-1980
Contact! O Jogo de Guerra Tático Restrito do Exército Canadense (1980)
Dunn Kempf: o Jogo de Guerra Tático do Exército Americano (1977-1997)
Tacspiel, o Jogo de Guerra do Exército Americano da Guerra do Vietnã (1966)
O Jogo de Guerra do Exército Britânico (1956)
O Jogo de Guerra do Deserto do Exército Britânico (1978) Regras de Jogos de Guerra MOD
A Bomba e o Computador de Andrew Wilson: A História dos Jogos de Guerra Profissionais 1780-1968
Veja o Projeto História dos Jogos de Guerra em www.wargaming.co para outras publicações.
SUCCESSFUL PROFESSIONAL WARGAMES
A PRACTITIONER’S HANDBOOK
Figure 1-1. The Military Academy of the General Staff of the Armed Force of Russia, which
[28]
features a wargaming centre
In 2017, Russia re-opened their Wargames Centre in the Military Academy of the General Staff of
the Russian Armed Forces. The Commandant of the Academy, Lieutenant-General Sergei
Kuralenko said: '…after refurbishment, in the Military Academy of the General Staff, is the
Centre of Wargames. It is a multi-media complex and… will be able to conduct inter-service
[29]
wargames, as well as [other] measures of an operational training nature.' If you have
recently wargamed across a map featuring the Baltic States, Syria, Ukraine or the Arctic ‘High
North’, rest assured that there will be a startlingly similar map somewhere on a table or
computer screen inside that building.
[30]
Red Dragon Rising
An increasingly assertive China (re-)started wargaming in 2008. Devin Ellis gave an overview of
[31]
wargaming within the Peoples Liberation Army (PLA) in a talk to Connections UK in 2015.
The PLA use wargames for educational and training purposes. The majority are kinetic,
operational-level wargames, but political-military and cyber games also feature. Although the
learning curve is steep for the Chinese, ‘those parts of the PLA which are interested in
improving [wargaming] are avidly seeking best practices and methodological insights to improve
[32]
and inform [their wargames].’
"The only thing that ever really frightened me during the war was the U-boat peril." Winston Churchill. The
Western Approaches Tactical Unit (WATU) was a dedicated training and analysis team created in January 1942
and tasked to improve the development and dissemination of new tactics to Royal Navy and Allied vessels
escorting convoys across the Atlantic. Using innovative analytical methods, WATU developed a range of tactics
during the war and disseminated these to over 5,000 officers through lectures and tactical wargames. Many of
these appeared in the Atlantic Convoy Instructions and were used with considerable success by Allied naval
forces during the decisive engagements of the Atlantic War.
In January 1942, Captain Gilbert Roberts was summoned to the Admiralty and tasked to train a small team of
analysts, and identify tactics that could be used to turn the tide of the battle in the Atlantic. The seriousness of
the appointment was made even clearer after a brief face-to-face meeting with the Prime Minister who growled
"find out what is happening in the Atlantic, find ways of getting the convoys through, and sink the U-boats!”
The WATU facility was primitive, with tables, a tactical floor divided into squares, basic ship models, and a
lecture theatre, but Roberts quickly got to work. A basic set of wargame rules were developed, and a set of
processes were designed to represent real-time decision cycles, tactical doctrine, and communications issues.
Then the room was re-arranged so that players representing escort commanders could only see the game play
through a restrictive canvas screen to represent the limited information they would have in a real battle while the
adjudication team moved the model ships according to the orders submitted by the players and their unseen
adversaries… The U-boat track was drawn on the tactical floor in brown chalk line, so it would be invisible to
players looking through their canvas slit but allow the umpires and ‘movers’ to follow its progress.
…It was getting late but [Wrens] Raynor, Laidlaw and Okell stayed behind to test the concept on a convoy
escorted by six vessels. A range of U-boat attack options were tested… As Roberts re-examined Laidlaw's
detailed plots from each game and her meticulous record of the discussions, he realised that a U-boat that
evaded an escort would probably dive and come up again astern of the convoy. The team agreed that he was
onto something and volunteered to continue wargaming… As dawn rose, the exhausted team were sent home
and Roberts arranged a demonstration. A sceptical Sir Percy Noble arrived with his staff the next day and
watched as the team worked through a series of demonstrations…Sir Percy’s demeanour changed dramatically
as the demonstration unfolded. Unlike every other approach, the solution WATU had identified was based upon
the U-boat commander’s most logical course of action and not just a reaction to a stricken merchant vessel. The
[41]
new tactic was immediately sent up to the Admiralty and Roberts was promoted on the spot.
When Roberts accepted his award as Commander of the British Empire at the end of 1943, he took a Wren
officer and rating with him to Buckingham Palace, intentionally sharing the honour with the team of remarkable
young women that helped WATU contribute to winning the Battle of the Atlantic.
The utility of wargaming is recognised by significant actors far beyond Defence. Wargaming
does not pertain just to war. Condoleezza Rice, writing in the Harvard Business Review in May
2018, said, ‘Reducing blind spots requires imagination. As one major investor told us, “The
biggest mistake is believing the future will look like the present. It almost never does.” His firm
trains all its associates to ask a simple question, over and over again: What if we are wrong?
Scenario planning, wargaming exercises, and other methods can also help firms identify hidden
[51]
risks.’
Business, government departments, emergency services, crisis management organisations,
humanitarian organisations and academia increasingly use wargaming. Many of my ‘lightbulb
moments’ can be laid at the feet of business wargamers. This because current Defence
wargamers tend to apply wargaming to theoretical contexts, whereas business practitioners use
the technique for real. Their quest for market share, profit, stress-tested strategies and so forth
is conducted in the harsh reality of the business world. Humanitarians and the emergency
services even more so; real lives depend on the effectiveness of their wargames.
General Andrew Sharpe
General Andrew Sharpe sums up the rationale for wargaming as follows:
Modern armies pour a lot of effort into a mixture of practicing tactics and procedures as a
repetitive drill, or into the planning process. Constraints on time, resources and commitments
often mean that they spend very little time, however, in practicing the execution of their plans,
or in dealing with the consequences (expected or unexpected) of the execution of their plans.
Opportunities for ‘free play live exercises’, in which teams, units or formations can practice
executing their plans against an active, thinking, disruptive and belligerent opponent are rare.
And, because they are rare, such training opportunities tend to be carefully constrained to
practice the executors in pre-arranged aspects of warfare with ‘goals’ and 'training aims’.
Exercises are rarely allowed to flow like wars - unpredictable, frictional, frustrating: the
military mind simply isn’t like that. Despite the need to be comfortable with and in chaos,
military trainers tend to use their precious time in a very controlled way, in a controlling
environment, for controlled outcomes. But war simply isn’t like that. Planning is a vital military
skill, but dealing with what happens after, inevitably, ‘no plan survives contact with the enemy’
is an even more vital skill. And, like so many such skills that rely as much upon intuitive
reasoning as on a rigid adherence to process, expertise can only be found in those who
practice that skill regularly.
This is where wargaming is such an essential tool for professional warfighters. When time
and material resources are short, and where career sensitivities mean that there is a fear of
‘failing’ during rare live training opportunities, wargaming offers professional soldiers a
vehicle in which they can train, safely and effectively, in the execution of their plans; in which
they can take the risks that they identify as the key opportunities with no fear of failure. The
‘fitness programme for the brain’ that is alluded to below, like any other fitness programme,
only works if it is conducted regularly and systematically. Wargames are competitive, like war;
outcomes are uncontrolled, like war; luck plays its part, like war; and those who tend to
succeed are those who habitually use a mixture of procedural understanding and intuitive
reasoning as events, unexpectedly, unfold (like in war). Simple manual wargames offer a cost-
effective, easy-to-conduct, safe-to-fail, and, if one dares to say it, enjoyable way of practicing
(for professionals) the simulated realities of decision-making in war. (Which is interesting, for
if one substituted: ‘sport offers a cost effective, easy-to-conduct, safe-to-fail and enjoyable
way of keeping soldiers fit to fight' no military person would question one’s logic.) And, as
every doctrinal pamphlet will tell you, the successful military leader, in war, is the person who
can seize and hold the initiative inside the decision-cycle of his opponent. So, a soldier’s
brain, like a soldier’s body, needs to be fit for purpose, and wargaming provides a very
portable mental gym in which genuinely professional officers can conduct their mental fitness
training.
Of course, completely aside from the routine training benefits described above, wargaming
also offers two other great benefits: informed force development and mission rehearsal. On
the former, those charged with developing future forces, whether in terms of equipment
capabilities, tactical developments or organisational changes, can try out ideas as they
develop. Wargaming has become an established and accepted tool for both military and
scientific force developers. On the latter, planners are afforded the opportunity to test and re-
test their actual operational plans. Before the chaos of execution, plans can be played through
(if time permits, repeatedly played through). This has two immediate benefits. First, plans tend
to become better and better as they are honed through the experience of success and failure
as they are played out. Second, teams that have played out the execution of their plans
against a genuinely belligerent opponent will be more comfortable in the inevitable confusion
of execution that will start the moment that the plan rolls into action, and thus be better pre-
prepared to succeed in that uniquely competitive environment.
[52]
Matt Caffrey’s On Wargaming
Matt’s book was published as mine was in its final draft. Unsurprisingly, there are close parallels
between the two. Rather than incorporate Matt’s wisdom into this chapter, you should read his
[53]
Chapter 8 (The Utility of Wargaming) for a slightly different perspective.
The benefits of good wargames
‘Games are intensely stimulating; people are very active; ideas and conjectures get tossed
around and analysed by a highly motivated group of people; a great deal of expertise is
collected in a single room, expertise that is not often collected together; and people discover
facts, ideas possibilities, capabilities and arguments that do not in any way depend on the game
[54]
but nevertheless emerge from it.’
Thomas Schelling (1964)
I show the benefits of good wargames on a single slide at the start of a presentation (Figure 1-
3). I leave this up for a good two minutes with no commentary beyond having posed the
question, “How else might you achieve these?”
Each of the bullets deserves discussion, but here are just two comments:
A ‘fitness programme for military thinking’. Talking to Connections UK in 2013, General Andrew
Sharpe pointed out that the officer class is the conceptual component of fighting power, and so
needs to hone the brain as much as the brawn: “It’s about minds, not stuff [equipment].”
‘Build confidence and trust in a team and pre-position relationships.’ Plato said, ‘You can
[55]
discover more about a person in an hour of play than in a year of conversation.’
Figure 1-3. The benefits of wargaming
This bullet was introduced at Connections UK 2013 by David Hockaday, a humanitarian. I am
amazed at the richness of discussion between members of what should be one team during the
simplest of wargames. It often seems as though the G3 (operations) and G4 (logistics) staff, or
the Business Development and Operations personnel, never talk, and relish the opportunity to
establish a mutual understanding in a wargame.
However, there is a risk that moving beyond 'getting to know someone' into explicit assessment
can prevent one of the key characteristics of successful wargames discussed in chapter 6:
freedom to fail. Knowing that they are being assessed can significantly dampen someone’s
willingness to innovate and experiment with no fear of failure.
When talking to businesspeople, I usually follow slide above with one directly relevant to the
business environment (although business folk relate well to military ideas):
· A greater understanding of competitors, frictions and market forces.
· Proactively managing, and reacting to, emerging risks and opportunities.
· Becoming comfortable with operating in dynamic and chaotic situations.
· An opportunity to practise or test:
o Risk Management.
o Business Continuity.
o Resilience.
o ‘What if’ questions.
o Collaborative ventures.
The limitations of wargaming
The references below address wargaming challenges and recommendations:
Pettijohn and Shlapak (18 February 2016) War on the Rocks article, ‘Gaming the system:
[56]
obstacles to reinvigorating defense wargaming’.
Downes-Martin (2014) ‘Your Boss, Players, and Sponsor - The Three Witches of War
[57]
Gaming’.
Downes-Martin, Rubel and Weuve (2004) ‘Pathologies of war gaming and what to do about
[58]
it.
[59]
Downes-Martin (2016) ‘Wargaming to Deceive the Sponsor’: How and Why?
Perla and McGrady (2011), ‘Why Wargaming Works’, Naval War College Review, Vol. 64, No.
[60]
3 (specifically the ‘Cautions’ section pp.123-125).
The Wargaming Handbook introduces the main limitations of wargaming, but I include
supplementary comments after the extract.
Wargaming Handbook extract
Wargames are not a panacea and should only be applied when appropriate. Wargames are not reproducible.
Wargames are driven by player decisions, and players will make different choices even when presented with the
same situation. Add the element of chance inherent in wargaming, and no game will ever be the same, even when
the starting situation is replicated. Of course, it is this very unpredictability, coupled with the creativity of
participants, which enables wargames to generate new ideas.
Wargames are qualitative. If the output required is numerical, a wargame is unlikely to be an appropriate tool.
While most wargames include mathematical systems that produce numerical results, precise outcomes will vary.
Wargames can complement, but are not a substitute for, more rigorous or detailed forms of analysis. Wargames
are best used to inform decisions by raising questions and insights, not to provide a definitive answer.
Wargames are only as good as the participants. An uninformed, unqualified or overconfident wargame team is
unlikely to add value. Furthermore, the product of a successful wargame will be of benefit only if it is considered by
the sponsor. Greater diversity among participants is likely to generate richer collective insight. In some cases,
having military officers as the only participants, or having military officers with common experiences and
perspectives, may limit the quality of the game.
Wargames are not predictive. Wargames illustrate possible outcomes, so there is a risk of false lessons being
identified from a single run of a wargame. Wargames can illustrate that something is plausible, but will not be able
[61]
to definitively predict that it is probable.
There is no single, commonly accepted, definition of ‘wargaming’. NATO defines a wargame as: ‘a simulation of a
[83]
military operation, by whatever means, using specific rules, data, methods and procedures.’ The importance
placed on the decisions of the wargame players, not contained in the NATO definition, means this handbook uses
the working definition of wargaming contained in the Red Teaming Guide:
‘A scenario-based warfare model in which the outcome and sequence of events affect, and are affected by, the
[84]
decisions made by the players.’
Wargaming is a decision-making technique that provides structured but intellectually liberating safe-to-fail
environments to help explore what works (winning/succeeding) and what does not (losing/failing), typically at
relatively low cost. A wargame is a process of adversarial challenge and creativity, delivered in a structured format
and usually umpired or adjudicated. Wargames are dynamic events driven by player decision-making. As well as
hostile actors, they should include ‘oppositional’ factors that resist a plan. At the core of wargames are:
the players;
the decisions they take;
the narrative they create;
their shared experiences; and
the lessons they take away.
Wargames immerse participants in an environment with the required level of realism to improve their decision-
[85]
making skills and/or the real decisions they make.
Space, and the fact that it is introductory doctrine, precluded a full explanation of these
elements in the Wargaming Handbook. I explore them in detail in this book, often devoting a
chapter or more to each. For the purpose of defining a wargame, please note the following.
A professional wargame includes all the elements above: they are part of a holistic whole that is
the game. The role of the first five is to enable the successful addition of the vital ingredient:
players. The decisions made by the players will be better, and be better understood, when SMEs
and analysis are integrated.
The first five elements provide a framework and a context for discussion; the last three
generate and make sense of that discussion.
The outputs from a wargame are analysed player decisions and the associated insights.
Everything else is an input to prompt and capture this discussion between players, SMEs and
analysts. For example, adjudicated in-game outcomes are an input that prompts further
decisions and discussion; they are not an output.
Elements 2 through 5 can be entirely manual, computer-assisted, or a combination thereof. I
discuss these later, but want to make the fundamental point now that manual and digital
approaches are complementary and synergistic, not preclusive or exclusive. This is discussed
in Chapter 9 (Appropriate Technology), and the theme recurs throughout the book.
Supporting and associated techniques
I introduce supporting techniques now to establish their relationship with wargaming, but
discuss them in detail later.
Wargaming Handbook extract
Several associated techniques can support, or be supported by, wargaming. These overlap to some extent and,
along with wargaming, add to the ‘tool kit’ of decision-making techniques. Their relationship with wargaming is
important.
· Operational Analysis (OA) or Operational Research (OR) is ‘the application of research and analysis
methods to the systematic investigation of operational problems to assist executive decision makers. In Defence,
it largely involves the application of OA/OR to complex socio-technical problems within the MOD and in
[94]
military/security operations’. Most wargame outcomes are based at some level on OA. OA can directly support
wargaming, while wargaming is often used to integrate OA into planning.
· Modelling and Simulation (M&S) is ‘using models (a physical, mathematical, or otherwise logical
representation of a system, entity, phenomenon, or process) as a basis for simulations (methods for implementing
[95]
a model statistically or over time) to develop data as a basis for decision making.’
· Red Teaming is ‘the independent application of a range of structured, creative and critical thinking
[96]
techniques to assist the end user make a better-informed decision or produce a more robust product.’
[97]
Wargaming is a recognised red teaming tool, while red teams often support wargames. However, neither is a
‘parent’ technique to the other; ‘sister’ is a better term.
I discuss M&S elsewhere, making the crucial differentiation between ‘simulation’ and ‘wargame’.
For now, please note what I consider to be the simplest and most useful definitions of each:
· A model is ‘a representation of something’; and
[98]
· A simulation is ‘the exercising of a model over time.’
Matt Caffrey concurs: ‘Many people in today’s military use the terms “modelling”, “simulation”,
and “wargaming” as if they were interchangeable. They aren’t. Models are proportional
representations of the real world [and] are static. When a model is examined over time, it
becomes a simulation. Place an aircraft model in a wind tunnel and observe how it behaves at
[99]
different airspeeds at different moments, and you have a simulation.’
Figure 2-2. MOD Wargaming Handbook diagram showing the strengths, weaknesses and
overlaps between wargaming, modelling and synthetic environments
‘Red Teaming’ remains an extant term but is gradually being replaced by ‘Critical Thinking’.
More on this in Chapter 3 (Misnomers and Misunderstandings). ‘Oppositional’ is the wargaming
characteristic that highlights the requirement for Red Teaming/Critical Thinking, which
challenges assumptions and mitigates cognitive biases such as group think and optimism bias.
‘Wargame’ defined
Previous definitions
I will shortly draw on the above discussion to arrive at a precise definition, but let’s first
highlight the key elements from historical and extant wargame definitions. The phrases and
terms I consider important – either to discard or to carry forward – are in bold. You will quickly
identify a pattern of common terms. I then suggest my own definition. Francis McHugh’s 1966
version is omitted from this survey; I’m saving that for a discussion on training ‘versus’
analysis wargames, and it is included in others’ definitions here. You may wish to review Peter
[100] [101]
Perla’s Art of Wargaming and Matt Caffrey’s On Wargaming before continuing, as my
proposals build on theirs.
US Naval War College, 1824. ‘Exercises in the art of war, either land or sea, worked out upon
maps or tables with apparatus and constructed to simulate, as nearly as possible, real
conditions.’
New International Encyclopaedia, 1916. ‘An imaginary military operation, usually conducted on a
map and employing various moveable devices intended to represent the opposing forces, which
are moved according to rules reflecting conditions of actual warfare.’
The Dictionary of United States Terms for Joint Usage, 1964. ‘A simulation, by whatever means,
of a military operation involving two or more opposing forces, conducted using rules, data and
procedures to depict an actual or assumed real life situation.’ This is the genesis of most
[102]
modern definitions, and remains extant in US doctrine.
[103]
Peter Perla, The Art of Wargaming. A warfare model or simulation that does not involve the
operations of actual forces, in which the flow of events affects and is affected by decisions
made during the course of those events by players representing the opposing sides.’
Peter recently updated this to 'A warfare model or simulation in which the flow of events
shapes, and is shaped by, decisions made by a human player or players during the course of
[104]
those events.’ I discuss the deletion of ‘that does not involve the operations of actual
forces’ in Chapter 3.
[105]
NATO definition. 'A simulation of a military operation, by whatever means, using specific
rules, data, methods and procedures.'
Draft Wargaming Handbook working definition (replaced by the extant Red Teaming
Guide definition by editorial diktat). ‘A wargame is a structured representation of adversarial
situations, in which outcomes are shaped by the decisions of the players. Effective wargames
generate an immersive narrative which can be used for training, education, planning and
executive decision-making.’
Let’s unpack the key terms from this array.
· ‘Simulation.’ A simulation is the exercising of a model over time. It can be manual,
computer-assisted or computerised. Accepting this removes the necessity to include ‘by
whatever means’ in a wargame definition.
· ‘…opposing sides…’ introduces the adversarial nature of wargames, and can be
subsumed into ‘adversarial.’
· ‘…does not involve the operations of actual forces…’. This means no formed bodies of
troops are deployed. HQs yes, but no actual force elements. This is important because it
excludes field exercises and manoeuvres. The clause should be retained because
exercises etc require different skill sets and design approach than wargames.
· ‘…in which the flow of events affects and is affected by decisions made during the
course of those events…’. This is story-living. It is critical, but the clause might be
shortened – as long as the story-living characteristic is not lost. Phil Sabin suggested: ‘…
in which events and outcomes are shaped…’ It is important to note that events affect
player decisions as well as vice-versa.
· ‘Immersive.’ As you will see in Part 4 (Practising Successful Wargames), immersing
players in the ‘possibility space’ where decisions are made and players face the
consequences of these is key to successful wargames. It should be retained.
· ‘Rules, procedures & data’ are included in the constituent elements of a wargame, so
need not take up words in the definition. They encompass ‘structured’.
To the above, add: the previewed wargame characteristics (adversarial by nature, oppositional,
and the temporal course of events); and the essence of a wargame (players, decisions and
story-living), and you have the ingredients for a definition that synthesises the key terms from
historical and current definitions.
Suggested definition
Adversarial and oppositional by nature, a wargame is an immersive simulation, not involving the
operations of actual forces, in which the course of events shapes, and is shaped by, decisions
made by the players.
After all that, it’s noteworthy that mine is similar to the 1964 (and extant) DOD definition, and
Peter Perla’s from 1990.
Wargaming applications
So, that’s what a wargame is. Where might the technique be applied? I discuss the many
formats, contexts and variants of wargames in Chapter 5. In advance of that, I want to briefly
introduce the general domains to which wargaming can be applied.
Wargaming Handbook extract
Wargames immerse participants in an environment with the required level of realism to improve their decision-
making skills and/or the real decisions they make. Analytical (‘discovery’) wargames can be used to explore
national-strategic, strategic, operational and tactical issues across the full spectrum of military activity. Training
(‘learning’) wargames are a ‘fitness programme for thinking’, enabling practise in the conceptual elements of
command and control. Wargames are widely used by business, the emergency services, academia and
[106]
humanitarians, as well as defence organisations. Historically, wargaming has proved its utility to UK
Defence and remains relevant to today’s problems. In particular, it can be applied to the following areas:
Education and training wargames focus on training personnel, using safe-to-fail environments to allow
participants to practise, experiment and innovate. Wargames are well suited to this because they create
experiential learning opportunities, helping to develop a shared narrative about situations and tasks that
personnel might face in the real world.
Planning wargames are analytical wargames used to develop and test plans for dealing with particular events
or circumstances. Applications span policy, strategic, operational and tactical situations. Their aim is to expose
plans to rigorous examination to identify risks, issues and previously unconsidered factors.
Executive decision-making wargames are analytical wargames that inform real-world decisions. The dynamic
and unpredictable nature of wargames enables players to consider future events, and supports related
decision-making. The intent is to generate insights and data that will increase understanding of, for example,
how:
Situations might develop.
Force structures and concepts might adapt to new challenges.
Science and technology might deliver a competitive advantage.
The distinction between education and training wargames and analytical wargames is not rigid. A wargame
designed for one purpose is also likely to have benefits in the other. However, in 1966, Francis McHugh wrote,
with regard to wargames designed for training or analytical purposes, ‘In practice, it has been found that it is
better to point the game towards but one of those objectives, that is, to select as the primary objective one of
the following: (a) provide military commanders with decision-making experience, or (b) provide military
commanders with decision-making information.’ This is illustrated in Figure [2-3]. Note the final sentence in the
figure: a wargame must be applicable to real-world situations to make it relevant. [107]
[108]
Figure 2-3. The general purposes of wargames
Two fundamental points illustrated in Figure 2-3 need emphasising:
1. Analytical wargames help inform people so they make better decisions, while
education/training wargames help people become better decision-makers.
2. Professional wargame outputs should be applicable to the real world. This can likewise
be manifested in enhancing someone’s decision-making abilities or informing actual
decisions.
Wargames as an art form
Art and science
I raised this as an issue in Chapter 1. However, the fact that wargaming is an art form is what
makes it so powerful, so I need to explain this apparent dichotomy. It is not a case of art versus
science, but of art and science. Wargames must include rational scientific elements: the
representations of physics (such as missile ranges or movement rates) must be sufficiently
accurate to be credible. Overlaying the people-centric nature of wargames introduces the
irrational human factors that win and lose wars and business campaigns. These emotional
aspects of wargaming demand an approach that is more art than science. Furthermore, it is the
‘play’ aspects of gaming that sets it apart from scientific techniques. Ed McGrady noted in his
2015 Connections UK keynote speech that ‘Games are not simply a linear extension of other
forms of analysis, models and simulations, because you have multiple people
[109]
participating.’
However, reconciling the tension between art and science presents an issue. Ed pointed out
that if you claim your wargame incorporates really accurate models, and that your adjudication
process gives it the exactitude of a computerised simulation, you will lose the argument. It is
the people playing wargames that differentiates the game from rational scientific constructs and
computerised simulations. Play is powerful. It causes players to: change their views; become
emotionally involved; introduce irrationality; and even change the paradigm (“Hey, we’re
examining the wrong problem!”). But many sponsors and users subscribe to the rational
approach, even to the point of viewing humans as agent-based systems that we don’t quite
understand yet. The issue, therefore, is persuading people to adopt an approach that requires
art not just to be overlaid on top of science, but to be accepted as being of equal or greater
importance.
Wargames as an act of communication
[110]
Raph Koster tells us that ‘Pinning art down is tricky.’ He further observes that: all games
are a form of communication; they are media that provide information; games that are
‘entertainment’ provide comforting and simplistic information that is easy to assimilate; games
that are ‘art’ deliver information that is intellectually and emotionally challenging. ‘Mere
entertainment becomes art when the communication element in the work is either novel or
exceptionally well done. It really is that simple. The work has the power to alter how people
[111]
perceive the world around them.’
[112]
Peter Perla addresses wargames as acts of communications in The Art of Wargaming and
Zones of Control: ‘Professor Stephen Downes-Martin at the US Naval War College has argued
that using game decisions as the key information source for wargaming insights is an unreliable
[113]
one. Pursuing this line of thought, Naval War College professor Hank Brightman and
student Melissa Dewey proposed that the true source of useful information and insight available
in a game derives from the conversation among the players as they communicate by both word
[114]
and action. Indeed, wargaming is an act of communication.’
In 1964, William Jones wrote, ‘In my opinion, the two most salient features of crisis games (or
any other sort of team game) are: (1) The rapidity with which the bits of information on the
problem area, known at the outset to individual participants, enter the common information
base of the whole group; (2) The rapidity and accuracy with which individual members of a
game team achieve an understanding of (although, not necessarily in agreement with) the
feelings of their team mates about the situation/problem being simulated. These two features
are, I believe, the main bases for the phenomena remarked on by Tom [Schelling]: the
noticeable post-game improvement in the ability of participants to communicate meaningfully.'
[115]
(Emphasis in original).
Communication occurs on many levels and is multi-directional between (for example): sponsor
and players; multiple players; and players and analysts. Some of these are quite esoteric
relationships, but I want to focus on practicalities that arise if you think of a wargame as a
[116]
‘focussed conversation.’
The implications of wargames as an act of communication are that analysts and training
mentors need to pay at least as much attention to the discussion in player cells as they do to
the decisions themselves and the resultant actions and outcomes. In a large, distributed
wargame, the technical and procedural challenges this presents are significant. As well as
decisions, analysts should capture the factors considered, options rejected and the rationale
for all this. Once a decision is taken, consequences can be examined fully cognisant of the
precursor discussion. This is a significant insight that can shape the entire analysis plan for a
game, be it in an educational or analytical context.
Data capture and initial analysis should focus on the anatomy of a decision; subsequent
analysis on its unflinching autopsy.
Chapter 3. Wargaming Misnomers and Misunderstandings
“When I use a word," Humpty Dumpty said in rather a scornful tone, “it means just what I choose
it to mean — neither more nor less."
Lewis Carroll, Through the Looking Glass
Introduction
The pervasive mis- and non-communication I allude to in Chapters 1 and 2 warrants a chapter to
clarify terminology issues. The standard caveats apply: wargaming is a broad church; what
follows are my own views; others will likely raise valid objections; and sparking debate is
desirable. If the latter occurs, the fact that we cannot agree highlights the issue.
‘Simulation’ is not synonymous with ‘wargame’
'The game is in the mind of the players, and the instrumentality of the game is simply a tool to
help get the game into the mind of the players.'
[117]
Peter Perla, Connections UK (2013)
Differentiating ‘simulation’ from ‘wargame’
Conflating ‘simulation’ with ‘wargame’ lies at the heart of many pernicious issues (hence the
inclusion of Figure 2-2 in the Wargaming Handbook). Let’s tease those terms apart. We know
from the previous chapter that simulation is but one element of a wargame. That immediately
differentiates them: an engine is one element of a car, but it is not the car. We also know that, to
have a wargame, you must have players and the decisions they make. Simply stated,
simulation + people = wargame
and not just simulation = wargame. I can understand the confusion, because many peoples’
experience of a wargame involves a computerised simulation, into which many of the other
elements have been subsumed.
As well as players, you also need rules, procedures and adjudication to bind everything
together. The next step, of course, is to add the other elements of a wargame to the equation
above. This combination of all elements is a wargame system, to which the vital ingredient of
players is added. Two examples and two manifestations illustrate the ‘wargame equals
simulation’ misunderstanding.
Example 1: Mistaking a simulation for a wargame
An organisation asked me to assess the cost savings and potential improvements to a wargame
from replacing a computerised with a manual simulation. While that is almost a valid question,
what was not valid was their preferred approach to answering it. I was to: set up the manual
simulation in an empty room isolated from the controlling staff; ‘execute’ my simulation during
the wargame; emerge after two weeks and compare the outcomes and situation to those
generated by the computerised simulation; then, from that experience, evaluate the feasibility of
replacing the computerised with the manual simulation.
I eventually persuaded them that a manual simulation divorced from players is just a map that
would achieve nothing on its own, and was allowed to work in the Control Room. The
fundamental failure to recognise that you need to add people to a simulation to have a
wargame, plus a host of nuances around that, persisted. One such was a primary ‘finding’ (that
we already knew): it is not a case of ‘either/or’ computerised or manual simulations; they each
have strengths and weaknesses and, when used in a complementary fashion, deliver synergistic
benefits. Another (that we also knew) was that the most significant improvements to the
wargame could be achieved by refining the wargame’s processes and rationalising extraneous
resources. These ‘findings’ would have been apparent to an organisation that understood the
basics of wargaming: that wargames are supported by appropriate simulation and enabled by
robust processes.
Example 2: Serendipitous wargames
While demonstrating a wargame to a group of international visitors, one of them admitted to me
that she was not a fan of the technique. She had seen too many ‘serendipitous wargames’ at
which the organiser had booked a room, booked a computer ‘wargame’, arranged for the
players to turn up – and then expected magic to spontaneously occur. Sometimes it did,
because the ‘can-do’ attitude of the military will wring any potential benefits out of an event. But
not, she explained, as many benefits as there should be, and not enough to justify the
expenditure of time, effort and money.
The primary issue, she said, was that the organisers assumed that the computer simulation was
the wargame, and all that was needed to make the event work once you added players. There
was no recognition that you also needed the enabling wargame processes. Her observations
were accurate and her objections to serendipitous wargaming valid: just booking a room, hiring
computers and co-opting some players is unlikely to deliver a good wargame.
If ‘serendipitous wargaming’ is a little prosaic, here are two common and down-to-earth
manifestations of the conflation of simulation and wargame.
Manifestation 1. I am regularly asked if I can shorten the length of time it takes to conduct a
wargame turn. The assumption is that there is a ‘x 2 speed’ button that can be pressed. In fact,
there is a formula that determines the length of a turn: number of participants x complexity of
the situation = time required for the necessary discussion. The turn length can only be
shortened by reducing participant numbers and/or complexity, or by curtailing discussion.
Doing any of these risks factors being insufficiently discussed or missed altogether; or
participants disengaging because their opinion is not being heeded.
Manifestation 2.
'No-one, however smart, however well-educated, however experienced, is the suppository (sic)
of all wisdom.'
Tony Abbott, former Australian Prime Minister (2013)
Despite explaining that a wargame is simply a framework for informed expert discussion, I am
frequently approached at the end of introductory briefings by well-meaning people who want to
send me volumes on non-state actors in the Ukraine, the capabilities of the S-400, counter-
cyber technology etc. Their intention is to pour information into the wargame data base (which
they perceive to be me) so that, brimming with all available information, the simulation will
produce the right outcome when the ‘execute’ button is pressed.
Ellie Bartels comments that, 'One source of the simulation versus gaming conflation is that
"simulation” is the preferred term within the political science community involved in activities
we would think of as games (see for example, the title of the "Simulation & Gaming” journal). As
a result, it is used by a fair number of American think tanks and academics to describe their
[118]
activities. In my experience, these tend to fall in the direction of BOGSATs.'
So, what is simulation?
'More and more we are into communications; and less and less into communication.'
Louis 'Studs' Terkel (1912 – 2008), chronicler of oral histories
Respective examples of manual, computer-assisted and computerised simulations are: map and
counters; spreadsheet-based combat calculators; and constructive computer simulations such
as the US Joint Theater Level Simulation (JTLS). As Studs Terkel alluded to, there is a
widespread pro-technology bias and, in the case of simulations to support wargaming, a
presumption that ‘electrons are best.’ I address this in Chapter 9 (Appropriate Technology).
What is Modelling and Simulation (M&S), and how does it relate to wargaming? As discussed in
[119]
Chapter 2, my preferred definition of ‘simulation’ is ‘the exercising of a model over time.’
‘Model’ is a useful term, but it is cranking that model over time, powered by player decisions,
that drives a wargame.
The M&S industry agrees that a simulation can be live, virtual or constructive.
· Live: real people using real equipment. Think lasers and direct fire weapons effects
simulators.
· Virtual: real people using simulated equipment. Examples include the UK’s Combiner
Arms Tactical Trainer (CATT) or first-person shooters such as Virtual Battlespace (VBS).
· Constructive: simulated people using simulated equipment. Computerised examples
include JTLS and the Advanced Battlefield Computer Simulation (ABACUS), which supports
Command and Staff Trainer events.
Those examples are computerised. But most manual simulations also qualify as constructive:
counters or miniature figures represent individuals or aggregated groups of personnel and their
equipment. Figure 3-1 shows the counter ‘fields’ used in the Rapid Campaign Analysis
Toolset (RCAT). Such a manual simulation is effectively a deconstructed constructive computer
simulation. The combat and ‘find’ factors enable algorithms and procedures that are essentially
the same as those in a constructive computerised simulation.
[120]
Figure 3-1. Counter ‘fields’ used in RCAT
‘Wargame’ and ‘exercise’ are not synonymous
The definitional clause ‘…that does not involve the operations of actual forces…’ is important
because it tends to preclude live and virtual simulations from wargames. The British Army (my
background) ‘trains the trainer’: commissioned and non-commissioned officers are taught to
train their subordinates. They are qualified to design and run live-fire ranges, conduct field
exercises, and so forth. If you transpose a section of infantry from a live-firing range into a live
simulation exercise or a virtual trainer, there is little substantive difference in the delivery of live
firing, and live or virtual simulation exercises. Personnel are placed directly alongside, or into,
the simulation, or use adapted real equipment. All British Army officers can design and execute
such exercises; there is no specialist skill required.
That is not the case when designing and delivering a wargame supported by a
constructive simulation. Equipment and headquarters are simulated (usually, although real
communications and IT systems are often used). Integrating people with the simulation to
create the wargame requires bespoke processes, and synthesising all elements requires
specialist skills – those of the wargame designer. If we do not exclude the operations of actual
forces from our wargames, we succumb to the presumption that ‘it’s just another exercise, and
anyone can design that.’ That is incorrect.
Hence, while a wargame is often given an exercise name, it should be considered something
apart. Although this is not one of the more serious misnomers, it is worth bearing in mind.
Terms such as Start- and End-of-exercise (Startex and Endex) and Exercise Control (Excon), are
widespread (I use them liberally throughout the book), and I see no harm in applying them to
wargames – but remain alert to the risk of conflating ‘wargame’ and ‘exercise.’
Training ‘or’ analytical wargames
Should a wargame be one or the other?
Francis McHugh said in 1966, 'The ideal of every wargame is to provide military commanders
with both decision-making experience and decision-making information that will be useful in
real-world situations. In practice, however, it has been found that it is better to point the game
towards but one of those objectives, that is, to select as the primary objective one of the
following:
(a) provide military commanders with decision-making experience, or
[121]
(b) provide military commanders with decision-making information.'
It is now apposite to introduce McHugh’s definition of a wargame (which I withheld in Chapter
2): ‘A simulation of selected aspects of a military operation in accordance with predetermined
rules, data and procedures to provide decision making experience, or decision-making
[122]
information that is applicable to real world situations.’ In Figure 2-3, McHugh considers the
distinction between providing decision-making experience and decision-making information
sufficiently important to fundamentally shape his diagram.
Some reinforce the distinction. In a 2012 PAXsims post, Brant says, ‘On the Joint Staff we are
[123]
using PSOM for planning and analysis. To me, that almost shoots it down as a solution to
[124]
CGSC's [training requirement] right there. Training tools should be focused on training, not
predictive analysis or decision support. Conflating the two because they have similar interfaces
(to people who don't know any better) will result in huge problems with outcomes, and poor
[125]
results on all fronts.’
However, Rex Brynen makes the point that you often don’t have the luxury of deciding whether
a wargame is educational or analytical. Game sponsors demand insights, but also want
participants to learn from the experience, form social or team bonds etc. This reality check is
substantiated by an extract from a 2011 US Naval War College Fleet Arctic Operations Game
(FAOG): ‘In addition to serving as a highly analytic event, the FAOG was designed to enhance
participants’ understanding of potential challenges and cooperative strategies for conducting
[126]
sustained maritime operations in the Arctic.’ Rex notes that, ‘While that differentiation
[training vs analysis] is useful because it points to important differences in purpose and hence
design, I’ve been increasingly interested in the extent to which we might be able to develop
hybrid games – that is, wargames that serve an education/training function, but in which
participants are also generating data that is of analytical value too. My own Brynania civil
war/peacebuilding game at McGill, for example, is designed for educational purposes but has
now been used to generate data for two PhD theses (one on terrorist violence, the other on
educational gaming). While there’s a risk of compromising analytical rigour or educational
effectiveness in doing this, it could also provide a useful way of stretching limited
[127]
resources.’
Does it matter?
It is inevitable that dual benefit will be derived from training and analytical wargames. However,
it remains important to bear the distinction in mind. This is McHugh’s ‘pointing’ to one or the
other, because each needs a different design approach. I explain this in detail in Chapter 12
(Wargame Design), but the essential difference is that the design of a training wargame
focusses on the effect to be enacted on the players, while the design of an analytical game
centres on the examination of a research question.
The distinction should be borne in mind throughout the ‘Wargame Lifecycle’ (see Chapters 11 -
16). For example, higher levels of verification and validation will be required of simulations that
support an analytical wargame, compared to those demanded of a training game (where
outcomes only need be ‘good enough’ to support the training objectives, ensure immersion and
prevent false lessons being learned). However, remain open to the dual benefits that will arise;
approached carefully, these are a desirable spin-off benefit, not a distraction.
‘Course of Action Wargaming’ and ‘wargaming’
Many military personnel erroneously use ‘Course of Action (COA) Wargaming’ and ‘wargaming’
synonymously. This is important because COA Wargaming is probably the most widespread
application of wargaming in the military; it is mandated in doctrine and practised in planning
activities from battlegroup upwards. Failing to recognise that COA Wargaming is just one sub-
set of wargaming risks assuming that everything can be approached as an analytical COA
Wargame. I repeatedly need to make the distinction to prevent people reverting to the COA
Wargame schema as the sole answer to all wargaming problems. Ellie Bartels, researching for
her doctoral thesis, says that, ‘the only place US doctrine discusses wargaming is in the
context of COA analysis. As a result, a shocking number of military officers will tell you that no
other purpose or structure is valid.’ (Ellie’s italics.) I use the diagram at Figure 5-5 to illustrate
this, with a red arrow showing that COA Wargaming belongs in the ‘operational planning’ box,
but that there are many other applications of wargaming. COA Wargaming is explained in
[128]
Chapter 26.
Verification and Validation
This section will be contentious.
The wargaming and M&S communities have reversed the common English meanings of ‘verify’
and ‘validate.’ Unsurprisingly, this switched meaning confuses most users. Wikipedia says of
verification and validation (V&V), ‘In practice, the usage of these terms varies. Sometimes they
are even used interchangeably.’ (My italics.) That is a significant issue, because the terms have
different meanings and are fundamental to wargame design and the supporting simulations.
The accepted meaning of the terms within the systems engineering and wargaming communities
is:
· Verification: ‘The process of determining that a model or simulation implementation
[129]
accurately represents the developer’s conceptual description and specification.’ In
short, is something fit for the purpose for which it was designed?
· Validation: ‘The process of determining the degree to which a model or simulation is an
accurate representation of the real world from the perspective of the intended uses of the
model or simulation.’ [130] In short, is something true to reality?
[131]
The above definitions are stated in Zones of Control, showing that they also pervade
hobby wargaming. Now to show that those two terms have been reversed from the common
English usage.
Verify
The root word is the Latin veritas, which means ‘truth’. Veritas was the Roman goddess of truth.
The OED definition is ‘To establish the truth, accuracy, or reality of; to prove to be true;
confirm; substantiate, establish the truth.’ That is the common English meaning: something is
true to reality.
Consider these examples:
· Complete, verifiable and irreversible de-nuclearisation.
· He attempted to verify the account of the Battle of Hastings.
These every day uses relate to ‘true to reality’. They have nothing to do with ‘accurately
representing the developer’s conceptual description and specification.’
Valid
‘Valid’ originates from the Latin validus, which means ‘strong’, ‘worthy’ or ‘with force’. The OED
definition is ‘To establish the soundness, or legitimacy of; having a sound basis in logic or
cogent; reasonable or defensible.’ Wikipedia says that validation is ‘the assurance that a
product, service, or system meets the needs of the customer.’ That is the common English
meaning: something is fit for purpose.
Consider these examples:
· The plan was proven to be valid.
· To be valid the referendum needs a turnout of over 50%.
These every day uses pertain to ‘strong’, ‘worthy’ and ‘fit for purpose.’ They have nothing to do
with ‘an accurate representation of the real world.’
The Defence Systems Approach to Training
The UK Defence Systems Approach to Training (DSAT) uses the standard English meaning of
validation: an evaluation of training, plans and materials to determine the extent to which
training achieves its intended purpose.
The DSAT meaning of validation, to evaluate fitness against the intended purpose, is the one
used in the Wargame Lifecycle described in the Wargaming Handbook. I adopt the same
meaning in this book.
Implications
In the context of M&S, it clearly matters which word is used, and that people understand what it
means. The questions ‘is the simulation an accurate representation of the real world?’ and ‘is
the simulation fit for the purpose for which it was designed for?’ are very different and very
significant. However, I have no illusions that the terms will revert to the common English usage.
Hence…
The take-away is that you must: be aware of the confusion between the terms; explain what you
mean when you use them; and confirm what is meant by whoever you are speaking to.
Red Cell versus Red Team
People inevitably refer to adversary players as the ‘Red Team’. Of course they do: the Blue
Team are a team, as is the Umpire team. The issue pertains to confusion over the Red Cell and
Red Team, as discussed on p.49. The Red Cell is the enemy; the Red Team are the Critical
Thinkers. A colleague employed at a UK Force Development wargame as a Red Teamer, was told
on arrival that he was commanding an armoured division in the Red Cell. The misnomer is
common, and results in misused expertise and a failure to deliver Red Teaming. Thankfully, the
term Red Team/Teamer is gradually being replaced by Critical Thinking/Thinker.
Scenarios
Scenario writing, development and execution are so important that I devote three chapters to
them in Part 4. I will define those terms fully there; the purpose of this section is to highlight the
fact that a wargaming lingua franca is required. Conversations about scenarios provide
numerous instances of experts in their own field failing to communicate, often without even
realising it.
I attended a presentation on ‘Scenario design’ at an M&S conference. The speaker had been
talking for a few minutes about the ‘scenario’ for a first-person shooter simulation, and I was
confused because she was describing the techniques used for configuring the digital data for
individual vehicles. Whispers around the table confirmed that I was not the only one expecting
the talk to be on something different. At the end, I asked the question, “What do you mean by
‘scenario’?” Her reply was the digital information that constitutes entities within the model; how
the 0s and 1s make a tank look and act like a tank. But she hadn’t thought to clarify that at the
start or, more useful, before the talk, so that we could have decided whether to attend. It hadn’t
crossed her mind that there could even be a different definition of ‘scenario’ other than the one
she used. Most of the delegates were expecting a talk about the background story, or the
‘outline of the plot of a dramatic work, giving particulars as to the scenes, characters,
[132]
situations, etc.’
This should be another misnomer that is easy to address: a definition exists for ‘scenario’ and
‘setting.’ The ones in Part 4 are from the NATO Bi-SC Collective Training and Exercise Directive
(CT&ED) 75-3. Suffice to say here that the setting is the geo-political framework within which
the scenario is developed, and the scenario is the crisis-specific story that leads to the situation
being wargamed. I prefer both terms to the historical ‘General Idea’ and ‘Special Idea’, which
sound simplistic.
Cell colours, especially White
I discuss cell colours in Chapter 22 (Controlling Wargames) but include ‘White Cell’ here as
another example of an easy-to-fix misnomer.
The Wargaming Handbook defines the White Cell as ‘National and supranational political
organisations and diplomats; humanitarians; International Organisations and Non-Governmental
[133]
Organisations.’ I use that definition, but some countries, including the US, use ‘White Cell’
to describe Control/Excon. This implies that the White Cell is a game player, or at least playing
in the same way as other cells. That might be the case: Control can sometimes be an active
[134]
player. However, it is usually not, so calling it the White Cell can cause confusion.
Two take-aways
First, we need to sort out these misunderstandings and misnomers. This might require the
governance that I said is lacking in Chapter 1, and this at a cross-community, international level.
Absent that, it’s down to us – the wargaming community – to discuss and agree the necessary
lingua franca. We must do this.
Second, whatever terminology you decide on, or are told to use, ensure that your team
understand it, and that it is briefed to all game participants from the outset.
Chapter 4. Adjudication
‘Good adjudication is the key to successful wargames, whether by a single “umpire” or a large
organisation.’
[135]
MOD Wargaming Handbook
Introduction
[136]
Adjudication is ‘the act of determining the outcome of player decisions.’ It is at the heart of
every wargame. Although a wargame Control organisation (or mechanism) is also responsible
for other activities, the Wargaming Handbook singles out the adjudication of players’ decisions
as its primary function. While other aspects of controlling wargames are important, I find it
helps to classify these as determination, not adjudication. The centrality of adjudication is
illustrated in Phil Sabin’s ‘simplest possible representation of a wargame’ in Figure 2-1. It is so
important that I have placed this chapter in Part 1 (Wargaming Fundamentals), rather than in
Part 4 (Practising Successful Wargames), where some might have expected it.
Figure 2-1 shows players making decisions, responding to outcomes, making more decisions
and facing the consequences of those. A parallel process takes place within the ‘adjudication’
area of the diagram. Unlike most hobby wargames, where adjudication tends to be subsumed
into rigid rules, professional wargames almost always involve human-in-the-loop (HITL)
adjudication, to steer the wargame to achieve the objectives. Some think of adjudication as
umpiring, but it is more than that. It is crucial that those in the adjudication process get, and
[137]
stay, ahead of game play in a variant of the OODA loop. That is a key focus of this chapter.
Adjudication in professional games can also serve to separate, or shield, players from the
detailed wargame systems. This mitigates the risk of players 'gaming the game' and exploiting
rules and mechanics to gain an unrealistic advantage. It is easy to imagine an experienced and
capable military commander struggling against an adversary for no other reason than the other
player better understands how to manipulate the game system. A good adjudication process
should prevent this.
The Wargaming Handbook introduces the technical aspects of adjudication. I will overlay my
experience on those and explain the various methods, models and tools (MMT) that can support
adjudication. These MMT enable adjudication, they are not ‘ends in themselves’. The ‘master’
diagram Figure 4-1 informs this discussion, but contains one element that is not strictly
necessary to discuss adjudication: the level of enquiry being conducted. I offer a brief
discussion of this important adjunct at the end of this chapter because it is closely related to
adjudication.
[138]
Figure 4-1. Broad adjudication approaches and associated factors
Before we go on, I’d like to explain my convention of using ‘control’ with a lower and upper case
‘c’. The upper case ‘Control’ is a noun that refers to the organisation that contains the
wargame’s directing staff; the lower case ‘control’ is a verb. All wargames are controlled; some
feature a Control. I differentiate ‘control’ from ‘adjudication’ to focus the latter on the act of
determining the outcomes of player decisions without being distracted by the myriad control
functions required of a professional wargame (and are discussed in Chapter 22).
Scalability
The scaling of functions and MMT to make them appropriate to the size and scope of any
wargame is easier than it might appear. Irrespective of a whether a wargame consists of one
person playing solitaire or a massive multi-player, multi-national exercise with distributed
players and Control organisation, the functions, factors and MMT I discuss still apply – they are
all scalable. Scalability can dramatically increase the complexity and technical challenges
involved in a wargame, but the principles remain the same.
Conflating ‘kriegsspiel’ with ‘adjudication’; another misnomer
Some of you will have noted the absence of the term ‘kriegsspiel’ from the adjudication labels in
Figure 4-1. ‘Free kriegsspiel’, ‘semi-rigid kriegsspiel’ etc have been commonly used terms for
nearly two hundred years, so why do I marginalise them?
The literal translation of ‘kriegsspiel' is 'wargame' – they are truly synonymous. But ‘kriegsspiel’
(perhaps, more fully, ‘classic kriegsspiel’) has specific and commonly-held connotations within
[139]
the wargaming community: a three-table, manual, 'double blind' wargame format . There is
much right with this type of wargame, but it is not the only form of wargame. Conflating the
same term, 'kriegsspiel', with adjudication causes further confusion. I recently heard a leading
expert, while educating new wargamers, say, “The difference between free- and rigid-kriegsspiel
is critical” and then, moments later, “Adjudicating a kriegsspiel is hard work, with all that
running between tables.” The first sentence pertains to adjudication, the second to the wargame
format. Experienced wargamers would have followed this conversation, but the risk of confusing
newcomers and non-wargamers is significant – and easily avoided.
Wargaming Handbook extract
While adjudication is just one of the variants that produce different wargames, it is common to all wargames and is
of primary importance. Ranking adjudication as the principal variant does not imply that the others are
unimportant. Quite the opposite: the other variants generally take the majority of wargame design and
[140]
development effort. However, adjudication requires the most careful consideration and so is placed foremost.
[141]
Adjudication is the act of determining the outcome of player decisions. It enables consequences to be
[142]
highlighted and discussed, and options to be explored. The methods of adjudication are as follows.
[143]
Free adjudication. The results of interactions are determined by the adjudicators in accordance with their
professional judgment and experience.
Rigid adjudication. The results of interactions are determined according to predetermined rules, data and
procedures.
Semi-rigid adjudication. Interactions are adjudicated by the rigid method, but the outcomes can be modified or
overruled by the adjudicator.
Minimal/consensual. Adjudication is by the collective opinion of players and the adjudicators.
Several tools and techniques can be used to support adjudication.
Operational Analysis. Operational analysis informs the adjudicator(s), typically by presenting a spread of outcomes
such as the best, worst and most likely cases. Using this to inform the decision influences adjudication in the
direction of a rigid outcome.
Computers. Computer assistance in the form of a ‘plug-in’ model or spreadsheet ‘combat calculator’ likewise
informs the adjudicator’s pending decision. Like Operational Analysis, the influence is towards a rigid outcome.
Computerised simulations exert an even stronger influence in the direction of rigid adjudication. Commonly used,
computerised simulations can provide the entirety of the adjudication function.
Moderation. To moderate is defined as: cause to be less extreme; to move towards the medium or average
[144]
quantity. Moderation is used to steer a wargame to achieve specific training objectives, or to lessen extremes
in an analytical event. Moderation is generally used during semi-rigid adjudication, and can influence the decision
in either direction (towards an average expected outcome). However, moderation has perils, since shifting towards
average outcomes can all too easily side-line important conclusions about the vulnerability of plans to chance and
bad luck.
Role-play. Defence wargames sometimes include an element of role-play, but are rarely only role-play. Role-play
can exert a strong influence towards free, or consensual/minimal, adjudication. Constraining role-playing actors’
interactions can reduce this influence, but that risks lessening the benefits of role-play (free thinking creativity).
The ultimate expression of role-play is completely open-ended games featuring consensual adjudication. There is
[145]
some evidence, when considering human conflict situations, that role-play is a better predictor of outcomes
than either a single ‘expert’, or game theory, or simulated interaction and unaided judgement for forecasting
[146]
decisions in conflicts.
Wargame formats
Common game and wargame formats are introduced below, except for computer-assisted,
because that is discussed in detail in Chapter 9 (Appropriate Technology). For now, be mindful
of the distinction between ‘wargame’ and ‘simulation’ discussed previously, especially with
regard to computerisation. Almost all professional wargames are computer-assisted to some
degree. Hence, I find it more useful to consider the degree of computerisation as a variant.
However, it is almost inevitable that people will refer to ‘computer wargames’ (although there is
no such thing as a computerised wargame because you cannot computerise the vital ingredient
of players), so I retain computer-assisted in the table, even though this will probably be in
support of, or subsumed into, one of the other wargame formats.
Format Context Variant
Educational or
Board Number of sides
training
Number of
Seminar Analytical
players
Map/chart & Number of force
Defence
counter elements
Representation
Course of
Business of force
Action
elements
Emergency Size of play
Matrix
Services space
Distributed or
Role-play Academia
co-located
Degree of
Kriegsspiel Humanitarian intelligence
provided
Megagame Hobby Narrative driver
Historical, through Method of
Sand table/
contemporary to generating
miniatures
future outcomes
Command
Post Level of discovery Size of Control
[158]
Exercise
[160] Degree of
PMESII PT computerisation
Method of
Grand strategic to sharing
micro-tactical situational
awareness
Complex through
well-bounded
Original creative
thought through to
rigorous analysis
Figure 5-1. AFTERSHOCK (2015), a humanitarian board game that teaches some realities of
dealing with humanitarian disasters
Board wargames - Volko Ruhnke
Board wargames model military conflict on a tabletop. They often feature unit pieces on a
geographic map, but can use just cards or other displays. Most hobby board wargames
portray specific historical conflicts, from soldier-on-soldier tactics (Sniper! 1973), through
military operations (Wilderness War, 2001) and regional politico-military affairs (Here I Stand,
2006), up to global grand strategy (Twilight Struggle, 2005). Some concern very recent,
present-day, or future conflicts (Labyrinth: The War on Terror 2001 -?, 2010, or the Next War
series, from 2012). Professional wargaming has built upon that hobby tradition for Defense
training and research. Most importantly, board wargames include a rulebook to govern play
and resolve all situations that the players and chance might create.
Figure 5-2 GMT Games Fire in the Lake. Photo courtesy of the US Army War College
Seminar wargames - Fred Cameron
Seminar wargames exhibit the elements found in other formats, but with some changes in
emphasis. The players generally sit around a table and conduct a structured discussion
concerning what has happened or might happen in coming moves. They may have a map, and
counters to represent resources, or cartographic products rendered by a computer. More
stress is placed on discussing what is happening and why, and less on specific moves of
resources, detections, engagements, and the like. Seminar wargames have a reduced focus
on systems – e.g. platforms, weapons, sensors – than other wargames. In other formats the
performance characteristics of such systems are represented with considerable fidelity. In
seminar wargaming, there is more discussion of decisions and their consequences, and
subsequent decisions and further consequences. Adjudication tends to be consensual or free.
Seminar games are good at generating discussion and identifying group consensus. Their
weakness is that they tend to gloss over the details of operational reality.
Map/chart & counter wargames
A map/chart & counter wargame is played on actual maps/charts, or a simplified version of
these. Counters can be replaced by 3D blocks or models to ensure they can be seen.
Colloquially called 'heavy cardboard' games by recreational wargamers, map/chart & counter
games are flexible, cheap, and can be applied to almost any context, and so are a prevalent
wargame format.
Figure 5-6. Phil’s MA 2018 class students playtesting their wargame designs
Historical wargames - Philip Sabin
The great majority of hobby wargames model conflicts which have already occurred, from
ancient times to the most recent wars. Around half of hobby wargames model battles or
campaigns from World War Two. Historical wargames aim to capture the observed
characteristics of their chosen conflict, while allowing players to refight the conflict for
themselves and to explore how events might have gone differently. They range from tactical
games such as the Call of Duty series (2003) to grand strategic games such as Hearts of
Iron (2002). Professional wargames are more likely to have contemporary or future settings,
but historical scenarios are also used for a variety of purposes.
Professionals use historical wargame scenarios to educate players about the dynamics of the
past conflict or crisis, or to develop general command and leadership skills (complementing
activities such as battlefield study tours), or to test the validity of generic wargame systems
by seeing whether they can capture the dynamics of known past conflicts.
Figure 5-7. The RCAT system being validated in Phil’s office in 2015, in an unclassified historical
scenario based on the 1982 Falklands War and with the participation of two veteran
commanders from that conflict, General Julian Thompson and Commodore Michael Clapp
Business Wargames - Hans Steensma
Business Wargames (BWG) are the application of wargaming to a business context. They can
utilise COA Wargaming, board wargaming, seminar wargaming, matrix gaming, or a
combination of wargaming formats. BWG engage and inspire people to think and act
differently from a more traditional training or meeting environment. They give meaning to
business ‘battles’ in various situations, ranging from engaging competitors, exploring the
future business battlefield to countering a hostile take-over. Encouraging staff to consider
their ‘gut feeling’, assembled work experience and their network/creativity, results in a gaming
format that can be played at short notice. A BWG is a relatively low-cost intervention, which
might have a huge impact on the future track of a business. Co-providing insightful
information during a game-process is not only rewarding for the elected players, but can bring
together diverse minds in a challenging game setting.
Hobby Wargames - John Curry
The hobby market for wargaming is vast, sprawling, and dwarfs professional wargaming. For
example, the largest company, Games Workshop, had a revenue of £158 million in 2017. The
company behind the online game World of Tanks, was valued at £1.14 billion in 2016. The
wargaming hobby includes board games, miniatures, computer games and conceptual games
such as matrix games. The games cover practically all eras of history, including current and
future confrontations. It should be noted that although English is the dominant language of
wargaming, other nations have their own traditions. The issue with using hobby wargames for
professional purposes is assessing the accuracy of the game when not an expert in that topic.
A wargame is a visualisation and generalisation of a particular confrontation, and it takes time
to establish the validity of the designer’s model. Some hobby games companies and some
designers are better than others for professional purposes. Wargaming innovation is
continuous, with a recent example being matrix gaming. Every year, new ways of gaming are
proposed, for example to model cyber or hybrid warfare. Routinely, these developments come
from the hobby market due to the commercial pressure to innovate, the absence of security
restrictions preventing the free exchange of ideas, the scale of the wargaming development in
the hobby and the rapid testing of the new at an almost continuous series of hobby
conventions.
Miniatures wargames, where the units involved are represented by miniature figures/ vehicles/
ship/ aircraft etc need a mention. Although using miniatures is not currently popular, it should
be noted that they were widely established and accepted during the development of
wargaming. An early example is RUSI’s Polemos (1888), which used figures on Ordinance
Survey maps, but other examples include the Fred Jane Naval Wargame (1898-1918), the
United States Naval War College, Dunn Kempf (1977-1997), Blockbuster (1984) and Contact!
(1980). There are two key reasons for using miniatures in tactical professional wargaming.
Using scale models teaches recognition as part of the hidden curriculum. Secondly, miniatures
allow flexibility to game any scenario with a minimal set up time. For example, the games of
Dunn Kempf and Contact! were played on tabletop representations of training areas and likely
battlefields on the central front in Europe. It is likely that, as professional wargamers grows in
confidence, miniatures will again dominate tactical level wargames.
Complex/well-bounded, creative thought/rigorous analysis and level of discovery
These contexts were discussed in Chapter 4 and feature on Figure 4-1. They determine the
necessary level of complexity of the wargame.
Wargame variants
Most of the variants that follow can be applied to most wargame formats and contexts. Largely
self-explanatory, I present examples from each end of the spectrum to illustrate the diversity of
approach that different variants demand. Again, note the distinction between ‘wargame’ and
‘simulation’; the latter is often the variant in question.
Number of sides: one through many-sided
The number of sides refers to the number of actors and factions being actively played. A one-
sided wargame has just one side (usually Blue), with all other actors and factions represented
by Control. A wargame in which Blue contains multiple cells (branches, departments, staff
functions or levels of command) remains a one-sided wargame. Most CPX-style wargames,
irrespective of size, fall into this category.
ISIS Crisis is an example of a multi-sided matrix game: the US, Iran, ISIS, the Iraqi Prime Minister
Haider al-Abidi, Sunni Opposition and the Kurdish Regional Government. Sides might be sub-
divided into discrete cells, for example if there are divisions within a group.
Figure 5-10. Examples of counters used to model a single entity/platform (left) and an
aggregated unit (right)
Representation of ‘soft’ factors: none (entirely kinetic) through social science modelling of actors’ and
populations’ perspectives at all levels
Many simulations do not feature soft factors; they are entirely kinetic, and poorer for that. Rex
Brynen addresses the modelling of soft effects in Chapter 10 (Incorporating Non-kinetic Effects
and Semi-cooperative Play into Wargame Design). Any wargame that does not feature soft
effects and human terrain ignores the fact that war is a human affair and fought among the
population. Ellie Bartels comments that, 'Even if we are unsure about how to represent soft
factors, choosing to exclude them or rely on players to represent the mental and moral spheres
of conflict represents either a systematic bias or a completely opaque approach to modelling. In
both cases we are undermining our representation of conflict.' If soft factors do not feature in
the game, they will not be considered and played, either by Control or by the players. A simple
example is shown in Figure 5-11. This represents soft factors, which prompts expert discussion
and human adjudication rather than attempt to model them.
Figure 5-11. A simple representation of soft effects and human terrain using RCAT ‘sticky’
[172]
Marker Tracks and Intelicons™
Size of play space: small (sub-tactical) through global (geo-strategic)
A valid approach to investigating individual soldiers’ equipment enhancements is to use 1/32nd
scale figures (54mm tall) and ‘skirmish’ rules. ‘Micro-tactics’ and capabilities such as mini
unmanned aerial systems, remote sighting systems and programmable-distance grenade
launchers can be examined. Computer or console first-person shooters such as Virtual
Battlespace (VBS) could also be used.
At the other end of the spectrum, the US Global series of wargames and the board game
Labyrinth: The War on Terror 2001 –? (2010) use most of the planet as their play space.
Amount of intelligence provided: closed through open disclosure
An open game is one in which everyone knows everything, with all forces displayed. In a semi-
open game, one or more sides’ dispositions and movements are known only to Control and,
maybe, to one or other side, with FEs being revealed as appropriate by Control. Enemy play in a
COA Wargame is usually semi-open. A closed game is one in which information and intelligence
is only revealed to players as they would ascertain it in reality; a so-called ‘intelligence view’.
Degree of computerisation: manual through computer-assisted to computerised simulation
The spectrum of potential computerisation is enormous, and discussed in Chapter 9
(Appropriate Technology). The – usual! – point to note is that none can be considered an
exclusive solution to all wargaming requirements: they are complementary and must be used as
appropriate.
Method of generating outcomes
I cover this in Chapter 24 (Generating Outcomes). For now, note that many of the methods,
models and tools (MMT) used to generate outcomes blur, overlap and are used in combination.
These include, but are not limited to: human determination, stochastic and
deterministic methods, various random number generators (including dice), argument-based,
cards, fixed rules, computer simulations, and look-up tables.
‘Turn’ length: real-time through turn-based cycles of months or even years
I discuss the derivation and execution of time-stepped turns in detail in Chapter 22 (Controlling
Wargames). These can cover virtually any length of turn in time-stepped games. The Wargaming
Handbook points out that many ‘real-time’ simulations still work in steps, generally of no smaller
than 30 seconds. Role-play should be considered a real-time simulation. It is an example of the
challenge facing the adjudication team working with a real-time simulation: how do you get
ahead of the evolving narrative to exercise a degree of control over the outcomes? Once the
role-players start talking, there is little that Control can do to credibly moderate whatever
[173]
happens.
Narrative driver: open-ended through pre-determined, pre-scripted events
Excessively pre-scripted events that significantly constrain player decisions are arguably not
wargames. However, some games demand strict control to ensure they meet their objectives.
Attaining the nirvana of correctly balancing the primacy of player decisions with the necessity of
controlling the game is best achieved by using the lightest of Control touches. I discuss this in
Part 4.
Method of sharing situational awareness, and the number and type of table/display: single table or
screen through many – including distributed – computer displays
The range of this variant is enormous. While some wargames use just one table, or display,
most professional wargames feature a combination of manual and digitised approaches,
especially when situational awareness has to be shared. As soon as you broach a distributed
requirement, you almost certainly need computers.
Size and composition of Control: none through large, distributed, Control organisations
Most hobby wargames are controlled by the rules, procedures and rigid adjudication, and so
have no Control organisation. Conversely, almost all professional wargames have a Control,
ranging from one person to hundreds in distributed locations. However, the functions of Control
remain the same, irrespective of its size, as I discuss in Chapter 22. This is an example of
scalability.
Conclusion
There is no one-size-fits-all wargaming solution. The permutations of wargame formats,
contexts and variants is enormous. Almost every professional wargame will require a bespoke
combination of these. As well as resisting the tendency to apply a previously used wargaming
solution to a new problem without due thought, wargame practitioners must be able to classify
and communicate to everyone the multitudinous options available to a wargame project. I hope
this chapter will spark the debate necessary to advance this process.
Space is big. Wargaming likewise. Discuss.
Part 2: Establishing the Conditions for Successful Wargames
‘Wargaming is a powerful tool. I am convinced that it can deliver better understanding and
critical thinking, foresight, genuinely informed decision-making and innovation.’
[174]
General Gordon Messenger, UK Vice Chief of the Defence Staff
Chapter 6. Essential Characteristics of Successful Wargames
‘One thing a person cannot do, no matter how rigorous his analysis or heroic his imagination, is
to draw up a list of things that would never occur to him.’
Thomas Schelling
Introduction
I begin Part 2 with a review of the characteristics found in successful wargames. These are
different from the elements of a wargame explained in Chapter 2. Characteristics are less
obvious and, in some cases, intangible: a game of rugby needs a ball, pitch, players, rules and a
referee (elements), but might only be considered ‘good’ if it is played in a competitive but
sporting spirit, is exciting and – from the coaches’ perspective – the reasons for success or
failure are apparent and actionable (characteristics).
Note the addition of ‘appropriate adjudication’, ‘soft factors’ and ‘simplicity’ to the
characteristics listed in the Wargaming Handbook, and the expansion of ‘oppositional’. There
might be good reasons why your wargames do not feature all the characteristics listed but, if
any are missing, you should ask yourself why in every case. The essential characteristics that
your wargames should feature are:
· Adversarial.
· Oppositional ‘friction’.
· Chance.
· Uncertainty.
· Primacy of player decisions.
· Freedom to fail.
· Engagement (and fun).
· Cheap, frequent and played in small groups.
· ‘Soft’, non-kinetic factors.
· Control.
· Appropriate adjudication.
· Transparency.
· Analysis.
· Wargaming within a wider context.
· Appropriate technology.
· Simplicity.
Adversarial
‘[The power of wargames] ...lies in the existence of the enemy, a live, vigorous enemy in the next
room waiting feverishly to take advantage of any of our mistakes, ever ready to puncture any
visionary scheme, to haul us down to earth.’
William McCarty Little, US Navy (1912)
‘Adversarial’ is a key – perhaps the key – characteristic of wargaming. Wargaming is a competitive intellectual
activity, and the primary challenge is usually provided by a combination of:
· Opposing players representing active, thinking and adaptive adversaries and competitors; and
[175]
· Wargame controllers using the level of threat as a variable.
According to my definitional discussion, if a wargame is not to some extent adversarial it is not a
wargame. A planning exercise might be a challenging intellectual activity – but it is not a
wargame. The adversarial characteristic of wargaming ensures experiential learning, demands
innovation and adaptive thinking, and introduces the prospect of failure. The McCarty Little
quote highlights the critical requirement for players to face suitably qualified, thinking people.
Unfortunately, playing the intentions and capabilities of Red (enemy), Orange (armed non-state
actors), Black (organised crime) etc into a wargame is too often insufficiently considered, both
in training and analytical wargames. I will return to this in more detail in Chapter 7 (The Wargame
Team).
How does a one-sided wargame deliver this adversarial characteristic? This is a function of
Control. Determining the level of adversarial challenge, and how this vital control lever is
managed is a critical consideration that I discuss in Chapter 22 (Controlling Wargames).
The adversarial characteristic of wargames is crucial, but there are nuances you need to be
aware of. Be careful that emotionally engaged players do not become confrontational. It is a
small step from adversarial to confrontational, and the change can occur quickly. Be alert for
indicators and take immediate action; a wargame that becomes confrontational can quickly
become counter-productive (as well as unpleasant). Note, also, the concept of ‘semi-
cooperative’ wargames, set in contexts such as inter-agency operations or coalition activities.
[176]
These are discussed by Rex Brynen in Chapter 10.
Oppositional
Operations rarely unfold as we wish, even in the absence of adversaries or competitors, so ‘oppositional’ friction,
[177]
in the Clausewitzian meaning, should also feature. This can be introduced by a red team [Critical Thinkers] or
wargame controllers. The red team challenges assumptions and, in conjunction with the wargame controllers, can
introduce friction.
[178]
Oppositional factors, and the means of introducing them to your wargames, should be
considered during the wargame design and development phases. Mechanisms to do this
include:
· Critical Thinking/Red Teaming. A primary method of injecting friction, be aware that
Critical Thinkers/Red Teamers can become unpopular, and so need support.
· Scenario injects and random events (typical of hobby wargames). Chapters 19-21 cover
injects in detail. These are a Control function that requires meticulous planning and
rigorous execution.
· Asking ‘What if?’ questions during analytical wargames. You will note a specific ‘What if’
serial in the Course of Action Wargaming process in Chapter 26. Note the distinction from
‘So what?’ questions.
Chance
‘War is the province of chance. In no other sphere of human activity must such a margin be left
for this intruder. It increases the uncertainty of every circumstance and deranges the course of
events.’
[179]
Carl von Clausewitz
Chance is an ever-present characteristic of warfare, and so must feature in wargames. It is an expression of risk,
which is a fundamental concept that all military personnel should be experienced in calculating and managing;
wargaming allows this in a safe-to-fail environment. Chance plays a key role in handling the extensive middle
ground between inevitable failure and confident success. The element of chance is most easily generated in a
[180]
wargame by using random number generators.
Space precluded a fuller discussion of chance in the Wargaming Handbook. I hope you do not
need persuading of its importance – but others might. I explain the spiel I use to introduce the
role of chance in Chapter 24 (Generating Outcomes), but paraphrase that briefly here. Military
folk venerate Clausewitz, as do many business people. Clausewitz's 'Paradoxical Trinity' is ‘…
composed of primordial violence, hatred and enmity, which are to be regarded as a blind natural
force; of the play of chance and probability within which the creative spirit is free to roam; and
of its element of subordination, as an instrument of policy, which makes it subject to reason
[181]
alone.’ My simplification of that is that Clausewitz’s Paradoxical Trinity comprises passion,
logic and chance. Chance is the ‘intruder’ in Clausewitz’s quote; it is certain to play a part in war
and so must feature in wargames.
Uncertainty
Uncertainty and the fog of war are fundamental characteristics of warfare, and should be considered in a
wargame. Experiencing uncertainty fosters a robust mental capacity among players, better allowing them to deal
with adverse outcomes. It often leads to new, and unexpected, situations and insights. Active, thinking
opponents and the element of chance are the primary means of introducing uncertainty into a wargame, but
other methods include:
· Hidden movement, until forces or intentions are revealed by intelligence;
· Unclear or unspecified aims and intentions, including those of allies, actors and factions other than the
adversary;
· Random events appropriate to the scenario such as bad weather, political interference, media scrutiny or
mechanical breakdown affecting operations; and
· Altering the sequence of play, which can allow one side to ‘steal a march’ or get inside the decision-making
[182]
cycle of the other.
Adversarial, oppositional, chance, friction and uncertainty are clearly interrelated. However,
[183]
when designing and delivering wargames, each deserves discrete consideration.
The primacy of player decisions
The players are the protagonists. Their combined behaviour should determine the course of a wargame.
Maximising story-living, with all its benefits, takes place when the narrative is driven by player decisions and when
players face the consequences. During execution, this requires wargame controllers to allow a dynamic, open-
ended narrative to evolve. Care should be taken to avoid:
· Presumptive answers influencing analytical wargame design and execution so that outcomes inevitably
reinforce these preconceptions; and
· Excessively predetermined events in a training wargame that constrain player decisions and constrict a
[184]
dynamically evolving narrative.
The second bullet reinforces the necessity to address the recurring issue of excessive pre-
scripting that I introduced in Chapter 1 and discuss in Chapters 19-21 (Scenario Writing,
Development and Execution).
Freedom to fail
‘If you're willing to fail interestingly, you tend to succeed interestingly.’
Edward Albee (Reader’s Digest, 2012)
and, more down to earth (figuratively),
‘Failure, the greatest teacher is.’
Yoda, The Force Awakens
Safe-to-fail. Wargames can provide a safe-to-fail environment, where mission command is practised and ‘thought
experiments’ undertaken with no fear of failure. Commander Field Army said in January 2017: ‘Delegate and foster
mission command in barracks as much as in training. Be bold and reward boldness. Release the genie from the
junior commander bottle. I don’t want failsafe (except in security, money, Service complaints and law). I want safe-
[185]
to-fail – providing the reason is positive.’ Wargames that involve undue assessment of participants,
consciously or subconsciously, stifle innovation, risk-taking and the opportunity to learn. The Israeli Defence
[186]
Force perspective is that, to learn, the trainee has to fail, be surprised and be mentally challenged.
These are fine words – which I rarely see put into practice. The ‘genie will only be released from
the bottle’ when all senior officers engage with wargames by actually playing them, accepting
the risk of their own failure in front of their peers and subordinates, and encourage their
subordinates to do likewise with no fear for their careers. William Lind says, ‘Wargaming should
play a much larger role in courses of instruction. The purpose of gaming should not be to see
who wins or loses, nor to attempt to “prove” certain specific approaches. There should be no
“school solutions”. Rather, it should be to teach students to make quick decisions through a
coherent, logical, thought process, while under pressure. General F.W. von Mellenthin, a 1937
graduate of the German War College, has told of the frequency and importance of wargaming at
that school. He stressed that there were no “right answers”. A student was never told his
decision was wrong. He was criticized for only two things: failure to make a timely decision, and
inability to give a logical, coherent explanation for his decision. But if he made either of these
[187]
errors, he was criticized severely.’
‘Freedom to fail’ particularly applies to business contexts. Business wargames offer a safe-to-
fail environment where strategies can be implemented that carry too much risk to attempt for
real. In the actual world, any suggestion of falling profit margins or share prices can lead to
instant dismissal and loss of livelihood, so it is often too risky to attempt bold plans. These can
be trialled in the safe-to-fail environment of a wargame.
‘You win, or you learn’
I don’t know who coined the phrase ‘you win, or you learn’, but I first heard it from Professor
Phil Sabin at Connections UK in 2013. It perfectly sums up one of the key reasons to wargame: it
is far better to make mistakes in a safe environment where no-one will die or lose their job than
to wait to make them on live operations.
Small, cheap and frequent
Recent United States experience shows that the majority of wargames should be cheap, frequent and played in
[188]
small groups. While some wargames are necessarily large, a ‘cheap and frequent’ approach maximises
learning opportunities and allows innovations to develop in subsequent games. Single wargames conducted as a
‘box ticking’ exercise generally fail to build on the educational process or analytical findings.
Pareto’s Law applies to wargames. Also known as the 80/20 Rule, the Law states that you derive
80% of total effect, or benefits, from the first 20% of resources applied; to then strive for the
final 20% of realised benefits requires 80% of the effort. Small, cheap and frequent wargames,
played iteratively, will deliver 80% of insights and innovations, while a large, comprehensive
game that might deliver the final 20% of benefits will be expensive and time consuming.
This turns on its head the current tendency for wargames to be large affairs that are so
expensive that they happen infrequently. It is a characteristic of successful wargames that I find
is bubbling up to the top of that list in importance, and I increasingly advocate the approach.
Commercial Off The Shelf (COTS) or Modified Off The Shelf (MOTS) games provide an easy
option for playing games frequently and cheaply. Small groups of players executing iterative
wargames with an opportunity to learn between these will elicit more insights than a single large
[189]
event. This is the case in an analytical or training context. Larry Bond’s Persian Incursion is
a good example of a COTS game used by small groups of Pentagon planners. It is a game about
Israeli air strikes to disrupt the Iranian nuclear programme, but the mechanisms can be applied
elsewhere. Bond’s introductory quote is instructive:
‘What an [analytical] wargame can do is show which interactions are important. Simple study will
not reveal them – there are just too many. “Banging the rocks together” gives all the factors full
play. One game will be instructive. A second game may reveal a pattern. By the third play, both
[190]
sides will know what drives the situation.’ (My italics.)
Engagement. Challenge and professional satisfaction should be inherent in all games, but wargames should also,
where appropriate, be fun. This in no way undermines the serious nature of wargames. ‘Fun’ is an acceptable
term; it is a primary factor in ensuring that players engage. Engagement, through active learning, leads to better
internalisation of training lessons and greater analytical insight. This effect extends to wargame designers, support
staff and observers.
Control is the minute-by-minute activity that ensures the wargame proceeds as required to address the problem. It
most often takes the form of a wargame control team, which can range from one person to hundreds sharing
[193]
distributed systems in many geographical locations.
I’ve discussed control and Control previously, and devote Chapter 22 to these essential topics.
Appropriate adjudication
Note the broad adjudication approaches and associated factors at Figure 4-1. You need to
ensure that the adjudication approach you select, and the supporting methods, models and
tools are appropriate to the real-world applicability and level of discovery of your wargame.
Transparency
Simulation outcomes, and the reasons for these, should be clear and open to scrutiny. This allow participants to
understand the dynamics of a situation. When adjudication is based on transparent calculations there is a clear
understanding of how the outcomes have been derived. Transparency is equally important in training and
[194]
analytical games.
Analysis is one of Rex Brynen’s ‘Three Pillars’ of good wargames. It permeates all activity from
conception through design, execution and refinement. I devote Chapter 8 to it, primarily written
by Margaret Polski and John Scott Logel from the US Naval War College. Also note the
contribution by Stephen Downes-Martin in Chapter 16 (Wargame Refinement), which discusses
AARs and other forms of in- and post-game analysis.
Wargames provide greatest utility when used iteratively within a wider decision-making process. For example,
Defence experimentation recognises the necessity to combine different techniques in a series of inter-connected
[195]
events. The Integrated Analysis and Experimentation Campaign Plan (IAECP) is accepted best practise. A
multi-technique, integrated approach enables a ‘cycle of research’, which is: ‘an iterative application of the
[196]
principal tools the military uses to explore, understand, and prepare for future conflict.’ Wargames are one
[197]
potential component. The principle applies to all wargaming, whether analytical or training.
This wider context, including the cycle of research and IAECP, is covered in Chapter 8
(Analysis).
Appropriate technology
Appropriate supporting simulation. All wargames require simulation. There are many instances where this can be
manual, rather than computerised; even role-playing is a form of simulation. Computer and manual simulations
each have strengths and weaknesses and are usually complementary. All types of simulation should be
considered, and an appropriate solution determined. Whatever supporting simulation is selected, there is generally
a requirement to incorporate a human-in-the-loop, usually as part of the adjudication process.
Appropriate technology is the subject of Chapter 9. Technology supports your wargames: ‘the
instrumentality is not the game’; and ‘wargame’ is not synonymous with ‘simulation.’ In his
keynote address to Connections UK in 2013, Peter Perla commented that when von
Muffling saw his first kriegsspiel in 1824, he didn’t see maps, counters and rules (the
instrumentality); he saw the wargame happening in the minds of the players, with the same
thought processes engaged as on staff rides and live operations. The instrumentality – the
supporting technology – is only there to inform, or affect, the players.
Simplicity
'It is very difficult to keep a game-design project simple. Once you get going, there are
tremendous temptations to add this and that. A game design is a very dynamic activity. It soon
acquires a life of its own, asking questions and providing parts of answers. The game
designer is sorely tempted to go deeper and deeper. Without some years of experience and a
high degree of professional discipline, it is extremely difficult to do an unsimple game that is not
a truly incomprehensible one. For a game is, in addition to being a source of information, also a
form of communication. If the information cannot be communicated, the game does not work.
You’ve got to keep it simple.'
[198]
Jim Dunnigan
You must fight to keep your wargame simple. Too many people offer WAGIIs (‘What A Good Idea
If…’) or CORGIs (‘Commanding Officer’s Really Good Idea’) with little understanding of the
implications. As Jim Dunnigan warns, wargames are also prone to ‘mission creep’, with the
areas being examined spiralling out of control. There is always pressure to add more detail,
examine more subjects of analysis and so forth. I have lost count of the times I’ve heard the
phrase, “We do not have time to ‘boil the ocean’, so will concentrate on x…” – and then we
press on and try to boil said ocean, adding to the scope and complexity of the wargame.
Ironically, a well-designed and well-run wargame looks simple – but isn’t. This is the same as
watching any expert at work, and belies the hard design and development work that has
delivered an apparently effortless wargame. It leads to people erroneously thinking, “That’s easy
– I can do that.” The following quotes are self-explanatory soundbites you can use to introduce
or shape a discussion on the necessity to keep wargames simple.
‘Everything should be made as simple as possible, but no simpler.’
Einstein
‘Simplicity is the ultimate sophistication. It takes a lot of hard work to make something simple, to
truly understand the underlying challenges and come up with elegant solutions.’
Steve Jobs (the first sentence was Leonardo da Vinci)
‘Everything in war(gaming) is simple, but doing the simplest thing is difficult.’
Clausewitz-ish
Accuracy versus playability (simplicity)
‘Simplicity in manual games is achieved by abstracting ancillary elements through game
mechanics.’
[199]
Robert Hossal
The tension between accuracy and playability is at the heart of most wargame design dilemmas.
Brian Train said during Connections UK 2013 that, ‘Games should be simple, and teach basic
principles and dynamics quickly; everything else is an add-on that can distract from this
[200]
fundamental requirement.’ Phil Sabin devotes a chapter of Simulating War to this, entitled
‘Accuracy vs Simplicity’. I can express it no better than Phil, so here’s a short quote to end the
chapter. You should also read the rest of Phil’s chapter, pp.19 – 30, and p.68, too.
‘Perhaps the most pervasive trade-off affecting all human attempts to understand the world in
which we live is that between accurately capturing the almost infinite complexities of reality and
keeping our models simple enough to be grasped by ordinary minds and used as a practical
guide for action… Above all, it is crucial to remember that a simple wargame that is played will
[201]
be more instructive than a detailed wargame that is not’.
Chapter 7. The Wargame Team
‘It is important to make one thing clear at the very start; designing and delivering a wargame is
an art, not a science. Experienced military officers, practised operations research analysts, and
accomplished computer programmers are not necessarily capable of designing useful
wargames. Although some or all of the knowledge and skills of such people are important tools
for a wargame designer to possess, the nature of game design requires a unique blending of
talents.’
[202]
Peter Perla
Introduction
This chapter introduces the wargame team; who it consists of and their roles. The ideas are
developed in Parts 3 and 4, where the synergistic efforts of the team, and the part they play in
the wargame processes, are detailed. Some of the terminology presented here and in the
Wargaming Handbook has been queried by US commentators on this book, so be mindful that it
is not a universally accepted template.
Peter’s quote is fundamental to good wargame design and delivery. However, one aspect of the
wargame team not mentioned is the customer, or sponsor (I use the terms interchangeably). I
defer to the wisdom of Stephen Downes-Martin, who has written extensively on the relationship
between the wargame designer and sponsor. Stephen’s experience informs the ‘Dealing with the
sponsor’ section, below.
Everything in this chapter is subject to scalability: the concepts and functions are applicable to
all wargame sizes and contexts.
Wargaming Handbook extract
Roles and Responsibilities
The credibility of a wargame depends on the skills and experience of the wargame design and delivery team, with
senior sponsorship and support. The key personnel involved, and their roles and responsibilities, are outlined
below. The NATO Bi-Strategic Collective Training and Exercise Directive 075-3 details the roles required in large-
scale events. The following list is shorter, acknowledging that wargames can be small. Indeed, some of the roles
below might be combined.
[203]
Game sponsor. The sponsor is the senior officer or official under whose authority the game is conducted.
Defence wargames are usually initiated by a sponsor. As well as starting the process, their understanding,
continuing commitment and open-mindedness will contribute to successful wargames. The sponsor needs to:
· Inculcate a common and widespread culture of wargaming at all levels, characterised by senior sponsorship
and active participation.
· Define the problem to be wargamed and approve aims and objectives.
· Remain open-minded to wargame insights; cognitive bias and Service or individual interests must be avoided.
Game director. The game director represents the sponsor, and is responsible for delivering a wargame that
satisfies the problem. Once a wargame’s aims and objectives have been approved by the sponsor, the game
director is responsible for achieving them. The game director is responsible for the following:
· Ensuring that the wargame team consists of suitably qualified and experienced personnel.
· Being actively involved in the design and development of the wargame.
· Ensuring that planning is done at the appropriate time. It is too often assumed that a wargame can be ‘pulled
off the shelf’ at the last minute. Sufficient time to design and develop all elements of the wargame must be
allowed.
· Empowering the wargame team. The game director should be open to external ideas but protect the team from
unwarranted criticism and ensure that design and development outcomes are acted upon by other decision-
makers.
· Ensuring the wargame is correctly staffed.
· Providing direction and guidance as required during wargame execution to ensure objectives are met.
· Ensuring that lessons are identified throughout the wargame process, analysed and promulgated.
· Validating the wargame and promulgating findings.
The wargame team. An empowered wargame design and delivery team must be established at the outset of a
wargame project. The size of the team will vary considerably with the scale of the wargame. The wargame team
should comprise the following:
a. Sponsor representative. One or more representatives of the sponsor should form part of the design team. They
are the custodian of the aim, objectives and scope of the wargame, and should be available throughout for
direction and clarification.
b. Designer. An experienced wargame designer should orchestrate the programme of wargame design and
development.
c. Analysts. Analysts are usually required to design and/or validate:
o Simulation models, to ensure these are sufficiently realistic; and
o Data collection and management plans.
d. Simulation experts. Experts are required to ensure that the simulation(s) selected are appropriate and will
enable delivery of the wargame objectives.
Game Controller. The game controller (GameCon) is the critical role during wargame execution. They steer the
wargame minute by minute to achieve the objectives, following direction and guidance from the game director as
required. The role includes, but goes beyond, ‘umpire’: the GameCon should be the final arbiter of all key
decisions, which might relate to adjudication, scenario evolution, or any aspect of the wargame. The GameCon can
be likened to the conductor of an orchestra, controlling all sections of the ensemble to produce a harmonious and
coherent whole. As well as being the key wargame controller, the GameCon is responsible for the following (but
does not necessarily personally undertake them):
a. Adjudication. Whether a person, in the form of an adjudicator, or a multi-person, multi-tool function, the
adjudication process is key to the success of the wargame.
b. Facilitation. The complexity of the wargame might necessitate a facilitator, or facilitation organisation. The
facilitator/organisation could assist both players and wargame support staff.
Players. Wargame players can number from one to thousands. They are usually organised into cells, the size and
shape of which can vary considerably.
Measure, Method,
and Model
Research Sub- Observable collection and Tool Scenario Scheduling
Question Issue issue SoA EEA attributes methods required requirements requirements
[220]
Figure 8-3. DCMP aide-memoire
Observations, insights and lessons identified
Many people find it useful to sub-divide insights arising from a wargame into observations,
insights and lessons identified (OILS). The definitions below are paraphrased from a knowledge
[221]
management web site .
1. Observations. Observations are captured from sources, whether they be people or things.
Observations are the basic building blocks for insights and lessons identified, but they often
offer limited or subjective perspective on their own.
2. Insights. Insights are objective conclusions drawn from patterns or groups of
observations.
3. Lessons identified. Lessons are insights that have specific potential and actual authorised
actions attached. Lessons identified are those that substantiate requests for recommended
actions to be authorised. Lessons learned are those that have been accepted into doctrine,
operating procedures etc. A wargame will elicit lessons identified.
The remainder of this chapter is a contribution from Margaret Polski and Jon Scott Logel. It is
based on U.S. Naval War College Wargaming Department course materials and working papers
that describe the Department’s approach to analysis.
U.S. Naval War College Wargaming Department – Doing Analysis
The Wargaming Department (WGD) at the U.S. Naval War College (NWC) has been wargaming as
a means to explore military decision challenges since 1887. Today, the WGD principally executes
analytical wargames to address Navy senior leaders’ most difficult questions about war fighting.
[222]
In our view, analytical wargaming is a flexible method for analysing and experiencing strategic
and operational decision-making and interaction. It is a tried and tested approach to rapidly
advancing knowledge about warfighting challenges. In a joint and combined arms environment,
wargaming also provides the opportunity to build and sustain essential working relationships
[223]
across services, partners, domains, and theatres.
From an analytical perspective, wargames, like real war fights, are unique events. The results
that emerge from a single analytical wargame are specific to the problem, purpose, objectives,
and research questions that provide a framework for design and execution, the participants, and
the event. In order to produce insights that are useful for command decision making, analysts
must use a disciplined method, adhere as closely as possible to the principles of scientific
inquiry, and identify the strengths and limitations of their work.
In the sections that follow we elaborate on how we think about analysis. The first section
provides background on the development of analytical war gaming at the NWC, our
terminology, and our research design process. The second section explains how we approach
analysis in each phase of our war gaming process. The final section summarizes and provides
advice.
Background
Analytical wargaming at the NWC is based on the work of Captain William McCarty Little,
Captain Wilbur Van Auken, and Francis McHugh. These early military operations researchers
developed wargaming at the college over a period that spanned the founding of the College,
the Inter-War period, and the Cold War era. McCarty Little is credited with introducing
wargaming to the Navy, and he wrote a number of papers on the topic in his time at the NWC.
[224]
Van Auken established a research department in wargames at the College in 1932, and
was one of the first to systematically analyse the BLUE-ORANGE games of the interwar period.
[225]
However, it is McHugh who is most closely associated with developing and documenting
[226]
the methodological approach that the WGD uses today. His definition of analytical
wargaming, which incorporates the definition McCarty Little articulates in a lecture at the NWC
in 1912, is: ‘A war game is a simulation, in accordance with predetermined rules, data, and
procedures, of selected aspects of a conflict situation. It is an artificial – or more strictly – a
theoretical-conflict … to afford a practice field for the acquirement of skill and experience in
the conduct or direction of war, and an experimental and trial ground for the testing of
[227]
strategic and tactical plans’.
The WGD is an integral part of the U.S. Navy’s (USN) research enterprise and supports national
defense strategy and operational design. The USN uses wargaming to research military
operations, provide experience in decision making, educate personnel, and shape critical
decisions and investments. The WGD is responsible for designing and executing wargames to
help leadership and their staff better understand their most vexing dilemmas. WGD wargames
are designed to challenge the assumptions underpinning joint and naval operating concepts and
plans, identify critical issues for further analysis, and produce a ‘punch list’ for action. Every
analytical wargame has educational and experiential as well as research components: typically, a
fleet commander’s objectives for a wargame include not only gaining insights, but also providing
an opportunity for senior staff to experience decisions when executing a concept or planning
against an aggressive adversary.
In our experience, a successful analytical wargame addresses the research questions and
generates questions that inform further analysis, exercises, experiments, and planning. At the
end of a wargame, the sponsors and participants should have a better understanding of what
they know, what they need to learn more about, and where they need to focus refinements. A
well-designed wargame properly analysed can also have implications for national security policy,
strategy, and naval enterprise management.
[228]
Figure 8-4. The U.S. Navy’s Approach to Analysing and Researching War-Fighting
Today the WGD uses a team research method that includes steps that are typical in professional
research design modified to accommodate the USN’s applied research requirements. Most of
our wargames investigate questions related to executing maritime concepts of operations and
plans in a joint, multi-domain, coalition environment. While we regularly design and execute
wargames for joint military operations planning courses, our primary responsibility is to
investigate challenges identified by the Navy’s CNO and fleet commanders. Hence most of our
wargames are classified and as analysts, we are required to work with our wargame teams to
ensure that we can draw implications from play that inform strategy, operational design, and
organizing, manning, training, and equipping the force.
Approach
Each fiscal year the CNO’s staff asks fleet commanders to submit wargame proposals. The WGD
reviews these proposals and makes recommendations to the CNO’s staff about which proposals
are best suited to war gaming and will contribute to the USN’s analytic agenda. Once the CNO
decides on a wargame plan, the WGD develops a schedule for the fiscal year and assigns a team
to execute each of the wargames in the schedule.
A WGD wargame core team includes a director, analyst, designer, developer, knowledge
[229]
manager, and logistician. As the wargaming process unfolds, a lead adjudicator is
integrated into the core team. The analyst provides the framework for designing and developing
a wargame, builds a Data Collection and Analysis Plan (DCAP), and a team to execute the plan
(DCAT). The analyst leads the team in collecting data on players’ decisions and decision-making
processes during play, analyses these data, and drafts wargame reports for peer review. Peer
reviewers typically include wargame team members, members of the WGD including the
Chairman of the WGD and his deputy, and command staff.
The WGD’s analytical wargame research process, which is depicted in Figure 8-5, includes
seven phases: tasking, design, development, testing, rehearsal, execution, and reporting. All
WGD wargames begin with analysis and integrate analysis into each phase of the wargaming
process. However, as the following sections describe in more detail, analytic goals and tasks
differ across the phases.
[230]
Figure 8-5: U.S. Naval War College Wargame Department Research Design Process
The WGD’s disciplined approach to analysis makes it possible for other wargamers to replicate
or repeat a wargame, to iterate on some aspect of a wargame in subsequent research, and to
understand how war gaming and wargame findings fit in a broader military operations research
agenda that includes other research methods such as modelling and simulation, experiments,
[231]
and exercises.
Wargame Analysis and Tasking
The overall team goal in the tasking phase is to reach agreement with the fleet commander
about the problem, purpose, and objectives for the wargame. At the end of the phase, which
includes an initial planning conference (IPC) with the commander and relevant staff, the WGD
executes a final proposal and schedule for the wargame.
Working from the preliminary wargame proposal and prior knowledge, the analyst begins
building a ‘literature review.’ The literature review, which often continues through the
development phase of the wargaming process and even in post-game analysis, involves
identifying and reviewing the results of other relevant studies, exercises, experiments, and
wargames to inform the problem, purpose, and objectives for the new wargame. The analyst
asks the following questions when conducting this review:
1. What are the current policy, strategy, and operational design challenges in the
commander’s area of responsibility (AOR)?
2. What does the USN know now about warfighting challenges in the AOR based on
previous research, what is the extent of the available evidence, and what is the quality of
the evidence? What don’t we know or what haven’t we previously studied?
3. What doctrine, concepts, plans, instructions, and assumptions address operational
challenges in the AOR and how well are they supported by evidence?
Based on the literature review, the analyst helps the wargame team draft an initial statement of
problem, purpose, and objectives for the wargame, which is then discussed and refined in the
IPC. WGD analysts typically participate in planning conferences with the wargame director,
designer, commander, and staff. The text box offers an example of a wargame framing
statement:
Problem: We need to refine our maritime planning products in light of changes in the
adversary’s capabilities.
Purpose: Challenge existing planning framework assumptions in order to identify warfighting
strengths and weaknesses.
Objectives: 1) Specify ways of employing capabilities to execute the planning framework; 2)
Identify the scale and scope of risk to force and mission; 3) Develop a deeper understanding of
the nature of the fight if the planning framework is executed.
[233]
Figure 8-7. Conceptualizing Data Collection and Analysis Planning
In the development phase the analyst and the wargame team are thinking through the type and
structure of the data we will collect from players and how we will collect these data (collect raw
data), how we will analyse them once collected, and how we will clean and organize the data so
that we can identify patterns (visualize) and communicate findings (report). The value of our
data and our ability to use modelling, simulation, or Artificial Intelligence (algorithms) to help us
visualize or use our data (build data products) is linked to the type of data we collect, the
structure of the data, and the tactics we use to ‘interrogate’ and archive our data. This brings us
to a discussion of tools and technologies.
Using tools and technologies
WGD analysts have access to several computer-based tools and technologies that help us
collect, analyse, clean, visualize, and build data products. For example, when we wargame in our
own facilities at the NWC, we use a custom-designed wargaming platform that automates
collection of data generated by player moves, requests for information, survey responses,
player communications, adjudication, and situation updates.
Automating data collection helps to reduce the potential for bias related to errors, omissions,
and misinterpretations. It also improves the efficiency of data collection and analysis processes.
In other words, properly done, automated data collection generates cleaner data, which reduces
the need for additional cleaning processes.
WGD analysts also have access to several analytic tools that help us crawl through and
interrogate our data more efficiently and with less human bias. For example, we use Atlas.ti,
Analysts’ Notebook, and custom-developed forms of artificial intelligence to visualize and
identify patterns in our data. Similarly, we use commercial and custom-developed software tools
to build data products and archive our data for future use.
In order to use our tools properly and to begin the process of developing new tools, WGD
analysts use the development phase of a wargame to specify data collection requirements. Table
8-1 provides an example of this type of specification.
Wargame Analysis and Testing
The WGD team goal for the testing phase of an analytical wargame is to integrate all of the
individual pieces of the wargame that were created in the development phase and make sure
that they 1) effectively and efficiently achieve the wargame purpose and objectives, and 2)
generate data that will allow us to address the wargame research questions.
DATA SET TYPE OF STRUCTURE SOURCE
DATA
Briefings Text and Structured & Players
Images Unstructured
Commander’s Intent Text Structured Operational
Cell Leads
HHQ Guidance Text Structured HHQ Cells
Operations Directives Text, Structured Operational
Numbers, Cells
& Images
Survey Responses Text Structured Operational
Cells
Situation Updates Text & Structured Control Cell
Images
Open Session Comments Text Structured & Players
Unstructured
Data Collector & Analyst Notes Text & Structured & Players &
Images Unstructured Control Cell
Summary
As analysts in the U.S. Naval War College War Gaming Department we have the good fortune
and privilege to build on over 130 years of analytical wargaming practice. We stand on the
shoulders of some truly great strategic thinkers who developed a capability that has helped the
U.S. Navy prepare to fight and win when called to do so.
This chapter provides a brief overview of how we approach analysis when we wargame, which
is summarized in Figure 8-8. From this overview, three key points emerge:
(1) By definition, an analytical wargame requires that analytic activities are integrated into all
phases of wargaming. An analyst’s work begins before the tasking phase and continues
after the wargame executes until a report is produced, disseminated, and archived.
(2) Most analytic activities are not confined to a single phase and are not performed
sequentially: they continue to evolve and are performed in parallel fashion across
wargaming phases
(3) War-fighting is a complex challenge that is very difficult to investigate. Hence, the best
analysts are humble sceptics who are willing to acknowledge the limits of their work.
[234]
Figure 8-8. Nesting analysis in wargaming
Operations research communities are rife with - and often thrive upon - controversy and
competition. Every wargamer we know has been called upon to explain or defend their methods
in comparison to other, more formal methods of investigating warfighting problems such as
modelling, virtual simulations, exercises, and experiments. However, our most interesting war-
fighting challenges, such as jointly fighting peer and near peer adversaries in a global political
and economic system with five domains, are messy and difficult to investigate. No method,
including wargaming, can generate perfect analytical rigor: these are ideal standards, not
minimum thresholds.
While WGD analysts work hard to ensure that our wargames produce findings that senior
leaders can rely on along with other analyses to ensure warfighting readiness, we know that our
analyses have strengths and limitations for decision-making. We believe that all military
operations researchers including wargamers ought to acknowledge the limits of their analyses
and provide estimates of rigor in their briefings and reports. If there is nothing else that you take
from this chapter, we urge you to consistently estimate the rigor of your analyses. The text box
below and associated table provide an example of how we handle this in our wargaming reports.
[235]
Factor Estimate
External Medium: We based our analysis
Validity on input from a selected sample
of the types of individuals who
are likely to make assessments
or decisions in a similar situation
and not a random sample of the
entire population.
Reliability High: We based our analysis on
players’ assessments and
decisions as expressed in their
directives, observations of player
behavior and comments by
multiple individuals in three
different fora, or comments
directly contributed by
participants.
Replicability High: We believe that if another
equally capable analyst analyzed
these data using the same
procedures they would arrive at
similar conclusions.
Reasonable High: There is a high degree of
Person consistency in the observations
and statements we collected,
which a reasonable person could
use to inform decision making
and action.
Chapter 9. Appropriate Technology
‘Tech can be your friend or your foe – usually both.’
[236]
Dr Karl Selke
Introduction
This chapter consists primarily of a contribution from Dr Karl Selke. Before that, there are some
fundamental points that I need to introduce.
Combating biases
In Chapter 3, I prefaced the discussion of the conflation of ‘simulation’ and ‘wargame’ with a
2013 Peter Perla quote. Peter expanded on that subsequently:
'Real wargaming is not about the unverifiable quantification of computer models. Real
wargaming is about the conflict of human wills confronting each other in a dynamic decision-
making and story-living environment. There is a place for technology in supporting that clash of
wills, but electrons are not always the most useful technology to apply.
The instrumentality is not the game.
The game takes place in the minds of the players. Human players, intensely seeking ways to
beat the brains out of the guys across the table or in the other room. It is that dynamic – and the
competition, conversation and contemplation it creates – which is our most powerful and
[237]
promising source of inspiration and innovation.' (My italics)
This chapter addresses the role of technology – the instrumentality – in supporting wargames.
You may think that you can detect an anti-technology bias running through this book. You
would be wrong, and I want to quash any such notion. However, I regularly find it necessary to
counter pro-technology and innovation biases, so understand why some might think I have a
pro-manual bias when, in fact, I am merely ensuring that all sides of the argument are heard.
Those with a pro-technology bias assume any and all new technology or innovation will be an
improvement and must be used. I am no Luddite (but accept that there are wargaming
practitioners who have a pro-manual bias), but echo Peter’s caution that electrons are not
always best. The pro-technology bias is so widespread that it is addressed as early as the
foreword to Zones of Control, where the editors discuss the cursory consideration of wargames
by the military and academia ‘before they arrive at the presumptive finality of a digital
[238]
present.’
Terminology
I need to define automation, autonomy and Artificial Intelligence (AI), manual, computer-assisted
and computerised.
[239]
In a presentation to a NATO Working Group , Colin Marston (Dstl) and Patrick
[240]
Ruestchmann (Serious Games Network-France ) defined automation and autonomy as
follows, and explained how they are related to wargaming. I am grateful for this information and
follow up conservations with the authors to compile this chapter.
Automation is principally defined as the act of performing repeated tasks. There is little
variation in the input, besides pre-determined categories, and such variations are accounted for
by simple measures (e.g. re-calibration, normalisation). Typical examples are found in robotic
assembly on the factory floor: the piece of machinery provided to the robot is known within
precise tolerance, the positioning can be adjusted to a high degree of accuracy, and the action
performed is pre-programmed. Note that automation is not limited to a single set or sequence of
actions; 'decision-making' can be performed, based on the input or result of actions, via a
decision-tree ('if' statements).
Automation in wargaming
The role of automation concerns tasks which are: 1) precisely identifiable; 2) uniform
throughout multiple wargames (of the same format); and 3) providing measurable quantities of
interest. Automation could be performed by humans, but is truly beneficial if conducted by
computerised systems (leveraging speed, uniformity, data size, accuracy and longevity). Areas
within wargaming that can benefit from automation (in part or full) are: data collection,
processing, visualisation, and adjudication.
Autonomy is defined as the ability to perform tasks according to the input data, state of the
agent, and environment, without strict observance to a pre-determined decision tree. By
contrast with automation, an autonomous agent can handle significant variations and
uncertainty in the input ('fuzzy' data). A key difference is that an autonomous agent learns from
experience. This makes the decision-making process non-uniform (not all agents have the same
experiences), and approximate (there is no exact answer). All autonomous agents are given a
general set of objectives, and learn through an optimisation principle, minimising the
discrepancy between achieved and desired objectives (this is generally valid for supervised,
unsupervised and reinforcement learning). For example, an autonomous program is given the
objective of achieving the highest score in a game (the objective must have a measurable
quantity associated to it); it can play against human agents, other AI agents, or itself. If an
additional objective is provided (e.g. minimum time to success), the solution will be a
[241]
combination of those (e.g. Pareto front ).
Autonomy in wargaming
Autonomy allows a flexible approach to the analytical support of wargames. Autonomous
enhancements can potentially be applied to a wider class of wargames (or parts thereof) and
are dynamically adaptable to geographical and temporal variations (updates in policies,
scenarios, technologies etc) due to their 'learning' ability. Factors to consider include:
· Which processes, currently performed by human agents, could be replaced by trained
autonomous systems (computerised 'agents').
· Which processes, currently not performed by human agents, could be added as
autonomous agents to the execution of the wargame to improve it.
The Dstl Peace Support Operations Model (PSOM) was built around a semi agent-based model;
the population faction (which was actually multiple factions/players depending on the
associated social-ethnic considerations within the scenario) was played by the ‘computer’. The
best way to represent influence activities is to use a simple agent-based model to support
human-in-the-loop (HITL) adjudication.
Artificial Intelligence is the ability of a machine or a computer programme to think and learn. The
concept of AI is based on the idea of building machines capable of thinking, acting, and learning
like humans so they can deal with tasks in a way we would call intelligent or smart. AI involves
many different fields, such as computer science, mathematics, linguistics, psychology,
neuroscience, and philosophy. For a machine or a computer programme to be considered AI, it
should be able to mimic the human thought process and human behaviour, and it should act in
a human-like way e.g. intelligent, rational, and ethical.
Artificial Intelligence in wargaming
AI can be used to augment automation and autonomy and can feature in a number of areas
within the wargaming process, as Figure 9-1 shows. For example, the use of personal
assistants such as Alexa or Siri has already become the new normal within everyday life. These
intelligent gadgets can recognise our speech, analyse the information they have access to, and
provide an answer or solution. They continuously learn about their users until the point at
which they can accurately anticipate users’ needs. Asking a personal assistant open-ended
questions to understand and inform a decision in a wargame can be seen as using AI in
wargaming.
[242]
Figure 9-1. Potential insertions of AI in the wargaming process
Manual, computer-assisted and computerised
I categorise the technology that supports wargames into 'manual', 'computer-assisted' and
'computerised'. According to the definitions in Chapter 2, there can be no such thing as a
‘computerised wargame’ that subsumes all the elements listed in that chapter. The essence of a
wargame is people making decisions, so it is impossible to have a wargame without people. You
can have a computerised simulation, and it is possible to computerise the adjudication function
(although that carries risk, as discussed below), where everything takes place inside the
computer with no human interference – but you cannot have a computerised wargame. The
people using a computer simulation, such as the UK’s Advanced Battlefield Computer
Simulation (ABACUS), are still making the decisions and, hence, engaged in or supporting a
computer-assisted wargame. Hence, a computer-assisted approach is one in which some
degree of digitisation is used. This is the case in most professional wargames. Examples
include operator-controlled computer simulations, spread-sheet look-up tables, situational
awareness and visualisation tools, data capture tools and the distribution of any of these.
However, even when supported by significant technology, people predominate. Players are the
vital ingredient, but ‘people’ includes control staff and analysts. You can clearly also have a
manual wargame, in which all components are physical, and all adjudication is conducted by a
human(s).
The risks of computerisation
Computerising the crucial adjudication function carries risk. One of the essential characteristics
of wargaming is transparency. The biggest risk with computerised adjudication is that the
outcomes of players' decisions are determined inside an opaque 'black box'. This might be
appropriate in hobby games, where the player(s) accept outcomes to move the game forward,
but it cannot be the case in professional wargames, where transparency and control (to meet
objectives) are key and the HITL element is essential. This is particularly the case in analytical
wargames, where analysts strive to understand player decisions and the factors considered.
The case is less clear in educational and training wargames, but it is still important that Control
understands why outcomes occur. It is also frustrating for players when 'black-box'-produced
outcomes cannot be explained.
The computerisation of other wargame elements should also be approached carefully. Read Phil
[243]
Sabin’s ‘The benefits and limitations of computerisation in conflict simulation’. Phil makes
the point that manual and computer simulations have strengths and weaknesses. He then
[244]
explains why he prefers ‘a more balanced mix of old and new technology’:
· Computerised simulations are ‘expert-led’, and their algorithms impenetrable to the average
wargame user (‘black boxes’).
· Computerised simulations tend to be designed from the ‘bottom-up’, maximising aspects
such as technical equipment data and graphics. This contrasts with the simpler ‘top-down’
approach standard in manual simulations, which concentrate on the overall effects and less
quantifiable ‘soft effects’ and human dimension.
· Despite 3D technology, augmented reality and so forth, computer displays do not
automatically deliver benefits over analogue solutions.
Those bullets are paraphrased from Simulating War, but I want to quote Phil verbatim to
substantiate that final point: ‘When the object is to portray units positioned on a map, computer
monitors and data projectors are actually less effective than physical maps and counters on
one or more large tables, since their fixed resolution and limited field of view frustrates
employment of the human eye's wonderful combination of central acuity and breadth of
[245]
vision.'
Permit me a short anecdote to illustrate that quote.
Supporting a corps-level wargame at the US Warrior Preparation Centre, I was part of a small
UK Lower Control ‘response cell’. We represented a brigade HQ, working to the US Task Force
(TF) Tiger divisional-level HQ. We had a small room, about 100m from the TF HQ. In the TF HQ,
rows of staff officers sat at terminals facing a bank of perhaps thirty monitors suspended on
frames. We had a table, a monitor and a printer. On asking for a map, we were told there were
none available; everything was electronic. So, we printed about forty A4 screen shots of the
operational area, taped them onto the table and used page tab markers to show units. A few
days into the exercise, one of the TF HQ staff officers visited us to receive a back-brief. We
conducted this around the map, and he commented how clear it was. A little later, the same
officer appeared and asked if he could brief his team using our map. Those briefs became
regular, and others started using the facility. Presently, the primary TF Tiger G3 staff officer
brought in his team to brief around our map. After that, he set one up in the main HQ. This for
precisely the reasons stated in Phil’s quote: a physical map can enable more effective shared
situational awareness than tens of computer monitors (but only if there is no requirement to
distribute it).
Technology must support people, not the other way around
There is no question that technology has a role to play in wargaming, but it must be in support
of people; to make their tasks easier and to free up their capacity to think – to make decisions,
to control and to analyse. The question is, ‘What technology is appropriate?’ We know from
Chapter 5 that there is no 'one size fits all' wargame – and that applies to technology, too. There
is no tool that is better than all others in all circumstances. Indeed, it is usually the case that
different models, methods and tools are complementary, not preclusive. Figure 9-2 illustrates
such a complementary approach.
[246]
Figure 9-2. Complementary approach to wargaming. Photo courtesy of NSC
Computerised calculations can be appropriate in well-bounded situations where the physics is
understood. Examples include kinetics such as missile effects, time and space calculations,
logistics consumption and sensor calculations. However, computerised outcomes become
questionable as soon as soft and intangible factors are involved. Despite advances in agent-
based modelling (ABM), AI, machine learning (ML) etc, I am not convinced that computerised
tools can model human factors and the plethora of human-related variables to deliver valid
outcomes. However, these technologies can be used to assist wargames, providing an input
into HITL adjudication that is transparent, well-understood and able to be documented. It is a
mistake to think that technology can provide an output from a wargame.
To summarise this introduction:
· Many people suffer from a pro-technology and innovation bias, which leads to a
presumption of a digital solution.
· Computerising the crucial adjudication function carries risk, but computerised
simulations, AI, ML etc have a role to play as an input to a HITL process.
· Technology has a place in professional wargames, but it must support the people
involved, making their tasks easier and increasing their capacity to think.
· Manual, computer-assisted and computerised approaches are usually complementary;
there is no one-size-fits-all. Hence…
· The technology you select must be appropriate.
The remainder of this chapter is a contribution from Dr Karl Selke.
Empowering wargaming through automation: the pros and cons of the computer
medium
Introduction
Simulation, whether man-only, machine-only, or a combination of the two, is a purposeful
undertaking whereby abstraction is leveraged to provide learning useful in the real world. The
bottom line is that experience suggests that fuzzy learning objectives lead to fuzzy insights
from the simulation. The ‘design to a purpose’ theme highlighted in this book is just as central
to the discussion of computer automation within a wargame. As computer automation is often
on display in front of the players, the lack of purposeful design will lead to unwanted and often
embarrassing results as the computer automation derails or distracts from the game objectives.
Types of computer automation and simulation
Computer automation can be categorized by the role of the computer and human players within
a wargame. Figure 9-3 presents a framework for considering automation support within a
wargame.
In common with all systems and projects, a wargame is best considered in terms of a holistic life cycle, as shown
in Figure [11-1]. Wargames are iterative, and follow a circular life cycle: the refine step leads directly back into
(re-)design, even if changes are minor. Even one-off wargames should follow all steps, including validation and
refinement. Lessons identified should be captured and promulgated to inform other wargames as applied lessons
learned. All wargame proceedings, findings, suggested refinements and best practise should be collected centrally
[265]
for future use.
[269]
Figure 11-2. A 5-phase wargame project
The US Naval War College War Gamers’ Handbook describes the necessary early discussions
with the sponsor: ‘The importance of properly defining the sponsor’s problem cannot be
overstated. The game purpose, objectives, and all subsequent game actions should be mapped
back to this problem. Defining and working toward the wrong problem neither fulfils the
[270]
sponsor’s needs nor effectively utilizes the collective time and talents of the department.’
The War Gamers’ Handbook contains a useful diagram that includes this ‘Tasking’ phase, and
provides another view of the overall game project process. This is at Figure 11-3. It sub-divides
‘Development’ into ‘Testing’ and ‘Rehearsal’, reflecting the larger scale and complexity of many
of the College’s wargames compared to UK and European games.
[271]
Figure 11-3. US Naval War College game project management process
[272]
Stephen Downes-Martin offers detailed guidance how you might approach the sponsor
during project initiation (‘Tasking’). It builds on his ‘3+1’ approach introduced in Chapter 7 (The
Wargame Team):
‘Engage sponsors by using their own mission planning process and language and make clear
the sponsor is a required participant in the wargame planning process (as is the wargame
director, designer and analyst) in just the same way that the commander's presence is
necessary during phases of the operational planning process (as is the commander’s staff). The
likelihood of an operation being well-planned and thus well-executed if the commander or staff
are absent from the planning process is as low as the likelihood of a wargame being well-
designed and executed to meet the sponsor's needs if the sponsor (or wargame staff) is absent.
The action officers cannot do it all: the sponsor must engage with the wargame planning
process. You demonstrate respect for the sponsor's and action officers' time and expertise by
using their own process.
For military sponsors, treat the request for a wargame as a mission to which you bring (a
suitably adapted) mission planning process using military language. A military officer serious
about sponsoring a game should respond well. Literature from the project management world is
useful when dealing with civilian sponsors from civilian organisations. For military sponsors I
strongly recommend you explicitly use your nation's operational planning manuals/doctrine
documents to build credibility and put pressure on your sponsor to perform.
What you are after is a “Mission Statement” consisting of objective, guidance, commander's
intent, intelligence (about the problem) etc. Using this language makes clear to the military
officer that this is a serious endeavour, requiring the commander's (sponsor's) attention if the
mission (the game) is to be successful. You will learn much about whether your organisation
should accept the request for a game from the sponsor's attitude towards engaging with you
[273]
(under your guidance of course - the 3+1 questions laid out in the Three Witches paper
helps with this) as you undertake “Mission Analysis”.
The “Restatement of the Mission” part of the planning process is handled by the follow up
“Initial Planning Conference”, at which you obtain a signature for the objectives, Commander's
Intent, guidance, desired timings, etc – or start negotiating. But get a signature before
proceeding with the game. Also make clear that the game will not happen without the sponsor's
signature on a mutually agreed “restated mission”, and that late changes to this document will
probably result in delays to the timetable due to the need to redesign the game.
“Sponsor's Objective.” You will decide whether a wargame or some other type of activity best
addresses that objective, and you will be in charge of the wargame design if a wargame is the
best way forward. Objectives are contractual, i.e. how well you achieve the objectives
determines how successful was the game.
Use a drill down on “What do you want?”
“Commander's Intent.” This, with the Objective, will allow you to craft subsidiary objectives and
Research Questions. Research questions are aspirational; which ones the game provides
relevant data for will depend on the players and the game's trajectory, so not all of them will be
addressed or addressed to the same depth.
Use a drill down on “Why do you want it?”'
“Intelligence Preparation of the Environment.” The equivalent information desired here are the
resources the sponsor brings to the game, when the sponsor needs the final report (after
analysis, after the game), who is the sponsor's audience for the report and for the game, what
bureaucratic purpose does the sponsor have for sponsoring a game, analysis and report, what
has gone wrong with previous attempts to address the problem, who are the bureaucratic
enemies of the sponsor and what are their objectives, and so on.
Also drill down on “Why don't you have it?” and ask the sponsor and the action officers “When
are you rotating out of here?”
I find an initial meeting of 30 minutes discussion with the sponsor to explain the planning
process you use and obtain the initial Sponsor's Objective and Intent, followed by several hours
(a half day) with the action officers to drill down on the planning products for the game is
sufficient. The final action of this meeting is to schedule with the sponsor and action officers
the “restated mission” meeting with the requirement that the sponsor (not the action officers)
signs off on the wargame plan and design.’
Aspects of the Wargame Lifecycle discussed elsewhere in this book
Many important aspects of the Wargame Lifecycle demand their own chapters. Analysis,
scenario writing, scenario development, scenario execution, facilitation, the control function
and generating outcomes are introduced in Part 3 to highlight where these crucial and pervasive
topics relate to the Lifecycle, but are fully detailed elsewhere. Hence, there is a lot of cross-
referencing and sign-posting to other chapters of the book.
Chapter 12. Wargame Design
‘In the whole range of human activities, war most closely resembles a game of cards.’
Carl von Clausewitz, On War
Introduction
I have refined the wargame design steps in the Wargaming Handbook. Those in Table 12-1 are
better standardised between training and analytical wargames.
Wargaming Handbook extract
[274]
As McHugh advised, wargames should ‘point to’ either a training or analytical purpose, but a dual benefit will
ensue. Typical design steps are shown in Figure [12-1] for training and analysis wargames. While similar, the key
difference lies in:
· The effects to be enacted on the players (training); and
· Subjects of analysis and metrics (analysis).
The design phase often includes an initial wargame design meeting. This should be held as early as possible in the
overall planning of the activity that the wargame will support. Wargame design ‘by committee’ can be counter-
productive, so attendance should be restricted to the minimum number required. In very small wargames, one
person may perform the last three roles below. Attendees should include:
· Sponsor, or an authorised representative.
· Game director (who could be the sponsor’s representative).
· Wargame designer.
· Lead simulation expert.
· Lead operational analyst.
Training wargame Analysis wargame
design steps design steps
1. Specify the aim and 1. Specify the aim (usually
training objectives related to the research
question) and analytical
objectives
2. Identify how the 2. Identify how the outputs
outputs will be used and will be used and integrated
integrated
3. Identify the people to 3. Identify the subjects of
be trained, their roles analysis, the critical
and the decisions they elements within these and
will be expected to make key variables
4. Determine the desired 4. Determine how the
effects on the players, subjects of analysis will be
and the wargame examined
activities required to
create these
5. Determine the setting, 5. Identify the data and
scenario and types, level metrics to be gathered to
and sources of enable the examination,
information the players and how this data capture
will need to make their will be done
decisions and to enable
the training objectives to
be achieved
6. Identify, or design, the 6. Determine the setting,
structures and processes scenario, vignettes and
required to achieve types, level and sources of
Steps 3 and 4 information required to
enable the examination
7. Identify or design the 7. Identify the people and
methods, models, processes required to
techniques and subject ensure the validity of the
matter experts needed to examination findings
populate and enable
these structures and
processes, including
adjudication
8. Create an audit trail by 8. Identify, or design, the
documenting all structures and processes
assumptions, decisions required to achieve the
taken and the reasons for examination, including
them adjudication of outcomes
9. Identify or design the
methods, models,
techniques and subject
matter experts required to
make these processes
work
10. Create an audit trail by
documenting all
assumptions, decisions
taken and the reasons for
them
The wargame design meeting agenda typically follows the steps at [Table 12-1]. While the wargame designer
should lead, the sponsor is responsible for providing clear aims, objectives and scope for the wargame. These
frame the problem and are an essential start point for all wargame design. The outcome of the design meeting
should be an agreed and documented set of actions and their rationale. This becomes the schedule for all
development activity and provides a constant reference point for all queries arising. Wargame design is an iterative
process and outcomes should be revisited as necessary.
Scoping previous wargames. The wargame team should identify and speak to organisations that have conducted
wargames like the one they are considering. This relates directly to the ‘validate’ and ‘refine’ steps from previous
games, which should identify wargaming lessons identified for future wargame teams to use. This engagement can
[275]
occur before the design step starts in earnest.
Figure 12-2. Example extract from an analytical wargame design meeting 3-column write-up
Column 3 provides a comprehensive set of actions and outcomes. With the rationale for these
captured in the centre column, the document usually suffices as the required audit trail. I
haven’t done so in this example, but you might also capture who made the decisions and
assumptions. As well as providing an audit trail, the document serves as a prompt for agenda
items during development phase meetings to ensure that actions are complete or being
progressed.
Design constraints
Determination of the scenario and simulation
Scenario writing and development is covered in detail in Chapters 19 and 20. Factors affecting
the choice of appropriate simulation were covered in Chapter 9 (Appropriate Technology). I
touch on these now because they must feature during the design meeting.
The scenario and simulation(s) are often pre-determined by the game sponsor, so will likely be
constraints on your wargame design. However, you must still address them during the
design phase, and assess their suitability thoroughly during the development phase, suggesting
amendments and work-arounds as required. Do not immediately accept the imposition of such
constraints as absolute and irrevocable, because they might have been made by people who do
not fully understand how a wargame will meet the requirement.
In an ideal world, determination of the setting, scenario and any detailed vignettes would occur
at steps 5 (training) or 6 (analysis) of the wargame design process. That is necessarily a long
way into the process. You should fight for the flexibility to adapt these givens to best meet the
requirement emerging from previous steps. At the very least, you should make the point that
the scenario will probably need to be adapted or developed to ensure that the training or
analytical objectives are met. Required refinements might arise during the design meeting itself;
they will probably surface during subsequent design work; and they will almost certainly
emerge during development and playtesting.
The simulation can be more problematical, because the option of replacing one that has already
been selected rarely exists, even if it is inappropriate. Ideally, the choice of supporting
simulation(s) is the penultimate step in both the training and analytical wargame design
processes. Again, never assume that a pre-determined simulation is fit for purpose until you
have worked through the design steps, confirmed the requirement and then assessed the
tool(s) during the development phase. This might necessitate hands-on use or, as a minimum, a
demonstration of the simulation to assess it against the now-confirmed requirement. The likely
outcome is that areas of weakness will be exposed, and you will have to work with the
simulation experts to devise ‘work-arounds’.
Game venue
The real-world space allocated is important to the success of the game. Each team, or cell,
needs sufficient space with tables, maps, computers etc. The relative position of the players,
Control staff and analysts is important. You should visit the actual game venue during the
design phase if possible, and the game should be play-tested there during development. Do not
take it for granted that the sponsor, or the executive officer, understands the wargame’s real
estate requirements. These need to be explained and agreed, and then confirmed during site
visits. It is too late when you arrive at the venue for execution to realise that player or Control
ergonomics will not work. While player HQs are usually set up by experienced teams (often
replicating reality), unless the Control set-up has been carefully considered and tested, key
cells can find themselves isolated or with insufficient space or facilities to function properly.
The wargame design team
The Core Wargame Team (CWT) was discussed in Chapter 7. The following points should be
considered during wargame design.
Involve the CWT as early as possible
The CWT should be created, and fully involved, during the wargame initiation phase; otherwise
the CWT should certainly be fully formed and present at the design meeting.
Minimise the size of the design team
The number of people involved in wargame design should be minimised; otherwise meetings
degenerate into talking shops and ‘design by committee.’ Successful wargame designers,
whether professional or hobby, report that games should be designed by one or perhaps two
people initially. Once the outline game design is ready, it can then be handed over to a larger
development team. This minimalist approach contrasts with NATO’s. The sequence of Initial,
Main and Final planning conferences suggested in Bi-SC 75-3 is accompanied by a long list of
[276]
responsible officers and attendees. What Bi-SC 75-3 does not specify is an initial wargame
design meeting. Given the size and scope of many NATO wargames, I understand the need to
involve numerous attendees at the three principal planning conferences – but try to keep the
wargame design team small, if you can.
Peter Perla’s Analyst, Artist or Architect? approach to wargame design
[277]
The 2010 edition of The Art of Wargaming explains three different design approaches.
Being aware of these is a good start; consciously picking the optimum blend to meet the
requirement is an ideal to strive for – and is often the necessary reality of most wargame
design. Each has pros and cons.
The Analyst focuses on modelling the real world. ‘The Analyst uses data and theory to model
the real world, including the players as elements of the model… You may often hear this
approach described as human-in-the-loop modelling or wargaming. The model is the main point
of emphasis in design and interest in play. For this reason, it tends to dominate the view of
most defense professionals when we talk about wargaming. But it is not the most useful
approach to most of the problems we face under current conditions. Our models, at best,
predict the past; Analyst wargames tend to imprison their players too tightly in that past for
them to lift their eyes high enough to see the circling Black Swans.’
The Artist focuses on immersing players in a story that they become part of, engaging them
intellectually and emotionally. ‘Like the Analyst, the Artist bases his game on real data, and lots
of it. But instead of using that data to build a clock-work model – as an Analyst does – the Artist
uses data to build an immersive storytelling environment. The Artist is the storyteller, and he
crafts the game’s story to engage and affect the players both intellectually and emotionally, by
communicating his own creative point of view on the subject matter… At its best, the Artist
approach allows the players freedom to surprise themselves, but it can sometimes be difficult
for them to surprise the Artist himself. Indeed, the Artist-designer has more to say to the
players of the game than the players have to say to the Artist.’
The Architect focuses on distilling a simplified decision-making environment to challenge
players with key decisions. ‘The Architect-designer is also trying to produce a story, but it is not
a story of the Architect’s own creation. Instead, it is a story the players tell each other as they
live through the game. The Architect uses data, as all designers do, but he uses that data to
create a representation of the game universe in which the players will live and work, but only to
the level of detail and completeness necessary to allow the players to focus their attention on
what the designer (and other stakeholders in the game) deem to be most critical. The Architect
distils the data of the real world into a form that is more readily accessible to the players for
making decisions in that universe. The decisions they make may be restricted somewhat to
those the Architect’s research indicates are the most critical… It presents the players a
somewhat more restricted range of decisions that might be available in the Artist’s approach,
but in the context of allowing the players more freedom to develop their own story line.’
Design phase output
The output from the design phase should be a blueprint for all subsequent development work.
This should also fulfil the audit trail requirement, although some additional documentation might
be required. This blueprint is a living document. It will be adapted and should form an agenda
item for subsequent meetings.
The US Naval War College War Gamers’ Handbook says, ‘The design phase, led by the game
designer, provides the backbone of the wargame. The primary focus driving the design phase is
creation of a game design document. All games are required to produce a game document,
which serves as a guide for the intended game and as a reference for future game
[278]
designers.'
Chapter 13. Wargame Development
‘Playtest, playtest, playtest!’
Every game designer, ever
Introduction
If there is one take-away from this chapter, it is a single word: playtest, playtest, playtest! There
is more to it than that, but ask any wargame designer – serious or hobby – what the most
important aspect of wargame development is and you will get that answer.
In the sections on development meetings and playtests, I offer example agendas from two
wargames on different points on the spectrum of scale of event: one of limited scope, and one
complex. These are not blueprints: every meeting and playtest will be unique.
Although unlikely to feature in a standard development phase for a specific wargame, note the
Operational Commanders Test discussed in Chapter 24 (Generating Outcomes), in which
veterans of a conflict help develop a wargaming system.
Wargaming Handbook extract
The wargame team, with appropriate assistance, individually or collectively complete the actions arising from the
design step. Examples of development taskings include the following:
a. Setting and scenario. The effort required to develop a scenario (including mapping, whether physical or digital)
[279]
can be considerable. The six modules detailed in the NATO directive provide a good guideline, including for
smaller wargames. While the use of main events lists and master incidents lists is common, take care to avoid
pre-scripting the wargame narrative.
b. Adjudication methods. These will ideally be drawn from existing and proven methods, but new tools or
techniques might be needed.
c. Wargame processes. The success and failure of most wargames depends on using correct and robust
processes, irrespective of the technologies used.
d. Analysis plan, plus any supporting processes. An analysis plan should exist for both training and analytical
wargames.
e. Data capture plan. This is derived from, and supports, the analysis plan.
f. Simulation. The simulation(s) might be original, or a modification of an existing one. Considerable effort may be
required to configure, populate and physically set up the simulation(s).
g. Players and supporting personnel. Undermanned player cells or absent subject matter expert functions can
invalidate the entire wargame.
h. Venue and layout. The physical space within which the wargame will take place can vary from a single table to
distributed multinational locations. While this should be dictated by the wargame design and development
process, it is often the case that venues will be predetermined and can act as a constraint on wargame design.
Several development meetings, workshops and conferences are typically required. These can be large, such as the
series of initial, main and final planning conferences used by NATO, but need not be for smaller wargames. The
outcomes of the design step should form the agenda for development meetings, or at least be reviewed as an
important agenda item.
‘Play testing’ is critical to deliver all wargames successfully. A series of events, interleaved with ongoing
development work, is commonplace, and could include the following:
· Internal Play Test. The internal play test is usually attended by just the wargame team. The purpose is to test
the progress of key development items such as adjudication methods, processes, the scenario, data collections
and analysis plans.
· Integrated Systems Test. The purpose of the integrated systems test is to assess whether the wargame
systems integrate to the required degree of rapidity and simplicity. It is a good opportunity to involve the
sponsor and game director to confirm that the wargame is on target to achieve the objectives.
· Test Exercise. The purpose of the test exercise (TESTEX) is to robustly test all aspects of the wargame to
ensure they are fit for purpose. While all wargame elements (including briefings, technology and processes)
should be evaluated, the TESTEX should not be mistaken for a rehearsal. A representative of the sponsor and
the game director should be present.
Rehearsal. Differentiated from the TESTEX, a rehearsal is required just before the actual wargame, with sufficient
supporting staff and player representatives. No new issues should arise; the rehearsal is primarily to confirm that
the technology and processes supporting the wargame will work.
Outcome. The desired outcome from the development step, and the series of play tests in particular, should be
that the sponsor, game director and all members of the wargame team are confident that the wargame can be set
up, executed with a full player contingent, deliver the required outputs and meet the overall aim.
The diversity of wargames that results from the combination of variants and contexts precludes a detailed
explanation here of how to execute them. Execution is a bespoke activity that will vary considerably from wargame
to wargame; it must be entrusted to a suitably staffed, qualified and experienced wargame team, supported as
required by subject matter experts. However, common activities (which will vary considerably in complexity) are
below:
· Conduct simulation and systems set-up. This can range from a map on a table to federated and distributed
computer systems.
· Conduct simulation user training as required.
· Conduct pre-wargame and start-of-wargame briefs for control staff and all participants.
· Conduct the wargame.
· Capture data and analyse the wargame. Some analysis will occur during execution, some afterwards. Data
capture during execution will be required in both cases.
· Conduct the after action review. These are not limited to a single end-of-wargame event, so can occur during
execution.
· Collect and collate lessons identified throughout for use during the ‘validate’ and ‘refine’ steps.
Set-up
Allow more time than you think to set up. Within the bounds of common sense, you cannot set
up too soon. If everything is set up early and you have spare time, consider using this for
rehearsals and ‘test turns’ (see below). Even if you have ‘smashed’ the development phase,
there will be participants for whom the wargame is entirely new. Any opportunity to familiarise
them with the wargame processes and scenario through game play should be seized.
Given the variety of wargames it is not possible to produce a single set-up aide-memoire.
However, I have included some points below that I frequently encounter. You should develop
your own check-list throughout the design and development phases, and have this to hand on
arrival at set-up.
Real-estate
Despite site recces, in-situ playtests and submitting detailed lists of your requirements, do not
assume that the right equipment or facilities will be waiting for you. The person, or people, you
have been dealing with might not be at the wargame venue, having delegated set-up to an
administrative branch or department. Even if the organiser is present, they will be inundated
with people making demands. Arrive early, claim your real-estate and set up. Then get people to
come over and start them playing.
Simulation and systems user training
Whether using manual, computer simulation or both, you must allow time for, and plan, user
training. Ideally, this can dovetail into rehearsals and a ‘Turn 0’ (see below), helping people get
into the scenario and become familiar with game mechanisms.
If you can harness user training to enable people to make decisions that will actually affect
game play, it helps to engage them, as you will see in Chapter 18 (High-Engagement Wargames).
For example, making amendments to orders of battle and dispositions or enhancing personality
profiles allows people to take ownership of the actors, factions and force elements they will
control during the game. Clearly, all of this must be done within the scenario parameters – the
wargame has not started yet! Even so, build in opportunities during user training for people
(players and Control staff alike) to start taking control.
Conduct training on IT, communications equipment and any scenario management software,
according to plans you devised during development. This is often interleaved with simulation
user training, with different groups (players, Control cells, observers, analysts and so on)
needing bespoke training, probably at different times. Hence, user training across all systems
can take several days if you are using complex computer simulations, real Communication
Information Systems and/or scenario management tools.
Initial briefings
Initial briefs should have been produced during the wargame design and development phases,
and delivered in full at least once, at the Testex. Introductory briefs tend to be structured
broadly as below. Different versions of each might be required for different participants, for
example players and Control staff.
· Administration.
· Introduction. Aim, objectives etc.
· Wargame mechanics. This will probably require variants for: analysts,
Observer/Mentors (O/M) and other support staff; and players (who probably need to know
less about the processes and sub-processes).
· Scenario. The scenario brief might also have two versions: one for support staff, and
one for players.
These briefs are important, and usually contain a lot of information, so keep them as short is
sensible or consider breaking them up. After 30 minutes of health and safety and
administration, a 10-minute pep talk from the sponsor, then a detailed brief on the scenario, the
audience’s ability to assimilate the detail of the wargame mechanics will be limited.
Exacerbating this, due to pressure of work or other priorities, assume that no-one has read any
of the instructions, scenario or pre-reading sent out prior to the wargame. Briefs should be
prepared and delivered based on this assumption.
Scenario brief
The scenario brief deserves particular attention. You will read in Chapter 18 about the ‘Invitation
to Play.’ This is the point at which players – and support staff to a degree – move from the real
world to being immersed in the play space known as the ‘magic circle’. The point at which
people step into the magic circle, and how this is done, is important. One way is to deliver all
except the scenario brief, and then have a break. The break could be a protracted period when
participants set up cells, find their accommodation or whatever. When everything else has been
done, deliver the scenario brief. The brief should end with a high-impact, ideally multi-media,
[286]
climax that precipitates participants into the scenario. That ‘Invitation to Play’ is Startex :
players and support staff are transported into the magic circle, not to emerge until the end of
the wargame.
Rehearsals, mini-games and ‘Turn 0’
At some point prior to Startex you should hold one or more rehearsals. All participants – or as
many as you can get – should take part, and all simulations, IT, communications equipment,
recording systems, scenario management tools and so forth should be used for real. This is not
the Testex: it is a genuine rehearsal, with the focus on familiarising people with the scenario,
equipment and roles they will play during the wargame.
The mini-game
An excellent way to do this is to run a mini-game. Simply putting people into their physical
locations achieves little by way of rehearsal, even with equipment live. You need to do more
than that – so why not run a mini-game? Pick a slice of the anticipated wargame play (identified
during the development phase) and run one or more turns or, if using real-time simulation, run
the game for a fixed time, including data capture. It is best to choose a relatively calm period in
the anticipated scenario, rather than pitch people straight into frenetic activity. Having run the
mini-game, re-set everything back to the Startex point.
‘Turn 0’
A variation on this if you are executing a turn-based wargame is to hold a ‘test turn’, or ‘Turn 0’.
Choose a turn that involves as many aspects of the game-play as possible, and include
recording. Run the turn as if for real, but ensure that everyone knows it is a practice and take
the opportunity to explain the game mechanisms. At the end of the turn, assume that you will
re-set everything back to the Startex situation, and then execute it again as the real Turn 1.
Hence your wargame schedule should include time for a Turn 0 and a Turn 1. However, having
played the Turn 0, if you, the Game Director, Game Controller and Lead Analyst are content,
then this can become the actual Turn 1 and be used for real. But take this decision carefully,
and do not assume that outcome. In a training event, the Turn 0 is best conducted before
Startex with just key members of Control. Typically, it takes place while Control is being set up.
Once all staff are available, the situation is replayed as the real Turn 1 with the full Control
contingent.
The output of a well-run rehearsal, mini-game or Turn 0 should be: all participants familiarised
with the scenario, wargame processes and enabling equipment; and players who have had an
opportunity to figure out how to enact their plans without these being skewed by unanticipated
aspects of the wargame mechanics.
The essentials of effective execution
So, this is it. Startex. The rubber hits the road and you’re off. While there are many important
aspects of the Wargame Lifecycle that continue beyond Endex, execution is where everything
comes together – or doesn’t. Unsurprisingly, therefore, the essentials of execution reflect the
essential characteristics of successful wargames discussed in Chapter 6.
Effective facilitation
Good facilitation is essential to successful execution. It is the third of Rex Brynen’s ‘Three
Pillars’, and covered in Chapter 23. I will highlight just one point here: the relationship between
the facilitator and Game Controller is key to success.
Transparent adjudication
Transparency is discussed in detail in Chapter 24 (Generating Outcomes). Some results might
have to be presented as a fait accompli, with no explanation, for example for training purposes
or, a less-good reason, to achieve rapid game play. However, players and Control staff make
better decisions if they understand the factors leading to an outcome, so make these as
transparent as possible.
Effective data capture and ongoing analysis
Data capture is a critical activity, and must be accorded a high priority in terms of ergonomics
and effort. While this is slightly less the case in a training wargame, it remains important, with
O/M needing to capture material for After Action Reviews (AARs). Hence data capture methods
must be effective (including capturing the vital communicative aspects of the game), with
recorders and analysts front and centre of all discussions.
Preliminary analysis is likely to occur during execution in both analytical and training wargames.
AARs (and Scientific AARs) often take place daily, or more often on an ad hoc ‘as required’
basis. When conducted immediately after wargame play, these are often called ‘hot wash-ups’
(‘hot washes’). Conducting hot washes carries risk, due to the tempo with which they are
prepared and conducted; each needs sufficient time to make it effective, and a staffed sub-
process that has been planned and tested during development.
Execute to the same purpose you designed for
An obvious point, too often forgotten: the wargame you are executing must stay true to the
purpose for which it was designed. It is easy for ‘mission creep’ to occur, with the wargame’s
scope and even aim and objectives being subjected to ‘improvements.’ If this is because of a
design or development flaw, then something must be done. Take remedial action and note
everything for the lessons identified (LI) log. Should this occur, the most likely error was
insufficient engagement from the sponsor and Game Director during the design and
development phases. Deviations or suggested amendments should have been discussed then,
not during execution.
Conversely, changes in scope can result from the wargame being too successful, when people
naturally want to exploit it and achieve more. Accepting that such pressures are the result of
positive factors, and that it is difficult to disappoint, you should still objectively consider the
impact of any proposed change. A lot of apparently good ideas originate from well-meaning
participants who do not understand the consequences of implementing what might appear to be
an innocuous and positive ‘improvement.’ Suggested changes are often the result of senior
[287] [288]
officer WAGIIs or ‘CORGIs’ . Coming from senior officers, these tend to be accepted
with insufficient scrutiny. Be judicious in accepting any idea that affects the wargame objectives
or scope, or that over-complicates something that should remain as simple as possible.
Effective, but light, control
The control function is critical and is discussed in detail in Chapter 22 (Controlling Wargames).
The point to note here is that while control must be effective, it should be implemented with as
light a touch as possible. This talks to characteristics of successful wargames such as the
primacy of player decisions and engagement.
High-engagement wargaming
Chapter 18 presents a myriad of methods by which you can ensure participant engagement. I
will highlight just two of these now. The first is the level of pressure on players. Key to players
staying in ‘the flow’, maintaining the correct level of pressure is central to continuing
improvement and engagement. I begin every Control meeting with an assessment of the level of
pressure on the players; it is the starting point for all discussion. Secondly, feedback is crucial
for player engagement. Feedback should, ideally, be constant, and amounts to far more than a
daily or weekly AAR. It should occur during execution, when it can be acted upon; it is too late
at the end.
A dynamic scenario that is primarily shaped by player decisions but remains coherent
Given that a wargame is a shared story-living experience, the scenario must be at its heart;
executing the wargame is executing the scenario. This is discussed in Chapter 21 (Scenario
Execution). And the scenario shapes, and is shaped by player decisions, per the wargame
definition. This necessitates a light Control touch to ensure that players retain enough
ownership of the scenario’s direction of travel to remain empowered and engaged.
Yet there is a tension here: the wargame has been designed, and is now being executed, to a
purpose that must be met. So, player decisions cannot completely dictate how the scenario
unfolds. The primacy of player decisions versus controlling the wargame to meet its objectives
is a crucial balancing act.
Achieving that balance is the primary challenge facing the wargame team during execution. To
make this even harder, it must be achieved while ensuring that the scenario remains coherent.
To violently adjust the play space to accommodate both player decisions and game objectives is
to invite inconsistency and aggravate players, who will soon disengage. It is a central balancing
act that Control must achieve.
Adversarial game play
The adversaries’ freedom of action is part of this balance. In a one-sided wargame, adversary
plans and actions are part of the scenario and, while usually enacted by a Red Cell [289], are
sanctioned or ultimately determined by Control. In a multi-sided, fully adversarial, wargame, the
[290]
adversary(ies) will be freer to act, although potentially still steered by Control.
Whatever variant of wargame you are executing, try to give the adversary as much freedom of
action as possible. The players or Control cell playing the adversary should be encouraged –
ideally able – to win. If this is not the case, you will lose one of the most powerful drivers of
innovation and challenge. This talks to the same tension between player primacy and control.
Even in a two- or multi-sided wargame, it is likely that a compromise must be reached. For
example, in an educational staff college wargame, one side pre-emptively destroyed the other’s
air force on the ground (1973-like). Because the learning objectives included the command and
control of joint air operations, that outcome had to be moderated, with forces resurrected to
allow both sides’ continued air play. Avoiding such clumsy interventions while allowing
adversary freedom of action is the challenge you face.
Collection of wargaming lessons identified
No wargame is perfect: issues will arise during execution. Clearly, you address these as they
occur. Beyond that, though, you should seek these out and the actions taken to resolve them,
and record this during execution for subsequent examination in the wargame validation and
refinement phases.
The lessons identified data base
To achieve this collection, consider using a LI data base. This can be a simple spreadsheet or
part of a bespoke exercise management system. Seemingly easy, the collection of LI happens
during execution, when more exciting things are going on. Some suggestions to ensure the
effective collection and collation of LI are:
· Exhort (force) everyone to complete, or review, LI at the end of every day. The tempo
and excitement of wargames can result in people forgetting everything that happened
more than a few hours (sometimes minutes!) before. If you wait until the end of the
wargame (which might be days or weeks), they will have forgotten their LI.
· Ensure the spreadsheet, or data base, has multi-access permissions, so that people can
work on it simultaneously.
· Make sure it is easy to access (i.e. have terminals to hand) and use. Over-complicating
the data base will discourage people from using it.
· Organise it into simple and obvious areas or work sheets. One huge bucket of entries is
difficult to examine subsequently.
Different grades of OIL
As an important aside, there are two different uses of the acronym OIL. You might find that you
have two ‘OILs’ data bases:
· Observations, Insights and LI. This pertains to data capture and analysis, as previously
discussed.
· Open Issues Log. This is a technical issues list that pertains to computer systems. It
ranges from user training to new code required.
Both are useful; just ensure there is no confusion.
Chapter 15. Wargame Validation
‘This is not a game! This is training for war! I must recommend it to the whole Army.’
General von Muffling,
Chief of the Prussian General Staff, 1824
Introduction
[291]
I use the common English meaning of ‘valid’ and apply the Defence Systems Approach to
Training (DSAT) definition of validation: an evaluation of training, plans and materials to assess
its fitness for purpose. Although DSAT specifies ‘training’, its methodologies are applicable to
any wargame, training or analytical.
Clearly, there will be ‘real’ findings from your wargame that pertain to the research question or
learning objectives. This chapter, however, focusses on validating the wargame to elicit lessons
identified (LI) that can become lessons learned (LL) when applied to future wargames. The
processes of validating and refining wargames are inseparable: the one enables the other.
Hence, there is considerable overlap between this chapter and the next (Wargame Refinement).
Wargaming Handbook extract
Validation should involve the whole wargame team. All LI, observations and feedback should be collated and
examined for internal and external validation. The customer will lead on external validation (were the event
objectives correct?); the remainder of the wargame team will lead on internal validation (did the event meet the
given objectives?).
A Post Exercise Report (PXR) should be produced. This might solely concern the wargame, or wargaming aspects
of a wider event might be included as a PXR annex. The following sections should be considered:
· Suggested refinements. Suggestions can relate to any aspect of the wargame. Promulgation should be
widespread, including to a central repository of wargaming LI.
· Wargame findings. If not extracted for promulgation elsewhere, the observations and insights arising from the
wargame should be noted. These, too, should be distributed to a central repository.
· Shaping factors for subsequent events. If not extracted for promulgation elsewhere, factors arising from the
wargame that will shape future iterations in a series of games, or other aspects of an experiment, should be
noted. The distribution of these in a timely manner is particularly important in an ongoing cycle of research.
Figure 15-2. Suggested subjects of analysis to further examine in a virtual environment and/or
live trials
· Suggestions to exploit findings and shape subsequent activity.
Classification allowing, all of these should be deposited in a central repository, with meta data,
key word tagging and so forth. Repositories and governance – or the lack of – is covered in the
next chapter.
Chapter 16. Wargame Refinement
'It is critical that we police ourselves and act as professionals. The mark of a professional
community is one that conducts a critical assessment and records what occurred, reporting
insights gained and advances its best practices.'
[294]
Cdr Philip Pournelle
Introduction
This chapter is a contribution by Stephen Downes-Martin. It discusses best practise in the USA,
where a wargame model is continually improved during its lifetime.
Wargaming Handbook extract
Lessons Identified only become Lessons Learned when applied. Many wargames are iterative, particularly in
educational and training contexts. In these instances, the incorporation of LI into subsequent events is routine.
More effort is required when a wargame is a one-off, which is often the case in analytical events, even an IAECP.
In the Wargame Lifecycle diagram, the ‘Refine’ arrow should flow back into the Design Phase. For even highly
iterative wargames, time is well spent re-confirming that the existing design remains valid. Exceptionally,
refinements might be made directly into the Development Phase, but it is wise to consider a (re-) Design Phase
first, using the design steps to check that nothing has changed.
Finally, and in common with all phases of the Wargame Lifecycle, an audit trail should be generated and maintained
by documenting suggestions made, by whom, decisions taken and the reasons why. A key output from the Validate
and Refine Phases are documents that enhance wargaming corporate memory.
Wargaming governance
There are two types of effective professional wargaming governance depending on who
manages it: the government’s defense department; or the profession itself. These two can and
should be complementary (with overlaps preferable to gaps). In the US, centralized DoD
wargaming governance exists due to the then Deputy Secretary of Defense Bob Work, whose
2015 initiatives established the Defense Wargaming Alignment Group (DWAG), Wargaming
[295]
Repository (WGR) and Wargaming Incentive Fund (WIF). Although resources have reduced
after the change of US governments in 2016, this governance remains effective. Critically,
defense department-managed wargaming governance handles classified information. On both
sides of the Atlantic professional wargaming governance is also handled, albeit in a more
distributed fashion, by initiatives such as the various Connections wargaming conferences and
websites such as PAXsims and Milgames. In addition, a large amount of professional gaming
best practice and lessons learned is available from a wide variety of non-defense related gaming
conferences and university programs.
US wargaming governance
Phil Pournelle explains the remits of the DWAG, WGR and WIF in his 2018 Connections
[296]
UK presentation. These are supporting organisations only and, although we should be
grateful for their existence, we cannot rely on them for continuity or quality. We must rely on
ourselves and create a governance that is based in the activity, not on the sponsor. For
example, while patient groups do exist, and the government passes laws concerning medical
treatments, true medical governance by organisations such as the British Medical Association or
the American Medical Association are set up and run by medical professionals (setting up codes
of ethics, best practices, and a knowledge base for example) and not by customers or sponsors
(Defense Department sponsors for wargamers, patients for doctors). We are responsible for the
lack of, or presence of, wargaming governance and only have ourselves to blame if a
bureaucratic governance is imposed on the activity by frustrated sponsors.
Introduction to wargame refinement
Wargame refinement is an expanded version of the best business practice of Reflection, in
which an organisation or individual explicitly schedules time (and therefore funds) to examine
the performance of a just-completed individual project to improve future organisational and
individual performance (see for example Stafano et al 2014 and Bravenboer 2017) through
learning and knowledge development. This is an episodic process mainly focused on identifying
and correcting performance errors in recent projects and is a necessary activity for sponsors
and providers of wargames. Wargame refinement adds to this a continuous reflection process
designed to:
1. Improve the organisation’s performance at wargaming and delivering wargames;
2. Identify and deal with hidden flaws in designing, playing and delivering wargames;
3. Deepen and extend the state of art and science of wargaming; and
4. Capture the best practices in a useful form.
How much effort an organisation or individual puts into wargame refinement depends on the
trade-off between the current and opportunity cost of the time spent on the process against the
potential increase in profitability (for a commercial activity), personal development (for an
individual) or in national security objectives (for government agencies). This chapter explores
wargame refinement and recommends some techniques based on logic and experience to obtain
its benefits while reducing its immediate and opportunity costs.
The purpose of best practice
A best practice is an activity which, if not carried out, or if carried out improperly, will result in
poor performance of individual wargames or in long term poor performance of the organisation
providing the wargame. The key point here is not just that the best practice must be carried out
properly to be effective: it is often the case that a poorly executed process is more damaging
[297]
than the absence of the properly executed process. Furthermore, experience shows that
the temptation to execute a process poorly in order to check the ‘we did it’ box is often
irresistible. Therefore:
The most important best practice of all is to take best practices seriously!
There are three kinds of wargame refinement best practice, each of which provides a level of
desired outcome. The first two levels are to:
1. Improve and maintain the best possible performance of the organisation delivering the
wargame; and to
2. Provide the best possible design, delivery, execution and analysis of individual
wargames.
These are not the same. No amount of brilliant performance by the wargame team on a specific
wargame can compensate for a poorly performing organisation to which the wargame team
belongs or for poorly performing managers and leaders of that organisation. Nevertheless, the
two levels are linked, in that poor organisational performance will lead to high-performing staff
leaving. The third level is to:
3. Extend the state of art and science of wargaming through the discovery and
development of new wargaming techniques in order to wargame better than our
adversaries.
There are two levels of poor performance that your wargaming best practices need to be
tailored to refine, not only your wargaming performance but also your best practices processes:
· First, the obvious poor performances that are easily perceived by professional
wargamers and managers. These produce results from individual wargames that we can
ignore or caveat. They indicate organisational problems that can be fixed assuming a good
cost-effectiveness trade-off and management’s will to do so.
· Second, those that are hidden. These are dangerous and hard to deal with. After all, how
can we develop best practices to avoid problems we can’t see and don’t know about?
This distinction between obvious and hidden flaws is important. Performance which is obviously
poor is the least dangerous, in that they do not lead to perverse results unless the sponsor
chooses to accept them for political or hidden agenda reasons. For example, a wargame that is
obviously flawed but supports the sponsor’s pre-conceived beliefs and wishes may result in the
sponsor ignoring the flaws in order to claim the wargame validates these wishes (Downes-
Martin 2014).
Hidden poor performance creates the risk of erroneous lessons at best being believed by well-
meaning wargaming or sponsor’s organisations, or at worst exploited by dishonest leaders. The
hidden results of failing to implement a best practice will result not only in wrong lessons
learned, but also in failing to discover the existence of the best practice or wrongfully ignoring it
by claiming it is ‘an unnecessary complication’. At best ‘the world has deceived you’, and at
worst ‘you have conspired to deceive yourself’. We need a way of identifying hidden problems
and therefore best practices are required that have a high probability of exposing and dealing
with previously hidden problems.
An organisation that is either so imbued with arrogance or so lacking in imagination that it
believes it has achieved and understands all that is possible to know about wargaming or cannot
imagine the possibility of others developing novel and competitive advances in technique is
already in trouble – but doesn’t yet know it. It might be addressing the easily perceived first level
of poor performance but is failing at the hidden deeper and more dangerous level.
Sources of best practice
Benign and malign games
The normal sources for best practices are well-run games that provide ‘what we did well’ and
‘what went wrong’ in the just-completed game, which in turn should then provide ‘how to do
things right’ and ‘what mistakes to avoid’ lessons learned for future games. They provide
characteristics to seek and behaviours that interfere with those characteristics to avoid, i.e. best
practices and lessons learned from benign games. This is routine episodic business practice but
does not go far enough.
Consider the thought experiment of deriving lessons learned and mistakes to avoid if you were
to deliberately design a malign wargame with the objective of deceiving the sponsor (Downes-
Martin 2016, Downes-Martin et al 2017). Deception here means the well-meaning and intelligent
sponsor believes in the validity of the game and, based on the game, is motivated to act in a way
that does not best support national security. Such thought experiments allow you to detect new
best practices and wargame design principles in a two-stage process: first, best practices for
how to best deceive the sponsor and what mistakes to avoid if you are to successfully deceive
the sponsor and avoid detection; and then, based on these best practices, for how sponsors and
wargame organisations might avoid being deceived and avoid deceiving themselves. This type
of thought experiment extends the state of art and science of wargaming with additional best
practices to those that surface from benign games.
We have no innate distaste for the idea of a Red Cell expending every intellectual effort in a
wargame into defeating our Blue Cells in order to explore how best to avoid a Blue defeat. We
can only credibly explore how best to achieve a Blue victory and avoid a Blue defeat if we
explore how best to defeat Blue. It’s called wargaming for a reason! If wargames inform the
national security decision process, it is therefore inappropriate squeamishness to object to the
idea of a thought experiment into creating a wargame deliberately designed to deceive those
national security decision makers, i.e. the wargame sponsors, in order to explore how best to
ensure they are not deliberately or inadvertently deceived, and to discover new wargame best
practices and principles. I.e. to apply wargaming principles to the very act of designing a
wargame to inform decision makers and to apply wargaming to the decision making process that
wargames inform, in which games ‘Red’ (now the wargame refinement researcher) is attempting
to pervert traditional wargame design for malign purposes and ‘Blue’ (now the national security
decision maker) is attempting to sponsor a benign game that correctly and best informs national
security.
By looking at wargames that are deliberately designed to be malign (deceptive) we identify
additional characteristics to explicitly avoid in wargame design that are not obvious from looking
at a list of characteristics to seek provided by benign games.
Final plenary
The final plenary is the wargame session after the final move(s) have been made. The cells are
brought together for a facilitated discussion led by the leader of the wargame’s facilitation team
and a sponsor representative. This session is part of the game in that it is focused on the
sponsor’s objectives for the game and is obviously an episodic process. As the US Naval War
Game Department’s handbook states:
‘Facilitated plenary gatherings include select questions created by the analysts to gather player
responses needed to answer game research questions and further explore unexpected insights
[298]
observed during game play.’
For example, during a deterrence game the Blue Cell lead might be asked ‘How well did you
deter your adversary?’ and then the Red Cell lead asked to comment. Although not explicitly
designed to be a source of best practice for wargame refinement, discussion during the final
plenary often does surface wargame design problems that can then be passed onto the
wargame refinement process. Players, when asked about their ability to achieve their military
objectives within the game, may complain about game design issues in addition to discussing
the capabilities they were given to play with and the cunning innovations of their adversaries.
During final plenaries I have frequently heard senior officers complain that the wargame
technology support failed to pass along an order they had given during a move or failed to
provide them with information they would have had in the real world. If true, this indicates a
technical problem that must be solved. If not true, this indicates a problem with the process by
which players kept track of what they did or did not do or know about during the game. The
latter might be a best practices issue if the game is not meant to deal with exploring unreliable
C3 or stressing people with too much information. Players should be allowed to vent about game
design during the plenary sessions since this is useful for wargame refinement purposes both
intra and inter games, but not to the extent that it interferes with the analytic purpose of the
plenary.
After Action Reviews
The After Action Review (AAR) is the primary mechanism for assessing your own wargame and
wargame organisation to identify ignored or poorly implemented best practices, and to identify
previously unknown and therefore missing best practices specific to your own organisation. It
can be episodic (following each game) or continuous (regularly scheduled independent of games
and addressing broader issues of wargame knowledge and quality within the organisation).
You identify the best practices by looking at the poor performances (problems) and asking why
the performance was poor and what best practice was known about but ignored by your
organisation, it’s leaders, or the wargame team. This is quite separate from the game’s final
plenary. Participants are primarily the game team from the organisation that designed and
delivered the game but should if possible include those sponsor representatives who
participated in the game and senior players who led game cells. It should not include anyone, no
matter how senior, who was not an active participant in the game.
An AAR is a disciplined procedure that digs deep into the weakness-based question:
What interfered with us providing the best possible wargame?
The answers to this provide a starting point for developing best practices and refining the
wargame process. The question is weakness-based since the purpose is not to pat yourself on
the back about how good you are, but to examine what went wrong. Its wording avoids
assuming from the beginning that the game was a failure and avoids any blame being a priori
assumed. The going-in assumption is that the game was successful, but that improvement is
possible.
[299]
A proper AAR following the wargame is not just a hot-wash, BOGSAT or brainstorm of
participants and practitioners. Hot-washes and BOGSATs are easy, fast, and produce lists of
what people emotionally liked or did not like. They produce responses, not answers.
Brainstorms are a form of problem solving rarely done properly and long since debunked as an
effective process (Lehrer 2012) in favour of a three-stage normative process.
The full AAR process takes four to six hours (depending on the level of participant familiarity
with the process) to do properly, which means it must be scheduled on the wargame agenda. If it
is not scheduled, but merely shoehorned into the final plenary, then that is a warning sign that
the organisation is either uninterested in improvement (wargame refinement), too busy to do it,
or arrogant enough to believe they do not need to do it. Obviously for small and short games a
four hour or longer AAR might be considered excessive and you might instead do a single full
AAR after a series of many similar short games – unless a short game is critically important and
involved very senior leadership as participants. For a several-day to one week or longer
wargame the full AAR process should be mandatory for a serious organisation. Tailoring the
AAR is a trade-off analysis between cost and benefit.
The AAR process is designed to identify the real underlying problems, from which you may then
identify what best practices were ignored, not properly adhered to, or perhaps new ones that
need to be introduced. Details of the process successfully used by the author on many
occasions for business, military and government sponsors are contained in and adapted from
the Language ProcessingTM Manual (GOAL/QPC 1995).
An AAR is a problem identification process, after which you schedule a separate session to
propose actions specifically aimed at addressing each of the weaknesses identified in the AAR
by asking the following questions:
1. What smallest collection of actions covers all the weaknesses with the best likelihood of
dealing with them?
2. What weaknesses are worth dealing with and which ones should we live with?
3. Which critical weaknesses have we seen before, i.e. which weaknesses are
systematically present in our organisation?
4. Why are these weaknesses present? What is it about our organisation that interferes
with past attempts to deal with them?
Outside organisations
A key source of best practices for refining your own wargames and wargaming knowledge are
the community of professional wargaming colleagues (both inside your organisation and, more
importantly, outside your organisation), outside organisations and their wargames, government
wargaming repositories, and academic and professional organisations and their databases.
Engaging with these sources should be a continuous process with embedded episodic events.
Your wargame staff should engage with other organisations by supporting their games, playing
in them, and inviting them to play in and support your own game development and delivery. For
government organisations the only obstacle to this might be the lack of political and financial
willpower of the leadership to do so. Different government departments or military services have
their own internal wargaming efforts; these should be talking with each other to refine their
wargames and knowledge by sharing best practice. For commercial organisations ‘outside’
means other departments or profit centres, while for academic departments ‘outside’ includes
other departments within the university and other universities. Review the documentation in
wargaming repositories for lessons learned that might apply to you.
Professional development
A critical source of best practice, especially for best practices that deal with performance
problems hidden from your organisation and to extend and deepen the state of art and science
of wargaming, is the professional development of your own wargame staff. This must be by both
formal (internal and external courses) and informal (conferences, time set aside for self-study)
learning. A good rule of thumb for keeping on the leading edge is to allow staff 20% of their time
[300]
for self-directed study or self-chosen formal courses.
Wargaming is vulnerable to inadvertent intellectual fraud
In nearly all cases of scientific fraud, three risk factors have been identified as present. The
perpetrators: ‘knew, or thought they knew, what the answer to the problem they were
considering would turn out to be if they went to all the trouble of doing the work properly; were
under career pressure; and were working in a field where individual experiments are not
expected to be precisely reproducible.’
A major risk to wargame refinement is a similar and inadvertent intellectual fraud driven by self-
confidence, career pressure (on sponsors, players and wargame designers and analysts), and
the strategic indeterminacy (Hanley 1991 pp 8-19, Hanley 2017 p 81) inherent in wargaming and
[301]
therefore in wargame refinement. Wargaming professionals and their sponsors are not
usually uncertain or humble about their areas of expertise, national security is a vital field where
careers depend on making good decisions, and the indeterminacy of warfare and wargaming
means that repeatability is unlikely. All three risk factors are present and thus wargaming is
vulnerable to inadvertent self-deception. The wargaming refinement process must therefore
explicitly deal with this vulnerability at the individual and the institutional levels.
A major way this vulnerability manifests itself is when an organisation acts as though following
procedures is the end rather than a means to an end. The end being the best possible design
and delivery of wargames based on extending the art and science of wargaming. A cautionary
example of how this might happen at the institutional level is provided by a recent examination
of a number of high-profile studies originally published in prestigious journals which failed to
find significant evidence for the original findings and found weaker effects than those originally
claimed (Camerer et al 2018, ScienceDaily 2018). At best poor and at worst incorrect results
were reported despite all concerned (the original experimenters, the Journal editors and their
peer reviewers) scrupulously and honestly following best practices.
If this kind of failure can happen for repeatable events, the possibility this can occur for hard-to-
repeat events such as wargames must be seriously considered despite the difficulty of proving
similar failures have occurred. National security is too important to blithely assume that
wargaming, unlike every other human endeavour, is somehow immune to systematic error, self-
deception, deliberate deception or fraud. How can wargaming deal with this vulnerability?
We start with the observation that wargaming is a social group process, and therefore so are
the best practices used to refine wargames and the wargaming process. There exist several
group dynamic effects that subtly subvert the benefits of group processes and which are rarely
considered when constructing, running, analysing or refining wargames. These must be taken
into account by advanced wargaming organisation, and so a discussion of the failure modes of
group processes and how to avoid them is required.
Before proceeding further, ask yourself these questions about the wargame or wargame
refinement process:
· Do we believe we know the answer to the question or problem and, if so, how is this
influencing our design and our attitude?
· Who among the stakeholders are under career pressure, where is that pressure coming
from and how intense is it?
· How easy is it to inflate or hide results?
Then respond to the answers by explicitly naming the problems and their sources and adapting
your normal institutional processes to deal with them. For example, if you face problems from
the second question, consider replacing the relevant people, and if that is not possible identify
the pressure on that person to behave in a certain way and explicitly adapt your Data Collection
and Analysis Plan (DCAP) accordingly. Of the recommended best practices at the end of this
chapter the one concerning an external independent peer review board is extraordinarily
effective at keeping an organisation honest.
The Three Witches of Wargaming
We must face the deliberate but well-meaning interference by the ‘Three Witches of Wargaming’:
the wargame sponsor, the wargamer’s own boss, and the senior officers leading game cells
during play (Downes-Martin 2014). Each of these will attempt to interfere in the design and
execution of the wargame and the wargame refinement activities due to the erroneous belief that
they are more expert at wargame design and refinement than the expert they have hired to do
the wargame and the wargame refinement. The wargame or wargame refinement process lead
must be truly an expert, obtain sign-off by the sponsor and his own boss for the design, recruit
(not invite) senior players to play the game or execute the process as designed, and have the
intestinal fortitude to politely but uncompromisingly push back against improper interference. In
the face of improper interference, if support from your own chain of command is not
forthcoming, then document the interference and the assessed damage it will do to the quality
of the product and analysis that will be delivered to the sponsor.
Wisdom of crowds, or madness of mobs?
The work of Philip Tetlock, Dan Gardner and James Surowiecki (’Superforecasting’, ‘Wisdom of
Crowds’) and their colleagues is too often grossly oversimplified into the claim that ‘aggregate
group forecasts and decisions are much better than individual ones’. This is clearly nonsense: a
group of idiots is unlikely to make a better forecast or decision than a single expert; and a single
expert is more than likely to make a better decision about their area of expertise than a group of
experts in another field. No patient in their right mind would ask a group of expert sanitation
engineers about their cancer treatment instead of the single expert oncologist. The wisdom of
crowds and superforecasting research is much more nuanced, with interesting areas to
manipulate, than the popular understanding of them.
A more accurate summary of the research might be ‘A group of experts satisfying four
requirements who make a decision or forecast about their area of expertise is more likely to
make a better decision or forecast than a randomly selected individual from that group’.
Furthermore, research indicates that a group brought together and using BOGSATs and
brainstorming consistently underperforms the same group using normative methods in which
individuals first work independently, then in a group, then review and refine as individuals
(Nijstad et al 2006; Lehrer 2012; Mullen et al 1991). The four requirements for a group to exhibit
[302]
the ‘wisdom of crowds’ rather than the ‘madness of mobs’ are (Surowiecki 2004, p. 10):
· ‘Diversity of opinion: Each person should have private information even if it's just an
eccentric interpretation of the known facts.
· Independence: People's opinions aren't determined by the opinions of those around
them.
· Decentralization: People are able to specialize and draw on local knowledge.
· Aggregation: Some mechanism exists for turning private judgments into a collective
decision.'
While the DoD groups I have observed often do satisfy the decentralization requirement, they
mostly fail on the other three for wisdom of crowds. They frequently consist of subject matter
experts from the same communities of practice or Service with peer pressure to conform to
doctrine. Opinions, in the form of statements or votes, are often collected sequentially and
[303]
publicly. Aggregation is often based on flawed voting schemes using junk arithmetic.
Brainstorming and BOGSATs
Immediately following its introduction in the 1950s, brainstorming has routinely been debunked
as an effective mechanism, so much so that the demonstration of its inferiority compared to
easily implemented normative processes is a routine experiment carried out by first-year
undergraduate social science students (for a popular description of the problem see Lehrer
2012). This is quite separate from observation that most brainstorms do not even follow the
primary rule that brainstorming claims must be followed for the process to be effective, which is
‘no analysis during the process’. A common problem is that shortly after the start of a
brainstorm a senior officer has grunted approval or disapproval of some junior officer’s idea –
and the brainstorm is over. It has long been proven that a disciplined normative approach using
Language Processing, Silent Clustering and Formal Debate give superior results than those
obtained from ill-disciplined methods such as brainstorming (even when run properly) or
BOGSATS.
If time constraints or your superiors dictate a brainstorm or BOGSAT and dictate who is
present, then insist on it being run properly: (a) an initial period in which everyone silently writes
down their ideas and then submits their paper to you; (b) open presentation of ideas during
which you ruthlessly call out anyone who indicates approval or disapproval of an idea; (c)
anonymous voting to rank the ideas.
The (Dis)honesty Shift
Research indicates ‘that there is a stronger inclination to behave immorally in groups than
individually’, resulting in group decisions that are less honest than the individuals would tolerate
on their own. ‘Dishonest’ means the group decisions break or skirt the ethical rules of the
organisation and societal norms, and involve cheating and lying. Furthermore, the group
discussions tend to shift the individuals’ post-discussion norms of honest behaviour towards
dishonest. First the discussion tends to challenge the honesty norm, then inattention to one’s
own moral standards (during the actual discussion) and categorization malleability (the range in
which dishonesty can occur without triggering self-assessment and self-examination) create the
effect that ‘people can cheat, but their behaviours, which they would usually consider dishonest
do not bear negatively on their self-concept (they are not forced to update their self-concept)’.
The research indicates that it is the small group communication that causes the shift towards
dishonesty that enables group members to coordinate on dishonest actions and change their
beliefs about honest behaviour. The group members ‘establish a new norm regarding
(dis)honest behaviour’.
The research has not been done on military officers (to the best of my knowledge) so at worst
there is only a reasonable suspicion that it would apply during wargames involving military
officers. Although the systematic and unremitting focus of the military on ethics means that the
level of honour within the military is probably higher than in the civilian community, note that the
research implies subtle and sometime subconscious shifts in honesty. Experience however is
less charitable, and the behaviours of Flag and General Officers and O6s leading player cells in
wargames has been observed to go beyond ‘vigorously advocating for their positions’. As H. G.
Wells observed over a century ago (Wells, 1914).
‘... it is remarkable how elastic the measurements of quite honest and honourable men can
become.’
Appeals to ethics standards seem to be effective in the short term but there is little evidence for
long term effectiveness (Mazar, Amir and Ariely 2008, Kocher, Schudy and Spantig 2016). So,
consider holding a short discussion before every wargame or refinement event around the
question ‘What are the legal and ethical concerns involved in the issue we are gaming or have
gamed?’ to remind and ground everyone in the ethical standards of the organisation. This will
provide protection for the immediate event.
The Risky Shift
Research into risky or cautious shifts during group discussion looks at whether and when a
group decision shifts to be riskier or more cautious than the decision that the individuals would
have made on their own (Batteux, Ferguson and Tunney 2017, Dodoiu, Leenders and Dijk 2017).
One element driving the shift appears to be who bears the consequences of the decision – the
group members, people the group members know (colleagues, friends, family), or people the
group members do not know. There is evidence that individuals tend to be myopically risk
averse when making decisions for themselves (Thaler, Tversky, Kahneman and Schwartz 1997).
Research indicates however that ‘risk preferences are attenuated when making decisions for
other people: risk-averse participants take more risk for others whereas risk seeking
participants take less.’ Whether the group shows a risky shift or a cautious shift depends on the
culture from which the group is drawn and the size of the shift seems to depend on the degree
of empathy the group feels for those who will bear the consequences and risks of the decision.
As part of the design for a wargame or wargame refinement ask yourself the questions:
· ‘What are the relationships between the participants and those who bear the
consequences of any decisions informed by the wargame or wargame refinement?’
· ‘How risk seeking or risk averse does each participant appear to be?’
The Normative Process
Research has consistently indicated that working in a group produces fewer ideas and of lower
quality than the same people working independently and then combining their ideas in a group
process. The research does not compare the quality of ideas created by a group with the quality
of ideas created by any single individual in the group working alone, it compares the quality of
ideas of the group working together with the quality of the combined ideas created by all
individuals of the group working alone. In addition, individuals tend to believe they are more
productive in a group brainstorm than when working alone and are also more satisfied with their
individual performance in a group even though they produce fewer ideas. Research also
indicates the reason for the illusion being that, when working in a group, each individual is less
aware of when they themselves do not come up with an idea than when working alone – they get
to hide behind the group (Nijstad and Lodesijkx 2006).
The normative approach consists of three phases: (1) individual pre-meeting preparation; (2)
group meeting; and (3) individual post-meeting work. This method has been routinely proven by
experiment to generate better results than brainstorming or BOGSATs (Lehrer 2012). Note that
work groups should not be larger than about eight people. Observation shows that in larger
[304]
groups only about eight people make constructive and sustained contributions. If your
group has more than eight people, consider breaking it up into smaller teams addressing the
same problem in parallel or addressing different but related problems.
· First, experts think about the weakness-based question (‘What interfered with us
producing the best possible result?’) on their own as individuals, write down their
proposed answers in correct English or other proper language sentences (i.e. not
PowerPoint bullets or bumper stickers) and bring them to the AAR. This ensures that every
expert in the group has thought through the question without influence from others in the
group who might be more senior, vocal, or persuasive. You obtain input from all the
experts.
· Second, they meet in a group during which a disciplined process designed for purpose is
followed. For AARs that address weakness-based questions an excellent choice of process
is Language ProcessingTM (see Goal/QPC 1996 for a detailed description). The use of Silent
Clustering used in Language ProcessingTM is unintuitive, but experience shows it is
effective and fast.
· Finally, the group think again as individuals and mail in any additional ideas to the
process lead who combines the new inputs with those from the meeting into a final report.
The three-stage normative approach combines the benefits of individual expertise with those of
group work. However, the process lead must be familiar with the failure modes of group
processes and act to circumvent them or to factor them into the analysis if circumvention is not
possible.
Recommended best practices
We can extract a list of best practices aimed at wargame refinement that you should consider in
addition to those which should already be in place from considering the management of any
project. All of these are of course optional. You do not have to produce an excellent wargame or
create an excellent wargaming organisation if you, your leadership or your sponsor do not care
enough and can hide that fact. If you do care, then based on experience and logic, employing
the above sources of best practices, my recommendations are:
Meta-level best practices
· Design and execute the wargame refinement processes after a trade-off analysis of their
costs and benefits to your specific organisation.
· Take the wargame refinement process seriously.
· Understand the requirements for and limitations of effective group processes (Lorenz et
al 2011, Nijstad et al 2006).
· Be aware of, and take into account, group process pathologies such as the Risky Shift
(Batteux et al 2017, Dodoiu et al 2017, Thaler et al 1997) and Dishonesty Shift (Mazar et al
2008, Kocher et al 2016).
External: engage with the sponsor and stakeholders
· Ensure the event is a wargame with the possibility that Blue can lose, and the gamed
concepts can be overcome by Red. Do not call non-game events ‘wargames’.
· Recruit, not invite, senior leaders to lead game cells to execute the game as designed. Do
not permit these leaders to derail the game in-stride to fit their non-sponsor agendas.
· Playtest the game with sponsor participation or with sponsor’s empowered action
officers to ensure the sponsor is paying proper attention to objectives and design.
· If stakeholders are to play, put each of them in whichever cell each stakeholder wants to
win thus ensuring vigorous play aimed at success. If possible, recruit stakeholders that
disagree with each other and put them in opposing game cells. Do not let the sponsor or
key stakeholders be adjudicators.
· Be prepared to deal with attempts by your boss or sponsor and their chains of command,
or by the senior officers or executives you have recruited to lead player cells, to change
the game objectives or design at the last minute (Downes-Martin 2014) by explaining the
negative impact on the analysis of game data to support the sponsor's objectives. If these
impacts are accepted by your supervisors, then document the changes and who made
them in the final game report to the sponsor.
Internal: work within the wargame organisation and its chain of command
· Create and use an empowered Independent Peer Review Board to examine objectives,
assumptions, scenario and capabilities data, design, game play, adjudication, data
collection and analysis.
· Minimise cognitive dissonance in the mind of the sponsor by ensuring wargame design
and play is as consistent as possible with their preconceptions, while not allowing these
preconceptions to drive objectives, design, game play, analysis or reporting.
· Conduct wargame forensics and reporting to provide actionable recommendations.
· Report ruthlessly and honestly, unencumbered by sponsor or stakeholder wishful
thinking.
· Do not use groupware technology: it is unnecessary and often counter-productive. Make
[305]
everyone get out of their seats and actively participate using pens, paper, etc.
· Document the best practices you have identified from your own wargames in your
organisation’s own ‘Wargame Handbook’. Include warnings about those that are usually
ignored by your organisation and your chain of command. Submit your handbook or its
updates to the various wargaming repositories. Get an outside wargaming organisation to
review the handbook (i.e. treat it like an audit) and update it annually.
During game play: engage with the players
· Actively monitor for, and collect data on, players ‘breaking the game rules’. This provides
information on better game design and information on the innovative players and their
[306]
ideas.
· Actively monitor for attempts by players to introduce items irrelevant to the sponsor’s
objectives into the game during play and deal with this by inductive adjudication.
· Actively monitor for attempts by senior leads of each player cell attempting to divert the
[307]
game away from sponsor’s objectives and onto their own agendas.
· Design the game so that players rotate between Red and Blue cells.
Chapter 17. The Top 10 Things that Make a Good Wargame Designer
This chapter is a contribution from Ed McGrady and Peter Perla.
Introduction
Anyone can design a game. Game design is simple. Particularly wargames. As Jim Dunnigan, the
original guru of board wargame designers always said, ‘If you can play ‘em, you can design ‘em.’
Decide on a topic that interests you. Get a map. Make some pieces. Borrow a set of rules. Run
the game. It may not be a good game, much less a life-changing one that compels great men to
make epic decisions. But if you are doing a professional game for paying clients, at least you'll
get paid.
We have watched, and designed, a lot of games that we find compelling, and some we have not.
We also are repeatedly asked to define what makes a ‘good game.’ The answer is a ‘good
designer’ makes a good game. So, what makes a “good designer?” Based on our experience
there are several attributes that good designers have, and things that good designers know, that
enable them to design good games. And not only good games. Great games. Ground-breaking
games. Games that make people go, “Wow, I'm glad I paid half a million dollars for that!”
We want to talk about the dirty, nasty, cynical secrets that make a game, and game designer, go
from good to great.
First let’s understand what kind of games we are talking about.
We are not talking about what most people think of when they think of games. We are sure that
we can identify and list traits and practices of video, computer, board, and role-playing
designers, which will greatly improve all of our game designs. We can all learn from game
designers in the digital, print, professional and hobby communities. But in this essay we are
talking about what we call ‘professional’ games. Games designed to accomplish a goal unrelated
—or at least related only incidentally—to entertainment. Such games often involve role-playing
and place the players within the same processes and organisations that they would work in at
their real-world jobs. These games include most of the wargames used by the U.S. Department
of Defense, but also include organisational and business games. They can be used to
understand business or organisational dynamics, figure out military operations or how new
weapon systems might be used, or help players manage their day-to-day jobs. The designer
often moderates these games; that is, introduces and facilitates the game play.
Just because we draw many of our examples from wargames, that does not mean we are
excluding civilian or organisational games. Far from it. Most of the principles we will discuss
apply to all types of professional games—and all types of professional game designers.
For example, prior to hurricane Katrina officials from the region and the Federal Emergency
Management Agency (FEMA) engaged in a game to examine response to a hypothetical
hurricane Pam. The objective was to help develop joint response plans based on projected
storm damage and flooding levels. Emergency officials from 50 parish, state, federal, and
volunteer organisations played the scenario out over a five-day period. This is an example of the
types of games we are talking about: involving tens or hundreds of players, extended duration,
serious subject, and a complex set of processes and procedures to be followed.
In order to design these sorts of games the professional game designer combines a variety of
skills. These include the ability to understand complex problems, to ask critical questions on
topics about which they know little, and to manage a large group of potentially unruly people. In
addition, it requires the ability to manage a design team that has a wide range of different
abilities, attitudes, and personalities.
At this point we also should explain what we mean by ‘game designer.’ We mean the individual
responsible for the design and execution of the game. Often, in our experience the designer is
responsible not only for creating the initial game design, but also for running the team that will
construct and carry out the game. The designer may also run the actual game event, or be the
chief controller for the game. (We’ll talk more about this idea of control a bit later on.) Even if
this is not always the case (there are, after all, many different ways to organize a game design
and execution), the points we make about the designer, the manager, and the controller will
apply to each individual who is responsible for those parts of the game.
None of this is particularly easy which is why there are a relatively few game designers that can
pull of the most complex, difficult, and unusual games. We have had too many of these
opportunities to excel. We have put on games everywhere from the White House to the
Syracuse University Orange dome. We have had games with two players, twenty players, and
over a hundred players. Our games have been played in small rooms where only five or ten
people could fit, to a globe-spanning event with individuals from Japan to Germany to Africa
playing. Our games have included boardgames focused on military conflicts, to seminar games
dealing with policy and organisational issues.
We have had to do games where we knew almost nothing about the subject at the start of the
design. Try figuring out the military contracting process, or how the Department of Energy runs
laboratories, if you don't know anything about contracting or laboratories. We have also had to
run games where the players did not speak English as their first—or indeed any—language.
These challenges are all manageable, but they require some degree of knowledge, flexibility, and
respect for the players in order to pull off without a disaster. And we have had enough of those
(one or two is plenty!) to have learned our lessons the hard way.
When thinking about the skills and abilities the best designers have we can come up with a set
of traits that seem to be common to most. Even if they are not common to everyone, they are
things that every designer needs to think about as they approach the problem of designing a
game for professionals.
What secrets do most designers know that they don't tell anyone? What traits do they have that
make them the best?
Here we provide ten answers to those two questions.
Designers don't respect authority
Game designers don’t respect authority. This is perhaps the most common, and challenging,
trait we see in top line game designers. Of course, not respecting authority and not getting
along well with others is a poor recipe for success. But this isn't saying that to be a good game
designer you have to be an anarchistic jerk.
Game designers don’t respect the word of authority. They don’t respect the conventional
wisdom or whatever has been given to them as the truth. They do two things when they get the
conventional wisdom:
First, they ask themselves what they think of the problem.
Second, they ask why things are this way, and who benefits from it.
Both these questions are critically important to ask when beginning a game design. As Stephen
Downes-Martin says, sponsors and everyone else involved often want to influence the outcome
of games and are therefore at best vulnerable to self-deception and deception and at worst can
lie. This alone suggests that a serious amount of scepticism should be deployed by a
conscientious designer. But this scepticism of the conventional wisdom goes beyond simple
distrust.
Good designers know that it is the system itself that you can’t trust. People are often just
pawns of circumstance. The position they occupy drives their goals and objectives. The
organisational goals and objectives, incentives, and internal and external tensions set up the
framework within which sponsors want things out of games. It also creates the mental, political,
and social environment from which you will draw your players.
Understanding how these systems work can be used to a designer’s great advantage.
Asking: ‘What do I think of this problem?’ gives the designer a tension that can be used to gain
insight into the actual problems being played out in the game. Tension, here between what the
designer thinks and what everyone else thinks about the problem, leads to creativity. The
designer can focus in on the organisational and political issues that underlie the game problem,
creating something that goes beyond simply simulating an organisation or executing a process.
The designer’s point of view also cues the designer to ask what is going on within the game
problem that is unspoken but critical. It leads to asking the sponsor what they ‘really’ want. And
good designers understand that there is often a huge difference between what game sponsors
say they want, and what they really want.
There are many things that may lie hidden within a design problem. Let’s say that the sponsor of
the game wants to understand what might happen when unidentified friendly country green is in
a conflict with unidentified threat country red. Obviously, the U.S. sponsor wants to understand,
obviously, how the United States can affect the emergent conflict. But in many situations like
this the United States has only a peripheral role to play, and much of what it can do or offer may
not be terribly relevant. The sponsor, and the U.S. military, may not replace this. Trust us, they
often don’t replace how little influence they actually have in a situation.
But the good designers ask themselves what are the equities of everyone involved. How will
these be represented in the game? It’s not uncommon for a designer to immediately turn to the
sponsor and say, “We need some other organisations” in the game. Much to the puzzlement of
the sponsor and military, but not to those who have participated in real world operations. The
game needs to reflect the realities of the potential situation, its complexity, and the external
drivers that will affect the sponsor’s objectives. The good designer knows this and seeks to
introduce it by widening the lens of the game.
So how is a designer able to come up with an answer to what they think of the problem? There
are several things to keep in mind when confronted with a game design challenge:
· Conflict. Even if you don’t know anything about the particulars of the problem, you do
know how people behave. Any time you have limited resources people will be in conflict
over them. And resources don’t have to be money. They could be press coverage, fame,
honour, or access to senior leaders. Anything that people value, they will squabble over.
· Self-deception. People make assumptions based on pre-disposed biases. High echelon
leaders, for example, will often assume that the whole organisation is working toward a
common goal—which they themselves have articulated clearly and, of course, elegantly.
Even when that is far from the truth. The good designer just assumes someone,
somewhere, is misbehaving. You can usually find out just who is misbehaving by following
the conflict within the group, but not always. Sometimes you can find it by following the
engineers. In some games, particularly wargames, the underlying problem relies on the
performance of a system or process. There is widespread belief that such things work.
This belief increases as you go up the chain of command. It can be very amusing for the
designer to ferret out the invalid engineering assumptions and use the game to bring them
into question.
· History. It repeats itself (sort of) because the same human processes, social structures,
and values come into play over and over again. Even if organisations try to fix them. During
the anthrax event around 9-11 there was some miss-communication between the CDC and
HHS headquarters —as in the CDC and HHS HQ said two different things at the same time.
This was a challenge that HHS worked hard to fix. But if we were designing a game for
them again, that conflict is exactly where we would look for problems. As institutions CDC
and HHS are naturally set to be in conflict, and when politicians get involved that pressure
is even greater.
The second question designers ask is who benefits from the conventional wisdom. We assume
that our weapons will work as advertised. We assume that the nations we intervene in will
welcome our meddling. Most of all, we assume that we will win. Those assumptions benefit an
orderly and well-funded march toward certain organisational objectives. Sponsors may want
games to give them warm and fuzzies about their assumptions. But games help most when they
are a disruptive element. And designers ask themselves where assumptions need testing and
whether those assumptions are disruptive enough to be truly useful.
This exploration of conventional wisdom is very like looking for self-deception in an organisation
and focusing on that as an element of the game design. Games that not only address the
problem and scenario they were asked to address, but also bring out highly realistic
organisational, political, and interpersonal dynamics from within the participating organisations
are among the most insightful games that we have seen.
Designers worry about maps
Ok, we have gone from questioning authority, a broad and important subject, to maps. At most a
peripheral and trivial subject. But good game designers know a couple of important things about
maps.
First, maps are vital to orienting the players to the world where they will be playing. Maps
determine a lot of things about the game. There is a reason Tolkien put a map at the front of The
Lord of the Rings.
Second, good maps are hard to build. ‘Get a map’ is the first step in our fool-proof formula for
building a wargame. It is also one of the two hardest steps (more on the second hardest, orders
of battle, later).
Maps determine the relationships between players, and connect them to their actions. In a policy
game, political geography will determine which players or player groups will naturally be in
conflict, and which will naturally be allies. In an emergency response game players need to
understand the road network, which sections of road will be damaged, and what alternate routes
can be used to avoid the damaged areas. The distance from garrison bases to the enemy
determines deployment times. The distance from forward bases to the target determines time of
flight, the time delay before ordnance is on target, and the time it will take ground forces to
move to the objective.
Physical geography determines movement rates for ground forces, accessible areas (swamps or
shallow water), and cover. Road networks determine how the terrain affects movement, and
whether forces are vulnerable to detection when traveling on them. Road nets also determine
refugee flows, primary ingress and egress routes, and flows of logistics support. Likewise, for
rail networks and air- and sea-ports. Cities and towns provide cover, and complex urban areas
provide another type of terrain that can be quite challenging to represent in a map and rules.
At minimum a good, large, map gives the players something to point at while talking to each
other about the situation. And that is often a central contribution of a map.
Maps have some organic challenges associated with them. First, let’s assume the earth is
actually round, rather than the even more complex reality. Usually, you want to display things on
flat maps. This raises the question of what projection do you want to use. Over wide areas, say
an entire theatre of operations, nothing will be very good at representing all the various
distances accurately. Players get irritated when they realize that their carefully calculated range
rings don’t necessarily conform to the geography of the map. This can have real-world
implications when players decide something is in range when it isn’t, or they decide to defend
something that doesn’t need to be defended. There are no ideal solutions. One less-bad solution
is to put the centre of the area being fought over in the centre of the projection. The artist can
then measure other distances and fudge the geography until they are more or less correct. Of
course, players will now complain that the map looks ‘weird,’ but, again, no ideal solutions.
Finding a useful map, at the resolution that you need it, can be quite vexing. Dedicated mapping
software often draws simple vectors, which does not produce maps with high visual interest.
Complex maps with lots of detail look great until you realize that players only need half the
information being displayed and the half they don’t need is overlapping and obscuring the half
they do need. If you have access to the National Geospatial Intelligence Center then you can at
least call on them or other groups to provide high quality, detailed maps. But if you are an
independent, contractor, or a staff physician or logistician trying to build a game you may not
have the access that a Combatant Command would. The Internet provides some relief to the
map challenged, through places like the University of Texas map repository, or Open Street
Maps, but often the best option is just to take a bitmap screen shot of Google Maps and call it a
day.
The other option for maps is to take the basic data from various sources, including a
Geographical Information System, and then have an artist redraw it. This, of course, can get
expensive as drawing detailed, high quality maps takes experienced and capable artists. And
they do not come cheap. Strategy and Tactics Magazine runs some of the cleanest, clearest,
military history maps that you can find, following and building on the standards and traditions of
Redmond A. Simonsen, the godfather of wargame graphics from back in the days of
mechanicals and the two-colour process. Modern graphical methods have access to a dizzying
array of tools, textures, and colours. Using them all in a game map is seldom the right way to
go. So, your artist needs specialized understanding.
Game maps also have to be edited to make sure they contain the information the players need,
and not a lot of information that they don’t. Game maps are tools, and only incidentally art. This,
again, can take time and resources, something that professional game designers do not
necessarily have in abundance.
Maps are hard. Be nice to your artists.
Designers know stuff, a lot of stuff
A good game designer knows how things work. Not just engineering things, but social things as
well. The designer incorporates this knowledge into the materials they give the players. They
also use it to adjust and edit the ambitions of the game sponsor. There are several different
kinds of knowledge that the game designer should have: technical, political, and social.
Technical knowledge
Technical knowledge for wargames, or any professional game dealing with a specific subject, is
an obvious kind of knowledge that the game designer should have. Or at least be able to get.
Unfortunately, there is a problem with this obvious requirement: there is a lot of knowledge to
cover.
Depending on the scale of the wargame the designer may be required to have a working
understanding of everything from missile kinematics and end-phase seeker engagement physics
to the network of communications systems the commands involved will be using to chat.
As another example, suppose you are doing a game on avian influenza. You will need to know at
least the basics of poultry production and cock fighting. You will also have to know a bit about
how animal diseases spread, and how the response to an outbreak will be handled by the people
and agencies involved. As the cynical designer who does not respect authority, you will also
want to understand the rest of the poultry protein delivery process, looking for any unexpected
implications or consequences for a disease outbreak (egg production declines, impacting human
influenza vaccine production for next year's flu season, for example). This is a lot of stuff to
know for a game about chickens.
“But we will just use someone else’s rules system (or model).” Sure. Fine. Let’s make the broad
and – from personal experience – unfounded assumptions that you can push this off onto
someone else. Someone you trust to know more about the problem you are gaming than you
do. Unfortunately, the players probably have not made the same bargain that you have.
So, the players will ask ‘why?’ Why did the model produce a result? How did the extra widget or
capability that the players threw in and that's not in the model affect the outcome? What
happens when it starts raining?
Given that the players are generally experts on their weapon systems and capabilities, they will
not be willing just to trust the model. They will start pushing back. You need to be able to
answer them in at least a reasonable fashion. Game controllers will often find that modelers
have to take an urgent phone call when this happens. Or their answer fails to satisfy the players,
who continue to look to the controller, not the modelers, for answers in the game world.
On the other hand, the players are, after all, heavily invested as experts in the systems that they
are playing with. This tends to make them think that the systems will always work in ways that
are, perhaps, unlikely under real world conditions. You cannot simply take the players objections
at face value. They may be wrong, and the model actually may be correct.
You need to decide: the players or the model? That decision will need to be based on you own
knowledge. So, using the model does not relieve you from knowing how things work. At least in
general terms.
Now you don't have to know the physics of every single interaction. But you need to know the
direction the physics goes in. Infra-red will allow you to see things that have a temperature
difference against background. Unless the threat lights a bunch of fires. Then you will need to
factor those fires into the field of view of the seeker, and its frequency ranges compared to that
of fire. The designer does not need to know the IR frequency range emitted by fire, but they
need to know enough to know they need to ask the question when confronted with an IR sensor
with set frequency band. And so on.
Political knowledge
The second kind of knowledge game designers have is political. Understanding the political
situation helps you right from the start in working with the sponsor to define the problem. Any
time someone is paying a lot for a game there has to be some agenda going on, otherwise they
would just get some in-house staffer or contractor to do it. They may be guilty of all the hidden
(or not so hidden) sins that Stephen Downes-Martin talks about in his work, or they may just be
trying to satisfy some tasking or question that came up. But you need to understand the
sponsor’s political position in order to position the game in a way that complements their
agendas.
Notice that I said, ‘complements’ as opposed to ‘supports’ their agenda. There is nothing wrong
with designing a game that addresses the question of whether the Marine Corps needs robotic
dogs as part of the platoon. There is everything wrong with designing a game to prove that the
Marine Corps needs robotic dogs. But designing a game that does not even include robotic
dogs would be just wrong.
Designers also need to understand organisational politics. In any organisation, no matter how
small or big, no matter how squared away, somebody hates somebody else. Often entire
departments hate other departments. There are rivalries over resources, fame, or promotions or
all three. There are process challenges that mean the organisation inevitably does the wrong
thing despite publicly saying that they will do the right thing. For example, an organisation can
have a budgeting process that takes into account all of the various puts and takes, gets
everyone’s input, and manages to produce a coordinated, well considered, budget for the next
year. Then, at the very last minute, a small group sits down and decides who actually gets what
funding, who gets cut, and how they are going to, yet again, shove ten times the requirements
into one times the resources. True story.
Knowing how these things work allows the designer to identify where to look for the ‘big C’s’ in
their game: conflict, cooperation, compromise. The designer needs to know where the
organisational fault lines lie so that they can include them in their game design. Knowing how
the budget process actually works, rather than how the organisation says it works, allows the
designer to duplicate the external and internal processes and pressures that force the
organisation into those compromises. This will not only increase realism in the game, it will give
the players a way to work on the process challenges they face in addition to whatever else the
game is about.
And it makes the designer look like an all-knowing hero. Though no one will notice except the
designer. No one ever does.
Social knowledge
Finally, the designer needs to know how people work. People, after all, are the central engine
that drives the game. Otherwise it's just solitaire, which no one pays for.
The designer needs to know how people behave, and then how the specific subset of people
who will play in their game will behave. Military personnel behave very differently from college
students. Officers behave very differently from enlisted personnel. Students behave very
differently from professors. The designer needs to understand the diversity that will exist in
their game and what the players will be inclined to do, and not do.
This involves a wide range of knowledge and techniques. It ranges from understanding that
people from operational combat units do not necessarily like to sit still in a room and debate
stuff for long periods of time, to the knowledge that if you phrase your tasking to the players in
different ways you will get very different outcomes based on how people interpret your words. It
includes the ability of the designer facilitating a game to understand the narrow path between
fun and control, and making sure they draw the players back into serious work when they begin
to veer outside of the lanes.
Unfortunately, these types of knowledge cannot be taught easily. They come from the
designer’s experience, and their own inner curiosity.
Orders of battle are annoying
Knowledge will tell us how weapon systems work, but you also need to know how many there
are, where they are located, and how they are to be employed. This is not as obvious as it may
sound. Note that while we are using military examples here, the same concept applies to
everything from avian influenza (now where did we put those depopulation systems?) to
emergency response (who owns which fire trucks and how many hospitals in the area are
capable of level 1 trauma care?).
First, there are two sides. While intelligence professionals may have an estimated order of battle
for the other side, they may not always have it in a form that is useful to a game designer. In an
operational level game, it may not be important to know exactly where every launcher in every
air defense system is, as keeping track of thousands of vehicles is not something that you want
to be doing in a game. But it may be very important to intelligence analysts supporting strike
planning. Aggregating these individual vehicles into firing groups, batteries, and battalions or
regiments may not be relevant to them, but it is to the designer.
In many ways threat capabilities are the ‘good news’ story of orders of battle. Own force
organisation, deployment, and capabilities may be even harder to come by. The biggest
challenge is task organisation and the tendency of the military to reorganize at the drop of a hat.
This means that whatever information you find today may very well be inapplicable tomorrow.
Unfortunately, the organisation of military units is of keen interest to serving personnel, as it
determines their chain of command and other things. So, they will ‘know’ the ground truth far
more easily than the designer can. Of course, it is most amusing when everyone is wrong, all in
different directions.
Weapons systems and their underlying physics do not change all that frequently, and the
vagaries of the peacetime procurement system make any changes to weapons slower still. On
the other hand, organisations can change with the stroke of a pen, sometimes even with a
simple verbal order.
Second, the U.S. military typically deploys overseas. Deploying a large military force is a major
pain in the ass (PITA). You have to get everybody notified, find a way to get them where they are
going (hopefully without any enemy interference), then you have to move them into position of
contact with the enemy. This takes a long time, a lot of coordination, and it cannot all be done at
once. Of course, the U.S. military has systems and processes that allow it to manage
deployments to a very fine degree of detail: the Time-Phased Force Deployment process.
Except it doesn’t quite work like that. Like organisational structures, plans change all the time.
And often very few people know or understand the changes. But, because military units actually
have to deploy, the information they use is structured in a way that helps them load, schedule,
and debark their units. Which is not really what we do in a game. This can result in an incredibly
detailed and massive set of information that, again, does not easily aggregate into units.
Even worse is the problem of understanding exactly what you are going to get when everything
arrives. A squadron may sound like a sensible aggregation of aircraft to use in a game.
Unfortunately, it does not tell you how many aircraft are in the squadron at any one time. The
squadron could be shorted, all the squadrons in the community may be short aircraft, or they
may have too many. And that doesn’t even begin to get into the problem of aircraft that are
down for maintenance or safety issues. A squadron may have 24 aircraft on the books, but
detachments, exercises, and low maintenance numbers may reduce that. Across an entire
theatre this can become an irritating challenge for the designer.
Good designers know that these issues are going to exist and try to deal with them in several
practical ways. First, if you have any contact with the actual commands the best way to get
accurate numbers is from the local commands. Second, you can always work with, hire, or
otherwise obtain someone with experience in the service to help you sort out the deployment
process, locations, and numbers. Finally, good designer replace that they will pretty much
always be wrong; their goal is to be wrong in increments smaller than what matters. In a theatre
level game 20 vs. 24 aircraft per squadron may not really matter much in the final outcome. A
squadron or two mis-deployed between airfields also won’t lose the war for the allies. Usually.
You do the best you can, and then let the players move stuff around until they stop complaining.
Designers are curious
The best game designers are curious. They must be. There are simply so many things they need
to know, and many of them are things that you would not at first expect a game designer to
need to know.
Things that wargame designers must be curious about obviously include weapons systems,
military organisations, military tactics and operations, and policy decisions. But if designers are
going to do civilian games, they may also need to know how water systems, farms, and
industrial processes work.
Most professional game designers we know who are associated with the Department of Defense
have an avid interest in history, particularly military history. Knowledge of one area informs
designs and thinking in others. Napoleonic infantry tactics will give the designer perspective on
command and control, morale, firepower, and manoeuvre even though the weapons and tactics
are significantly different from modern combat. Understanding the British occupation of
Afghanistan, India, and South Africa during the Boer wars will give insight into insurgency and
counter-insurgency. At least enough to direct the designer’s attention toward the questions that
they need to ask in a modern environment.
Because wargame designers are curious, they have a broad perspective on military operations,
and can incorporate historical events into modern games. This is a major advantage over those
who lack a deep historical perspective.
However, a game designer that also designs for civilian clients has to go well beyond military
operations. They need to understand how international organisations like the UN work, or they
will be seriously limited in their design of policy games. They need to understand business and
economics, or they will be severely challenged in designing games that have an economic
component. They need to understand national response planning and emergency response,
along with infrastructure and how it operates, or they will be limited in their ability to design for
the emergency response sector. They need to understand how organisations work, and how an
individual’s psychology affects their leadership styles and the organisation, in order to do an
organisational game.
Even for those specializing in one set of clients, this is a much broader portfolio than simply
knowing the rate of advance for an infantry company moving across broken terrain. It involves
understanding everything from how the international system of governance works, to the
command system for our military, to the likely reaction of civilians in a nation affected by
conflict. Having any chance of doing this means that the designer must be constantly curious
about everything that crosses their path. Whether it’s how a college works (great information for
doing a game on managing a national laboratory) or how large-scale scientific projects are
managed (for gaming plans on planetary defense).
Finally, game designers love games. They are extremely curious about the latest techniques,
designs, and scenarios. They want to understand how others are designing games. Because
they get their own inspiration from those designs, if they don’t just steal them in the first place.
And this does not just apply to professional games. Designers are curious about how the latest
Euro game deals with the problem of economic competition, and the how role-playing games are
moving in the direction of storytelling rather than tightly controlled by system. They don’t
necessarily apply these insights directly, though they may, but they can get ideas from
commercial games to help design and run professional ones.
If such requirements seem overly broad, and irrelevant to you, you have not run a professional
game shop for very long. People come to you with any number of game ideas, ideas that range
from the micro to the macro. From things you know about to things you have never heard of. If
you’ve been curious you may be able to answer their questions with good game designs. If not,
they will go elsewhere.
Knowledge is critical to being able to design a realistic and effective game. Curiosity broadens
that knowledge beyond the immediate horizon and allows the designer to progress from basic
designs to advanced ones.
Designers steal rules from everywhere they can find them
Rules. Steal them. If you don’t know where to steal them from then go play some games before
you begin designing them.
Rules. Modify and adapt them. If you can’t modify the rules to account for things like an
artificial-intelligence seeker on an artillery round then you have not been curious enough. Go
learn more before trying to design a game.
Rules. Use them with confidence. Because if you do not, then the players will make you wish
you never took on the job of standing up and running a game.
But realize that, ultimately, rules are designed to be modified. For the kinds of games we are
talking about rules are guidelines and guidelines are the way you get the players through the
game without everything falling apart. Good designers stay constantly aware of that.
Designers are confident
Which can be annoying. Particularly for those close to them. Confidence can be a two-edged
sword, something that becomes quite obvious when you start hanging around with too many
game designers. Game designers—professional game designers—have an amazingly high
degree of confidence in what they know, and what decisions they make. They have to.
Otherwise the players will eat them alive.
Confidence mostly manifests itself during the execution of a game. If the designer is also the
chief controller, as is often the case, then the designer has to walk a fine line between being in
charge and being accommodating to input from players and others involved in the game.
Projecting confidence in the control of games is essential in order to get the players to buy into
the premise and legitimacy of the game. If you don’t feel confident about the game, why should
they? Once that confidence is lost so is the willing suspension of disbelief; the game of pretend
is over and the whole edifice comes crashing down.
You often see the warning signs when the old saws ‘don’t fight the scenario’ and ‘don’t fight the
game’ begin to come up in player conversations. Players fighting the game are essentially
disagreeing with some fundamental premise in the game, whether it’s the plausibility of the
scenario or the likelihood of the other side taking some sort of action. In order to play the game,
the participants somehow have to get over those objections, or else they begin to believe the
game is a waste of time.
A good designer knows how and when to accommodate objections, and when to push back on
player input. Accommodations that don’t derail the game or its objectives, or actually improve
the game (yes, it can happen), can be welcomed. However, designers have to project an aura of
still being in charge, lest the players see an opportunity to redesign the whole game right out
from under them.
The designer also has to be capable of pushing back against bad player inputs. In order to do
this the designer has to have built up credibility as an authority. Both in the control of the game
as well as the various problems and issues associated with the game. Because the good
designer has been curious, and learned about the systems and processes involved in the game,
they can early on in the game exhibit some of that knowledge in order to gain the player’s
confidence. This confidence can be cashed in when it comes time to move the game in a
direction not all of the players may like. Our friend and award-winning designer Mark
Herman tells one of our favourite stories about this. He was giving the weather brief at the start
of a game when one of the players objected. “I’m very familiar with this region,” says he. “I was
stationed there. And the weather is never like that!” Mark’s response was classic. “Well, I’ve
never been there, and I don’t claim to be an expert. But I can read. And this weather report is the
actual forecast for the region today.” We would pay good money for an opportunity to deal with
a player objection like that.
When a designer or controller is out of their depth, or just plain wrong, players can sense it.
They sense that the controller is uncertain or mistaken. To make matters worse the design is
also often askew because of the designer’s lack of knowledge. This only puts the players in a
worse mood. It is at times like this that the best designers shine. They know how to use player
input to move the game in the right direction, and they know how to fall back from the parts of
the existing design that may not be correct. They can do this without giving the players a sense
that the entire game is sideways, and that their authority should be questioned. It's not the best
way to do things, but sometimes it is the only way to save a game from disaster.
Amateurs talk about design; professionals talk about where to buy cheap
supplies
Yes, we stole that line from Picasso. Maps, orders of battle, pieces, and rules all form the
material components of a game. In a classical wargame, materials are required to give the
players a way to interact dynamically with the battlefield. In other types of professional games,
the design of materials leads the players into the story and keeps them there.
Good designers know this and try and avoid the basic instruction sheet or memo format. That's
where you tell the players the scenario, the key issues, and the game objectives in a short essay
about the game. Basically, giving the players a set of instructions. In those kinds of games, the
players will end up talking about the problem instead of playing through the game. Instead, good
designers try to set the stage for the story the players are about to enter, using forms and
styles of information the players will expect to see in reality. News reports. Op-Ed pieces. Even
custom video or audio reports from the scene of the action.
Designers also know that in game materials, more is better. The more you give the players the
more they will invest in the game. There are a couple of reasons for this. First the players sense
that this is a serious game, one where the designers have done their homework. There is a lot of
detail, and the players really have a job to do if they to participate effectively. This is part of the
‘flow’ experience in a professional game—the players need to be challenged. Flow experiences
keep the players hovering between too much and too little stress. Players need to feel they are
able to complete the work assigned to them, but that the work is meaningful enough to matter
when they do. This balance between too much information and complexity in the game, between
too many materials and not enough, is one that good designers learn from experience, and their
social knowledge.
But more materials alone are not enough to capture the players. The materials also need to
reflect the kind of information that the players would normally see in the context of an actual
event. Players in professional games often are expected to play the same roles in the game that
they play in real life. The more realistic a presentation they get the more likely they will be to
believe the other fictions in the game.
The information in the materials must also be presented in a way that is easy for the players to
digest. When the players show up at the game, they are unlikely to have read whatever read-
[308]
aheads you may have sent. And some players may have been late substitutes or add-ons
and have no idea what the game is about. This means that at the start of the game players will
need to be given clear summaries, briefings, and other written materials in order to catch up. It
also means that the controllers will have to brief the players on the game situation at start. In
other words, good designers use read-aheads for the players, then give the materials to the
players again when the game begins, and then tell the players what they want them to know.
Good designers don’t trust the players to learn what they need to know on their own.
Good designers also know that bringing players into the game at the beginning is critical to
having a successful game. This means that when the players look at their materials they should
be transported immediately into the problem. If they expect briefings from the J-3 and J-2, then
the materials should look and feel like briefings from those organisations. Likewise, if they
expect media articles, game information should be presented as news articles. Some designers
even go as far as to use simulated news broadcasts at the beginning or even throughout the
game in order to give players a sense that they are in a media environment.
Game designers need to develop extensive materials that are clear and easy to access, but that
give the players the information that they would get in the real world. This is not as arduous as
it might seem; designers can cut out much of the irrelevant information that would be given to
players in the real world. But it also means that designers need to understand what real-world
information would be available, and how it would be presented.
Designers respect their players
Knowing how things work, how they really work, comes from knowledge, curiosity, and a healthy
willingness to disrespect authority. Knowledge in games begets accuracy in game construction
and game play. Knowing how weapon systems work allows the game designer to include
accurate representations in the game. Knowing how emergency response operations work, or
animal disease control is managed, allow designers to build games that are believable. More
importantly, on game day it allows them to adjust those representations in the face of player
shenanigans.
Without accuracy you will not have a good game. And you will likely have many upset players,
and sponsors.
However, accuracy is insufficient to make a good game. Good games also have to be fair in
addition to being accurate. Factual knowledge must be balanced with social knowledge,
because games are as much a social event as a simulation.
By fairness we do not mean game balance. Balance in game design is very important in
commercial products. Both sides in a commercial game need to have a realistic expectation of
winning. Even if winning means losing less spectacularly than the general did in the real battle.
Balance in this sense of the concept does not apply in professional games. In professional
games we are interested in exploring, understanding, or assessing the topic at hand. If one side
crushes the other side all the time, fine, that is part of what we are trying to understand. The
players being crushed may feel bad, but it doesn’t ruin the game.
Fairness is a much more subtle concept than balance in professional games. It means that the
game respects the players, the scenario, and the surrounding components of the scenario.
Players in the game don’t get the feeling that no matter what they do; the controller is going to
come down in favour of the other side. They feel that the models, scenario, and organisation of
the game have been designed to reflect their real circumstances accurately, even if those
circumstances are pretty dire.
Because the focus of many people is on models when they consider games, they miss the reality
that models cannot do everything in a game. Players will come up with ideas and concepts that
do not necessarily work in the models of combat (or whatever) as they are designed. Controllers
will need to make some decisions about those player actions. Dismissing them, shoehorning
them into existing models, or poorly reflecting the capability will lead players to believe that they
have been dismissed. Players who feel that control does not respect their decisions stop
playing. It's a natural feeling. If whatever I’m doing isn’t going to ever work, why should I do it?
Using rigid adjudication requires sensitivity and nuance on the part of the controllers. They need
to know when to go off script, especially when players invest a lot of their own energy and
sense of self in what they are doing. Controllers need to read the players’ focus and respond
with an equal degree of consideration.
But it's not just the adjudication process that affects the perception of fairness. The scenario
can be biased unfairly against one side or the other. Now we don’t mean that the situation
depicted in the scenario is unbalanced. That is fine. Rather the way in which player decisions in
the game are treated can become unfair. Players may feel that unrealistic or excessive limits are
imposed on their actions or capabilities. We often see this in terms of restrictions on what one
or another side would do during a conflict. The Red side may be told to sit quietly and watch as
coalition attacks brilliantly destroy their forces. Without the Red players being able to respond.
Another group of Red peer competitor players may be told not to escalate to nuclear use, even
though experts agree that the country would use them given the actions of their opponents.
While player-generated constraints are fair, arbitrarily imposed constraints on player actions
such as those described here can make the game seem biased, unfair, and a waste of time to
the victims. And it probably is.
The game designer also has to respect those elements that surround the scenario. These
include the way in which the players are organized the information that is given to players, and
how it is given to them. The units the players are given, and the time sequence for their arrival.
The designer must demonstrate to the players that they understand how these various elements
of their jobs work, at least at the most general level of understanding.
Player roles and organisation can significantly affect the perception of fairness in a game. The
most common problem is mismatch between player skills and the position they are placed in.
Surprising the colonel at the game by suddenly putting them in charge of the theatre-wide
operation can be seen as unfair.
The designer can address this kind of fairness by understanding the pressures and challenges
shaping the players, their participation, and their integration into the game. A colonel is going to
be sensitive to looking the fool because of some last-minute tasking they were not prepared for.
Better to sound them out privately before announcing it to the assembly. Better still to make
sure they understand their role before coming, including providing some read-ahead materials
so they look extra smart. And during the game keeping an eye on them and being willing to
brainstorm options or offer suggestions when they come up short of ideas.
Sure, this kind of ‘control’ does not conform to a rigorous, unbiased, and disinterested run of a
game. But fundamentally games are a human project. And it helps to treat the players like
humans sometimes. In fact, it’s downright unfair not to.
Designers know how, and why, to build a scenario
Now that we are at the end of our little essay, we come to what is too often the beginning of the
design process. The scenario. What is the game going to be about? What decisions are the
players going to make? What roles will the players have? These are all elements of the game
scenario.
The game scenario carries a lot of weight in creating a good game. Which is why it's a shame
that the scenario is often the first thing thought of – and specified – by the sponsor. They want
a game on NATO intervention in Fredonia, and from that comes all kinds of results including the
scale of the game, the roles of the players, the kind of forces employed and the adjudication
(and models) needed.
But a good designer knows that just because the scenario is specified, the game itself is not.
The scenario as often conceived of by sponsors and other civilians is no more than a shell, and
outline of who, what, where, when, and why. Modern operations live or die by the details of the
situation. The political, social, environmental, and other constraints imposed by outside forces
are what really shape the form of the scenario. These shape the roles and the decisions that the
players make. They are part of the scenario, but are not something that sponsors can or want to
specify.
A good designer knows that no matter what scenario they are given they still have room to
create the game that they need to create.
For example, suppose the sponsor is really interested in command and control and a possible
reorganisation of the staffs they will use as a Joint Task Force commander. The sponsor
specifies a particular scenario in their theatre, simply because this is the one that they are most
concerned about at the time. This may or may not be the ideal scenario to test different JTF
organisational concepts. A far better scenario may be one with a significant humanitarian
component, because all of the elements of the JTF staff can get exercised, not just the
warfighting pieces. Also, the designer knows (because it’s always been that way) that U.S.
military forces inevitably have difficulty during the initial phases of an operation interfacing and
working with outside organisations. This would be a much more sophisticated challenge for the
JTF force providers than a simple kinetic scenario, and everyone would learn a lot more. So
maybe the design can sneak in a humanitarian aspect of the scenario without being too obvious
or too dismissive of the sponsor’s favoured situation.
The good designer understands the power of the scenario to accomplish objectives, real
objectives, and is willing to work the sponsor and the game in order to bring things around to
the best overall scenario solution.
And so we find ourselves back at the beginning. A good game designer does not simply accept
the scenario as given by the sponsor – the authority. They ask themselves what the sponsor’s
real objectives are, not just the ones they are saying but the ones they are thinking. Or even the
ones they don’t want to think about out loud. The good designer pushes back on the objectives.
They refine them. The good designer pushes back on the scenario. They want it to match the
objectives and make it easy on the players to consider the decisions they need to consider. And
the good designer does this because they respect the players. They don’t want to waste their
time, they want them to be in a fair game, that accurately reflects the world, and that has the
potential to change the way the world actually works.
Part 4: Practising Successful Wargames
‘Wargames help strip down a strategic, operational, or tactical problem and reduce its
complexity in order to identify the few, important factors that constrain us or an opponent. They
provide structured, measured, rigorous – but intellectually liberating – environments to help us
explore what works (winning) and what doesn’t (losing) across all dimensions of warfighting.
They permit hypotheses to be challenged and theories to be tested during either adjudicated
moves or free play settings, thereby allowing current and future leaders to expand the
boundaries of warfare theory. And they provide players with the opportunity to make critical
mistakes and learn from them – and to perhaps reveal breakthrough strategies and tactics when
doing so.’
Key concepts
Eustress
[329]
Eustress means beneficial stress. The word is a combination of the Greek prefix eu-
meaning ‘good’, and ‘stress’, so literally means ‘good stress’. It must, however, be self-
generated, originating from within. More accurately, it must originate from within your players,
and it is your job to create the conditions to allow this. You need to instil in them a sense of
optimism and confidence so that they voluntarily put themselves into a position where they are
being stressed. Key to this is ensuring that players know they are in a safe-to-fail environment.
Fiero
Fiero is an Italian word broadly meaning ‘pride’. Raph Koster defines it as ‘the expression of
triumph when you have achieved a significant task. This is a signal to others that you are
[330]
valuable.’ It’s the feeling, or reaction, you get when your team scores the winning goal, or
in the moment you finally achieve something difficult. It is usually accompanied by punching the
air or raising your arms in triumph. Fiero is such a primeval emotion that it is common to all
human cultures. It taps into our basic survival instincts. In achieving something challenging, we
are learning how best to survive. We are no longer killing mammoths, but creating ‘fiero
moments’ in our players still releases endorphins into their bodies. [331] Endorphins can also
result from physical exercise – and fun. Releasing them significantly enhances the learning
effect. It should not be a once-daily emotion, triggered by a ‘well-done’ at the After Action
Review (AAR); mini fiero moments should be constant. A slap on the back, a reinforcing snippet
of intelligence – seize any opportunity to induce them. But taking opportunities is not enough:
you need to engineer them. In that way, you tap into players’ visceral survival instincts.
Naches
Naches is Yiddish for ‘vicarious pride’: experiencing pride through someone else’s
achievements. ‘The feeling you get when someone you mentor succeeds. This is a clear
[332]
feedback mechanism for tribal continuance.’ It again taps into our survival instinct, so
enhances learning and engagement. A common issue during staff college wargames is the
‘appointment change’: Student A has been Chief of Staff for Week 1, and is then given a lowly
logistics post for Week 2 to allow Student B the opportunity to shine (be assessed) as Chief of
Staff. Student A switches off, disengages and reaches for their phone. Why not get Student A to
mentor Student B? Why not get strong students to work with less confident ones? This needs
careful thought and in-game management, but results in better engagement and enhanced
learning in both the supported and supporting students.
Flow
[333]
‘Flow’ is fundamental; all the key works on games highlight it. Most people have heard
about ‘being in the flow’, so you should be pushing at an open door. First, lodge the fact that
the originator of the concept, Csikszentmihalyi, is simple to pronounce: Cheek-SENT-me-high.
Flow is the frame of mind characterised by intense attention and maximum performance on a
task; achieving a deep and effortless involvement in a job that reduces awareness of everyday
life. Concern for yourself is reduced, but self-awareness rises after the event; and your sense of
time changes. Colloquially, it is being ‘in the zone’, or having ‘hard fun’.
‘In the flow’ is where you want all your participants (not just the players) to be all the time.
That’s a big ask, so how do you do it? I list below the conditions required for being in the flow,
[334]
then expand on just the last two . Work to ensure that each of the following is embodied in
your games:
· Intense, optimistic engagement.
· Tasks are satisfying, exhilarating, challenging but achievable.
· Creative accomplishment.
· Heightened functioning.
· Structured, self-motivated work.
· Continuous feedback.
· Self-chosen goals and personally optimised obstacles
· Perfectly matching the challenge to capabilities.
Self-chosen goals and personally optimised obstacles. In an educational wargame, why not
invite the students to select their appointment? This is revealing in itself, and easy to
accommodate (for example, by asking for their top three preferences for positions within the
wargame). By doing this, you are helping to establish the conditions to get every student into
the flow, because they will be working at a job they themselves have chosen.
Perfectly matching the challenge to capabilities. Illustrated in Figure 18-2, this is a key
performance indicator, and my first agenda item for Control coordination meetings in an
educational or training wargame. The discussion of the level of pressure the players are under
informs all decisions taken in the meeting. To confirm this, a member of the Observer/Mentor
staff, or someone who is working alongside the Training Audience, must be present. Having
ascertained the level of pressure, the balance between challenge and ability can be regulated
using the methods I discuss in Chapter 22 (Controlling Wargames).
Figure 18-2. Achieving ‘flow’: matching challenge to ability
To achieve the conditions for flow, your participants need to:
· Be able to concentrate on the task;
· Recognise the possibility of completing the task;
· Work to clear goals;
· Receive feedback; and
· Have control over their own actions.
The importance of these bullets to your wargames is apparent, and it should be simple to ensure
[335]
they feature in your games. But (mindful of that Clausewitz quote! ) let me give you an
example of how simple things can be difficult. Being able to concentrate on a task is one of the
prerequisites for achieving flow. I made this point to the Directing Staff at a European staff
college where I was shortly to facilitate a two-week wargame. About a week before it was to
start, I checked that nothing had been added to the wargame programme. It was as planned,
with no distractions. Arriving a week later, I found that the deadline for submission of the
Commandant’s Paper (a key assessed assignment) had been brought forward by one month to –
you guessed it – the last day of the wargame. At a stroke, the students had been effectively
prevented from concentrating on the wargame. Yet none of the wargame’s Directing Staff had
even thought to object when the new deadline was mooted. Use the bullets above as a checklist.
If you are in charge, refer to them constantly to ensure you are abiding by them; if not, check
them with whoever you are working to and try to ensure they are implemented.
Happiness
This is a lesson in life, brilliantly covered by Jane McGonigal (the full book title is Reality is
Broken; Why Games Make Us Better and How They Can Change the World), which directly
[336]
applies to wargames.
Extrinsic or intrinsic happiness
[337]
‘When the source of positive emotion is yourself, it is renewable.’ Intrinsic
happiness comes from within, and is a result of something that you achieve, especially when
hard-won. An associated phrase is auto-telic: auto meaning ‘self’, and telic meaning ‘goal’ in
Greek. Extrinsic happiness comes from outside. It is, perhaps, best encapsulated in ‘The
American Dream’ (apologies to any States-side readers!): having a big car, nice clothes, grand
house etc. These consumer items, and money, do not make people happy: they always crave
more, and end up unhappy. Achieving challenging auto-telic goals makes you happy.
Extrinsic rewards in games include ‘bling’ on a first-person shooter avatar, victory points,
quantified success and scoring. These are less effective motivators than intrinsic rewards, such
as playing well, doing what you think the actor you are representing would have done and the
challenge of the game. Moreover, extrinsic rewards can crowd out intrinsic ones. I might give
players objectives, but I never assign victory points; rather, I invite the players in the AAR to
discuss how well they have done. This is often just as useful – and interesting – as the game
play.
The four secrets to happiness
The four secrets are:
1. Satisfying work. This is being immersed in clearly defined, demanding activities that
allow people to see the direct impact of their efforts. People must have:
· Clear goals.
· Actionable next steps towards reaching an achievable goal.
· Agency. People like to know that their actions produce an effect, and that their
decisions matter.
2. Experience success, or at least the chance of being successful. Succeeding is fine (and
needs no explanation) – but so is failure, if handled in the right way. People do not mind
failing if they have no fear of this and can see themselves improving. Many video games
involve constant failure, but the player will try again…and again…and again because each
time they get a little better – until they eventually succeed. Jane McGonigal introduces the
concept of ‘fun failure’, where failure is accompanied by ribbing. Counter-intuitively, this
mild teasing inoculates the ‘failed’ player against failure, and builds trust between the two
parties. Failure, and fun failure, are crucial concepts for successful wargames. ‘You win or
you learn.’
But your game’s outcomes must be fair. You cannot simply make someone fail for the sake
of it, even if there is a learning point to be made. The failure must teach players something,
and make them want to do better, see that they are improving and – eventually – have a
good chance of succeeding.
Many professional wargames are deliberately designed to ensure that players fail. If not
carefully managed, this can be dispiriting and cause players to disengage. The techniques
below can ensure continuing player engagement in the face of repeated failure.
· Brief people! With most professionals, this simple step will largely mitigate the
issue. If they understand what is being asked of them, and why, people usually play
along, even in the face of failure.
[338]
· The accepted ‘safe-to-fail’ wargaming doctrine is particularly important. A
statement to that effect by the sponsor or senior officer present is critical, so players
know that failure is acceptable.
· Set players a challenge: to do better than [insert appropriate target], even though
failure is likely. Hobby wargames often use a historical benchmark: get off Omaha
taking fewer losses (D-Day at Omaha Beach); or hold more territory than the paras in
Operation Market Garden (Storm over Arnhem). A challenge that usually works well is
to tell players that the best the development team could achieve during playtesting
was ‘x’, and ask them to beat that.
· Switch player sides. If each team plays both sides, see who does least badly as the
failing side.
· Time permitting, play multiple iterations. Players will stay engaged if they can see
themselves improving.
3. Social connections. This speaks for itself in professional wargames. Teamwork is
developed, bonds forged and relations pre-positioned prior to operations within a wargame,
as players organise around voluntary obstacles and take actions that matter. This talks to
the very heart of wargaming: shared story-living experiences, shaped by player’s
decisions.
Meaning. Happiness stems from being part of something greater than ourselves, belonging
and contributing to something that has lasting meaning beyond our individual lives. This
contributor to happiness is heightened if the thing to which we are giving service is of epic
proportions. Being part of an epic story can generate all four secrets to happiness – and
that is exactly what a wargame can deliver. Rules of play notes that ‘meaningful play’ relies
on discernibility (feedback) and integration into the game system. ‘The goal of successful
[339]
game design is meaningful play.
A context for action
More awesomeness
Awe is one of the most powerful emotions. Awe involves delight and aesthetic appreciation – but
is transitory. It is something that should feature in wargames; they should, literally, be awesome.
Link that to ‘epic’, and the scale of your wargame’s context and story. Epic scale bestows the
power to act with meaning, to do something that matters in a bigger picture. The story is the
[340]
bigger picture, but the player’s actions are what matter. This context for action should be
an epic environment and involve an epic project; in professional wargaming terms, an epic
mission. Everything must be credible, but that does not stop you emphasising the importance of
whatever it is your players are being asked to do. By their very nature, wargames ask that
players suspend disbelief. Introducing a sensible degree of epic awesomeness just requires
imagination tempered by common sense.
The possibility space
Wargames provides a ‘possibility space’, in which:
· Meaning is made.
· Players have the power to make choices within lots of possibilities.
· People can be sent to places you can’t go in real life.
· Players and their experiences are put first.
· Players author their own experiences.
· Events are visceral and compelling, disruptive and strange.
These points recur throughout this book. I make some suggestions, but you must use your
imagination to derive ideas appropriate to your games. For example, ‘making choices’ might
extend to a pre-game planning phase, where players have to select – or determine –some
capabilities to use that will affect the forthcoming game. A player might be given three cyber
operations to use in the game, and have to determine the system to be attacked and the effect
before play starts; or two Special Forces raid will be allowed, but the player has to select one
type of target (airfields, infrastructure, leadership, or whatever). In a futures game, a ‘world-
building’ session that allows players to create, or influence, the context within they will then
wargame is a particularly powerful technique.
Players should be challenged in a wargame. Raph Koster’s quote at the start of this chapter
reminds us that games are art and ‘We are artists and teachers with a powerful tool.’ He makes
the point that the best art – irrespective of the medium – is challenging. So, to, with wargames;
hence ‘disruptive and strange’. Encourage players to question their own assumptions and world
view, and to compare these with the models presented in the game.
Story-living
Active learning and the power of a constructed narrative
A ‘constructed narrative’ is one that the players create themselves. This as opposed to a
‘presented narrative’, when recipients are passive consumers of, say, a presentation or a book.
Allowing players to construct the narrative enables experiential learning: the story itself is the
fun of learning. This talks to the Confucian concept of active learning:
I hear, and I forget;
I see, and I remember;
I do, and I understand.
The concept of experiential, active learning – story-living via a constructed narrative – is a
primary reason why wargaming is so powerful. This applies to all wargaming. Player learning is
enhanced and internalised because ‘I do, and I understand.’
The dramatic elements of a story
Your wargame’s story needs to be shaped, both in its construction and its execution, as I
discuss in Chapters 19 – 21 (Scenario Writing, Development and Execution). In advance of that,
note the following, which concern the dramatic elements of a story. They apply to all story-
telling.
· The dramatic arc. This is illustrated in figure 18-3. Easy to understand, it requires
constant consideration during wargame execution because the arc should emerge from
game play. You cannot pre-script the story because that precludes players learning via a
constructed narrative.
Feedback
Feedback is the ‘discernible’ element of Salen and Zimmerman’s ‘meaningful play’. ‘Discernible
means that the result of the game action is communicated to the player in a perceivable
[342]
way.’ There are many forms of feedback. For example, think about how feedback is
communicated in video games, then consider how these might be applied to your games. Rather
unimaginatively, most feedback in professional wargames still takes the form of an AAR. For the
military, and most professional organisations such as emergency services, AAR formats are
standardised. They also tend to occur at regular intervals, typically at the end of each period of
activity: after a mission; the end of each day; and then at the end of the exercise. That’s often
necessary, but there are some factors you should consider to make this, and the many other
forms, of feedback as effective as possible.
· Beware breaking the bubble. Anything that takes players out of the magic circle needs to
be carefully considered. Just because it is accepted practise to hold an AAR at the end of
every day doesn’t mean you must do it the usual way. Maybe you can get the person
leading the AAR to role-play a senior commander from the players’ chain of command.
Everything would have to be couched in terms of activities within the bubble – but so what?
Do not presume a standard AAR.
· Use multi-sensory methods to deliver feedback. The previous bullet included role-play.
Why not do this often, having visitors arrive in a player HQ? An in-game brief on the
changed situation, highlighting the role of player decision-making to deliver a favourable
outcome is good feedback.
· Feedback should take place during game play. Jan Chappuis, an expert on learning, tells
[343]
us that feedback is most effective during learning, while there is still time to act on it.
If you hold one AAR at the end of a wargame, it’s too late to act on the points arising.
· Feedback should be frequent and direct. This might be a pat on the back, an event that
goes well due to good decision-making, or whatever.
Players must face the consequences of their own decisions
The most powerful feedback occurs when people are immersed in a well-crafted and protected
magic circle, construct their own narrative and face the consequences of the decisions they
make.
It is as simple as that. Putting it into practise is not quite so simple. The following chapters
suggest some ways you can do this.
Chapter 19. Scenario Writing
Existing settings
[354]
There are open source settings you can use, although many are fictionalised. Examples
include:
· The US Army Decisive Action Training Environment (DATE). This provides ‘…a detailed
description of the conditions of five operational environments in the Caucasus region:
specifically, the countries of Ariana, Atropia, Donovia, Gorgas and Limaria.’ This is a
[355]
fictionalised setting, approved for public release.
Figure 19-1. US Decisive Action Training Environment
· Cerasia and Sorotan. These are fictionalised NATO settings in the Horn of Africa suitable
for ‘near peer’ warfighting and ‘hybrid’ operations. Fisko is a new setting, under
development.
· Skolkan. This is a fictionalised NATO Article 5 setting that ‘transforms our Partner
Nations of Sweden and Finland into various countries of instability and/or concern. In
addition, North Island New Zealand is added as a new country off the coast of Norway to
[356]
add to the level of complexity in the maritime domain.’
Writing a setting and scenario
What follows is a worked example of writing a setting and scenario for an educational wargame.
This was a rare opportunity to write from a blank piece of paper, so it illustrates the full process.
It is followed by a discussion on writing a scenario for an analytical wargame.
Writing a setting and scenario for an educational wargame
In 2014 I had the pleasure of designing, developing and delivering a new computer-
assisted ('Simulation-Supported Exercise’, or SIMEx) training wargame for a Middle Eastern staff
college. I was working with NSC, who delivered a ‘managed service’ (complete wargame
package). My thanks to NSC for permission to use this.
Figure 19-2 shows the steps we devised to write the setting and scenario as we first
whiteboarded them (they are listed in full after Figure 19-2). This was during a week-long
Wargame Design Workshop, so we had already worked through the design steps in Chapter 12.
(Wargame Design) We therefore had a full understanding of the requirement and could devise
the setting and scenario to precisely meet the aim and objectives.
Introduction
This is a short but critical chapter. Players are too often implored not to ‘fight the white’ when
querying aspects of setting or scenario (in)credibility. Any such instance indicates players
whose ‘magic circle’ has been violated due to a poorly developed setting or scenario. The
development phase must ensure that the setting and scenario are credible, coherent and fit for
the purpose of enabling the wargame. The development mantra of ‘playtest, playtest, playtest!’
applies. Aspects to be tested include: the balance of forces; the anticipated broad progression
of the narrative ‘story-board’; determination of the likely information flows into and out of
Control; the number of people required to support this; and the required level of detail in
paperwork and briefings. Throughout the development of the setting and scenario, it is
important to ‘keep the end in mind’, which is executing the scenario dynamically to enable a
constructed narrative.
Unstructured inject scripting workshops are a waste of time
Significant time, effort and money is spent conducting Main Events List/Master Injects
List (MEL/MIL) ‘scripting workshops’, sometimes involving hundreds of personnel and
contractors making up ‘stuff’ and populating data bases with dozens of events and hundreds, if
not thousands, of associated injects. NATO’s Bi-SC 75-3 suggests that, ‘The MEL/MIL includes
the complete set of events, incidents and injections, and constitutes the detailed script of the
exercise play. These should be developed before the exercise. They may also be dynamically
scripted during the exercise. The purpose is to generate responses from, as well as to ‘paint the
[360]
picture’ for, the Training Audiences.’ (My italics.) Although this statement accepts some
dynamic scripting, the intention is to determine all injects prior to the exercise, with little scope
to adjust these during execution.
For the past few years, I have tried to quantify the percentage of pre-scripted injects used
during the subsequent wargame. The figure ranges from 0 (yes, zero – and I was working with
the Scenario Manager, so that is accurate) to about 20%. Furthermore, many of the injects used
have nil, or a detrimental, effect on the wargame: players are unable to act on them, or – worse
– they make no sense in the current situation. This is usually due to the situation having
changed since the inject was conceived – yet they are still played in because ‘it’s in the script.’
I am not suggesting that MEL/MIL workshops are eliminated: a premeditated (versus
predetermined) roadmap is usually required, and there is great utility in wargaming this to
visualise the broad sequence of events. But there is a better way to run MEL/MIL scripting than
the current prevalent method: use wargaming, as explained below.
Confirming a scenario is fit for purpose
How, then, do we reach the Nirvana of a setting and scenario that supports the wargame
objectives, yet will be executed dynamically with player decisions paramount? The following
suggestions apply to training and analytical wargames, but should be adapted to fit your
wargame specifics. Many will require a ‘leap of faith’ by the sponsor and Control staff, because
the comfort blanket of a pre-determined wargame script is discarded in favour of a flexible
‘waypoint’ approach to dynamic scenario execution.
The key is the series of playtests detailed in Chapter 13 (Wargame Development). Do what you
can prior to the Internal Playtest to ensure the scenario’s coherence and internal consistency,
including ‘staff checks’ by the sponsor organisation. Then take the sanity-checked setting and
scenario into the series of playtests and MEL/MIL scripting. Focus on the following three
aspects:
· The antagonists’ anticipated plans, to ensure they will support the objectives.
· A broad ‘road map’ or story-board consisting of scenario ‘waypoints.’
· The balance of forces, to confirm this can deliver the anticipated outcomes.
These are all likely to start with fairly wide parameters, which are refined during playtesting. It
helps to achieve an initial narrowing while playtests involve only a few people: to attempt the
refinement from broad to narrow with many participants risks lengthy debate and wasted time.
Validating antagonists’ anticipated plans
Adversary plans are much easier to control in a one-sided wargame, when Red is essentially a
pressure lever that Control can adjust as required. It is far more difficult to credibly control
adversary plans in a free-play, two- or more-sided wargame. Accordingly, during playtesting, you
should test and refine the scenario to deliver a situation that, even when it evolves with minimal
Control influence, meets the wargame’s objectives but remains challenging and finely balanced
throughout execution. I recently witnessed an analytical wargame in which the Red plan (a flank
guard action) dovetailed neatly into the Blue plan, allowing both sides to achieve their given
objectives without fighting. Red advanced to a certain point, just short of the main Blue
positions – and stopped. Both sides declared that their plan had succeeded. Unfortunately, the
experiment demanded that the Blue defensive capability be tested – something the Red player
had no intention of doing. The Game Controller had to step in and – somewhat incredibly – ask
the Red player to launch attacks. This should have been identified and rectified during
playtesting.
Story-board ‘waypoints’
The waypoints might be the wargame’s objectives (e.g. practise amphibious operations) and/or
events or themes that the sponsor requires (e.g. suffer multiple and increasing set-backs when
operating within a humanitarian crisis). The crucial point is that, while the Game Controller is
responsible for steering the game to reach these waypoints, the route between them will be
determined by player decisions.
While not required at all playtests, the Game Controller must be involved at some point, because
it is they who ultimately steers the wargame to reach the waypoints. The earlier the involvement,
the more confident the Game Controller will be that waypoints can be reached. Refinements to
the scenario might be required, e.g. a credible reason why one side cannot immediately destroy
the other’s air force in the 1973-like example I cited previously. Once the Game Controller is
comfortable with the waypoints along the broad story-board, these should then be played as
part of scenario testing during subsequent playtests. The sponsor, Game Director and Game
Controller should come away from playtesting confident that the waypoints can be achieved –
but they must also be comfortable with the fact that the exact route between them will not be
determined until the players derive and execute their actual plans.
Confirming the balance of forces
The balance of forces must be correctly assessed during playtesting. The process can start
during the design phase (per the Middle Eastern staff college example), but must feature from
the very first playtest.
An example of the process in action
We were able to follow this process for a divisional-level Command Post Exercise (CPX) in 2015.
Having worked together the previous year, the Game Controller was confident that we could
deliver the required training objectives using the ‘waypoint’ approach, allowing Red freedom of
manoeuvre and with well-balanced capabilities on both sides. Following internal playtests, we
conducted a half-day wargame with the Game Controller and Red Cell to investigate the broad
Red options and confirm the balance of forces. Following that, the Test Exercise concentrated
on playing through the anticipated Red and Blue plans to see if the waypoints were achievable.
That teed up a much-reduced period of inject writing, with the scripters having just played
through the scenario waypoints. We achieved such a fine balance of forces that, during the
actual exercise, the divisional Chief of Staff took the Game Controller to one side saying that the
Training Audience were seriously concerned that the division would be overwhelmed, and were
frantically planning contingencies; could the Game Controller not tone down the threat?
Confident from playtesting, the Game Controller stood his ground and the ultimate outcome,
while uncertain throughout, delivered a favourable result for Blue. The Training Audience
emerged exhausted but elated, and with the learning outcomes firmly internalised.
MEL/MIL scripting workshops using wargaming
MEL/MIL scripting might not be required, particularly for analytical wargames. For training
events, certainly large-scale ones, I would recommend it – but structured around wargaming.
Attendees are too often briefed just the top-level scenario events, and then asked to make up
detailed injects to support these. The result is a large bucket of injects that are either generic
(‘fuel contamination downs the F-16 fleet’) or fixed (‘it’s Tuesday so must be the outbreak of
disease at the refugee camp’). It’s this bucket from which the 0-20% of used injects are drawn
that I referred to earlier.
An example of an effective scripting week
In 2018, I ran a scripting week in preparation for a divisional CPX. We based this around simple
manual wargaming, which delivered a visualisation of the broad story-board and a structured
framework for discussion. The top-level events were actions on various geographical objectives.
These were contained in higher command’s orders and correlated closely to the training
objectives, so were a given. We wargamed the possible routes from event to event to elicit
incidents that were likely to occur between, and during, them. One event was an airborne
assault. The incidents we derived included: reactions from in-situ IOs and NGOs dealing with
Internally Displaced Persons (IDPs); the actions of local armed non-state actors and organised
crime should any power vacuum emerge; the degree of control required over actors and
factions; detailed Red play that would demand player decisions in response to well-balanced
kinetic outcomes (Operational Analysis was used to examine the expected outcomes); the level
of support from in-place host-nation forces to enable the operation; IDP movements as a result
of the operation; and regional and international reactions.
Having premeditated the broad course of action, scripters created inject serials with a clear idea
of the type of incidents likely to occur during execution. Everything created remained
provisional: it could not be a prediction because we could not know the players’ decisions until
execution. Incidents and injects would be confirmed and developed during dynamic execution –
ideally by the same people who had attended the scripting week – in response to player
decisions and Control guidance. Incidents and injects included kinetic events and the non-
kinetic ‘wrap around’ so crucial in providing an immersive context that, as far as it can, reflects
the complexity of the real world. Where an incident was complicated, or involved multiple cells’
injects, we divided it into vignettes for ease of conceptualisation and coordination. In between
bouts of wargaming, we paused to allow people to populate Exonaut (the computerised MEL/MIL
tool) while what they had just experienced was fresh in their minds.
The outcomes from the scripting week were:
· An experiential visualisation of the likely, but provisional, execution story-board, to be
refined during execution into a full MEL/MIL adaptive to player decisions.
· Population into Exonaut of definite events, likely incidents and vignettes, and some
plausible injects.
· Familiarisation with Exonaut.
· A backbrief of the anticipated key serials and scenario ‘hooks’ for potential development.
· Confirmed balance of forces, which would deliver enough play to last execution.
· Identification of Control requirements e.g. personnel, information flows etc.
· Crucially, a Game Controller confident of executing the scenario dynamically and flexibly
– but with a light touch – to achieve the objectives.
Chapter 21. Scenario Execution
'What unites people? Armies? Gold? Flags? Stories? There's nothing in the world more powerful
than a good story. Nothing can stop it. No enemy can defeat it.'
Tyrion Lanister, final episode of Game of Thrones
Introduction
Scenario development, with sufficient playtesting, has brought you to Startex with a coherent
product; broad scenario way points; a good understanding of the antagonists’ plans; balanced
forces; and, crucially, a Game Controller confident that all this is fit for purpose. Now the
scenario must be executed dynamically, balancing the tension between player decisions and
objectives. Much of that control is discussed in the next chapter (Controlling Wargames), but
there are key aspects specific to executing the scenario.
Your task is that of a storyteller or storyshaper, but using a light touch
I find it useful to consider myself a storyteller if I am directly relating events or outcomes,
explaining these, as far as possible, as if they were a story. More often, however, my role is that
of a storyshaper: the player actions and conversations tell the story, and my task is to shape
this and give colour to the grey areas around it. The two different styles are respectively best
suited to a small wargame, when you fulfil most or all of the control functions, and a large game
where you are part of a Control organisation. Whichever style is appropriate, they are points on
the same spectrum.
Ideally, control (verb) and Control (noun) remain invisible. Clearly, if player decisions are likely to
result in a failure to achieve the objectives, you will need to intervene to keep the game on
course. But a well-designed and thoroughly playtested wargame should, with a gentle hand on
the tiller during execution, be capable of withstanding most player decisions. The following are
some suggestions that might help in this regard.
Ensure that the Startex situation tees up the initial waypoints
The situation at Startex should point the players towards the first one or more waypoints.
Unless working at the highest levels – for example in a political-military game, there will almost
always be direction given to the players from a ‘virtual’ superior. Combined with the starting
‘laydown’ positions, this should enable you to create the required situation. When we re-fight
the Falklands War with the Royal Marines Advanced Amphibious Warfare Course, we use the real
initial mission as at 17 April 1982 (‘Land on the Falklands with a view to re-capturing them’), but
we know that we will shortly inject further guidance: ‘Port Stanley must be captured’; then
worsening weather imposes a deadline of 5 June by which to achieve this. The initial mission
ensures we hit the waypoints of conducting the investment of the Falklands and executing an
amphibious landing; the subsequent guidance results in a fight to Stanley under time pressure.
How the students achieve this is up to them.
…but, where possible, be flexible over waypoints and objectives
Heretical as it might sound, if an evolving situation is delivering valuable wargame outcomes, it
is sometimes better to concentrate on that rather than dogmatically move on to the next
waypoint. This does not contradict the previous discussion. Clearly, flexing the objectives might
not be appropriate, and must be discussed with the Game Director and sponsor. However, if
learning and/or insights arising from Waypoint A are rich, and it is apparent that more can be
gleaned, at least consider concentrating on that objective (or even accepting that a new one has
arisen) and marginalising Waypoint B.
Look ahead
Constantly think ahead to player decision points and the next waypoint to gauge the options
open to evolve the scenario. If any of these might take the game down a route the Game
Controller does not want, you will need to pre-empt decisions that lead that way. The further
ahead you can think, the better your chances of doing this with a light touch. The intervention
might be by way of an inject, direction from a superior commander or adversary activity.
Whatever it is, make it credible and intervene in good time, so that it is a gentle nudge, not a
crude shove.
Avoid clumsy interventions
Resist Control interventions that come out of the blue or are ‘acts of God’ such as ‘There’s been
a landslide that blocks all roads’. Players immediately spot these as hasty corrections. I talk
about the ART of injects later. For now, simply register that injects must be credible and usually
develop over game time.
Minimise the moderation of outcomes
Try to resist moderating the results of combat and other interactions. If outcomes, and the
factors that lead to them, are transparent, it is usually better to run with whatever
stochastic result occurs, even if it is towards one end of the outcome distribution curve.
Understanding a result makes it more likely that players will accept it, even when an adverse
outcome arises.
Dramatic considerations
As discussed in Chapter 18, consideration during execution of dramatic factors will help ensure
player engagement. These were: The Invitation to Play; tension and conflict; uncertain
outcomes; and the dramatic arc. I re-state them because they will not be the foremost
considerations for the Game Controller, so you should champion them as appropriate during
execution.
Techniques for managing scenario execution
We are now morphing into pure execution and there is considerable cross-over with next
chapter, but I want to concentrate on executing the scenario for a while yet. There are several
processes and techniques that can be used during execution that help Control – and the Game
Controller in particular – manage the evolving scenario to ensure it reaches the various
waypoints.
Conduct a ‘Turn 0’
I discussed the Turn 0 in Chapter 14. Allowing players to play through a complete turn, then
resetting the game clock back to Startex, is an excellent way to ensure your wargame gets off to
a good start, heading towards the first waypoint.
Hold ‘Next-Day Wargames’
A ‘Next-Day Wargame’ is a Control game conducted prior to the period to be executed for real,
typically the day before. It is a game within a game. Its purpose is for Control to anticipate,
premeditate and plan the next day’s activities. This is a key activity, and the process is detailed
in the next chapter. It is predicated on players’ plans, and Control’s shaping activities should
conform to these. Outcomes pertaining to scenario execution are:
· The understanding by all Control staff of the likely outcomes when the events being
wargamed are executed for real the following day.
· A list of injects to be introduced dynamically the next day. These might range from a
necessary adjustment to steer the wargame towards a waypoint/objective, to developing
the non-kinetic ‘wrap-around’, to adding richness and colour to the scenario.
· Confirmation of media products required for the next day, with sufficient time to produce
these.
· Production of a ‘flanks’ or ‘higher command’ situation. This might just add depth to the
scenario, but could elicit potential direction and guidance to the players. Whatever the
outcome, the risk of players feeling that they are operating in a vacuum is mitigated.
· Advance warning of issues arising. These might range from a developing imbalance of
forces to a major decision point that might skew the remainder of the event.
Premeditation of such issues enables measured Control action.
· Parameters within which Lower Controllers and computer operators are to work when
executing the game using real-time simulations.
Control levers and mechanisms
There are several levers and mechanisms that the Game Controller might use to gently steer a
wargame. In a perfect world, these remain unused, and the players’ decisions will naturally
navigate the spaces between waypoints – but that is an unlikely ideal.
· Injects. See the next section for more detail.
· Higher commander’s direction. Considered in advance, and delivered to a credible and
sensible timeline, this can provide absolute direction to participants. Direction should not
come as a surprise to players; Warning Orders and drip-fed information should precede it.
Use such direction sparingly if it does not arise naturally in response to game play.
· Adversary activity. In a one-sided wargame, adversary cells are a Control lever. Red
(Orange, Black etc) is not there to win in a one-sided game, it is there to ensure that
objectives are met. This is the flip-side of higher commander’s direction, so the same
factors apply: early consideration and credibility. You must also take care not to surprise
players. There is a fine balance between an adversary cell achieving surprise in the
doctrinal sense, and players assuming that ‘Control (Red) is making it up’ when the
adversary does something unexpected.
· Bring players into the process. If you ask a player to ‘step into Control’ or accept privy
information, you take them out of the magic circle, and disbelief will be suspended.
However, it is sometimes justified; the forthcoming time jump discussion provides one
example.
Managing injects
Terminology
· Main Events List/Master Incidents List (MEL/MIL). ‘The MEL/MIL, the main tool (normally
a database) for the Excon to control the exercise, is maintained by Excon and is structured
on the main events developed to support achievement of the exercise objectives. Each
main event will have one or more incidents that are presented to the Training Audience by
[361]
way of injections.’
· Master Scenario Events List (MSEL). The MSEL is the US equivalent of the MEL/MIL.
· Event. An event is ‘an inserted major occurrence or a sequence of related incidents.’
[362] It is the highest level of the hierarchy explained below. Events should equate to, or
directly support, the primary wargame objectives, waypoints and themes.
· Incident. ‘An incident is an element or subset of an exercise event.’ [363] Several
incidents combine to create the higher-level event.
· Vignette. Per Chapter 19, a vignette is a discrete action, or series of connected actions,
confined to a very specific and limited situation. NATO does not use the term, but it is
common parlance and useful when deconstructing an incident, which is often necessary.
· Inject (NATO ‘injection’). An inject is ‘the way of bringing an incident to the attention of
the players for whom it was created (and from whom a reaction is expected). The intent is
[364]
to simulate the likely source of such information in a real situation/operation.’ (My
italics) Event, incident and vignette are umbrella terms, an inject is a serial as if delivered
by an actual entity in the wargame e.g. a report.
A hierarchy of injects
A standard hierarchy used to develop and manage injects is shown at Figure 21-1.
I explained the potential confusion concerning the use of ‘White Cell’ as a Control label in
Chapter 3 (Wargaming Misnomers and Misunderstandings). The NATO definition of White Cell
includes ‘local police’, who I classify as ‘Green’ (host nation), and ‘local civilians’, who I call
‘Brown’ (local populations). The key point is that you must define and communicate what your
cell colours mean throughout the entire Wargame Lifecycle.
Cell leads
Control can consist of many cells beyond the ‘colours’ described above: media, senior mentors
etc. Cells often consist of disparate people from diverse organisations. While cell numbers and
composition will differ from wargame to wargame, one suggestion common to all is to nominate
and work through a designated cell lead. That person is responsible for coordinating the cell’s
effort, usually reporting directly to the Game Controller, sometimes via Chief Control. Without a
lead, cells can easily fragment and produce incoherent products. Careful consideration must be
given to the selection of cell leads (criteria can include rank, expertise or personality, or players
might select their lead – there is no right or wrong answer) and the processes by which they
interact with the facilitator and Game Controller.
Cell liaison staff
It is often worth assigning a liaison officer(s) from the wargaming delivery team to cells and
wargame support staff. White Cells, analysts and Sim Press are prime examples: they usually
consist of non-military folk, and assigning someone to translate ‘mil-speak’ avoids
misunderstanding and improve the effectiveness of such cells’ products.
Information flows
The range of possible information and communications media precludes an explanation of every
possibility. However, two key areas that are often poorly executed are Requests for
Information (RFIs) and Information Management (IM).
Requests for Information
RFIs are requests, generally from players, to someone, or a cell, within Control for information.
That statement has hidden complexities, and I have seen wargames falter due to poor RFI
systems and management. Four observations:
· If you think there will be more than occasional RFIs, consider using a dedicated RFI
Manager, or even organisation. The RFI Manager tracks RFIs and can ‘name and shame’
the owners of RFIs that remain unanswered if you think that appropriate.
· A standard e-mail system is unlikely to suffice if there will be more than infrequent RFIs.
A bespoke tool is likely required that will allow RFIs to be: tracked; replied to, forwarded
and shared; accessed and answered collectively; and sorted into different cells or
headquarters with appropriate access permissions.
· Differentiate the RFI system from the – simulated or real – command and control
system(s). Orders etc should not be passed using the RFI system.
· Test the RFI system rigorously during playtesting.
Information Management
IM differs from RFI management and specific data capture in that it encompasses all sources,
types and repositories of information. All but the smallest military headquarters have an IM
Manager. Unless your wargame is very small, you should consider having one, probably with a
specific IM sub-process. If I haven’t been able to do so during development, one of the first
things I do when arriving to run a wargame is check the filing system – often to find that there
isn’t one, or not one configured for the wargame. Remind (enforce) players and Control about IM
as soon as possible after they arrive.
IM is integral to the Data Collection and Management Plan (DCMP) process. Specific IM
processes will flow from this consideration, for example physical and colour-coded player
message slips with ‘To’ ‘From’ and ‘Time’ fields might be required to capture player intentions
and decisions.
Player interfaces
Figure 22-1 shows a single box labelled ‘Control/Player Interfaces.’ This box hides significant
complexity: interfaces can range from incredibly simple to exceedingly complex, in terms of
both tools and processes.
Shared situational awareness, Common Operations Pictures and Recognised Pictures
There is a fundamental requirement to share evolving situational updates. These need not
necessarily be distributed, but usually are in larger professional wargames. That distinction – to
distribute or not – is crucial. Many small wargame variants are played around a single table or
board, upon which all information is displayed. In these cases, the map, or board, and
associated components deliver shared situational awareness (SSA).
There are many factors that might preclude a single table approach: geographically distributed
participants; the number of players; the necessity for a closed game using several tables; and
the need to show an ‘intelligence’ picture. In these circumstances a method must be found to
distribute the situation and/or filter information. Terms such as a Common Operations
Picture (COP) and Recognised Picture (RP – Air, Maritime, or Land) are used. Numerous options
exist to record and distribute SSA. Per my usual caution against technology- and innovation-
bias, you should ensure that appropriate tools are selected. I have listed some of the available
options below, generally progressing from simple to complicated. Note that even a ‘simple’ SSA
solution still requires effort to populate and manage. There is often a ‘credibility factor’ that
precludes low-tech solutions. That is a matter of a frank discussion with the sponsor to
determine how much resource they are prepared to devote to ensuring the wargame is
presentationally acceptable.
Methods of sharing situational awareness
[378]
Single table. The simplest solution is to use a single birdtable and a manual, time-stepped,
simulation. The number of players is realistically limited to those who can see the information
on the table. Furthermore, such games must be open or semi-open because it is too clunky to
run all sides in a closed wargame on one table; this is better done using multiple tables. The
situations arising on a single table, and the associated conversations, can be recorded
relatively easily, but any requirement to distribute SSA (for example to player cells for separate
study) requires some sort of electronic solution.
Multiple tables. Control (possibly a single umpire) manages a central table which is a 100%
accurate ‘Truth view.’ Reports are delivered from there to players’ tables, which are screened
off or in separate rooms. Players maintain their own tables, using reports received and their
own assessment. Players’ orders are passed back to Control, which adjudicates outcomes.
Control reports and player orders can be generated and passed manually, but could be
automated. Recording situations requires an electronic solution.
Stills photography. Stills photography can capture situations as they arise and/or at the end of
turns. Some sort of ‘clapper board’ (think of shooting film scenes) placed on the table is
required to ensure each photograph is date-time stamped. An example is in Figure 23-2.
Motion-picture photography. Video cameras can be used to project the situation onto a screen
in real time. These also enable stills photography. See Figure 22-2.
Figure 22-2. Motion-picture cameras project the birdtable situation
Scribes. Scribes need to be trained, unbiased, and have access to players during and after
game play. Multiple scribes might be required to capture all necessary conversations. For
example, separate scribes will likely be needed on every table in a multi-table wargame to record
player conversations and adjudication.
Voice recorders. Presumed to assist note taking, the quantity of material recorded can be
overwhelming and so remain unused. And that is assuming that the equipment is sensitive
enough to capture everything said. Passing a microphone to each speaker is awkward. While a
scribe or analyst might use voice data capture to confirm an occasional key statement, I would
caution against relying solely on this for recording purposes.
PowerPoint. It is simple to produce slides that capture the situation on a birdtable(s). Graphics,
such as schematics, are easily overlaid. However: this might be deemed presentationally
unworthy; and it can take considerable time to populate and annotate slides, often requiring one
or more person full-time.
Player orders and ‘Intent’ slides. One technique used by Dstl to record and distribute player
decisions and plans is the ‘Intent’ slide. An example is shown below in Figure 22-3. A second
slide generally contains a schematic, or map to illustrate the plan or scheme of manoeuvre.
Intent slides are populated prior to each wargame turn, or phase, then refined as required to
reflect amendments and capture insights arising. The US Naval War College’s War Gamers’
Handbook suggests a similar concept of ‘Move Sheets’ to capture decisions and their rationale.
[381]
Figure 22-4. NSC’s iNet.
Control processes
Much of the discussion in this and the last chapter pertains to methods, models and
tools (MMT). Irrespective which of these you select to support your game, a robust and
effective execution process is essential for delivering a successful wargame. It is impossible to
cover all the Control processes that could arise across the gamut of game formats and variants,
but I cover situations that commonly arise. Extrapolate from these to ensure they are
appropriate for your wargame.
Control/Excon ‘battle rhythm’
It is hard to conceive of a professional wargame without some sort of higher-level controlling
function. Potentially, a small group of officers sat in a mess or wardroom playing a hobby game
of their own volition might not need one. But even that simplest of examples would benefit from
some sort of AAR to consolidate learning. In which case, time needs to be allocated to the end
of the wargame and, ideally, someone might be given the role of recorder. Why not call that
person ‘Chief AAR’, and ask them to also facilitate the AAR? That person is now part of a
Control team. They will need time allocated in the process, or ‘battle rhythm’ to ensure events,
communication and insights are captured. Examples of battle rhythms from various events,
both analytical and training, are at Figures 22-5, 22-6 and 22-7.
Figure 22-5 shows a relatively straightforward daily timetable. As well as the wargame turns, the
key serials are: prioritisation of observations, insights and lessons identified arising;
presentation of new capabilities, and consideration of how these might have influenced the
previous wargame turn; an analyst consolidation period; and the key Excon Coordination
Meeting that determines the next day’s programme. This is a relatively simple example, although
there are complexities in each serial.
Figure 23-2: An example date/time ‘clapper board’ and labelled player positions
Final tidy up and equipment check
Just prior to starting, tidy up the play space and check that all equipment is available and to
hand. Once stood at the table with play started, you are fixed in position until the next break.
Final mental rehearsal
Mentally visualise and rehearse what is about to happen just before you start. Do this alone if
necessary, but a sotto voce run-through with a trusted member of the delivery team is ideal. This
is simply a ‘walk-through-talk-through’ of the first turn’s process in order to have this fresh in
your mind.
Other considerations
Dice, and other random number generators
I discuss dice and other random number generators (RNG)s in the next chapter. The key points
to note here are that dice and RNGs add value; but there is a misplaced and lingering perception
among the inexperienced that dice connote childish games, so be cautious when introducing
them.
Fatigue and breaks
Wargames should be fun, but they are also hard work. Look out for tell-tale signs and body
language that indicate the onset of fatigue. Facilitation itself is relentless and requires
significant mental effort. Plan sessions of manageable duration and, if necessary, take
impromptu breaks. As a rule of thumb, if a turn takes, say, 60 minutes, then two turns in the
morning and two in the afternoon is a comfortable number. Breaks are not dead time, and
enable:
· Refreshment.
· Players to plan, think and reflect.
· Players to record the reasons for their decisions.
· The scribes to check logs, and to confirm details with individuals.
· The facilitator to think through the forthcoming turns.
· Operational analysts and simulation experts to consider forthcoming event outcomes.
Ensure final recording
By the end of the day, everyone will be tired. You must ensure that the Game Controller and
scribe(s) are content that the necessary wargame outputs have been captured. Do not allow
people to leave until the immediate recording process is complete. The scribes’ logs, photo
records and so forth must be in a fit state at the end of each day, even if they are to be fully
written-up later. The detail will be forgotten unless it is clearly captured immediately.
One team approach
Participants in analytical events usually comprise one team, despite players filling diverse and
adversarial roles. Where appropriate, encourage cross-cell cooperation. For example, an air
SME designated ‘Blue Air’ should assist the Red Cell if the latter does not have an appropriate
SME, or Blue and Red might consider their next actions collaboratively if this is appropriate to
the event objectives.
Neatness
The game space should remain uncluttered, so tidy-up constantly. This ensures efficient game
play and conveys a professional impression. Remain cognisant of the balance between having
enough counters, informational aids etc necessary to convey information, and presenting a
cluttered play space that detracts from SSA. This does not just apply to military units: human
terrain, humanitarian agencies, militia groups, police forces etc can easily overwhelm a map.
Appendix 1 – Wargame facilitator’s aide-memoire
Planning
· The facilitator, Game Controller and sponsors must work together from the outset.
· Turns. Keep the event aim and objectives firmly in mind.
o Are all the objectives captured within the proposed turns?
o Are any over-crowded?
o Do some need breaking down?
o Are there too many turns in any one day?
o Is the turn sequence coherent?
o Is it possible to include a test turn?
· The delivery team. Have all roles on the delivery team been identified and filled?
· Are all briefs ready?
· Players. Are there enough players to represent the required actors and factions? Are
they the right people?
· Recording. Has scribing been fully considered?
· Detractors. Has a brief been prepared?
· Testex. Essential.
Final Preparation
· Shared situational awareness. Players will only play what is on the birdtable.
o Maps.
o Counters.
o Tracks.
o Ergonomics.
o Labels.
o IT for distributing SSA.
· Equipment:
o Aide-memoire, which should contain:
§ The overall process.
§ Turn guide.
§ Look-up tables.
§ Rules extracts as required.
§ Key references (such as the answers to detractor comments).
§ Orders of Battle.
§ (In an analytical game) the research question and key extracts from the DCMP.
§ (In a training game) the learning objectives.
o Pointers (manual and laser).
o A plasticised ‘turn board’ showing the current turn, date, timeframe etc.
o Informational counters, cards etc as required for placement onto the map.
o Spare counters, some blank.
o RNG: dice, mobile phone application, laptop or whatever is appropriate.
o Notebook and pens, including non-permanent markers.
o The RCAT ‘4-box’ plasticised board or similar matrix game-style board.
o Flip-charts and white boards.
o Coffee.
· Final tidy-up and equipment check.
· Final mental rehearsal.
Execution
· Keep the aim and objective at the forefront of your mind.
· Read the room.
· Make a list of everyone’s name.
· Know the overall process and game turn processes, but use an aide-memoire as well.
· Confirm each turn’s timeframe.
· Think one serial ahead.
· Think one turn ahead.
· Manage the turn flexibly. Split the turn or change the player turn order?
· Manage debate:
o Manage time ruthlessly.
o Avoid irrelevant discussion.
o Relegate relevant but non-urgent topics to sidebars.
o Empower all participants.
o Avoid simultaneous discussion.
o Avoid private discussions.
o Ensure everyone speaks up.
o Ensure cross-communication.
· Assist the scribe.
· Assist the analyst.
· Refer to the Game Controller constantly.
· Ensure a story-living experience.
· Use dice and RNG, but carefully.
· Use 'plug-in' outcome generators as required.
· Watch out for fatigue and take breaks as necessary.
· Ensure final recording before people leave.
· One team; help each other.
· Neatness:
o De-clutter the birdtable.
o Tidy up constantly.
Chapter 24. Generating Outcomes
‘All models are wrong, but some are useful. The practical question is how wrong they have to be
[388]
to not be useful?’
George Box, professor of statistics (1987)
Introduction
This chapter is not about designing models and simulations; for that, read Professor Phil
Sabin’s Simulating War. What I do here is suggest how you might use models and simulations
to ensure they best support your wargame.
The next two chapters augment the adjudication discussion in Chapter 4, and explain the
methods, models and tools (MMT) shown in Figure 4-1. Adjudication was defined as ‘the
[389]
process of determining outcomes, usually by an objective human-in-the-loop.’ In this
chapter I focus on generating the outcomes from players’ decisions; in the next on presenting
those outcomes to controllers and players, and affirming these before they are played into the
game. I briefly cover the adjudication MMT in Figure 4-1, but concentrate on warfare models,
which require the fullest explanation.
As George Box alludes to, anyone who claims that their model accurately predicts reality is, at
best, misguided and, at worst, a charlatan. This is especially the case with models that claim to
simulate the human characteristics of warfare. That is why my preferred method of adjudication
is the deliberative approach: the results presented will be wrong, but are useful as a start point
for expert judgement that leads to an outcome that, even if not consensual, will have been
subjected to a forced discussion. Your task is to understand the strengths and weaknesses of
the models and simulations you use; to leverage their strong points and mitigate their
limitations.
Two people are better than one
This point applies to generating, presenting and affirming outcomes, although it is less
important in the last two cases. Irrespective of the simulation used, I find it tremendously useful
to work as a pair, because this:
· Leverages the combination of simulation and military expertise. The most effective
double-acts are those that combine a simulation expert or operational analyst with
someone military. Clearly, you can delete military and insert the relevant organisation:
humanitarian, business, emergency services, aid worker etc.
· Spreads the workload, reducing the scope to make errors.
· Reduces the risk of forgetting factors that play into outcome determination.
· Provides a sanity check as a minimum and, ideally, expert confirmation, to make sure the
outcome is within acceptable parameters.
· Prevents lengthy outcome generation by committee. Two is a good number; more will
unduly prolong the process.
· Improves recall and recording. Adjudicators working in pairs must explain their reasoning
to each other to reach agreement. This forcing function of required discussion improves
recall for data capture.
Manual adjudication methods, models and tools
Human-in-the-loop (HITL) adjudication and moderation
HITL adjudication can entail the human being the loop, with outcomes determined by one person
or a small group. These can be based solely on an individual’s, or group’s, judgement, or take a
suggested outcome from another simulation method and moderate it. The following apply:
· The outcome arrived at must balance achievement of the game’s objectives with the
primacy of player decisions, while remaining credible.
· If more than one person, keep adjudication groups small to minimise what can become
prolonged discussions.
· The Game Controller must be involved, and is the final arbiter. See the next chapter for
more details.
· Adjudication discussions must be captured by scribes.
Role-play
As discussed, role-play can be considered a real-time simulation. Role-play is a perfectly valid
method of determining outcomes in any wargame. However, unless the entire construct of the
game is role-play, I find the technique most effective when used as an adjunct to other, more
easily controlled, forms of adjudication. Research demonstrates that role-play is at least as
effective as experts’ forecasts. Kesten Green’s findings are that, 'Contrary to expectations,
earlier research found game theorists' forecasts were less accurate than forecasts from student
role-players.... Current evidence suggests that decision makers would be wise to prefer
[390]
forecasts from simulated interaction [role-play].'
To get the best from role-play, ensure that people are actually in role and invested in the
decisions and outcomes, not merely acting a role. Tom Schelling said, ‘Everybody was expected
to play the role that had been assigned to him, and if you were assigned the role of Foreign
Minister of the Soviet Union, you displayed your talent by speaking the way you thought a
Foreign Minister of the Soviet Union would participate… What I wanted was for the participants
to be truly, personally engaged in the decision making, not to play a role, but to be a role and
engage in the discussion [and]… the argument of, “where do we stand and what do we do next
[391]
and how do we read what the adversary is doing” ’. Role-players should make what they
think is the best decision in the circumstances, not the one they think whoever they are role-
[392]
playing would make.
To achieve this, you need to ensure role-players are sufficiently briefed to step into role, with
enough Control guidance – explicit or implicit – and objectives to at least set the role-play off in
the right direction. Once the simulated interaction has started, it is difficult to credibly intervene,
per Chapter 22.
Argument-based adjudication
Argument-based adjudication is most closely associated with, but pre-dates and is certainly not
exclusive to, matrix gaming. Being quick, simple and effective, it is easily adapted to many
situations. Tom Mouat explains a simple application of the technique as follows.
‘Actions are resolved by a structured sequence of logical arguments. Each player takes turns to
make an argument, with successful arguments advancing the game, and the player's position. In
the “pros and cons system”, each argument is broken down into:
· The active players state: something that happens; and a number of reasons why it might
happen (pros).
· The other players then state a number of reasons why it might not happen (if they can
[393]
think of any) (cons).’
These arguments can easily be turned into an outcome determined by a facilitator/adjudicator or
Game Controller (especially if there are no counter-arguments, which is consensual
adjudication) or used to derive a probability (percentage) chance of occurrence, with the result
being chosen or determined stochastically. The 4-box method discussed later in this chapter
and illustrated at Figure 24-5 is an example of this.
The aggregated modelling line
I differentiate combat models between those that generate outcomes above and below an
‘aggregated modelling line’.
· Models that work above the line base outcomes on algorithms that aggregate (combine
together) equipment and personnel capabilities into a score, or factor, and then compare
[394]
these to the opponent’s. A common approach is to use Force Ratios (FR) as a basis
for outcome determination, moderated by – or in conjunction with – various other factors.
· Models that work below the line consider individual platforms and the effects of discrete
shots, missiles or whatever. There can be blurring, for example vehicles are often modelled
individually, but alongside an infantry fire team or section. A common approach is to use a
series of ‘p’ [something] to determine outcomes. ‘p’ stands for probability (usually
expressed as a percentage chance) and the ‘something’ is typically ‘detect’, ‘hit’, ‘kill’ etc.
A precursor ‘Line of Sight’ (LOS) check is required. Modifiers are usually applied to ‘p’
steps and the LOS check; more on this later.
Being aware of the broad modelling approach of the simulation you are using, and the key
modifiers, helps you assess its strengths, weaknesses and potential idiosyncrasies.
Force Ratios
FR form the basis of many combat models that work above the aggregated modelling line (and a
few below). The aggregated combat factors for both sides’ engaged forces are totalled and
compared to derive either a fixed outcome or a spread of possible outcomes from which one is
selected, or determined using a stochastic method. FR-based models have weaknesses:
· FRs do not always explain historical combat outcomes. Any number of indeterminable
factors might account for success against the odds.
· Many other factors need to be considered, such as leadership, surprise, morale, training,
equipment, advantageous positioning and so forth. While some of these might be
incorporated into aggregated numbers, FRs are just another factor, and arguably of lesser
[395]
importance. Jim Wallman’s Army 2020 system, for example, includes FRs as but one
modifier among others such as surprise and troop quality to derive a numerical score for
both sides. The result is then determined by the differential between the two totals.
· How do you derive a numerical combat factor from aggregated capabilities and human
factors, and how do you know this is accurate? For example, intangibles such as levels of
training and motivation might be incorporated. Even the Dstl Balanced Analysis Modelling
System (BAMS), which is the basis of many FR models, has those who question its validity
inside Dstl.
· Aggregating capabilities and characteristics into a gross combat score risks losing sight
of individual factors when subsumed into a single number.
However, I find FRs a good starting point for determining combat outcomes at the aggregated
level if they are:
· Clearly and transparently explained;
· Used to inform expert judgement; and
· Presented using a deliberative approach rather than as a definitive result.
FRs are fundamental to the Rapid Campaign Analysis Toolset (RCAT), which was subjected to a
rigorous two-year long Dstl Verification and Validation process. RCAT has been in constant use
since 2012 in front of military players and hobby grognards; it has ‘survived contact’ in every
instance.
Typical outcomes from FR models can include any or all of:
· Reductions in the capabilities or Combat Effectiveness (CE) of forces involved. CE might
be displayed as casualty figures (but it includes leadership, morale, fatigue, logistics etc),
a percentage figure or a fraction of the capability remaining.
· An effect on one or more of the force elements (FEs) involved, usually ‘destroyed’,
‘defeated’, ‘neutralised’, ‘fixed’, ‘suppressed’ etc.
· Retreats and advances during and following combat.
Force Ratio Risk Levels explained
Cognisant of FR limitations, the table below remains a useful reference. It was created by Paul
Syms as a quick ready-reckoner when part of a Dstl Operational Analysis (OA) team supporting
a Higher Command and Staff Course wargame.
[396]
This table now appears in the Planning and Execution Handbook – but with no explanation.
Hence, it is often misused, with potentially dangerous results, so here is a quick example to
show how to use it.
A force conducting a prepared attack against a hasty defence (find this on the left-hand axis) at
a FR of 3:1 (find 3.0 on the bottom axis) has a borderline ‘Nominal’/ ’good/safe’ probability of
success (read up from 3.0 and you will intersect the type of attack line at this boundary).
I chose this example because it explodes one of the military’s ‘urban myths’, that attacking at a
3:1 ratio will deliver success. OA conducted during the 1991 Gulf War and thereafter indicates
that a FR of 5:1 is required to deliver a ‘V Good’ probability of success – but the ‘3:1 equals tea
[397]
and medals’ syndrome persists to this day. The key in Figure
Figure 24-1. Force Ratio Risk Levels
.
24-1 only loosely defines the outcomes. That is the correct approach because the graph’s
purpose is to inform military judgement prior to a HITL decision.
P [insert effect]
Using the ‘p’ probability of achieving an effect entails modelling at a greater level of fidelity than
FRs. Typical outcome generation steps are:
1. LOS check. Does the firer have an unimpeded view of the target?
2. p (detect). Some models differentiate between detect, recognise and identify (DRI):
there’s a vehicle moving; it’s a tank; it’s a T-72. Whichever, this check establishes whether
the firer can sufficiently sense the target to engage it.
3. p (hit). Does the firer hit the target? This might be calculated as a p (hit) per shot, or
multiple shots within a given time frame, for example including jockeying for position or a
submarine obtaining a successful firing solution. Several shots might be fired using the
latter method, but only one p (hit) calculated.
4. p (kill). What effect does the shot have on the target? Some models differentiate
[398]
between ‘kills’: ‘mission’ (K kill) ; ‘mobility’ or ‘movement’ (M kill); and armament (F
kill). Other outcomes can include suppression, degradation of sensors, abort etc.
Figure 24-2 shows a platform-level outcome resolution look-up table. Note the ‘LOS check?’ top
centre, p (hit) table and a combined p(detect) and p(kill) table; many variations on the basic
approach are possible.
[399]
Figure 24-2. A look-up table showing LOS check, p (hit) and p (combat ineffective)
Deterministic and stochastic methods
All simulations based on rigid or semi-rigid adjudication use a deterministic or stochastic
method to generate outcomes. These can be used in combination.
· A deterministic model is one in which there is no random variation. Every potential state
is uniquely determined by parameters in the model, and the result of a given interaction will
be the same every time the model is run.
· A stochastic model features randomness. Variable states are described by probability
distributions, and the result will be different each time the model is run because the
[400]
element of chance is incorporated.
Random number generators, and how to introduce and use dice
There are many random number generators (RNG) you can use, including computers, look-up
tables, mobile phone apps – and dice. But dice are what I want to focus on, because they are
tremendously useful but carry unjustified baggage. Annex A of Francis McHugh’s 1966 The
[401]
Fundamentals of War Gaming is devoted to ‘Chance devices’. He starts with a coin,
mentions dice and then quickly moves on to tables of random numbers: ‘Currently, the most
widely used wargaming device is a table of random numbers [that] are often compiled by
electronic devices or computers. Random number generators can also be prepared, three digits
at a time, by rolling three unbiased 20-sided dice, and writing down the digits… Random number
[402]
tables are also used in a similar manner to 20-sided dice…’ I would like to have asked
Francis, “If you use dice to construct these tables and determine percentages – why don’t you
just use dice during the game?” I suspect that the answer might have centred on the
(un)acceptability of dice, even back in the 1960’s. If this was the reason, I echo Francis’ caution.
Introducing and using dice
You should introduce and use dice cautiously. Pulling dice from your pocket will usually elicit a
response from military players. Initially, this can be visceral scepticism and incredulity; latterly,
once they recognise the utility of dice, it tends towards excitement and even trepidation.
Dice (singular, die) are but one method of generating random numbers – and these methods all
do the same thing. Open an Excel spreadsheet and enter the formula =RAND () into cell A1. Go
to cell B2, enter =(A1)*100. Right click in B2 and Format Cells/Number to show zero decimal
places and /Font to size of 72. Zoom until cell B2 fills the screen. Now press F9 to recalculate
the formula and cell B2 will generate a number between 1 and 100. Ironically, computers only
generate pseudo-random numbers, while dice generate truly random numbers.
So what? I am constantly amazed at players who readily accept a number generated by a
computer that determines a stochastic outcome, but who will shout, “You killed my orc!” if that
exact same number is generated by dice. Players also happily accept mobile phone apps; in
fact, any RNG – so long as it isn’t dice! I have a short cut to an Excel spreadsheet on my
desktop so that I can generate a random number within seconds. But such measures are only
necessary until you have properly introduced players to dice. Until an opportunity like those
below presents itself, I do not even mention dice, and certainly do not have any visible at the
start of a wargame. When the opportunity arises, I find the most effective way to introduce dice
is as tools to illustrate risk management and Clausewitzian chance.
Dice as risk management tools
Managing risk is inherent in any military or business career. Indeed, anyone in Defence,
business, the emergency services or humanitarian assistance who is not comfortable managing
risk is probably in the wrong job.
1. Assume that an interaction produces an outcome that is assessed (by whatever
adjudication MMT you are using) to have an approximately 70% probability of success.
2. At this point, present a pair of percentage dice or a 10-sided die to whoever ‘owns’ the
risk of failure. That discussion, of who owns the risk, is worthwhile it its own right.
3. Make sure the risk owner – and everyone else listening – understands the risk, and the
consequences of an unfavourable outcome. Ask the risk owner to heft and feel the dice
(they should be large) and visualise rolling them in front of their peers, superiors and
subordinates – but not to throw them.
4. Ask the risk owner if they are happy to roll the dice, hoping to get an outcome of 1-70
(assuming a 70% probability of success). If not, why not? Make sure the discussion is
captured.
5. Then stop. Take the dice back before they are rolled. They have done their job. They
have helped the decision-maker understand the risk, and visualise taking it cognisant of
possible consequences.
6. Explain this, and the fact that, in this instance, you will not throw the dice because that
moves us from risk management to prediction. When you come to a point where you need
to move the wargame narrative forward, and so need an outcome, dice will be rolled; but
that is different to understanding and managing a risk.
This is a simple but powerful procedure. I find actual commanders (those who are committing
their own men and women into danger, albeit in the wargame) uncomfortable with an assessed
probability of success of less than 80%. They look for ways to increase the chances of success;
they are managing the risk. Those not in actual positions of command are far more blasé, and
just want to chuck the dice to see what happens. There are lessons there, both for the real-
world and who you place in command positions in a wargame.
This process gets you into the anatomical guts of decision-making (with factors and rationale
captured, of course), illustrates risk management – and tees up a hook that, when the wargame
demands a narrative, you will roll the dice. That is a subliminal point: now we understand their
utility, we will not scoff when dice next appear.
Clausewitzian chance is exemplified by dice
Using dice to resolve outcomes might be inappropriate in some games (for example in some
analytical contexts or when adjudication is wholly reliant on expert judgement), in which case
don’t force them on people. However, let’s assume an opportunity arises to use dice when
exploring ‘chance’. Ensure that whatever adjudication MMT you are using incorporates a
transparent spread of outcomes. Then:
1. Introduce Clausewitz. Ask the players what constitutes Clausewitz’s ‘Paradoxical Trinity’.
Note that definitions and interpretations of this differ, but the three points I use are: logic,
[403]
passion and chance. Focus on ‘the play of chance and probability (see footnote).’
2. Present a pair of percentage dice or a 10-sided die. Explain that the function of the dice
is to introduce chance and probability. This incontrovertible link to Clausewitz cannot be
argued. Then give the dice to the player initiating the action to be resolved (or to the risk
owner, as above).
3. Explain the spread of possible outcomes suggested by whatever adjudication MMT you
are using, and relate the dice to these. This is where a single 10-sided die works well
because some people struggle to apply percentage spreads. So, your patter might run
thus, if using a 10-sided die: “You have a 10% chance of destroying the enemy (a roll of 1).
You have a 50% chance of forcing the enemy to retreat (a roll of 2, 3, 4, 5 or 6). You have a
30% chance of neutralising the enemy (a roll of 7, 8 or 9). And you have a 10% chance of
merely fixing the enemy (a roll of 10).”
4. The player will now want to throw the die, desperate to see if the attack succeeds. Do
not allow this - yet.
5. Rather, present and explain the expected outcome. Then ask the Game Controller if they
want to accept the expected result and use that as the outcome. If the Game Controller
decides to do so, put the die away.
6. However, if the Game Controller decides to allow the intrusion of chance, let the player
role the die. Then explain the outcome.
7. Remind everyone of the expected outcome and double-check with the Game Controller
that they are still content to let the randomly determined (stochastic) result stand. This
reinforces to everyone: the divergence from the expected outcome; and that the Game
Controller remains in control of the wargame – it’s not just chance and RNG.
8. Finally, gently mock bad rolls, and extoll the tactical genius behind good rolls. This is
light-hearted and engages the players. Over the course of the game, you will find that
some players role consistently well or poorly (apparently so, at least). Player
engagement builds with such outcomes, and you can reinforce the role of chance and
probabilities. Everyone knows that Napoleon said, ‘Give me lucky generals’, so now you
can link dice to both Clausewitz and Napoleon (and generate subsequent discussions on
the role of luck, war being most like a game of cards etc, and so prompt educational
opportunities from the wargame).
Once participants are comfortable using dice, they often demand they are used, even in
analytical wargames. This is important, because conflict does not produce ‘expected outcomes’
and our training and analysis should consider outcomes from the ends of distribution curves.
Furthermore, players and analysts usually insist on using unmodified die rolls, to reflect the
uncertainties of war and ensure that less likely outcomes are considered. Using expected
outcomes, or ones that are moderated towards that, precludes this essential discussion, as
explained in Chapter 4.
6-sided, 10-sided or percentage dice?
Although the 6-sided die is accepted as a cultural norm for games in most parts of the world,
10-sided and percentage dice have advantages. I find people understand percentages and can
relate the spread of probabilities and outcomes to a 10-sided die. Simplicity aids transparency.
One 6-sided die is reasonably easy to understand: everyone knows that there is a 33.3%
probability of rolling two numbers and a 50% probability of rolling three of the six. Clearly,
there’s a 16.66% probability of rolling one of the six numbers. But those are clumsy numbers to
bandy about when explaining possible outcomes.
This becomes even more awkward when you add a second die; see Figure 24-3, below. Rolling
dice in combination produces a bell curve, whereas using percentage, or a single die, produces
a linear outcome: there is an equal chance of rolling a 1 as a 6. In wargame terms, you can use
the two-dice approach to generate results that tend towards the expected (centre) outcome,
with outliers a less-common occurrence. But how can you ever be sure what the ‘expected’
outcome is? If you can’t know that, is it not better to accept a wider spread of linear
possibilities? Military players embrace more extreme outcomes because it accustoms them to
adverse outcomes and instils a resilient mindset.
I find that explaining percentages and probabilities using two 6-sided dice detracts from
transparent, easy-to-understand outcomes. It’s relatively easy to explain to a player that a
Special Forces raid has an 80% probability of success so they must role 1-80% on percentage
dice or 1 – 8 on a ten-sided die to succeed. The dice skitter across the table and the result, and
the factors that quantified it, are clear and apparent to all. That essential discussion is more
difficult if you are using combinations of dice. From the graph in Figure 24-3, you can see that
there is a 83.1% probability of rolling a 2 – 9, so that would be the dice roll needed for success
in the Special Forces example above. But it is difficult to explain the 2 through 9 roll required
and the associated 83.1%.
Figure 24-6. OCT with Professor Phil Sabin, General Julian Thompson and Commodore Mike
Clapp
‘We liked the manual simulation very much and wish we had had such a system in Ascension
[Island] with Fieldhouse, Moore, Trant, Curtiss, Woodward, Commander 5 Bde and us sitting
around the map table thrashing through possible courses of action and, hopefully, agreeing a
thoroughly well-considered plan.’
General Julian Thompson and Commodore Michael Clapp
Chapter 25. Presenting and Affirming 0utcomes
‘Rules are for the guidance of wise men and the obedience of fools.’
Attributed by Douglas Bader in Reach for the Sky
to Harry Day, Royal Flying Corps WW1 fighter ace.
Introduction
Before outcomes are presented to players, they are often affirmed by the Game Controller. The
act of presenting outcomes to the Game Controller is different to presenting them to players:
they are prospective in the first case, and definitive in the second (although sometimes subject
to a deliberative approach). However, much of the discussion below pertains to both.
Presenting outcomes
Transparency
Transparency is critical. There can be situations where players might not be privy to the factors
and calculations that determined an outcome, for example in a closed game. However, if the
purpose of a wargame is to explore player decisions and their rationale, the greater the players’
understanding of an outcome and its associated factors, the better informed will be their
decision-making. Even if players are to remain unsighted, the Game Controller needs to
understand the derivation of outcomes. Analysts, Observer/ Mentors et al even more so. The
explanation of outcomes could include the factors incorporated and an honest estimate of the
confidence in the result. If possible, outcomes should be presented and explained completely
transparently.
Spread of probabilities and possible outcomes
Operational Analysis (OA) best practice is to present probabilities and outcomes as a spread:
best case, most likely, and worst case. Explaining this spread – and the factors that determine it
– to Game Controllers and players is tremendously informative. It helps them understand the
possible consequences of the decision that leads to the result, and moderation of a
stochastic result is better informed.
The deliberative approach: presenting outcomes for comment
Applying SME judgement and collective common sense to outcomes delivers high dividends.
The Douglas Bader quote continues the sentiment of George Box’s in the previous chapter.
While there are instances when rigid adjudication is appropriate, in professional wargames I find
that generating, presenting and affirming outcomes is usually best conducted using rational
science as a start point – but then applying a good dose of military and common sense. If no
rule or model can guarantee an accurate prediction, don’t pretend that they can; rather,
leverage yours, and your participants, collective expertise in an honest and deliberative
discussion, as explained in Chapter 4. This delivers better outcomes and analysis by
acknowledging that simulations are not predictive; ensuring that players retain ownership of the
wargame by being part of the adjudication process; increasing engagement; pre-empting cries
of, “It’s all chance!”; and forcing a discussion that enables analysis.
Precise military language
The military use precise language, and for good reason. There are fundamental differences
[414]
between ‘defeat’ and ‘destroy’, ‘feint’ and ‘demonstrate’, and so forth. If the military use
well-defined terminology, we should reciprocate by understanding and using theirs, while
explaining and using ours correctly. The key point is that you must understand and speak
‘military’ and use the wargame lingua franca precisely. This reinforces the utility of working as a
pair, with at least one of the two being fluent in ‘mil-speak’. Clearly, the same applies to
wargaming with other professions.
Deconstructed computer simulation
We know that computer simulations tend to use Line of Sight (LOS) checks, p (effect) and/or
Force Ratio (FR) comparisons as a basis for their models. What then, is the difference between:
a tactical-level computer simulation conducting a LOS check on a geo data base, and then
cross-referencing an electronically generated random number with a data base to determine
whether a platform is sensed, hit and killed; and a manual simulation that requires a human to
conduct an indivisibility check on a map, and then cross-referencing a die roll with look-up
tables to determine whether a platform is sensed, hit and killed? Procedurally, none.
Mechanically, the map replaces the geo data base, the look-up tables replace the data base and
the die replaces the electronic random number generator (RNG).
Intuitively, a LOS check should be more accurate using a digital geo data base than an
indivisibility study. But experiences shows that geo data bases do not always include buildings
(!) or vegetation, and so permit LOS and fire through woods and even urban areas. Why is a
computer’s internal data base more accurate than a look-up table? In fact, it is no more than
data from a look-up table turned into 0s and 1s. And, as we have seen, a RNG is a RNG.
Similarly, a computer simulation that uses FRs to calculate outcomes follows, to all intents and
purposes, identical procedures to those used in many manual simulations.
When you are transparently working through an outcome using a manual simulation, the players
are watching a de-constructed computer simulation. Make this point – but then go further. They
are not just watching, they are part of the process. At every step they see and understand the
factors involved; can question those factors and any assumptions made, suggesting modifiers;
understand the spread of outcomes; and watch the result being determined. That is very
powerful in terms of increased understanding and engagement.
Affirming outcomes
Except when a rigid method of outcome determination is used, affirming outcomes is a key
element of adjudication. The following discussion applies to any sort of simulation outcome,
including real-time simulations.
What do I mean by ‘affirm?’
[415]
‘Affirm’ means ‘assert, declare the truth of something, state as fact.’ A Thesaurus entry
adds ‘pronounce’, ‘certify’ and ‘corroborate’. The person doing this in a professional wargame
is ultimately the Game Controller, but affirmation checks by adjudicators prior to presenting
results to the Game Controller are sensible. The point in the process where these checks and
the final affirmation take place falls between simulation outcomes being provisionally
determined and being delivered to the players. That’s why the ‘Simulation results and
outcomes’ arrow in Figure 22-1 changes from dotted to solid. The difference between affirming
real-time and stepped-time simulation outcomes comes down to when and how you do this. If
done well, it should ensure that simulation outcomes help keep the wargame on track, heading
for the next objective/waypoint and, ultimately, achieving the overall aim. This sounds obvious,
but indulge me with two examples where affirmation did not occur.
A computer simulation expert presented results in a staff college wargame. These included a
Maritime Patrol Aircraft (MPA) being shot down by a submarine. A submariner in the room
expressed his amazement. On digging into the simulation reports, it transpired that the MPA
had been downed by small arms fire, because the computer modelled ‘bottom-up’ and there was
a – tiny – chance of this happening. This was declared publicly, someone muttered, “Bloody
accurate those submariners with their pistols!” and the room dissolved in laughter. The result
was a complete loss of faith in the simulation outcomes. The cause was no affirmation of the
simulation results prior to presentation.
During a recent divisional-level Command Post Exercise using a real-time computer simulation,
a squadron of Challenger 2s was destroyed in the space of a minute at long range by anti-tank
guided weapons. Amidst the splutterings of the Royal Armoured Corps player, it transpired that
the computer simulation did not model buildings, so the ‘covered’ approach route the player
had selected was, in the simulation, open ground. The result was an embarrassing
‘resurrection’ of the tank squadron, witnessed by all players, now firmly extracted from the
magic circle, and a question mark over the appropriateness of the buildings-free computer
simulation.
I do not pick these examples to detract from computer simulations. Plenty of manual simulation
results are similarly incredible. Rather, the examples introduce a discussion on how to affirm
outcomes in both stepped and real-time simulations.
Affirming outcomes in time-stepped simulations
Affirming time-stepped simulation results is comparatively simple. Build checks into the process
as follows:
· Conduct an initial ‘sanity check’ of all simulation outcomes. This in a closed group,
consisting of the simulation expert and one more person, ideally someone who can apply
military common sense. Any results that seem unlikely in the extreme – such as the
submarine example – should be investigated to ascertain the reason. I am not saying that
unexpected results should be ‘smoothed’, just investigated before being declared to
ensure they are credible.
· Conduct a review of the overall simulation outcomes to ensure these holistically support
the wargame objectives; and identify any variances that need to be highlighted. This is still
a small-group activity. Again, I am not saying that results should be moderated towards a
desired outcome, just that any that cause concern are investigated before being declared
so they can be explained and considered.
· The reality-checked and reviewed outcomes are then presented to the Game Controller,
along with up-front declarations of unexpected or undesirable results and the reasons for
these. With the reasons for the results clear, the Game Controller can apply their
judgement; results can stand, or moderations can be made. The outcome of this part of
the adjudication process is a certified and authorised set of results that can be presented
to the players, by whatever means.
Affirming outcomes in real-time simulations
The key to affirming real-time simulations - including role-play - is to premeditate likely
outcomes and, through this, give computer operators and role-players (where part of Control)
parameters within which to work. This is the approach enabled by the Next-Day
Wargame discussed previously. During execution, should operators’ sense that they are likely
to exceed the given parameters, they can take pre-emptive action to prevent this. If the
parameters will be exceeded unless unrealistic action is taken, the operator should immediately
inform the Game Controller. Hopefully, such a situation should not arise out of the blue, and
there will be time to take credible corrective measures to keep the game on track. Otherwise,
the Game Controller will have to work with the simulation experts and appropriate Control staff
to manage the situation without having to take the final recourse of revealing what is happening
to the players and breaking the magic circle.
Of course, there will be wargames where you do not want to impose constraints. Player-on-
player role-play could be an instance where whatever happens, happens! Let it roll (or is that
role?).
[416]
In the case of manual simulations that are resolved live (in-stride) , the key is to premeditate
outcomes as above if possible, but then for the simulation expert and/or facilitator to constantly
refer to the Game Controller at every step of the process. This is to ensure that the Game
Controller is assimilating the factors involved, assumptions made, and the final proposed
outcome. By doing this, you should arrive at the end of the resolution process with a straight
forward question for the Game Controller: “Are you happy with that outcome, or do you want to
moderate it?” Having been fully engaged, and knowing the next objective waypoint to be
reached, the Game Controller should be well-placed to affirm the result.
Chapter 26. Course of Action Wargaming
"I consider it a duty of anyone who sees a flaw in this plan not to hesitate to say so. I have no
sympathy with anyone, whatever his station, who will not brook criticism. We are here to get the
best possible results."
Eisenhower, 5 May 1944, Model Room,
HQ British 21st Army Group
Introduction
Course of Action (COA) Wargaming is the most prevalent form of wargaming practised by the
military. It is mandated in the doctrines of the US, UK, NATO and most associated militaries.
Hence, it warrants a chapter to itself.
This chapter is an updated version of the original draft section I wrote in 2012 for the Army’s
Staff Officer’s Handbook (SOHB). Since then, various editors have mangled the SOHB text and –
crucially – removed key one-page aide memoires (and replaced these with a meaningless flow
diagram). A much-reduced version is included in the 2018 Planning and Execution
Handbook (PEHB), which has replaced Part 3 of the SOHB. I have corrected these edits and
reinstated the original text and aide-memoires to present what I hope is a consolidated and
complete guide to COA Wargaming. Note that this is written from a British Army perspective.
Most units and formations have got a reasonable handle on COA Wargaming, certainly within the
British Army and in UK Joint HQs. They recognise its utility and try to apply it – but lack the
experience or wargaming intuition to overlay art on top of the raw procedures to deliver the full
benefits. When delivering COA Wargaming training, I find that a few nudges are all that is
needed to make clear what a powerful tool it is.
Examples of such ‘nudges’ are: deriving turns from the commander’s key risks; conducting
these turns in a series of short ‘impulses’, flicking between Blue and Red in a series of staccato
‘action-reaction’ cycles; explaining the different options for the commander’s participation; and
teaching – empowering – the G2/J2 officer how to play the adversary. These are straightforward,
and ‘no-brainers’ once introduced, but elude people until they have seen them. Apparently
minor, they make a huge difference to the quality of COA Wargame delivered. Please note these
details in this chapter; they are not in the SOHB or PEHB.
The COA Wargaming ‘action-reaction’ mechanism provides the turn structure start point for
many wargame formats. The overall COA Wargame approach is, arguably, the best schema to
apply to any context beyond the military. Business wargames, in particular, are well suited to the
competitive action, reaction, consideration, Critical Thinking, ‘So what?’ and Consequence
Management steps. But so, too, are emergency service and humanitarian games – the process
can be applied to almost anything. I COA Wargamed my wedding – but I’m not going to tell you
who was the Red Cell!
Introduction to COA Wargaming
COA Wargaming and Rehearsal of Concept (ROC) Drills are related but discrete tools that
support different elements of decision-making. The distinctions between these terms (and Red
Teaming, discussed elsewhere) are outlined in Table 26-1.
Table 26-1: COA Wargaming, ROC Drill and Red Teaming [Critical Thinking] distinctions
[432]
Figure 26 - 3: COA Wargaming dos and don'ts
Time management is crucial. The Chief Controller must strike a balance between useful
discussion and driving the COA Wargame relentlessly forward. Most points raised can usually
be noted for subsequent action or captured by staff branches for their own use. Not everyone is
equal, not everyone has a speaking role, and a COA Wargame is not a career moment!
COA Wargaming is command-led. The commander decides which COA(s) to COA Wargame and
the elements within the selected COA to wargame (as turns).
Recording. Accurate recording is vital. This task should be given to a good staff officer who is
fully conversant with the plan and able to garner findings without prompts from the Chief
Controller. A simple COA Wargame record sheet is essential. Suggested headings are at Figure
26-3.
Effect
Effect on on Decision
Timeframe Action Reaction Consideration
Adversary Friendly Taken
Forces
Perla, P. (1990) The Art of Wargaming: A Guide for Professionals and Hobbyists Naval Institute
Press. 2nd edition Peter Perla's The Art of Wargaming A Guide for Professionals and
Hobbyists (Ed Curry J.) (2012) The History of Wargaming Project. If you buy another book on
professional wargaming, this should be the one.
MOD. (2017) MOD Wargaming Handbook Development Concepts and Doctrine Centre. This
overarching UK wargaming doctrine was written in parallel with this book.
Sabin, P. (2012) Simulating War; studying conflict through simulation games Continuum
International Publishing Group Ltd. The book on the educational use of wargames in the
classroom.
McHugh, F. (1966) Fundamentals of War Gaming US Naval War College.
Caffrey, M. (2019) On Wargaming Naval War College Press. A wide-ranging book full of historical
and current examples of professional wargaming practise.
Professional practise
Chatfield, T. (2010) Fun Inc. Why Games are the 21st Century's Most Serious Business Virgin
Books.
Downes-Martin, S. (2013) ‘Adjudication: the Diabolus in Machina of War Gaming’, Naval War
College Review, Vol. 66, No. 3.
Downes-Martin, S. (2014) ‘Your Boss, Players, and Sponsor: The Three Witches of War Gaming’,
US Naval War College Review, Vol. 67, No. 1.
Downes-Martin, S. (2015) Stress, Paranoia and Cheating: The Three Furies of Innovative
Wargaming Connections US Wargaming Conference, National Defense University Washington
DC.
Downes-Martin, S. (2016) Wargaming to Deceive the Sponsor: Why and How? Connections UK
Annual Wargaming Conference, London.
Downes-Martin, S. (2017) Validity and Utility of Wargaming, MORS Wargaming Special Meeting.
Available online
http://www.professionalwargaming.co.uk/ValidityAndUtilityOfWargaming.pdf
Downes-Martin, S. (2018) Swarm Gaming Approach to Organizing In-Stride Games, In-Stride
Adjudication Working Group, Connections US Wargaming Conference, National Defense
University Washington DC.
Harrigan, P and Kirschenbaum, M (Eds) (2016) Zones of Control; Perspectives on Wargaming
The MIT Press. A wide-ranging anthology from key wargaming practitioners and academics
from around the world.
Longley-Brown, G. (2005) ‘The Dos and Don’ts of COA Wargaming’ British Army Review No.
138.
MOD. (2016) Joint Doctrine Publication 04 Understanding and Decision-Making Development,
Concepts and Doctrine Centre.
MOD. (2018) Planning and Execution Handbook.
MOD. (2018) Staff Officer's Handbook.
MOD. (2013) Red Teaming Guide Development, Concepts and Doctrine Centre.
NATO. (2013) Bi-SC Collective Training and Exercise Directive (CT&ED) 75-3, NATO.
Wells, H G. (1913) Little Wars Read Books Limited.
Batteux, E., Ferguson, E. and Tunney, R. (2017) ‘Risk Preferences in Surrogate Decision-Making’
Experimental Psychology 64(4).
Bravenboer, D. (2017) ‘The Unexpected Benefits of Reflection: a Case Sstudy in University-
Business Collaboration’ Journal of Work-Applied Management Vol 10 Issue 1.
Box, G.(1987) Empirical Model-Building and Response Surfaces Wiley.
Cayirci, E. and Marinicic, D. (2009) Computer Assisted Exercises and Training: A Reference
Guide Wiley.
Covey, S. (1989) The 7 Habits of Highly Effective People Simon & Schuster UK Ltd.
Fullerton, T. (2008) Game Design Workshop CRC Press.
Gilad, B. (2009) Business Wargames Career Press.
Goal/QPC. (1995) Language Processing Manual Goal/QPC.
Green, K. (2002) ‘Forecasting decisions in conflict situations: A comparison of game theory,
role-playing, and unaided judgement’ International Journal of Forecasting 18 (2002) 321–344.
Available online https://repository.upenn.edu/cgi/viewcontent.cgi?
article=1174&context=marketing_papers The article cited about the importance of roleplaying
in games.
Green, K. (2004) Game theory, simulated interaction and unaided judgement for forecasting
decisions in conflicts: Further evidence Monash University.
Herman, M., Frost, M and Kurz, R. (2009) Wargaming for Leaders McGraw Hill.
Huizinga, J. (1955) Homo Ludens: A Study of the Play Element in Culture Beacon Press.
Johnson, G and Scholes, K. (1997) Exploring Corporate Strategy Fourth Edition, Prentice Hall.
Kaplan, R and Norton, D. (2001) The Strategy-Focused Organisation Harvard Business School
Press.
Michael, D and Chen, S. (2006) Serious Games that Educate, Train and Inform Course
Technology Cengage Learning.
Kocher, M., Schudy, S. and Spantig, L. (2016) I Lie? We Lie! Why? Experimental Evidence on a
Dishonesty Shift in Groups University of Munich Discussion Paper 2016-8.
Koster, R. (2005) A Theory of Fun for Game Design Paraglyph Press.
Lehrer, J. (2012) ‘Groupthink: The Brainstorming Myth’ The New Yorker January 20. Available
https://www.newyorker.com/magazine/2012/01/30/groupthink
Lorenz, J., Rauhut, H., Schweitzer, F. and Helbing, D. (2011) ‘How Social Influence can
Undermine the Wisdom of Crowd Effect’ Proceedings of the National Academy of Sciences.
May 31;108(22):9020-5 available online
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3107299/
Mazar, N., Amir, O. and Ariely, D. (2008) ‘The Dishonesty of Honest People: A Theory of Self-
Concept’ Maintenance Journal of Marketing Research Vol. 45, No. 6.
McGonigal, J. (2012) Reality is Broken, Vintage Books.
Nijstad, B., Stroebe, W. and Lodewijkx, H. (2006) ‘The illusion of group productivity: A Reduction
of Failures Explanation’ European Journal of Social Psychology, 36(1):31 - 48.
Sabin, P. (2007) Lost Battles: Reconstructing the Great Clashes of the Ancient World Continuum
International Publishing Group Ltd. Classic analysis of history through wargaming.
Salen, K and Zimmerman, E. (2004) Rules of Play; Game Design Fundamentals The MIT Press.
Schroeder, D. (2011) Business in the Trenches SPW.
Stefano, G., Gino, F., Pisani, G., Staats, B. (2014) Making Experience Count: The Role of
Reflection in Individual Learning Harvard Business School Working Paper 14-093. Available
online
https://www.hbs.edu/faculty/Publication%20Files/14-093_defe8327-eeb6-40c3-aafe-
26194181cfd2.pdf
There are few examples in the public domain. Listed below are some of those readily available.
For other examples see www.wargaming.co.
Armatys J. and Curry J. (2019) The Pentagon's Rural AGILE/COIN Wargame (1966): A
wargaming counter insurgency megagame Level The History of Wargaming Project
Curry, J. (2012) Innovations in Wargaming, Vol 1 The History of Wargaming Project. The History
of Wargaming Project. Examples of different methods of gaming including two COIN games.
Curry J. (2018) Pentagon Urban COIN Wargame (1966): A wargaming counter insurgency
megagame Level The History of Wargaming Project.
Griffith. P. (2016) Curry, J. (Eds) Paddy Griffith’s Counter Insurgency Wargames (1980) The
History of Wargaming Project.
PAXsims: https://paxsims.wordpress.com/ PAXsims has the most active and informative series of posts on all-
matters gaming, thanks to the boundless energy of Professor Rex Brynen and his team of associate editors.
Connections (US): https://connections-wargaming.com/ The original Connections!
Connections UK: http://professionalwargaming.co.uk/ This has grown to be a unique repository of wargaming
information, thanks to the efforts of Tom Mouat. The audio and, in most cases, video of all talks since 2013 are
available to download; a fantastic resource.
War on the Rocks: https://warontherocks.com/ This hosts the most ‘grown up’ series of discussions and interviews
on wargaming and Defense matters.
Simulating War: https://groups.yahoo.com/neo/groups/simulatingwar/info Initially centring around Phil’s eponymous
book, this forum has naturally expanded to cover many wargaming topics.
RAND Published Research: https://www.rand.org/pubs.html
RAND Centre for Gaming: https://www.prgs.edu/research/methods-centers/gaming.html
Op Analytics: http://opanalytics.ca/
Simulation and Gaming Journal: https://journals.sagepub.com/home/sag
ConsimWorld http://www.consimworld.com/
Grogheads: http://grogheads.com/
Story-Living Games: https://www.storylivinggames.com/
Play The Past: http://www.playthepast.org/
UIS Naval War College Review: https://www.usnwc.edu/Publications/Naval-War-College-Review
US Army War College Quarterly Parameters:
https://ssi.armywarcollege.edu/pubs/parameters/
National Defense University Press Joint Force Quarterly:
https://ndupress.ndu.edu/JFQ/
[1]
See MOD (2017), Wargaming Handbook, Annex A – Applying wargaming to Defence problems: Case study 1 –
Wargaming in campaign and operational planning: shaping peace support operations in Afghanistan, 2011, pp. 66-
69.
[2]
The events were publicly known as ‘Synchronisation of Efforts’ (SoE) conferences rather than ‘wargames’ due to
the political sensitivities of some of the civilian agencies involved. See p. 34 for reputational and connotative issues
of using the term ‘wargame’, and advice on what to do.
[3]
Connections UK was the first extension of the Connections ‘franchise’, started in the US by Matt Caffrey (see
Chapter 27 for more information). The ‘franchise’ has since expanded beyond the US and UK and includes
conferences in the Netherlands, Australia, Canada (‘North’) and, most recently, France.
[4]
MOD (2017), Wargaming Handbook.
[5]
Air Marshal Edward Stringer's talk on ‘Advancing the UK’s Analytical Tools to Address Modern Deterrence and
Strategic Competition’
https://soundcloud.com/warstudies/event-advancing-the-uks-analytical-tools-to-address-strategic-competition-and-
modern-deterrence/s-zoK88?in=warstudies/sets/events dated April 2019.
[6]
See p. 8 and the remark on the fact that those who take warfare seriously, wargame.
[7]
The Naval War College Department of Wargaming has one of the largest, re-configurable and state of the art
facilities – a 100,000+ square-foot wargaming facility.
[8]
http://www.jhuapl.edu/CAC/#
[9]
Dstl is the UK’s leading government agency in applying science and technology (S&T) to the Defence and
security of the UK.
[10]
For further information see the UK MOD ‘Global Strategic Trends – The Future Starts Today’ publication,
available at https://www.gov.uk/government/publications/global-strategic-trends
[11]
The contents of this foreword include views and opinions expressed by the author that do not necessarily
reflect official policy. The foreword has been cleared for release by the UK MOD’s Defence Science and Technology
Laboratory (public release identifier DSTL/PUB115861).
[12]
Moved to the Planning and Execution Handbook in 2018.
[13]
www.wargaming.co
[14]
For example, see http://www.anquangroup.com/
[15]
www.lbsconsultancy.co.uk
[16]
Sabin (2012).
[17]
Perla (1990).
[18]
McHugh (1966).
[19]
Caffrey (2019).
[20]
https://paxsims.wordpress.com/
[21]
Defence Science and Technology Laboratory, see www.gov.uk/government/organisations/defence-science-and-
technology-laboratory
[22]
https://www.gov.uk/government/publications/defence-wargaming-handbook
[23]
www.jwc.nato.int/
[24]
This is the mission statement of the Connections wargaming conference, the subject of Chapter 27.
[25]
A conference for wargaming professionals, and a ‘franchise’ of Matt Caffrey’s (US) original Connections.
[26]
Rubel (2006), p. 111.
[27]
For example, see ‘Outgunned. Buying butter, not guns: why Germany needs a better army. Generals are
worried about Russia. Voters are not.’ The Economist July 28th 2018, p.23.
[28]
Photo: Vladimir V. Burov, Creative Commons Attribution-ShareAlike 2.5 Generic license, unmodified.
[29]
Vakademii Genshtaba poiavilsia tsentr voennykh igr, 24 marta 2017, (https:// russian.rt.com/
Russia/news/371695genshtab-centrvoennykh-igr, quoted in ‘How Russia ‘Plays’ at War’, Dr Steven J Main, British
Army Review 171: Winter 2018, p.53.
[30]
This is the name of a 2008 Decision Games Wargame.
[31]
See http://professionalwargaming.co.uk/2016WargamingwiththePLA.pdf
[32]
Ibid, slide 15.
[33]
The Dado Center for Interdisciplinary Military Studies in the Operations Directorate of the Israel Defense Forces
General Staff is a body whose purpose is to advance the operational thinking and learning processes in the IDF
(Wikipedia).
[34]
A vignette is ‘A discrete action, or series of connected actions, confined to a very specific and limited situation.’
See chapters 19 – 21 for more detail.
[35]
A wicked problem is ‘a problem that is difficult or impossible to solve because of incomplete, contradictory, and
changing requirements that are often difficult to recognise.’ Wikipedia.
[36]
Perla (1990), p. 156.
[37]
Ibid, pp. 15 – 160 and Caffrey (2019).
[38]
MOD Wargaming Handbook (2017), pp.19 – 20.
[39]
Strong (2017) available at https://paxsims.files.wordpress.com/2017/12/2017-12-10-watu-mors.pdf
[40]
See http://professionalwargaming.co.uk/2017.html
[41]
Note the parallel with von Muffling’s 1824 statement “This is not a game: it is training for war!”
[42]
For further reading, see: Lillard (2016), pp.88 – 128 in particular; Nofi (2010); and Nofi’s paper in the Afterword
to the 2010 edition of Peter Perla’s The Art Of Wargaming.
[43]
Taken from a speech by Admiral Nimitz to the US Naval War College in October 1960, and quoted in the MOD
Wargaming Handbook, (2017), p.4.
[44]
Perla, quoted in Harrigan and Kirschenbaum (2016), p. 178.
[45]
This had been built up by the Research Department (later Intelligence and Research; then Intelligence) at the
Naval War College.
[46]
For more detail see Lillard (2016), p.106.
[47]
For more detail see ibid, p.125.
[48]
Ibid, p.74.
[49]
Ibid, p. 129.
[50]
MOD Wargaming Handbook (2017).
[51]
Rice, C and Zegart, A, ‘Managing 21st-Century Political Risk’, Harvard Business Review May - June 2018, p.135.
[52]
Caffrey (2019).
[53]
Ibid, pp. 277 – 290.
[54]
Levine, Schelling, and Jones (1991, original 1964), p.27.
[55]
Also attributed to Richard Lingard. A Letter of Advice to a Young Gentleman Leaveing the University Concerning
His Behaviour and Conversation in the World, 1670. 'If you would read a man's disposition see him game, you will
then learn more of him in one hour, than in seven years conversation.'
[56]
Available at https://warontherocks.com/2016/02/gaming-the-system-obstacles-to-reinvigorating-defense-
wargaming/
[57]
Naval War College Review (Winter 2014), vol. 67, no. 1.
[58]
A CNA and Naval War College paper (September 2004).
[59]
Speaker’s notes to a presentation to Connections UK (2016).
[60]
Naval War College Review (Summer 2011), Vol. 64, No. 3.
[61]
MOD (2017), Wargaming Handbook, pp.12-13.
[62]
William F. Owen (2019), What’s Wrong with Professional Wargaming? draft paper.
[63]
See Levine, Schelling, and Jones (1991, original 1964) for the genesis of this discussion.
[64]
The Free Dictionary.
[65]
Cambridge Dictionary.
[66]
Ibid
[67]
Merriam-Webster.
[68]
Reinforcing Deterrence on NATO's Eastern Flank; Wargaming the Defense of the Baltics, David A. Shlapak and
Michael Johnson, https://www.rand.org/pubs/research_reports/RR1253.html
[69]
For example, see War on the Rocks https://warontherocks.com/2016/05/fixing-nato-deterrence-in-the-east-or-
how-i-learned-to-stop-worrying-and-love-natos-crushing-defeat-by-russia/
[70]
Downes-Martin (2016), p.1. See also Downes-Martin (2014).
[71]
'Greenwashing' is the use of marketing to portray an organisation's products, activities or policies as
environmentally friendly when they are not.
[72]
Levine, Schelling, and Jones (1991, original 1964), p.1.
[73]
‘A common language between speakers whose native languages are different.’ Oxford Dictionaries.
[74]
https://paxsims.wordpress.com/
[75]
https://warontherocks.com/
[76]
http://opanalytics.ca/navair/
[77]
Downes-Martin (2014), p.31.
[78]
Pettijohn and Shlapak (2016) Gaming the system: obstacles to reinvigorating defense wargaming. Available at
War on the Rocks at https://warontherocks.com/2016/02/gaming-the-system-obstacles-to-reinvigorating-defense-
wargaming/
[79]
Sabin (2012).
[80]
http://www.elliebartels.com/uploads/1/1/0/6/110629149/mors_cop_gaming_community_survey_briefing_7sept2017.pdf
[81]
McHugh (1966), pp. 2-3.
[82]
Presentation to Connections UK 2018.
[83]
NATO AAP-6 Glossary.
[84]
MOD (2013), Red Teaming Guide.
[85]
MOD (2017) Wargaming Handbook p.5.
[86]
‘Operation Fleet Street’, devised by Military Formats in Business, in conjunction with PricewaterhouseCoopers
Netherlands. See http://professionalwargaming.co.uk/pwcmfibbusinesswargamedemo.pdf
[87]
See http://professionalwargaming.co.uk/2016Keynote-McGrady.pdf
[88]
MOD (2017), Wargaming Handbook, p.43.
[89]
3 (UK) Division (1999) Wargaming Aide Memoire 2nd Edition, p.1.
[90]
See King (2015) for scenarios gamed by emergency services and business.
[91]
Rubel (2006), p.110.
[92]
Lind (1985), p.46.
[93]
‘Setting’ and ‘scenario’ will be differentiated later; run with ‘scenario’ for now.
[94]
Dstl, Defence and Security Analysis Division. Dated. 28th March 2017.
[95]
Department of Defense Instruction Number 5000.61: ‘Modeling and Simulation Verification, Validation, and
Accreditation’. Department of Defense. 2009-12-09.
[96]
MOD (2013) Red Teaming Guide, p.1-3.
[97]
Idid, p. 3-9, 3-11 and A-29.
[98]
UK MOD Defence Standard 03-44, p.1-2.
[99]
Caffrey (2019), p.262.
[100]
Perla (1990), Part 2 ‘Principles’.
[101]
Caffrey (2019), Part 2 ‘Towards More-Effective Wargaming’.
[102]
Dictionary of Military and Associated Terms. US Department of Defense 2005.
[103]
Perla (1990), p.164.
[104]
Introduced at Connections 2016.
[105]
NATO APP-6, 2008, Glossary.
[106]
MOD (2017) Wargaming Handbook, p.5-6.
[107]
Ibid, pp.8-9.
[108]
McHugh (1966), p. 9, reproduced in the MOD (2017) Wargaming Handbook, p.10.
[109]
See http://professionalwargaming.co.uk/2015.html
[110]
Koster (2005), p.146.
[111]
Ibid, p.148.
[112]
See, for example, Perla (1990), p.183.
[113]
This refers to Downes-Martin (2013), pp.67-80.
[114]
Harrigan and Kirschenbaum (2016), p.175.
[115]
Levine, Schelling, and Jones (1991, original 1964), p. 40.
[116]
Peter Perla, writing in Harrigan and Kirschenbaum (2016), p.180.
[117]
See http://professionalwargaming.co.uk/2013.html
[118]
Bunch Of Guys Sitting Around A Table, a term used to describe an unstructured discussion.
[119]
Ibid, p.2.
[120]
Rapid Campaign Analysis Toolset Rules v5.2 dated December 2017, p.4.
[121]
McHugh (1966), p.8.
[122]
Ibid, p.2.
[123]
UK Dstl’s Peace Support Operations Model.
[124]
U.S. Army Command and General Staff College.
[125]
http://paxsims.wordpress.com/2012/07/06/comments-wanted-draft-cgsc-stability-operations-simulation-
requirements/#comment-3940
[126]
US Naval War College (2011), Fleet Arctic Operations Game Report, 14, p.4.
[127]
https://paxsims.wordpress.com/2016/12/21/reflections-on-the-wargame-spectrum/
[128]
The chapter is the full, and updated, version of the draft COA Wargaming section I wrote for the UK Staff
Officer’s Handbook.
[129]
UK MOD Defence Standard 03-44, p.2.
[130]
Ibid, p.2.
[131]
Harrigan and Kirschenbaum (2016), p.184.
[132]
A definition of ‘scenario’ from dictionary.com.
[133]
MOD (2017) Wargaming Handbook, p.34.
[134]
For more detail, see Downes-Martin (2013), pp.67-80.
[135]
MOD (2017), Wargaming Handbook, p.24.
[136]
Ibid, p. 43.
[137]
Observe – Orient – Decide – Act, Colonel John Boyd, USAF.
[138]
There are several versions of this diagram. Colin Marston’s Dstl version is available on the PAXsims website at
https://paxsims.wordpress.com/2016/12/21/reflections-on-the-wargame-spectrum/
[139]
A wargame with a map-table for each of two sides, plus one for umpires. Both sides' dispositions and
movement remain hidden until intelligence or surveillance activities reveal them, with outcomes determined by
umpires.
[140]
See Downes-Martin (2013).
[141]
Dstl/WP100280, Wargaming in Defence; A Thinkpiece for VCDS v2.0, dated 20170201.
[142]
Ibid., page 8.
[143]
Interactions can be kinetic (for example, combat) or non-kinetic (for example, a meeting or aid delivery).
[144]
OED.
[145]
Green (2004), pp. 463–47.
[146]
MOD (2017) Wargaming Handbook, pp.43-44.
[147]
See Pettijohn, S, Strategic Wargaming at Connections UK 2016. Slides, audio and video available at
http://professionalwargaming.co.uk/2016.html
[148]
OED.
[149]
See slide 2 at https://paxsims.files.wordpress.com/2018/08/dstl-adjudication.pdf
[150]
Harrigan and Kirschenbaum (2016).
[151]
Source unknown, but a staple comment among OA practitioners.
[152]
Literally translated from the French as ‘grumbler’. A grognard was a soldier of Napoleon’s Imperial Guard,
officially formed in 1804. Having served the Emperor for years, while remaining loyal to him the grognards – like all
soldiers – were prone to complain about their treatment and use in battle. The term is now applied to wargamers –
and with good reason!
[153]
Sabin (2012), pp. 202-220.
[154]
Developed by Mark Flanagan and shown at Connections UK 2016.
[155]
Go to http://rprod.com/index.php?page=description-22 to see what it looks like.
[156]
Some combinations are clearly mutually exclusively, such as a solitaire megagame.
[157]
MOD Wargaming Handbook (2017), pp.39 and 45-46.
[158]
One form of this is the Crisis Management Exercise; see explanatory text below.
[159]
Sometimes called ‘domains’, in Defence these are land, air, maritime, space and cyber.
[160]
Political, Military, Economic, Social, Infrastructure, Information, Physical Environment, and Time.
[161]
Source: https://www.gov.uk/guidance/emergency-planning-and-preparedness-exercises-and-training
[162]
Source: https://www.fema.gov/media-library-data/20130726-1914-25045-8890/hseep_apr13_.pdf
[163]
Miniatures, typically 1/300th scale, are also used for vehicle recognition training because they are more
effective than computer recognition packages.
[164]
See, for example, Featherstone (1975).
[165]
Heiden (2005).
[166]
Typically using SWOT (strengths, weaknesses, opportunities and threats) analysis, interviews and clustering.
[167]
Koster (2005), p.232.
[168]
See the forthcoming Rosenstrasse, which ‘is a highly structured historical role-playing scenario about the
eponymous women-led protests in 1943’, War Birds, currently on Kickstarter.
[169]
Compare these extracts from the D&D 5th Edition Starter Set to the characteristics of professional games and
the requirements of the facilitator discussed in Chapter 23: ‘The DM [Dungeon Master] is a referee… The DM is a
narrator… Although the DM controls the adventure, the relationship between the players and DM isn’t adversarial.
The DM’s job is to challenge the characters with interesting encounters and tests, keep the game moving, and apply
the rules fairly… [The DM should]: be consistent; make sure everyone is involved; and be fair… D&D gives structure
to the stories – a way of determining the consequences of the adventurers’ actions.’
[170]
US Naval War College (2015), p.6.
[171]
Downes-Martin (2013).
[172]
Intelicons are 'a codified visual language that can augment the current suite of symbols and reflects the
complexity found in complex multi-actor environments' devised by AW2 Consulting.
[173]
For more on ‘in-stride’ adjudication, see the Connections US 2018 In-Stride Adjudication Working Group
report at https://paxsims.wordpress.com/2018/09/12/in-stride-adjudication-connections-2018-working-group-report/
[174]
MOD Wargaming Handbook (2017).
[175]
MOD Wargaming Handbook (2017), p.21.
[176]
See also Gaming the semi-cooperative, available on PAXsims at
https://paxsims.wordpress.com/2016/02/02/gaming-the-semi-cooperative/
[177]
Friction has been defined as ‘the propensity of unexpected delays to occur during armed conflicts.’
Simpson (2015).
[178]
Defined in Chapter 2 as ‘The act of opposing or resisting; something that acts as an obstacle to some course
or progress.’
[179]
Clausewitz, On War, Book 1, Chapter 1, section # 28.
[180]
MOD Wargaming Handbook (2017), p.22.
[181]
Clausewitz, On War, Book 1, Chapter 1, section # 28.
[182]
MOD Wargaming Handbook (2017), pp.22-23.
[183]
For more on uncertainty, see the MIT press monograph at https://mitpress.mit.edu/books/uncertainty-games
[184]
Ibid, p.23.
[185]
Commander Field Army presentation to General Staff Conference, 12 January 2017.
[186]
Brig Dr Meir Finkel, IDF, speaking at the I/ITSEC Conference, 3 December 2015.
[187]
Lind (1985), p.43.
[188]
Ed McGrady, Introduction to Wargaming Class, MORS Wargaming Special Meeting, Alexandria, Virginia, US, 17
October 2016.
[189]
Co-author of Red Storm Rising and wargame designer, including the Harpoon series.
[190]
Persian Incursion (2010), Clash of Arms, Game Rules p.4.
[191]
Military strategy; Wargames. To understand war, American officials are playing board games, The Economist,
March 15th 2014.
[192]
Koster(2005), p.96.
[193]
Ibid, p. 23.
[194]
Ibid, p.28.
[195]
See, for example The Technical Cooperation Program (2006) and the UK Land Handbook – Force Development
Analysis and Experimentation (2014).
[196]
For more information, see Pournelle (2014).
[197]
MOD (2017), Wargaming Handbook, p.25.
[198]
Dunnigan (2000), p.147.
[199]
KCL War Studies graduate, https://medium.com/smart-war/4daf4895c0fe
[200]
Brian Train, ‘Abstracted for Your Attention’, available at http://professionalwargaming.co.uk/2013.html
[201]
Sabin (2012), pp.19 and 30.
[202]
Perla (1990), p.183.
[203]
Simpson (2015).
[204]
US Naval War College, War Gamers’ Handbook, A Guide for Professional War Gamers, p.10.
[205]
Professor Rex Brynen, ‘Ten (Not Entirely Randomly-Generated) Reflections on the Social Science of
Wargaming.’ See http://professionalwargaming.co.uk/2016.html
[206]
Ibid.
[207]
Downes-Martin (2016), Wargaming to Deceive the Sponsor: Why and How? Speakers notes at Connections
UK September.
[208]
Downes-Martin (2014). See also Downes-Martin (2013).
[209]
Ibid, p.37.
[210]
Some lessons from history about wargaming and exercises, quoted in Perla (2010 edition), Afterword.
[211]
This has been periodically updated. See: Harrigan and Kirschenbaum (2016); Nofi in Perla (2010 edition),
Afterword; and Pournelle (2014).
[212]
Perla (1990), p.287.
[213]
Perla, writing in Harrigan and Kirschenbaum (2016), p.176.
[214]
The Technical Cooperation Program (2006), p.12 - 16. Note the Experiment and Campaign Planning Flowchart
on p.31 and the detailed discussion of the GUIDEx Principle 4 (Defense experiments should be integrated into a
coherent campaign of activities to maximize their utility) on pp.107-116.
[215]
Land Handbook: Force Development Analysis and Experimentation, UK Army Director General Capability,
undated, pp. 3-1 and 3-10.
[216]
US Naval War College (2015), p.5.
[217]
Ibid, p.5.
[218]
Wikipedia. For a deeper discussion of inductive and deductive game designs and their impact on adjudication
see Downes-Martin, S. (2013) ‘Adjudication: the Diabolus in Machina of War Gaming’, Naval War College Review, Vol.
66, No. 3
[219]
US Naval Air Systems Command Wargame Course, Background for the Basic Analytic Wargaming Course from
the Naval Postgraduate School at China Lake, 31 July to 4 August 2017. See http://opanalytics.ca/navair/dcmp.html
[220]
Ibid.
[221]
Knoco stories: OILs - Observations, Insights and Lessons http://www.nickmilton.com/2013/02/oils-
observations-insights-and-lessons.html#ixzz5EKi5xAI2
[222]
The WGD defines analytical wargaming as a systematic research method for analysing war-fighting decisions
and decision-making behaviour. A wargame is a representation of a decision-making challenge, which may or may
not closely resemble what we can or will observe in the naturally occurring world. It is a representation of ‘real
warfare’ in the sense that players are making real war-fighting decisions. From a practical perspective, analytical
wargaming provides an opportunity to think deeply about a policy, strategy, operating concepts, operating plans, or
course of actions; rehearse decision-making; or create a learning experience for participants in making decisions
and experiencing the effects of interaction with an aggressive competitor.
[223]
The U.S. military is a joint force (Air Force, Army, Coast Guard, Marines, and Navy) that prepares to fight in
combined arms with coalitions and other partners in all warfighting domains (air, space, land, the surface of the
oceans, undersea, and in the littorals) using a full range of hard and soft power capabilities including cyber and
other information operations.
[224]
McCarty Little was appointed in 1887 as a member of the faculty and developed two-sided wargaming at the
College (Brightman and Dewey 2014). McHugh also credits McCarty Little with introducing wargaming in the College:
see McHugh (1966). McCarty Little’s papers on war gaming include Rules for the Conduct of War Games, Naval War
College, Newport RI (1901 and 1905); The Strategic Naval War Game or Chart Maneuver, U.S. Naval Institute
Proceedings, Vol. 38, No. 4, Whole No. 144 (1912); and The Chart Maneuver, Naval War College, Newport, RI (1920).
[225]
Jon Scott Logel reports on Van Auken’s appointment as Director of the Research Department at the College
and his team’s work analysing wargames in the inter-war period. See Jon Scott Logel, Captain Van Auken and the
Research Department of the Naval War College: Considerations of Analytical War Gaming in the Decade Before
Midway, prepared for the 2017 McMullen Naval History Symposium, US Naval Academy, 14-15 September 2017. Also
see, Lillard (2016).
[226]
For a professional wargamer’s perspective on McHugh’s influence see David DellaVolpe’s foreword to the
Naval War College reprint of McHugh’s 1966 volume. McHugh’s influence is reflected in the current U.S. Naval War
College War Gamers’ Handbook: A Guide for Professional War Gamers, which was edited by Shawn Burns.
[227]
See McHugh (1966). McCarty Little’s 1912 lecture, which is entitled The Strategic Naval War Game or Chart
Maneuver, is reprinted in U.S. Naval Institute Proceedings, Vol. 38, No.4, Whole No. 144.
[228]
Adapted by the authors based on Mr. Patrick Molenda’s and RADM John T. Palmer’s classified briefing,
Integrated Training, Wargaming, and Modelling and Simulation, dated 14 December 2017.
[229]
For detailed descriptions of the NWC WGD’s wargaming methodology, see McHugh (1966) and US Naval War
College (2015).
[230]
M Polski, extension based on US Naval War College (2015).
[231]
By ‘replicate’ we mean that another equally capable research team could duplicate the analysis using the same
data and obtain the same insights or findings. By ‘repeat’ we mean that the same game design could be played again
by another research team or the same research team with the same players or another group of players. By ‘iterate’
we mean that a new game could be designed and played that would represent the next step in a series of inquiries
with respect to the war-fighting challenge. For a more comprehensive discussion of how the WGD situates itself in
military operations research, see Polski, War Gaming Department Working Paper WGD_20181, U.S. Naval War
College, Newport, RI. April 2018.
[232]
Our definitions of validity, reliability, and replicability are drawn from King, Keohane, and Verba (1994).
Campbell and Stanley, who are concerned with experimental design, distinguish ‘internal validity’ from ‘external
validity’. Findings have internal validity if we can say that the experimental treatment made a difference in a specific
experimental instance. External validity refers to the extent to which findings can be generalized across other
populations, settings, treatment variables, and measurement variables (Donald T. Campbell and Julian C. Stanley,
Experimental and Quasi-Experimental Designs for Research, Rand McNally College Publishing Company (1963). For a
general primer on research design and planning, see Paul D. Leedy and Jeanne Ellis Ormrod, Practical Research:
Planning and Design, Ninth Edition, Pearson (2010).
[233]
Polski, M (2018) A Warfighter’s Guide to Analysis, Wargaming Department Working Paper WGD 20181, U.S.
Naval War College, Newport RI.
[234]
Logel, J, Analysis Up Front: Planning and Managing Wargame Analysis, Presentation for the NWC International
Wargaming Course, June 2018.
[235]
See p. 171 for definitions of validity, reliability, and replicability.
[236]
Presentation to Connections UK 2018.
[237]
2016 e-mail exchange.
[238]
Harrigan and Kirschenbaum (2016), p.XVI.
[239]
NATO SAS Research Task Group (RTG) 139 is an operations research task group proposed by NATO Allied
Command Transformation. (ACT). The title for SAS-139/RTG is ‘NATO Analytical Wargaming – Innovative Approaches
for Data Capture, Analysis, and Exploitation’.
[240]
See: https://sgnfr.wordpress.com/serious-games-forum/
[241]
See: https://en.wikipedia.org/wiki/Pareto_efficiency
[242]
NATO SAS Research Task Group RTG 139.
[243]
Summarised in Sabin (2012), p.22-27, published in Literary & Linguistic Computing, 26/3, September 2011, and
presented to the ‘Digital Humanities’ conference in July 2010.
[244]
Sabin (2012), p.23.
[245]
Ibid, p.26.
[246]
Abbreviations are: Training Audience (TA); Common Operations Picture (COP); Joint Operations Command
and Staff Training System (JOCASTS); Exercise Control (Excon); and Rapid Campaign Analysis Toolset (RCAT).
[247]
Brynen, R ‘Gaming the Non-kinetic,’ in Harrigan and Kirschenbaum (2016). See also Rex Brynen, ‘Gaming
Indirect Effects: From Cyber to Social Media,’ presentation made to the Defence Science and Technology
Laboratory, Dstl Portsdown West, June 2018, at https://paxsims.files.wordpress.com/2018/08/dstl-social-media-
cyber.pdf
[248]
Brynen, R, ‘Gaming Fog and Friction: How Simulations Enhance Student Understanding of Complex Policy
Processes,’ in Anna MacLeod and Matthew Schnurr, eds., Simulations and Student Learning: An Interdisciplinary
Perspective (University of Toronto Press, forthcoming).
[249]
For a broader discussion on the factors associated with more effective forecasting, see Brynen, R, ‘Here (Very
Likely) Be Dragons: The Challenges of Strategic Forecasting,’ in Thomas Juneau, ed., Strategic Analysis in Support
of International Policy Making (Lanham: Rowman & Littlefield, 2017). For the application of this to wargaming, see
Rex Brynen, ‘Wargaming and Forecasting,”’ presentation made to the Defence Science and Technology Laboratory,
Dstl Portsdown West, June 2018, at https://paxsims.files.wordpress.com/2018/08/dstl-forecasting.pdf
[250]
On messy problems, and the importance of unanticipated consequences and second and third order effects,
see Horst Rittel and Melvin Webber, ‘Dilemmas in a General Theory of Planning,’ Policy Sciences 4 (1973), and
Chiyuki Aoi, Cedric De Coning, and Ramesh Chandra Thakur, eds., Unintended Consequences of Peacekeeping
Operations (Tokyo: United Nations University Press, 2007).
[251]
Dixson, M and Ma, F, An Investigation into Wargaming Methods to Enhance Capability Based Planning, DRDC-
RDDC-2017-D147 (Ottawa: Defence Research and Development Canada, 2018), Annex B, at http://cradpdf.drdc-
rddc.gc.ca/PDFS/unc293/p806080_A1b.pdf
[252]
Brynen, R, ‘Playtesting RCAT,’ PAXsims blog, 30 November 2015, at
https://paxsims.wordpress.com/2015/11/30/playtesting-rcat/
[253]
Dixson, M and Ma, F, An Investigation into Wargaming Methods to Enhance Capability Based Planning, p. 20.
[254]
This is an actual example drawn from Paddy Griffith’s clever ‘Longreagh Village’ counterinsurgency game. See
John Curry, ed., Paddy Griffith’s Counter Insurgency Wargames (1980) (History of Wargaming Project, 2016).
[255]
Varda Liberman, Steven Samuels, and Lee Ross, ‘The Name of the Game: Predictive Power of Reputations
Versus Situational Labels in Determining Prisoner’s Dilemma Game Moves,’ Personality and Social Psychology
Bulletin 30, 9 (2004). See also, Rex Brynen, ‘Ten (Not Entirely Randomly-Generated) Reflections on the Social
Science of Wargaming,’ keynote address to Connections UK, September 2016. Slides and video of presentation at
http://www.professionalwargaming.co.uk/2016.html
[256]
Perla and McGrady (2011).
[257]
Dunnigan (2000), p. 147.
[258]
Brynen, R, ‘Experimenting with DIRE STRAITS,’ PAXsims blog, 7 October 2017, at
https://paxsims.wordpress.com/2017/10/07/experimenting-with-dire-straits/ See also Rex Brynen, ‘In the eye of the
beholder? Cognitive challenges in wargame analysis,’ presentation to the Connections UK, September 2018,
available at http://www.professionalwargaming.co.uk/2018.html
[259]
In one larger game of ISIS Crisis, for example, each actor was represented by a team in which there were
divergent interests, and simple decision rules for each team brought these to the fore without extensive involvement
by the game controller. By contrast, in DIRE Straits (a game exploring crisis stability in East Asia), the internal
dynamics of the Trump Administration were represented by a large policy game, which itself was nested into the
broader regional strategic game. See: Rex Brynen, ‘Exploring matrix games for mass atrocity prevention
and response,’ PAXsims blog, 3 June 2016, at https://paxsims.wordpress.com/2016/06/03/exploring-matrix-games-
for-mass-atrocity-prevention-and-response/ and Rex Brynen, ‘Dissecting DIRE STRAITS,’ PAXsims blog, 9
September 2017, at https://paxsims.wordpress.com/2017/09/09/dissecting-dire-straits/
[260]
This also reflects real world perspectives: many Sunnis in Iraq do tend to overestimate their share of the
population.
[261]
Brynen, R, Exploring US Engagement in the Middle East (Washington DC: Atlantic Council, 2016), at
https://www.atlanticcouncil.org/publications/issue-briefs/exploring-us-engagement-in-the-middle-east
[262]
Brynen, R, ‘(Ending) Civil War in the Classroom: A Peacebuilding Simulation.’” PS Political Science &
Politics 43, 1 (January 2010), available at https://paxsims.wordpress.com/2010/01/18/ending-civil-war-in-the-
classroom/
[263]
During the 1992 US election campaign, Bill Clinton had a sign in his office to keep him on message: ‘The
economy, stupid.’ Similarly, a robust process to design, develop and deliver a wargame is fundamental.
[264]
Cayirci and Marinicic (2009), p.16.
[265]
MOD (2017), Wargaming Handbook, p.51.
[266]
‘Through-Life Management is an integrated approach to management of the delivery of all aspects of military
capability, from identification of the need for the capability to its disposal.’ UK National Audit office, Ministry of
Defence Through-Life Management, 21 May 2003.
[267]
Perla (1990), pp.183 – 272.
[268]
This is an enhanced version of the diagram on p.51 of the MOD Wargaming Handbook.
[269]
Background for the Basic Analytic Wargaming Course from the Naval Postgraduate School at China Lake, 31
July to 4 August 2017, http://opanalytics.ca/navair/
[270]
US Naval War College (2015), p.14.
[271]
Ibid, p.7.
[272]
In a 2018 e-mail.
[273]
Downes-Martin (2014).
[274]
McHugh (1966).
[275]
MOD (2017), Wargaming Handbook, pp.52 – 55.
[276]
NATO (2013).
[277]
Perla (2010 edition of The Art of Wargaming), final chapter.
[278]
US Naval War College (2015), p.21.
[279]
NATO (2013), Appendix 1 to Annex M.
[280]
NATO (2013), p.4-28.
[281]
See ibid, p.C-1 to C-4, for a summary.
[282]
Ibid, p.C-1.
[283]
Rapid Campaign Analysis Toolset, a Cranfield University/Dstl manual simulation system.
[284]
Quoted above, from the MOD (2017), Wargaming Handbook, pp.56-57.
[285]
Electronic voting featured, but other approaches are common: Deep Dives, Delphi analysis etc.
[286]
Start of Exercise (or wargame). Endex is the end of exercise (or wargame).
[287]
What A Good Idea If...
[288]
Commanding Officer’s Really Good Idea.
[289]
And/or Orange (armed non-state actors), Black (organised crime) etc.
[290]
The recreational wargame term for this is ‘plumpire’: a combination of ‘player’ and ‘umpire’. ‘Control’ is a more
formal term, likely to be more acceptable in the context of a serious game.
[291]
Something is sound, strong or fit for purpose; see Chapter 3 (Misnomers and Misunderstandings).
[292]
UK Defence Systems Approach to Training, Glossary of Terms.
[293]
Ibid.
[294]
Presentation to Connections UK 2018.
[295]
See https://www.defenseone.com/ideas/2018/08/better-wargaming-helping-us-military-navigate-turbulent-
era/150653/?oref=d-river and https://www.defenseone.com/ideas/2018/10/how-joint-staff-calculated-defense-
programs-return-investment/152171/ for links and articles on the DWAG repository.
[296]
Slides, notes and audio are available in the Day 3 section at http://professionalwargaming.co.uk/2018.html
[297]
Consider surgery for example. Unless immediate life-saving surgery is required, a botched brain operation is
probably more damaging than no operation. There are obvious caveats to this principle.
[298]
US Naval War College (2015), p.46.
[299]
Bunch of Guys Sat Around a Table.
[300]
This is the informal practice at the Center for Naval Warfare Studies (in which the US Naval War College’s
Wargaming Department resides) for research faculty and wargame staff.
[301]
‘Intellectual fraud’ refers to the generation of results at best not supported by analysis and at worst
contradicted by analysis. Its primary motive is not to gain an unearned financial or career benefit, although it may be
true that both result from the fraud.
[302]
My thanks to Paul Vebber for the term ‘madness of mobs’ in this context.
[303]
For details of this kind of problem and how to deal with it see the briefings and reports on the Puppet Mastery
web page at https://sites.google.com/site/stephendownesmartin/puppet-mastery
[304]
I have observed this to be true over many years and a huge number of meetings. I have no idea why it is true.
[305]
Everyone with their heads down typing into their groupware laptops means they are not engaging with each
other. Alternatively sitting slumped in a chair while a facilitator types onto a screen at the direction of members of
the group is one of the lesser known but more painful circles of hell. You might find it helpful to remove all the chairs
from the room.
[306]
One of the informal rules of wargaming is that ‘players cheat’. This is not necessarily a bad thing (Downes-
Martin 2015).
[307]
At an observed wargame a retired three-star leading the Blue Cell stated: “We mustn’t allow the game
schedule to interfere with this interesting conversation”, thus ensuring that critical data needed to analyse the
objectives of the active duty four-star sponsor was missing because the game director did not have the intestinal
fortitude or the support of his boss to deal with it. The sponsor blamed the wargaming organization, not his retired
three-star colleague.
[308]
Pre-game briefings.
[309]
Deputy Secretary of Defense Bob Work and Gen. Paul Selva, Revitalizing Wargaming is Necessary to be
Prepared for Future Wars, War on the Rock, 8 December 2015. Bob Work was the Deputy Secretary of the U.S.
Department of Defense, General Paul Selva the Vice Chairman of the Joint Chiefs of Staff.
[310]
Perla and McGrady(2011), p. 125.
[311]
This chapter is based on a presentation I gave at 2016 Connections UK. See Connections UK 2016, Day 2,
Plenary 1 at http://professionalwargaming.co.uk/2016.html
[312]
Koster (2005), p.148.
[313]
Perla and McGrady (2011), p.113 and p.127.
[314]
Koster (2005), p.158 and p.184.
[315]
Ibid.
[316]
McGonigal (2012).
[317]
Fullerton (2008).
[318]
Salen and Zimmerman (2004),
[319]
Perla and McGrady (2011).
[320]
See especially McGonigal (2012), p.21 for more on game traits.
[321]
Bernard Suits (1978), The Grasshopper: Games, life, Utopia, quoted in McGonigal (2012), p.22.
[322]
McGonigal (2012), p.34.
[323]
Ibid, quoted on pp.31 – 33.
[324]
Ibid, quoted on p.28.
[325]
Ibid, paraphrased from p. 28.
[326]
See ibid, pp. 29 – 31, for descriptions of these.
[327]
Salen and Zimmerman (2004), pp.31-37.
[328]
Robert E. Quinn and Anjan V. Thakor, ‘Creating a Purpose-Driven Organization’, Harvard Business Review,
July – August 2018, p.81.
[329]
Ibid, see p. 32.
[330]
Koster (2005), p.92.
[331]
Endorphins are the body’s natural opiates; a chemical in the brain that leads to feelings of happiness, even
euphoria.
[332]
Koster (2005), p.92.
[333]
See in particular McGonigal (2012), pp.35 – 38.
[334]
I think the significance of each word, phrase or condition is self-evident, but you should still read the
references at the start of the chapter for the full context.
[335]
Everything in war(gaming) is simple, but doing the simplest thing is difficult.
[336]
McGonigal (2012), pp. 45 – 51.
[337]
Ibid, quoted on p.47.
[338]
MOD (2017), Wargaming Handbook, pp. 24 – 25 paragraph 2.10.
[339]
Salen and Zimmerman (2004), p.37.
[340]
McGonigal (2012), pp.95 – 115.
[341]
Huizinga (1955), p.10.
[342]
Salen and Zimmerman (2004), p.34.
[343]
See http://www.janchappuis.com/
[344]
'Begin with the End in Mind' is Habit 2 from Covey, S, (1989), The 7 Habits of Highly Effective People.
[345]
Start of exercise.
[346]
US Naval War College (2015), p.23.
[347]
Sharon Ghamari-Tabrizi, writing in Harrigan and Kirschenbaum (2016), p.334.
[348]
NATO (2013), Glossary.
[349]
Ibid.
[350]
Simpson (2015).
[351]
NATO (2013), Glossary.
[352]
van Notten, P, ‘Writing on the Wall: Scenario Development in Times of Discontinuity’, quoted at
http://www.oecd.org/site/schoolingfortomorrowknowledgebase/futuresthinking/scenarios/37246431.pdf
[353]
NATO (2013), pp. M-1-1 to M-1-4.
[354]
There are many real-world settings, but these tend to be classified.
[355]
You can read a review and download DATE from the PAXsims website at
https://paxsims.wordpress.com/2016/05/17/decisive-action-training-environment/
[356]
NATO Joint Warfare Centre The Three Swords Magazine, 21/2011, p.5.
[357]
https://www.cia.gov/library/publications/the-world-factbook/
[358]
Dunnigan and Bay (2008).
[359]
Perla (1990), p.165.
[360]
NATO (2013), p.M-1-4.
[361]
NATO (2013), p. A-25.
[362]
Ibid, p. A-15.
[363]
Ibid, p. A-23.
[364]
Ibid, p. A-23.
[365]
Find: Detect, Recognise, Identify and/or Locate a unit, object, activity, situation, event or individual or group.
UK Army Staff Officer’s Handbook.
[366]
See https://www.4cstrategies.com/exonaut/
[367]
https://www.4cstrategies.com/exonaut/
[368]
http://www.nsc.co.uk/simulation/constructive-simulation/
[369]
https://www.c4itrgtech.com/webmsel-2/
[370]
McGonigal (2012).
[371]
And 1 ½-sided if following the US Naval War College War Gamers’ Handbook.
[372]
NATO (2013), p.5-1.
[373]
Ibid, p. 5-3.
[374]
Main Events List/Master Incidents List.
[375]
Also known as Observer/Trainers.
[376]
American, British, Canadian, Australian and New Zealand Armies’ Programme.
[377]
MOD (2013), Red Teaming Guide, p. 3-6 and Lexicon.
[378]
‘Event-stepped’ is an alternative to ‘time-stepped’.
[379]
Feeds from a simulation generate a picture in a separate tool.
[380]
http://www.nsc.co.uk/simulation/constructive-simulation/#
[381]
iNet is part of a collaborative and inclusive working environments. See
http://www.nsc.co.uk/simulation/constructive-simulation/
[382]
Termed Forces Synchronisation (F Synch) by HQ ARRC and HQ 3 UK Division.
[383]
Sabin (2012), pp.83-84 and pp.104-106.
[384]
This chapter is an adaptation of a ‘Facilitation Guide’ I wrote for the Defence Research and Development
Canada (DRDC). It incorporates comments from Rex Brynen, who was involved on the same project.
[385]
Design to a purpose, analysis and facilitation.
[386]
Actually, having your game compared to one that has sold millions of units should be considered a compliment.
[387]
Note them on a whiteboard or flip chart for subsequent discussion.
[388]
Box (1987), p.74.
[389]
MOD (2017), Wargaming Handbook, p.24.
[390]
Green (2004), pp.465 – 470.
[391]
Schelling, writing in Harrigan and Kirschenbaum (2016), pp.229 – 230.
[392]
This is positive, not normative decision-making.
[393]
Mouat, T (2016), Baltic Challenge: A Matrix Game about NATO/Russian posturing in the Baltic Sea, MORS
Special Event.
[394]
Also called Force Equivalency Ratios (FER).
[395]
Wallman, J (2015), Army 2020: Wargame Guidelines for Operational-Level Combat 2020-25.
[396]
MOD (2018) Planning and Execution Handbook, p.10-6.
[397]
i.e. If one achieves 3 to 1 odds, that the engagement will be successful.
[398]
Sometimes referred to as a ‘catastrophic kill.’
[399]
Author’s original, enhanced by Sandbox Ltd and Cranfield University.
[400]
Clearly, identical outcomes are possible within the bounds of randomly generated probabilities.
[401]
McHugh (1966).
[402]
Ibid, p.210-211.
[403]
Clausewitz’s original work lists these as (1) primordial violence, hatred, and enmity; (2) the play of chance and
probability; and (3) war's element of subordination to rational policy. They are not the military, the government, and
the people.
[404]
Ed McGrady differentiates between 'precision' (meaning consistency) and 'accuracy' (meaning how close to
being right).
[405]
OA is also an abbreviation for Operational Analyst.
[406]
Dstl Support to Operations.
[407]
Perla (1990), pp. 105-114 and pp.273-290.
[408]
Harrigan and Kirschenbaum (2016), pp.159-182.
[409]
Sabin (2012), pp.267-273.
[410]
That I still had from my Donald Featherstone Skirmish Wargaming days in the mid 1970’s!
[411]
Rapid Campaign Analysis Toolset, Rules v5.2, dated December 2017, p.32.
[412]
The presentation of player cells’ plans should be predicated on these having already been determined and
recorded. In this way, cells cannot change their mind when they hear others’ plans before they present theirs.
[413]
Sabin (2012), pp.55 – 56.
[414]
MOD (2018) Staff Officer’s Handbook:
Defeat: To diminish the effectiveness of the enemy to the extent that he is unable to participate further in the battle
or at least cannot fulfil his mission.
Destroy: To kill or so Damage an enemy force that it is rendered useless.
Feint: To distract the action of the enemy through seeking contact with it.
Demonstrate: To distract the enemy’s attention without seeking contact.
[415]
OED.
[416]
See the Connections US 2018 In-Stride Adjudication Working Group report at
https://paxsims.wordpress.com/2018/09/12/in-stride-adjudication-connections-2018-working-group-report/
[417]
UK Scientific Advisor (Land) (now Warfare Science and Technology Branch), Course of Action Analysis
Requirements, 12 April 2012.
[418]
Civil Military Cooperation.
[419]
Cultural, Political and Legal Advisor.
[420]
JDP 0-01.1 UK Glossary of Joint and Multinational terms and Definitions.
[421]
Sun Tzu, The Art of War.
[422]
Decision Support Overlay.
[423]
Decision Support Matrix.
[424]
Commander’s Critical Information Requirements.
[425]
Concept of Operations.
[426]
Armed Non-State Actors (ANSA).
[427]
Organised Crime (OC) and Transnational Organised Crime (TOC).
[428]
Traditionally the side with the initiative goes first, but consider Blue always having the first Action; either
approach can work.
[429]
Note distinction between Red Cell (adversary) and Red Team (challenging assumptions).
[430]
Consider the Blue Cell always having the Counteraction, because it is their plan being examined.
[431]
Also known as a Risk Impact Graph, although this is simplistic.
[432]
Longley-Brown (2005), p.49.
[433]
John Warden (1988) The Air Campaign: John Warden and the Classical Airpower Theorists republished many
times.
[434]
Harrigan and Kirschenbaum (2016), pp. 229 – 240.
[435]
https://paxsims.wordpress.com/