Deception Counterdeception and Counterintelligence
Deception Counterdeception and Counterintelligence
Deception Counterdeception and Counterintelligence
Deception
Robert M. Clark
William L. Mitchell
Los Angeles
London
New Delhi
Singapore
Washington DC
Melbourne
Copyright © 2019 by CQ Press, an imprint of SAGE Publications, Inc. CQ
Press is a registered trademark of Congressional Quarterly, Inc.
All rights reserved. No part of this book may be reproduced or utilized in any
form or by any means, electronic or mechanical, including photocopying,
recording, or by any information storage and retrieval system, without
permission in writing from the publisher.
FOR INFORMATION:
CQ Press
E-mail: order@sagepub.com
1 Oliver’s Yard
55 City Road
United Kingdom
3 Church Street
Singapore 049483
Names: Clark, Robert M., author. | Mitchell, William L. (William Leslie), author.
Sun Tzu
Although a number of books and journal articles deal with the historical and
theoretical aspects of deception, the focus tends toward Soviet-era
disinformation disciplines, which are now dated. This book includes many
historical examples because the principles are still relevant, but also
highlighted are more current illustrations. And deception is typically
presented almost as an afterthought in intelligence and operations courses.
But deception is deserving of its own educational and training infrastructure
that fits into today’s strategic context. Since the Russian deception that
supported its annexation of Crimea in 2014, increasing attention is being
given to the concept of deception. This text is designed to elevate the level of
education in intelligence and operations courses in terms of deception
planning as well as detecting and countering deception.
The basic principles of deception have not changed, and we’ll cover them in
some detail. But information technology, and specifically social media,
constitutes a new collection asset (and new channels for deception). In that
role, it adds a new dynamic to both the conduct and detection of deception
activities, both in their traditional roles and as a part of counterintelligence or
psychological operations. For example, the cyber domain provides a powerful
channel for conducting deception. Accordingly, it receives special attention.
The book provides a main framework for a deception planning course for the
civilian academic community, the intelligence community, and the military. It
bridges the divide between theory and practice concerning deception that
sometimes separates these communities. The target audience includes
intelligence analysts and operational planners, and the book addresses both
perspectives. Operations professionals have few chances during an entire
career to observe the successful execution of a deception operation. A book
that illustrates deception theory using historical and modern-day cases and
provides opportunities to practice with hypothetical case studies will
especially benefit them.
Above all, we are thankful for the efforts of Dr. Clark’s wife and partner in
this effort, Abigail, whose extensive revisions made this a better book.
Robert M. Clark
William L. Mitchell
Copenhagen, Denmark
About the Authors
Robert M. Clark
currently is an independent consultant performing threat analyses for the
US intelligence community. He is also a faculty member of the
Intelligence and Security Academy and adjunct professor of intelligence
studies at the Johns Hopkins University. He previously was a faculty
member of the Director of National Intelligence (DNI) Intelligence
Community Officers’ course and course director of the DNI introduction
to the intelligence community course. Dr. Clark served as a US Air
Force electronics warfare officer and intelligence officer, reaching the
rank of lieutenant colonel. At the Central Intelligence Agency (CIA), he
was a senior analyst and group chief responsible for managing analytic
methodologies. Clark holds an SB from MIT, a PhD in electrical
engineering from the University of Illinois, and a JD from George
Washington University. He has previously authored three books:
Intelligence Analysis: A Target-Centric Approach (5th edition, 2016),
The Technical Collection of Intelligence (2010), and Intelligence
Collection (2014). He was co-editor, with Dr. Mark Lowenthal, of The
Five Disciplines of Intelligence Collection (2016). He co-authored
Target-Centric Network Modeling (2016) with Dr. William L. Mitchell.
William L. Mitchell’s
military and intelligence career spans three decades, including
operations in Afghanistan, the Balkans, Iraq, Africa, and French Guiana.
Dr. Mitchell is currently an active member of Danish Defence as an
advisor, instructor, and lecturer. While at the Royal Danish Defence
College (RDDC), he was responsible for the synchronization of theory,
practice, and education regarding intelligence, joint, and special
operations. He served as a member of the RDDC Research Board and
continues to support NATO and US Department of Defense research,
education, and doctrine development programs. He co-authored Target-
Centric Network Modeling (2016) with Robert M. Clark, has several
publications on military intelligence and battlespace agility, and was
awarded the 2014 NATO Scientific Achievement Award for his
contributions to NATO research. Dr. Mitchell has a BA, an MA with
distinction from Kent University, and a PhD in political science from
Aarhus University. He is a decorated war veteran of two countries, with
one citation and several medals, including the French Croix de
Combatant and the Danish Defence medal.
Part I Fundamentals of Deception and
Counterdeception
1 Deception: The Basics
This chapter introduces the basics of deception and the role of intelligence in
both supporting and defeating deception. It sets the stage, with definitions
and a set of basic principles, for the following chapters that explain how to
conduct deception and to identify an opponent’s use of it. But first, let’s look
at the case of a perfect deception—something that happens only in the
movies, of course.
The Sting
The popular media have given us many great stories about well-conducted
deceptions. The staged “sinking” of a Soviet ballistic missile submarine in
The Hunt for Red October (1990) and the elaborate scam to steal $160
million from a casino owner in Ocean’s Eleven (2001) come to mind. But
few if any Hollywood movies can offer the beautifully executed deception
operation set forth in the 1973 film The Sting.
The film is set in 1936, in the depths of the Great Depression. In it, a
small-time con man Johnny Hooker (Robert Redford) has helped pull off a
minor street scam with his friend Luther. Unfortunately for them, it turns
out that the mark was a courier of crime boss Doyle Lonnegan (Robert
Shaw) and Luther is quickly tracked and killed. Hooker must run for his
life from Lonnegan’s revenge. (Lonnegan does not know what Hooker
looks like, which turns out to be a key part of the story.) Hooker gets
advice that perhaps Luther’s old friend, the legendary con master Henry
Gondorff (Paul Newman) can help him start anew, and tracks down
Gondorff, who is hiding in Chicago from the FBI. Hooker subsequently
persuades Gondorff to undertake an elaborate con operation, partially to
honor Luther, targeting Lonnegan.
The next day, at the designated time, Lonnegan receives the tip to “place it
on Lucky Dan,” and makes his $500,000 bet at Gondorff’s parlor. The race
description is another part of the scam—broadcast by an announcer in a
back room of the parlor. As the race begins, the tipster arrives, and when
told that Lonnegan had bet on Lucky Dan to win, explains that when he
said “place it” he meant, literally, that Lucky Dan would “place.” The
panicked Lonnegan rushes to the teller window and demands his money
back, but the teller says he is too late. At this moment, Agent Polk, Snyder,
and a half-dozen FBI officers break into the parlor. Agent Polk tells
Gondorff that he is under arrest and informs Hooker that he is free to go. In
reaction to the apparent treachery, Gondorff shoots down Hooker; Agent
Polk guns down Gondorff and tells Snyder to get Lonnegan out of there
and away from the crime scene.
Once Lonnegan and Snyder are gone, Gondorff and Hooker get up, unhurt
—their deception is complete. “Agent Polk” and his fellow “agents” are, of
course, part of the con.
Aside from its entertainment value, the movie illustrates many features of a
deception operation that are covered in this book:
The movie also illustrates some other points. Most deceptions require good
showmanship, and Hollywood understands how to put on a good show—a
talent that the US government occasionally calls on, as we’ll see in a later
case study. In every deception, curveballs will present themselves. Success
demands constant observing, orienting, and reacting as events unfold. The
sudden appearance of the corrupt local cop, Snyder, looking for Hooker had
to be dealt with. The deception plan was changed to include him as a target. It
also creates an outcome that most often is the ideal—the “mark” or opponent,
Lonnegan, never realized that he had been deceived (nor did Snyder). On
some occasions, though, you prefer for an opponent to know he’s been
deceived because of the subsequent bad decisions that he makes.
Deception itself is a word that has been used in a great many contexts, from
deliberate acts within wars between nations to the deliberate acts in personal
relationships as exemplified in The Sting. In this sense deception is a process.
As it is used in this book, it also refers to a deliberate and rational process
executed by an actor, in order to benefit that actor within a subjective context.
The spectrum of contexts in this book is focused on actions promoting
governmental rather than private interests. That includes, for example, the
intelligence and operational planning processes of military, police, and/or
civilian organizations.
One does not conduct deception for the sake of deception itself. It is always
conducted as a part of a conflict or in a competitive context, intended to
support some overarching plan or objectives of a participant. In a military and
civilian intelligence context, the overarching plan or strategy is usually stated
clearly. So in this context, the most direct answer is the following axiom: The
more successful the deception in support of a plan, the greater the chance the
plan will be successful. In dealing with war and security issues, measures of
success are usually characterized by precious resources that include material
and people. Though by no means exhaustive on the issue, one of the most
accessible studies as to the effects of employing deception in operational- and
strategic-level military planning is Barton Whaley’s 1969 book Stratagem,
Deception and Surprise in War.1 By drawing on the comparative analysis of
122 historical cases, Whaley shows a clear relationship between deception,
surprise, and the ratio of adversary to friendly casualty results from
engagements. Figure 1-1 illustrates his results. The bottom axis is the ratio of
adversary to friendly casualties—higher numbers are better. As one succeeds
in either deception or surprise, casualty ratios become more favorable. The
ratio improves dramatically when deception is used and surprise is achieved.
Furthermore, deception has a substantial role to play in all conflicts, not just
military ones. Governments must be able to apply deception activity in
conflicts across the political, military, economic, social, infrastructure, and
information (PMESII) domains. Nonstate actors such as terrorists, criminal
groups, and other militants directly engage governments through social and
economic lines of operation. The insurgents in Afghanistan offer “shadow”
governance. Hezbollah militancy in Lebanon has a strong social and
economic engagement, as did Daesh (also known as ISIS, ISIL, or IS) in
Syria, Libya, and Iraq. However, “shadow governance” in Afghanistan is also
a cover for narcotics cartels. The social and economic engagement of
Hezbollah conceals ideological activities that support a militant agenda and
Iranian foreign policy. Daesh did the reverse, using a radical religious
ideological screen to hide the fragile economic and social connections to the
Sunni tribes that form their support base on the strategic level. On the
operational level they disguise their intelligence organization as an
organization of tribal engagement offices in major communities. They hide
military command and control (C2) within the social domain of the
population they control.
The statistics shown in Figure 1-1 and Figure 1-2 are generally available for
the outcomes of military conflict. No similar statistics have been found that
deal with the many deceptions in the political, economic, and social realms,
in part because outcomes of “victory” or “defeat” are harder to establish in
those arenas. But it is likely that similar ratios of success for both deception
and surprise apply in all the fields of conflict covered in this book.
Comparable statistics do not exist for the counterintelligence or psychological
operations (PSYOPS) disciplines for good reason. The statistics are almost
binary: Failure of the deception almost always means failure of the operation.
The series of deceptions executed by Soviet and later Russian intelligence to
protect two major US sources for almost twenty years, Aldrich Ames and
Robert Hanssen, are described in Chapter 6. Those two were able to operate
long after they should have been caught because the deceptions were so
successful. The World War II Black Boomerang PSYOPS described in
Chapter 3 succeeded because its listeners continued to believe that they were
hearing a German Wehrmacht radio broadcast.
Deception
There are a number of definitions of deception in a variety of contexts, some
of which overlap. The eminent author on deception, Barton Whaley, defines
deception as
Another prominent writer on the topic, J. Boyer Bell, defines deception very
simply:
Both definitions are accurate in the sense of defining an end result, in terms
of the belief and/or behavior of others. Both also correctly describe deception
as a process of deliberately inducing misperception in a target person or
group of people. Deception is therefore not an accidental or unintended
outcome.
Whaley explicitly takes the definition one step further, and it is an important
step. His focus is on manipulating behavior based on a false picture. That’s
the widely accepted view: that belief is not enough; action (or refraining from
an action that otherwise would be taken) is required for it to be deception.
This concise definition includes three basic concepts that we’ll revisit
frequently:
1. It emphasizes the idea that deception must have a target. In the next
section, we’ll introduce a structured approach to thinking about the
targets. The section following that discusses the means, in the form of
basic principles of deception.
2. It promotes the idea of using deception to gain an advantage. The key to
deception planning is being able to envision a future situation that is
more advantageous to the pursuit of the deceiver’s objectives than if he
or she did not conduct a deception. That future situation takes the form
of a “desired” scenario to be achieved through deception, as later
chapters will discuss.
3. It highlights the concept of imposing the false on the target’s perception
of reality. This false perception takes the form of a story, which will be
discussed in Chapter 5.
Counterdeception
Much like the definition of deception, the term counterdeception is often
differentiated by context and organizational mission. For example, the US
Department of Defense definition follows:
Counterintelligence
The US government defines counterintelligence as follows:
Operations
Operations is often thought of in a military context, and many of the
examples in this book describe deception to support military operations. But
law enforcement conducts operations to deter crime and capture criminals;
and CI, as the previous definition indicates, includes “activities”—that is,
operations. And nongovernmental organizations such as criminal and terrorist
groups conduct operations. So the term is used in its most general sense
throughout the book, in two ways: to describe an action taken, and to refer to
an organization that executes political, informational, or economic as well as
military actions.
Psychological Operations
Finally, before going further into the subject of deception, it’s important to
define psychological operations. The US military definition is as follows:
Your goal is not to make the opponent think something; it is to make the
opponent do something.
You want your opponent not only to do something—but do something
specific.
It is not always necessary to make the decision maker in your target
network believe in the false state of affairs that you want to project; but
it is enough to make him so concerned over its likelihood that he feels
that he must provide for it.
Non-action is a form of action; the decision to do nothing is still a
decision.
The decision maker(s) are the targets of deception, the intelligence
services are the customers of deception.9
The view of the deception target as a defined group was developed in China
nearly two millennia ago. It is recorded in a book called Thirty-Six
Stratagems that originated in both oral and written Chinese history, with
many different versions compiled by different authors over time. The book
was rediscovered during World War II and popularized after the Communists
came to power in China. Most of the thirty-six stratagems—which use
colorful metaphors to convey the concepts—are about either deception or the
use of deception as an enabler.
One subset of the stratagems is foreign to current Western thinking about the
opponent as a deception target—though Niccolò Machiavelli would
undoubtedly recognize them, since he argued for the same stratagems in the
sixteenth century. This subset emphasizes the use of deception against
neutrals or even allies instead of opponents. Some examples follow:
The second set of stratagems (below) violates Holt’s maxim for misleading
deception against decision makers cited earlier: that the goal is to make the
opponent do something. These mostly fall into the PSYOPS realm. The
objective here is to use all means, including deception, to create chaos and to
destabilize the opponent, thereby creating a favorable outcome scenario.
There are several of these in the thirty-six stratagems, suggesting that they
play an important role in Chinese thinking about conflict. Some examples
follow:
Remove the firewood from under the pot. This argues for an indirect
approach, rather than directly confronting an opponent. Operations are
aimed instead at the opponent’s ability to wage a conflict.
Trouble the water to catch a fish. Create confusion in the opponent’s
organization and use it to promote your objectives.
Feign madness but keep your balance. Pretend to be a fool or a madman.
Create confusion about your motives and intent. Encourage an opponent
to underestimate your ability and make him overconfident.
Hit the grass to startle the snake. Do something spectacular but
apparently purposeless, strange, or unexpected to confuse the opponent
or provoke a response that furthers your goals.
Replace the beams with rotten timbers. Disrupt the opponent’s
organization or standard processes. The idea is to tear apart the
opponent’s cohesiveness.
Let the enemy’s own spy sow discord in the enemy camp. Undermine
your enemy’s ability to fight by secretly causing discord between her
and her friends, allies, advisors, family, commanders, soldiers, and
population. While she is preoccupied settling internal disputes, her
ability to attack or defend is compromised.
Once the stage has been set using one or more of these stratagems, then it is
time to apply the execution or follow-up stratagem:
The use of deception to destabilize has often been applied against criminal
and terrorist groups and in international economic matters. The British have
demonstrated some skill in this area, as the next case illustrates.
The IRA Embezzlement Sting
In the early 1970s, the British were engaged in a bitter conflict with the
Irish Republican Army (IRA) in Northern Ireland. The two sides
conducted several deceptive operations against each other. One British
operation was designed to aggravate an emerging split between older IRA
leadership and a younger faction of leaders. The veteran leaders were
willing to consider a cease-fire with the British; the younger faction
opposed any cease-fire.
Called the Embezzlement Sting, the deception was carried out largely by a
British unit called the Mobile Reconnaissance Force. It relied primarily on
allegations made to the press by a British double agent named Louis
Hammond, a Belfast Catholic who had joined the British Army’s Royal
Irish Rangers in 1970.
Unfortunately for Hammond, the press article pointed directly to him. The
IRA enforcement arm seized him and conducted an intense interrogation.
Hammond confessed to working for the British and was then shot three
times in the head and once in the stomach. The IRA gunmen then dropped
Hammond’s apparently dead body in a deserted alleyway. Hammond
somehow survived, partially paralyzed and with the loss of one eye.15
Truth
All deception works within the context of what is true. Truth establishes a
foundation of perceptions and beliefs; these are then accepted by an opponent
and can be exploited in deception. Often, supplying the opponent with real
data establishes the credibility of future communications that the opponent
then relies on.
In the Stalin example, truth was provided with no intent to deceive. But it is
possible to correct an opponent’s misperception as part of a deception plan.
Historically it has been applied to establishing the credibility of double agents
who can be used with greater effect later. For example, one of the most
famous channels used by the United Kingdom during World War II was a
double agent named Juan Pujol Garcia, known as Garbo to the MI5 and as
Arabel to the German Abwehr. By the end of the war MI5 had provided
Garbo with enough truth to relay to the Germans that he had managed to
receive an Iron Cross from them. Truth in this case was used to ensure there
was no German misperception about his credibility, so that when it came time
to use him for D-Day and the supporting deception, Operation Quicksilver
(described in Chapter 2), the Germans were conditioned to Garbo’s reporting
being credible.
A key element in breaching the defenses was getting through the sand wall
quickly. After trying conventional methods (explosives and bulldozers),
Egyptian engineers found that a sand wall could be flattened quickly by a
high-pressure stream of water. Egypt subsequently purchased several high-
pressure water cannons from the United Kingdom and East Germany.
Denial
In many texts and training courses, deception is coupled with denial, and the
two concepts are labeled denial and deception, or D&D. There are advantages
to treating them separately. One can practice denial without conducting active
deception. Denial often is used when no deception is intended; that is, the end
objective is simply to deny knowledge. Intelligence collectors routinely must
deal with this type of denial:
This book does not deal with these types of denial, unless they are a part of
deception. All deception involves some form of denial. One can deny without
intent to deceive, but not the converse. You cannot practice deception without
also practicing denial of some aspects of the truth that you want the opponent
to be unaware of or disbelieve. Operational security (OPSEC) is just as
important to a deception operation as it is to the real operation, if not more,
depending on how much success is hinged on the deception plan.
In fact, all deception requires that you deny the opponent access to some
parts of the truth. Denial conceals aspects of what is true, such as your real
intentions and capabilities.
In the same time frame, the Iraqis pursued a biological weapons program,
drawing heavily on German expertise for facility construction and on the
US Centers for Disease Control and Prevention for biological samples such
as anthrax, botulism toxin, and West Nile virus. The Iraqis claimed that
they needed the samples for medical research. By the time of the first Gulf
War (1991; codenamed by the United States as Operation Desert Storm),
Iraq had weaponized (placed into munitions) thousands of liters of
botulism toxin, anthrax, and aflatoxin.
The Iraqi denial effort was unusual in that it was an attempt to project two
diametrically opposite perceptions to two different targets. Against the UN,
the United States, and its allies, the objective was to portray an image of
weakness—the lack of WMD and WMD programs. Against Saddam
Hussein’s opponents in the Middle East—Iran and Israel in particular—the
objective was to portray an image of strength: that Iraq possessed WMD
and was prepared to use them.
Deceit
Successful deception normally requires the practice of deceit. Without deceit
the target is only the victim of misperceptions due to denial, misinformation,
and/or self-deception, which is not the same as deliberate deception.
Pseudo operations teams have proved so successful over many decades that
they are considered to be an essential component of any counterinsurgency
campaign:
Misdirection
Misdirection requires manipulating the opponent’s perceptions in a specific
direction. You want to redirect the opponent away from the truth and toward
a false perception. In operations, a feint—often called a diversion—is used to
redirect the adversary’s attention away from where the real operation will
occur. The idea is to draw the adversary away from an area or activity; to
divert the target’s attention from friendly assets; or to draw the target’s
attention to a particular time and place.
Using the knowledge they gained from the demarche, the Indians were
able to plan an elaborate deception campaign to conceal preparations for
the 1998 tests. The campaign was many faceted, aimed at protecting the
operation from HUMINT and IMINT.25 The deception campaign had
several elements, making it an excellent example of multi-INT deception.
And it is a prime example of all four characteristics of a deception:
Truth. Their test location was known. Indians had to work within that truth,
knowing that the United States was going to monitor that facility using
imagery. Also, the deception was helped along by the US government’s
knowledge that India wanted to improve trade relations. US officials were
therefore predisposed to believe that India would not provoke a crisis by
testing a nuclear weapon.26
Denial. The effort was protected by extensive secrecy measures within the
Indian government. Few knew of the plan; the decision to test was not
disclosed even to senior cabinet ministers. Work was done at night, and
heavy equipment was always returned to the same parking spot at dawn
with no evidence that it had been moved. The shafts were dug under a
netting of camouflage. When cables for sensors were laid, they were
carefully covered with sand and native vegetation was replaced to conceal
the digging.
Deceit. Piles of dug-out sand were shaped to mimic the natural wind-
aligned and shaped dune forms in the desert area. All technical staff at the
range wore military fatigues, so that in satellite images they would appear
as military personnel charged with maintenance of the test range. The
Indian government issued a number of public statements just prior to the
test, designed to reassure Washington that no nuclear test was
contemplated. Indian diplomats also categorically told their US
counterparts that “there would be no surprise testings.” All scientists
involved in the operation left in groups of two or three on the pretext of
attending a seminar or a conference. Tickets were bought for some location
other than Pokhran under false names, and after arriving at their
destination the group would secretly leave for Pokhran. After finishing
their part of the work, the group would go back, retracing their path. Then
another group would leave for the range, employing similar means to do
their part of the work on the bombs.
Misdirection. Just prior to the test, Indian leaders began an effort to focus
US attention elsewhere. They were aware that the United States monitored
ballistic missile tests at their Chandipur missile test range, more than a
thousand miles from the Pokhran site. They consequently started
preparations for what appeared to be a ballistic missile test at Chandipur.
The Indians actually tested a Trishul surface-to-air missile (which was of
relatively low intelligence interest to the United States), but they moved
additional equipment into the test range so that the preparations appeared
to be for a test of the Agni intermediate-range ballistic missile (which was
of high intelligence interest).27 As a result, US reconnaissance satellites
reportedly were focused on the Chandipur site, with only minimal
coverage of the nuclear test site at the time of the test.28
The following chapters of this book lay out the methodology for organizing a
deception campaign. But first, it’s worth taking a moment to understand the
roles of operations and intelligence in conducting deception.
Roles of Operations and Intelligence
Holt’s commandments of deception described earlier were generated by
lessons learned from World War II and were shaped by the organizational
environment in which military deceptions were carried out by the Allies. It
then should come as no surprise that those tenets are based on deceptions that
required both an intelligence element and an operations element. The
relationship between the two elements has often determined the success or
failure of deception operations.
Deceptions are planned and carried out under the guidance of a decision
maker to support a policy or military objective that can be strategic,
operational, or tactical. In military terms, the decision maker is the
commander; and his or her organization for attaining the overall objective,
referring back to our definition, is called operations.
Also, in a few cases, the intelligence organization handles both the operations
and intelligence roles. These cases include several types of covert actions and
those missions in which the primary objective of the operation is intelligence
collection.
This focuses on two categories that both government and military leaders
often don’t consider: the opportunity to conduct deception against opponents,
and the threat of deception operations conducted against them. It is an
important capability as not only does it increase the chances of success for
your operations, but it also decreases the chance of success of adversarial
operations against your side.
The next several chapters of this book lay out the methodology for organizing
and managing deception and counterdeception. Chapters 2–9 cover how
intelligence analysis should support deception operations, while Chapters 10
and 11 illustrate how intelligence analysis should drive counterdeception
efforts.
Notes
1. Barton Whaley, Stratagem, Deception and Surprise in War [reprint of
1969 edition] (Norwood, MA: Artech House Publishing, 2007), 104, tables
5.19 and 5.20.
5. “Counterdeception,” http://www.militaryfactory.com/dictionary/military-
terms-defined.asp?term_id=1334.
12. Ibid.
13. Pamela H. Krause, Proteus: Insights from 2020 (Washington, DC: The
Copernicus Institute Press, 2000), D-i–D-xx.
15. Mark L. Bowlin, “British Intelligence and the IRA: The Secret War in
Northern Ireland, 1969–1988,” US Naval Postgraduate School, September
1999, pp. 80–83,
https://archive.org/stream/britishintellige00bowlpdf/britishintellige00bowl_djvu.txt
16. Ibid.
19. David Kay, “Denial and Deception: The Lessons of Iraq,” in U.S.
Intelligence at the Crossroads: Agendas for Reform, ed. Roy Godson, Ernest
R. May, and Gary Schmitt (Washington, DC: Brassey’s, 1995), 120.
20. Scott Gerwehr and Russell W. Glenn, The Art of Darkness: Deception
and Urban Operations (Santa Monica, CA: RAND, 1999), 21,
http://www.rand.org/publications/MR/MR1132.
24. Tim Weiner and James Risen, “Policy Makers, Diplomats, Intelligence
Officers All Missed India’s Intentions,” New York Times, May 25, 1998.
25. Ibid.
26. Ibid.
29. Holt, The Deceivers: Allied Military Deception in the Second World War,
54.
2 The Methodology
Roy Godson and James Wirtz, in their journal article “Strategic Denial and
Deception,” provide some detailed guidance for planning and implementing
the misleading type of deception. They observe that
To summarize, all deception has four essential components. First, there must
be a desired objective. Second, there has to be a target, some person or group
of people who are to be deceived. Third, a story is presented—a false picture
of reality. And fourth, one or more channels are selected through which the
story is to be transmitted.
The OODA loop clearly is a useful way to think about deception, and has
previously been applied to deception problems. In his work on deception
methodology, Rob Dussault uses an identical framework to describe his
interacting-agent model of deception. Dussault applies slightly different
names; the terms in his loop are sense-interpret-plan-act3—reflecting the
longer time frame involved in a deception operation as compared with the
combat time frame of a fighter pilot. Since the Boyd OODA loop is more
familiar to most readers, we have adopted it for this text.
Armed with these three perspectives of OODA loops and one explicit
additional step, we can plan and execute a deception operation. Boyd
assumed an outcome, or result of his actions in every cycle around the loop—
a feedback step between action and observation, based on the result of each
action. This result includes the unfolding circumstances, interaction with the
environment, and the opponent’s actions in response to your actions. The
opponent, after all, is running his or her own OODA loop while you are
running yours. But it is important to make the outcome of the action explicit
in deception planning. So for this book’s purposes, an additional step is
required—inserting the outcome—into Boyd’s loop. The loop now looks like
Figure 2-5. This is the loop for the planning process—with a slight change; a
reversal of the arrows to create what is called an inverse loop.
Some type of deception, therefore, was called for. To get to their objective,
the Allies needed to persuade the Germans to deploy troops elsewhere in
the theater, away from Sicily. Decisions such as this were being handled
by Adolf Hitler, so he was the target of the deception. He would likely
make such a decision only if convinced that the invasion would occur in
another part of southern Europe. The British orchestrated an elaborate
deception operation, codenamed Operation Barclay, to convince Hitler that
such was the case.
The “story” to be used was that the invasion would hit either the Balkans
or Sardinia rather than Sicily. Several channels were used to convey the
“story.” Radio communications; bogus troop movements; currency
transactions; preparing maps of Greece and Sardinia—all accomplished in
such a fashion that German and Italian intelligence would observe them—
were designed to give the impression of a landing in either Greece or
Sardinia.
The most interesting part of Operation Barclay had the unappetizing name
Operation Mincemeat; it was first revealed in Ewen Montagu’s book (later
a motion picture) entitled The Man Who Never Was.8 The British planned
and executed this part of the deception during early 1943. Because
Operation Mincemeat proved to be a critical part of the deception, the
observation/orientation part of the OODA loop (Figure 2-6) is worth
looking at step by step.
The Outcome
That process involves selecting the channels for conducting the deception and
then identifying and dealing with possible undesired end states resulting from
adding deception to the OODA loop. As part of this, you conduct a
deconfliction analysis (see Chapter 9) and then execute the deception. The
process completes a loop as shown in Figure 2-7. Finally, you monitor
subsequent indicators to determine whether they are pointing to the desired
outcome scenario.
These steps are listed in the logical order, but in practice they create an
iterative process. That is, you might start with an inverse OODA loop based
on the belief that a certain action by an opponent will result in a favorable
outcome; then go forward through the loop and find that the opponent’s
possible actions lead to undesired outcomes—making it necessary to go back
into the loop at some point and revise it. In some cases, it may become
apparent that the desired outcome scenario is unachievable or too risky to
attempt, given the opponent’s sensory channels or decision process. In that
case, you must start over with a different desired outcome.
The planning phase also has to consider what is likely to happen once the
deception has been executed and the outcome is apparent. Often, the
opponent will recognize from the outcome that he or she has been deceived.
Sometimes, the outcome will not be what was expected. In any case, you also
often have to plan for multiple OODA loops as part of a deception, and for
multiple circuits of a single OODA loop. That idea leads us to a second
classic case from World War II—the elaborate deception that supported the
Normandy invasion.
Operation FORTITUDE
During the spring of 1944, the Allies were busy planning for the invasion
of Nazi-occupied Europe. The decision had been made for the invasion to
hit the French beaches at Normandy. The problem for the Allies was that
the invading forces would be highly vulnerable to counterattack during and
in the days after the landing. It was likely, in fact, that the German
Wehrmacht would defeat the invasion by hitting it during the landing
phase, if they were positioned to attack soon after the invasion began. To
avoid that outcome, the Allies developed an elaborate deception known as
Operation Fortitude. We’ll revisit this deception in later chapters to
illustrate a number of points. For now, let’s focus on using the inverse
OODA loop.
The desired outcome had two phases (and consequently required taking the
Germans through two circuits of an OODA loop). Prior to D-Day, the
objective (loop one) was for as many German forces as possible to be
concentrated in areas other than Normandy, and preferably far from
Normandy. After D-Day, the objective (loop two) was to delay as long as
possible the redeployment of German forces to the Normandy beachhead.
For Fortitude North, the Allies provided the Germans with evidence of a
buildup of military forces in Scotland, the logical departure point for a
Norway invasion. They had previously created a fictional British Fourth
Army, headquartered in Edinburgh Castle. The deception was conveyed
through several channels:
Some histories have reported that Quicksilver also used large numbers of
dummy tanks, military vehicles, and other equipment that appeared to be
assembling for an invasion across the Channel. The dummy equipment
supposedly made use of inflatables that were highly detailed and were
designed to look real when photographed by overflying German
reconnaissance aircraft. Though dummy equipment was used in a number
of deceptive operations in Europe, it appears that Quicksilver only made
use of dummy landing craft that could be observed from the English
Channel; by 1944, German reconnaissance aircraft couldn’t survive for
long over England.
Of equal importance was denial of other channels: The Allies denied the
Germans intelligence about the preparations for invasion in the ports of
southern England—which would have pointed to the actual invasion target.
Reports from southwestern England through doubled agents indicated few
troop sightings there.
The key observable here came from two doubled German agents,
nicknamed Garbo and Brutus, who had previously provided fictitious
reporting on the activities of FUSAG. Both reported in the days after June
6 that FUSAG was still positioned near Dover, ready to invade at Pas-de-
Calais.
7. Ibid.
8. Ewen Montagu, The Man Who Never Was (Annapolis, MD: Naval Institute
Press, 1953).
An outcome scenario seldom must deal with only one of these factors.
Complex scenarios are likely to involve them all. The events of the Arab
Spring in 2011, the Syrian uprising that began that year, and the Ukrainian
crisis of 2014 involved all of the PMESII factors.
Denial. Missiles were shipped from eight Soviet ports to hide the size of
the effort; the missiles were loaded under cover of darkness. On reaching
Cuba, weaponry was unloaded only at night, and moved directly to the
missile bases along back roads at night. If visible on the decks, the missile
crates and launchers were shielded with metal sheets to defeat infrared
photography.
Deceit. Soviet military units designated for the Cuban assignment were
told that they were going to a cold region. They were outfitted with skis,
felt boots, fleece-lined parkas, and other winter equipment. Officers and
missile specialists traveled to Cuba as machine operators, irrigation
specialists, and agricultural specialists. The ships’ captains made false
declarations when exiting the Black Sea and the Bosporus. They altered
the cargo records and declared tonnage well below what was being carried.
They often listed Conakry, Guinea, as their destination. During September,
Soviet diplomats gave repeated assurances to top US officials (including
President John Kennedy) that they had no intention of putting offensive
weaponry in Cuba.5
The war in Kosovo highlighted a Serbian military that was well versed in
deception means and techniques. The Serbs drew heavily on Soviet
Maskirovka doctrine in their deceptions. Maskirovka is defined in the 1978
Soviet Military Encyclopedia as
They built decoy bridges out of plastic tarp and logs, and placed them
near camouflaged real bridges—combining denial with deceit.7
They painted bridges with infrared-sensitive material to break up
their distinguishing shape in order to mislead US intelligence sensors.
Serbian forces employed numerous decoy tanks throughout Kosovo.
Some tanks were quite crudely built of wood and plastic; at close
ranges, they were obvious decoys, but they were effective in
deceiving Allied pilots flying at high altitudes.
To add more realism to their deception, they created heat sources by
burning trash, tires, and cooking oil near the decoy tanks to create an
infrared signature.8
Serbian forces disguised the results of Allied bombing in order to
deny US intelligence an accurate battle damage assessment. Besides
removing and hiding military wreckage, they created dummy bomb
craters on roads and runways to portray damage and thereby avoid
NATO targeting.
Economic
Deception has long been practiced by private persons and companies for
economic gain. Techniques such as Ponzi schemes and deceptive corporate
accounting are daily occurrences. These all are basically fraud and are not
treated in this book; they typically fall in the purview of law enforcement.
Deception about trade also is common, especially in the case of illicit trade
and gray arms trade. Godson and Wirtz note that “global financial markets
and commerce form a new and profitable venue for D&D operations.”9 At
the strategic level, deception is a tool for concealing or exaggerating
economic strength or weakness to gain economic or other advantages.
Deception in currency or commodities markets or products is practiced at the
tactical level, as the following case illustrates.
The 1972 Soviet Wheat Crop Deception
In 1971 the USSR was facing a major wheat crop shortage that had to be
covered by very large imports of wheat. Had that fact been generally
known, it would have caused a major surge in the price of wheat
worldwide—and, of course, would have resulted in the Soviets having to
cover the shortfall at substantially higher prices than the existing market
offered.
In July and August 1972, the USSR purchased 440 million bushels of
wheat for approximately $700 million. At about the same time, the Soviets
negotiated a trade deal under which the United States agreed to provide
them with a credit of $750 million for the purchase of grain over a three-
year period.10
The primary risk to the deception involved agricultural attachés from the
US Department of Agriculture’s Foreign Agricultural Service. These
attachés had the job of monitoring crop developments worldwide and
routinely reported on expected Soviet crop yields. Their estimates at the
time were based on visual observations—requiring frequent visits to crop-
producing areas. This part of the deception involved the Soviets steering
their visitors to a few selected fields that were producing high yields—
ensuring that the attachés had no chance to view other fields en route.11
The US attachés suspected that they were being deceived but couldn’t
prove it.
During the Kosovo conflict, the Serbs brought in news reporters to show
them the results of a NATO airstrike that featured many bodies. A
blood-stained baby doll was prominent in one photo and garnered such
international attention that the same doll subsequently appeared in
staged photographs of the results of other alleged air attacks.12
During Operation Desert Storm, the Iraqis sheared off the dome of the
al-Basrah mosque, then brought foreign news media to the location and
falsely accused the coalition and the United States of destroying
religious shrines. The deception was crude—no bomb could cleanly cut
the top off of a mosque and leave the surrounding building undamaged,
and the nearest bomb crater was some distance from the mosque. But
many people accepted the incident as true.
During 2015, a widespread belief developed among Iraqis that the
United States was secretly supporting Daesh. Iraqi fighters reported
viewing videos of US helicopters delivering supplies and weapons to
Daesh. The widely believed (among Iraqis) rationale for such an
unlikely collusion was a US goal to regain control over Iraq and its oil
reserves.13 Although the source of the belief (and the videos) could not
be established, a PSYOP operated by Iran or its Iraqi supporters is a
possibility.
Unlike most deceptions, the end objective of social deception may not be to
cause specific actions by the target population, but rather to shape attitudes.
Psychological operations are most commonly associated with these
deceptions. But one social deception during WWII also was aimed at
encouraging specific actions, as the next case—which we will return to later
on—illustrates.
Black Boomerang
The British Secret Service set up a novel and very successful “black”
psychological operation14 that made use of BBC monitoring of German
radio broadcasts during World War II. A black radio station in the United
Kingdom was staffed with native German speakers, including anti-Nazi
prisoner of war (POW) volunteers. Purporting to be a German Wehrmacht
station called Atlantiksender, it broadcast news for the troops. Interspersed
with actual news items were tidbits of distortions and gossip designed to
drive a wedge between the Wehrmacht and the Nazi party. The full story is
told in Sefton Delmer’s book, Black Boomerang.15
The stories in the news broadcasts typically hit themes such as the
inequality of sacrifice between the common soldier and the “privileged”
Nazi party functionaries and, therefore, were intended to promote no more
serious actions than malingering and desertions. But some broadcasts were
carefully tailored to encourage sabotage. In “news items,” the broadcasters
criticized U-boat crew members who had, due to a “selfish desire for self-
preservation,” delayed the departure of their submarines by conducting
unattributable acts of sabotage. The unattributable acts were, of course,
carefully described in the broadcast.
There are a number of cases, though, where the information channel (the
intelligence organization) is the target. These usually fall into the realm of
counterintelligence. The Soviet and later the Russian government conducted
an elaborate series of deceptions between 1985 and 2001 to protect two of its
most valuable intelligence assets—Aldrich Ames and Robert Hanssen. Those
cases are discussed later, in Chapter 6. Some of the earliest and most
effective information deceptive operations were set up to protect the Ultra
collection program.
The Ultra Deception
During World War II, British cryptanalysts (with help from the Polish
Secret Service) managed to break the German Enigma encryption system
—a story that has been told in books (The Ultra Secret and Codebreakers:
The Inside Story of Bletchley Park) and a 2014 film (The Imitation Game).
This breakthrough allowed the British to read sizeable portions of the
German military’s top-secret messages to its army and navy commanders
in the field. Ultra subsequently became a major factor in Allied military
successes during the war. We’ll revisit the Ultra case in later chapters,
because it illustrates several features of a successful deception.
Soon after they successfully broke the encryption, the British recognized
the dangers of aggressively using the intelligence they were getting. The
messages revealed, for example, German U-boat deployments to intercept
convoys and the locations of Axis supply ships bound for North Africa. If
the British were to warn the convoys, or intercept the U-boats and supply
ships, Germany would likely suspect a communications compromise.
Every one of those ships before it was attacked and sunk had to be
sighted by a British aeroplane or submarine which had been put in a
position in which it would sight it without it [the ship] knowing that it
[the British aircraft or submarine] had been put in that position, and
had made a sighting signal which the Germans and the Italians had
intercepted. That was the standard procedure. As a consequence of
that the Germans and the Italians assumed that we had 400
submarines whereas we had 25. And they assumed that we had a huge
reconnaissance Air Force on Malta, whereas we had three
aeroplanes!17
In some cases, this standard protocol wasn’t feasible, and some additional
deception was necessary. During 1941, a convoy of five ships left Naples
carrying supplies that were critically needed by Axis forces in North
Africa. Normally a reconnaissance mission would be sent out to observe
the ships, and strike aircraft would take off after the sighting was reported.
In this case, the ships would arrive at their North African port before the
strike aircraft could catch them. British prime minister Winston Churchill
ordered an attack based solely on Ultra intelligence. Fearing that the
Germans would suspect that their communications had been compromised,
the British conducted a special deception operation. They sent a radio
message congratulating a nonexistent spy in Naples for warning them of
the ships’ departure. The message was sent using a cipher system that the
Germans could decode, and it apparently succeeded.18
Sir Hinsley notes that the problem in the North Atlantic required additional
deception approaches:
Similar precautions were taken in the Atlantic, but there the problem
was different. That is why the Germans got most suspicious about the
Atlantic. The great feature there was that the Enigma was used in the
first instance not to fight the U-Boats but to evade them. And the
problem was how could you evade them without their noticing. You
have a situation … in which the number of U-Boats at sea in the
Atlantic is going up, and the number of convoys they see is going
down!19
How do you cover that? We did cover it but it was done by a different
system from what I have just described in the Mediterranean. We let
captured Germans, people we had captured from U-Boats write home
from prison camp and we instructed our people when interrogated by
Germans—our pilots for example—to propagate the view that we had
absolutely miraculous radar which could detect a U-Boat even if it
was submerged from hundreds of miles. And the Germans believed
it.20
This particular technique—a deception that relies on some novel source of
intelligence (usually based on a technology breakthrough) is particularly
effective. Intelligence services worldwide are especially vulnerable to
paranoia about an opponent’s “novel” sources.
Infrastructure
An outcome scenario can sometimes include a country’s infrastructure. Such
was the case in the covert action associated with the Farewell Dossier.
The Farewell Dossier
Covert operations usually depend on deception for success, as the Farewell
operation shows. In 1980 the French internal security service Direction de
la Surveillance du Territoire (DST) recruited a KGB lieutenant colonel,
Vladimir I. Vetrov, codenamed Farewell. Vetrov gave the French some
4,000 documents, detailing an extensive KGB effort to clandestinely
acquire technical know-how from the West, primarily from the United
States. In 1981 French president François Mitterrand shared the source and
the documents (which DST named the Farewell Dossier) with US
president Ronald Reagan.
In many ways, the Farewell operation was the perfect covert operation.
Even its subsequent exposure did not reduce the effectiveness of the
operation, since the exposure called into question all of the successful
KGB technology acquisitions and discredited the KGB’s technology
collection effort within the Soviet Union.22 Whether or not intended, the
operation damaged KGB credibility and its effectiveness. The operation
would not have been possible without the detailed knowledge that Vetrov
provided about the KGB’s intelligence channels. It allowed the United
States to create detailed models of the KGB targets; the nature of the KGB
operations; and the linkages—that is, the use of other Warsaw Pact country
intelligence services in the technology acquisition effort.
Notes
1. CIA, A Tradecraft Primer: Structured Analytic Techniques for Improving
Intelligence Analysis (Washington, DC: Author, 2009), 34.
7. Peter Martin, “The Sky’s the Limit: A Critical Analysis of the Kosovo
Airwar,” November 1, 1999,
http://www.infowar.com/iwftp/cloaks/99/CD_1999-214.txt/.
8. Ibid.; see also Steven Lee Myers, “Damage to Serb Military Less Than
Expected,” New York Times, June 28, 1999,
http://www.nytimes.com/library/world/europe/062899kosovo-
bombdamage.html, and Tim Ripley, “Kosovo: A Bomb Damage
Assessment,” Jane’s Intelligence Review (September 1999): 10–13.
9. Roy Godson and James J. Wirtz, “Strategic Denial and Deception,”
International Journal of Intelligence and Counterlntelligence 13 (2000): 430
11. Bob Porter, Have Clearance, Will Travel (Bloomington, IN: iUniverse,
2008), 34.
13. Liz Sly, “Iraqis Think the U.S. Is in Cahoots with the Islamic State, and It
Is Hurting the War,” Washington Post, December 1, 2015,
https://www.washingtonpost.com/world/middle_east/iraqis-think-the-us-is-
in-cahoots-with-isis-and-it-is-hurting-the-war/2015/12/01/d00968ec-9243-
11e5-befa-99ceebcbb272_story.html.
14. Black PSYOPS messages purport to come from a source other than the
actual one.
15. Sefton Delmer, Black Boomerang (New York: Viking Press, January
1962).
16. F. W. Winterbotham, The Ultra Secret (New York: Dell, 1974), 133.
17. Sir Harry Hinsley, “The Influence of ULTRA in the Second World War,”
lecture at Cambridge University, October 19, 1993,
http://www.cix.co.uk/~klockstone/hinsley.htm.
18. Mark Simmons, The Battle of Matapan 1941: The Trafalgar of the
Mediterranean (Stroud, Gloucestershire, UK: The History Press, 1941), 31.
20. Ibid.
21. Thomas C. Reed, At the Abyss: An Insider’s History of the Cold War
(Novato, CA: Presidio Press, 2004).
22. Gus W. Weiss, “The Farewell Dossier,” CIA: Studies in Intelligence 39,
no. 5 (1996), http://www.cia.gov/csi/studies/96unclass.
4 The Target
Once the desired outcome scenario has been established, attention is focused
on the target(s) of deception. For most outcome scenarios, the objective is for
the target (individual or group) to make decisions and take actions that lead to
the chosen scenario. The decision and action steps are discussed together in
this chapter, with a focus on the decision step; once the target makes a
decision, then presumably the action will follow. The two steps are closely
interrelated. Recognize, though, that they are still separate steps.
Individual Decision Modeling1
Deception usually depends on assessing the likely behavior of political,
military, or nonstate organization leaders, specifically to determine what
decisions they tend to make under given conditions. Many factors have to be
taken into account in making this assessment. The target’s background and
decision-making process have to be assessed. For example,
The purpose of decision analysis is always predictive: How will the target
react to a given situation? The identification of key decision makers, their
backgrounds, and psychological profiles are all critical components. The end
state is to know what the target believes, what he wishes to believe, and what
he is prepared to believe.
Rational Models
Targets of deception rarely are rational, in the sense that your own culture
defines rationality. That is especially true of terrorist and criminal group
leadership, but it is also true of many national leaders. And even leaders
regarded as rational can have “hot buttons”—issues or situations where the
leader is emotionally involved and departs from a rational path, as we’ll
discuss later in this chapter.
We don’t encounter many truly rational actors on the world stage, which is a
good thing when you’re trying to pull off a deception because they can be
difficult to deceive. Even when the mechanics of a deception succeed, as
happened in the following case, there may be a number of rational responses
to the deception, and the decision maker may not choose the expected one.
The Ethiopian Campaign Deception
In 1940 the British were preparing to reclaim British Somaliland—which
the Italians had taken earlier that year—and to expel the Italians from
Ethiopia. The British operations plan was to attack from the North, out of
the Sudan. The British deception plan was to make the Italians expect the
attack to come from the East, across the Red Sea into British Somaliland.
The deception involved extensive dummy message traffic and military
activity in Aden, indicating a large force assembling at the notional source
of the invasion. The troops preparing for the invasion were issued maps
and pamphlets about Somaliland. The Japanese counsel in Port Said,
Egypt, received false information about an invasion originating in Aden.6
Prince Amedeo, third duke of Aosta, was educated at Eton College and
Oxford University in England. He was, in some ways, more British than
the British—in behavior, mannerisms, and sports such as fox hunting and
polo. When World War I broke out, he was offered a commission in the
Italian Royal Army. Recognizing that the commission was offered only
because of his connections with the royal house of Italy, he refused. He
instead enlisted in an artillery unit. Amedeo served with distinction in the
ensuing conflict and ended the war as a major. In 1932 he became a pilot
in the Italian Air Force. As squadron commander, he led a group of Italian
airmen who helped defeat Libyan opponents in the conflict there.
The British deception plan apparently conveyed the image of a large force
positioned to invade from Aden. The duke concluded that the British force
would be too strong to resist. He instead ordered his troops to fall back
from the coastal cities to the more defensible mountainous areas. He
believed that his best chance for victory would be to wear down his
attackers on defense, and then to deliver a counterattack once the British
had impaled themselves on his lines.8
Could the British have reasonably foreseen the duke’s decision? The
profile that the British should have had was that of a competent and
experienced military officer with an excellent understanding of the British
military. Furthermore, the duke comes across as a noble and
compassionate leader who was intimately familiar with the Ethiopian
terrain and who placed a high priority on protecting his troops—in sum, a
rational man. His experiences in World War I appear to have convinced
him of the advantages that come with defending. Perhaps the British
should have put more time into studying their deception target and at least
have considered the alternative outcome scenarios in planning the
deception.
Administrative Models
The utility theory (rational) approach is useful in decision estimates, but it
must be used with caution. Why? Because people won’t always make the
effort to find the optimum action to solve a problem. The complexity of any
realistic decision problem dissuades them. Instead, people tend to select a
number of possible outcomes that would be “good enough.” They then
choose a strategy or an action that is likely to achieve one of the good-enough
outcomes.9
There are limits, though, on how well we can simulate (and therefore predict)
decisions based solely on either the rational or the administrative model. To
improve decision estimates, we have to include emotional and cultural
factors. These are the factors that often cause a target’s decision, or the action
of a group or population, to be labeled “irrational.”
Emotional Models
A third aspect of the decision process to consider when analyzing behavior is
the emotional. While we discuss it last as a factor in individual decision
making, it often dominates as a factor. Pride, envy, and revenge are all
emotional motivators. All types of targets, from national leaders and military
generals to business professionals, make some decisions simply because they
want to pay back an opponent for past wrongs. (Consider again the reason
that Gondorff places his massive bet in The Sting.) The emotional aspect of
behavioral prediction cannot be ignored, and personality profiling is one way
to grasp it.
In spite of this complexity, some analytic tools and techniques are available
to estimate the likely outcome of group decision-making. These tools and
techniques are based on the theories of social choice expounded by the
Marquis de Condorcet, an eighteenth-century mathematician. He suggested
that the prevailing alternative should be the one preferred by a majority over
each of the other choices in an exhaustive series of pairwise comparisons.
Another technique is to start by drawing an influence diagram that shows the
persons involved in the collective decision. Collective decisions tend to have
more of the rational elements and less of the emotional. But unless the
decision participants come from different cultures, the end decision will be no
less cultural in nature. In assessing collective decision-making, the Hofstede
individualism index, discussed later in this chapter, is an important
consideration.
Game Theory
One branch of the operations research discipline, known as game theory, is a
powerful tool for decision modeling of both individual and group decisions.
Game theory, in brief, is about analyzing the decision processes of two or
more parties (referred to as the “players”) in conflict or cooperation. It can be
applied as a thought exercise or, because it makes use of mathematical
models, by using simulations. It is, in a sense, a mathematical model of a
sequence of OODA loops.
A worthy opponent will apply game theory as well, and intelligence has a
role in identifying the opponent’s deceptive moves. Edieal J. Pinker, a Yale
University professor of operations research, has applied game theory to the
2015 P5+1-Iranian negotiations and developed a hypothesis that the Iranians
are carrying out a long-term deception. According to Pinker,
Therefore your best choice is the slower choice: First, you declare that
you are enriching uranium solely for peaceful purposes, like generating
energy and providing nuclear materials for treating cancer patients.
Second, you refrain from weaponizing the uranium until the very last
moment possible. Since your enemies have already shown that they are
reluctant to attack, if you don’t step across their threshold, you can
continue your nuclear program. Once you are ready, you will need to
make a mad rush to complete the final steps toward a weapon before the
United States and Israel react.18
The Pinker hypothesis illustrates the value of game theory in assessing the
motivation behind an opponent’s actions. Interestingly, Bueno de Mesquita in
2008 ran a game theory simulation that came to a different conclusion; it
predicted Iran would reach the brink of developing a nuclear weapon and
then stop as moderate elements came to power.19 The part about moderates
coming to power did occur when Hassan Rouhani assumed the presidency in
2013. But whether Pinker’s hypothesis or Bueno de Mesquita’s simulation
more accurately describes the Iranian situation remained an open question in
2018.
A frontal assault from Saudi Arabia into Kuwait would meet strong
resistance from Iraqis in prepared defensive positions, and the coalition
could expect heavy casualties. The more attractive option was a flanking
maneuver into Iraq from the west to encircle and cut off the Iraqi forces in
Kuwait—a tactic that later became known as the “left hook” or “Hail
Mary” maneuver. From the coalition point of view, this was an obvious
tactic. It would work; but if the Iraqis saw it coming, they would
undoubtedly deploy forces to counter it.
The coalition objective was for the Iraqis to believe a scenario in which the
ground attack would come from the South, along with an attack from the
Persian Gulf into Kuwait. If successful, this would divert Iraqi attention
away from a flanking maneuver through the desert.
Media channels were used to stage a diversion that had two major
components:
The deception was helped by Iraqi perceptions about the danger of desert
operations. They believed that any coalition force moving through the
desert would become lost and pose no threat. The Iraqis had little
operational experience in the desert, and their limited experiences there
had convinced them that desert travel was hazardous. They somehow
failed to appreciate how much the global positioning system (GPS) would
affect desert operations.21
As with any major deception, there were a few flaws—aircraft and vehicle
operations on the left flank that could have caught Iraqi attention—but
these either weren’t observed or weren’t acted upon.22 The deception was
a success in that the Iraqis were caught by surprise when the coalition
forces in the “Left Hook” maneuver swept into Iraq, cutting off supply
lines and retreat for many Republican Guard troops.
Like the Ethiopian campaign example, the Iraqi reaction could have been
at least foreseen as a possibility by looking at the situation through the
viewpoint of the Iraqi commanders. Iraq’s recent experience in the Iraq-
Iran war had taught them to emphasize defensive operations. And Iraqi
concerns about a flanking maneuver should not have been difficult to
foresee, given their assessment of the opposing deployments.
Before Japan attacked Pearl Harbor, both the United States and Japan made
exceptionally poor predictions about the other’s decisions. Both sides
indulged in mirror imaging—that is, they acted as though the opponent would
use a rational decision-making process as they defined rational.
The Japanese also knew that a long-term war with the United States was not
winnable because of the countries’ disparities in industrial capacity. But
Japan thought that a knockout blow at Pearl Harbor would encourage the
United States to seek a negotiated settlement in the Pacific and East Asia.25
To validate this assumption, the Japanese drew on their past experience—a
similar surprise attack on the Russian fleet at Port Arthur in 1904 had
eventually resulted in the Japanese obtaining a favorable negotiated
settlement. The Japanese did not mirror image the United States with
themselves, but with the Russians of 1904 and 1905. Japan believed that the
US government would behave much as the tsarist government had.
Latin America, Africa, the Arab world, and Asia have high power distance
index scores. Countries with long-established democratic traditions, not
surprisingly, have low scores on this index.29
Uncertainty Avoidance
The uncertainty avoidance index defines a society’s tolerance for ambiguity
and situations in which outcomes and conditions are unknown or
unpredictable. The index measures the extent to which people either accept or
avoid the unexpected or unknown. Societies that score high on uncertainty
avoidance usually have guidelines and laws that define the boundaries of
acceptable behavior. They tend to believe in absolute truth, or that a single
truth dictates behavior and that people understand what it is. A low score on
this index indicates general acceptance of differing thoughts and ideas.
Societies with low scores tend to impose fewer restrictions on behavior, and
ambiguity is more generally accepted in daily life. The differences between
the two types of societies are summarized in Table 4-3.
Creative Commons Attribution ShareAlike License
Latin America, much of Europe, Russia, the Arab world, and Japan all score
high in uncertainty avoidance. English-speaking, Scandinavian, and Chinese
cultures tend to be more accepting of uncertainty. Few countries, though,
score low on the uncertainty avoidance index.31 How this cultural index
might apply in planning deception is illustrated by the British experience with
failed deceptions in World War II.
Deception Campaigns against the Japanese in
World War II
Thaddeus Holt, in his book about Allied deception operations during
World War II, has documented the difficulties that the British had in
conducting deception against Japan. The British, from their experience in
running deceptions against the Germans, had learned to use subtlety in
providing information. A set of clues that was too obvious would be
suspect. In dealing with the Japanese, the British repeatedly presented a
subtle set of indicators to support deceptions only to see it apparently
disappear into Japan’s intelligence services without any response. Only
when information was provided that pointed to an obvious conclusion did
the Japanese respond. German intelligence would have rejected as a
deception a level of information that pointed to an obvious conclusion.
This result is all the more puzzling because the Japanese military routinely
integrated deceptive operations into their own campaigns. Holt concluded
that the Japanese intelligence service was incompetent.32
Masculinity index scores are extremely low in Scandinavia. They are high in
China, Japan, and central Europe and relatively high in English-speaking
countries. Despite popular perceptions of masculine dominance, the Arab
countries score in the mid-range of the masculinity index.34
High long-term orientation scores are typically found in East Asia, with
China having the highest in the world and Japan not far behind. Moderate
scores characterize Eastern and Western Europe. A short-term orientation
characterizes the Anglo countries, the Muslim world, Africa, and Latin
America.35
Indulgence scores are highest in Latin America, parts of Africa, the Anglo
world, and Nordic Europe; restraint is mostly found in East Asia, Eastern
Europe, and the Muslim world.
A final note about the Hofstede indices: The averages of a country do not
automatically define the behavior or decisions of individuals of that country.
Most countries have existed as such for less than a century, and many
comprise multiple ethnicities and religious groups. Even though these models
have proven to be quite often correct when applied to the dominant ethnic or
religious group, not all individuals or even regions with subcultures fit into
the national mold. The indices should be used as a guide to understanding the
difference in culture between different groups when planning deception, not
as law set in stone. As always, there are exceptions to the general rule.
Notes
1. The material in this section was taken from Robert M. Clark, Intelligence
Analysis: A Target-Centric Approach, 5th edition (2016) and modified.
8. Ibid.
16. Clive Thompson, “Can Game Theory Predict When Iran Will Get the
Bomb?” New York Times, August 12, 2009,
http://www.nytimes.com/2009/08/16/magazine/16Bruce-t.html?_r=0.
18. Edieal J. Pinker, “What Can Game Theory Tell Us about Iran’s Nuclear
Intentions?” in Yale Insight (Yale School of Management), March 2015,
http://insights.som.yale.edu/insights/what-can-game-theory-tell-us-about-
iran-s-nuclear-intentions.
19. Thompson, “Can Game Theory Predict When Iran Will Get the Bomb?”
20. James F. Dunningan and Albert A. Nofi, Victory and Deceit: Dirty Tricks
at War. (New York: William Morrow, 1995), 322.
22. Major Henry S. Larsen, “Operational Deception: U.S. Joint Doctrine and
the Persian Gulf War,” US Army Command and General Staff College, Ft.
Leavenworth, Kansas, May 26, 1995, 18.
26. Dino Brugioni, Eyeball to Eyeball: The Inside Story of the Cuban Missile
Crisis (New York: Random House, 1990), 250.
32. Holt, The Deceivers: Allied Military Deception in the Second World War,
289.
After thinking through the desired outcome scenario and determining the
opponent’s decisions and actions that would help create that outcome, the
next step is to craft a separate scenario, often called the “story” in deception
textbooks. The deception story is the picture that the target must believe in
order to make the desired decisions that lead to the desired actions and, thus,
the desired outcome.
Successful script writers, novelists, and news reporters all know how to tell a
story. Some think that it’s a lost (or perhaps undiscovered) art in intelligence
writing. Several years ago, American movie director Steven Spielberg
reportedly told a group of senior US intelligence community officials that
they needed to learn to tell a story.
A story having these characteristics engages young children and makes them
want to hear more. The first five characteristics include the elements that
children expect to see in an adventure story. The last one is the one that
almost everyone wants to see. It reinforces the belief that good should
triumph in the end.
The point is, whether or not a story is judged as good is decided entirely by
the audience. Like a good adventure story for an eight-year-old, a good
deception narrative should have characteristics that fit with the deception
target’s expectations and desires.
Characteristics of a Good Deception Story
A well-known literary theory called “reader response” focuses attention not
on what the author meant when she wrote the story, but instead on what the
reader thinks and feels as he reads it. The theory asserts that the same
narrative is responded to differently, based on the unique experiences of each
reader. Put another way, it doesn’t matter what the author was trying to
convey. The person reading the story interprets the elements of it based on
his personal world view. This is exactly how a deception planner must think
when creating the story. The target has to be the focus. Godson and Wirtz
confirm this in their description of what makes for a good deception
narrative:
In short, neither a generic story nor one that reflects the deception planner’s
culture or characteristics will succeed. Instead, the deception planner must
contemplate what circumstances and set of “facts” would lead the target to
take the desired action. Those circumstances and “facts” will combine to
create the story. Returning to the basic principles outlined in Chapter 1, the
deception planner must ask: What combination of truth, denial, deceit, and
misrepresentation is needed to produce a successful story for this particular
opponent?
But to answer those questions, the deception planner first needs to understand
the factors that are important in the opponent’s decision-making process and
what his or her state of mind is likely to be at the point of decision-making.
For example, when dealing with a specific type of target—such as a decision
maker who is prone to paranoia—a good start would be to develop a story
that includes a conspiracy against that target. This knowledge usually
requires some decision-modeling, as discussed in Chapter 4. It is also
profitable to know the target’s preferences in receiving information and
intelligence. A story that finds its way to the opponent from a trusted source
can lead to a false sense of security, a topic returned to in the next chapter.
Taking all of these elements into account, the earlier Godson and Wirtz
description identifies two characteristics that make for a good story in
deception. To have any chance for success, the story should have one,
preferably both, of these characteristics:
Plausibility. What story would fit with the target’s beliefs and
expectancies? What would convince the target that the deception story is
the real one?
Reinforcement. What story would reinforce the target’s desires? What
would reinforce the target’s fears?
Plausibility
A story must be plausible to the opponent—which is not the same thing as
what the deception planner would find plausible. A story that the deceiver
sees as flawed might be highly acceptable to the target. Such was the case in
the Operation Desert Storm deception, described in Chapter 4. A flanking
attack was, in the eyes of the coalition leaders, obvious. To the Iraqis, it was
foolhardy because of the dangers they perceived in moving through
uncharted desert.
When the Iranians took over the US embassy in 1979 and seized the
Americans there, six embassy employees with no intelligence background
managed to find shelter in the Canadian embassy. Their presence there was
not acknowledged to the Iranian government, and could not be
acknowledged; the Iranian Revolutionary Guard had already shown its
willingness to disregard the sanctity of embassies. The United States, in
cooperation with the Canadians, had to find a way to exfiltrate the six, and
the plan had to succeed. A failure would put both US hostages and
Canadian embassy personnel at risk and would be politically embarrassing
for both the United States and Canada.
The CIA was called on for a covert operation to rescue the six. It
developed a plan in coordination with senior policymakers in the US and
Canadian administrations. The plan called for an elaborate narrative in
which a “Hollywood film production crew” would travel to Tehran for the
purpose of making a movie that would be appealing to Iran’s revolutionary
government. After investigating possible filming locations, a party of eight
(now including the six) would depart for home.
The deception was successful, but it did not remain a secret for long. The
press caught wind of it within a few days. The Iranians were publicly
embarrassed, and the Canadians got a rousing “thank you” all across the
United States for their role. When Studio Six folded several weeks after the
rescue, they had received twenty-six scripts, including some potential
moneymakers. One was from Steven Spielberg.2
The story created for the Argo deception was plausible, being both credible
and appealing to the Iranian government. Contrast it with the alternate story
proposed but rejected in the 2012 movie titled Argo. In the movie, a proposal
is made to send the exfiltration team in as agricultural experts. The story was
to be that they would consult with Iranians about how to improve their
agricultural practices. That story failed the plausibility test: The operation
was scheduled to take place in January. It made no sense to send experts to
assist with Iranian agriculture in the middle of winter.
Again, all deception works within the context of what is true. To be plausible,
the story should be based as much as possible on truth. Even the Argo
operation, farfetched as it was, had true elements—such as an actual script
that could be shown to the Iranians.
The choice of this story, and of Hughes’s company to build it, offered
several advantages. Manganese nodules are golf ball–sized rocks rich in
valuable manganese, nickel, copper, cobalt, and other elements. They are
found in large numbers miles down on the ocean bottom. They are very
difficult to mine because of the difficulties in operating equipment at such
depths. This story was chosen because, according to a CIA memorandum
sent to Secretary of State Henry Kissinger,
During July and August 1974 the Glomar Explorer located the submarine
and began recovery operations. Cables were looped around the submarine,
and winches began slowly pulling the submarine up toward the barge.
About halfway up, the hull broke apart and two-thirds of the submarine fell
back and was lost on the bottom. Though the sought-after prizes—code
books and nuclear warheads—were reportedly lost, the remaining one-
third of the submarine was recovered and yielded valuable intelligence.6
The deception target was the Soviet intelligence services (the KGB and
GRU). There has been some controversy about how well the deception
succeeded. Much of the time, the Glomar Explorer’s operations were
monitored by Soviet ships in the immediate area. The truth about Project
Azorian was disclosed in a New York Times article in March 1975. By June
1975 the Soviets certainly were aware of the ship’s covert mission and had
assigned one of their ships to monitor and guard the recovery site. With the
mission exposed, President Ford canceled further recovery operations.7
There have been claims since then alleging that the Soviets had
foreknowledge of the salvage operation. Soviet naval leaders reportedly
believed such a recovery to be impossible and disregarded the intelligence
warnings.8 If true, that again illustrates the importance of plausibility: A
deep sea research and mining story was plausible; a submarine recovery
mission was, from a Soviet perspective, implausible.
Reinforcement
A good cover story should reinforce what the opponent already believes or
wishes to believe. A story that plays to or reinforces an opponent’s fears can
be equally effective. Operation Quicksilver, described in Chapter 2, fit nicely
with the German belief that Pas-de-Calais was the logical site for an Allied
invasion of Europe. And if the opponent’s wish to believe something is
strong enough, it may be possible (though never recommended) to conduct a
deception that does not even have plausibility. One component of the Ultra
deception, described in Chapter 3, projected a story that the British had a
radar that could detect submerged submarines. A radar expert would find
such a story implausible, but the Germans accepted the story because it
played to their fears.
The commission was withdrawn in 1927, and its final report concluded
that Germany had not disarmed and had over a seven-year period deceived
the Allied powers about its rearmament.
The tactics applied in this deception operation set a pattern that has been
repeated in recent years in WMD proliferation efforts. They relied heavily
on skillfully managing the plant inspection visits. When warned of a
commission inspection, the Germans would remove aircraft and
components from production hangars and conceal them in the countryside.
One way to check to see if the story meets the characteristics described in this
chapter is to create an influence net. Influence net modeling is an intuitive
graphical technique. It is useful in deception planning because it requires its
creator to consider the factors, forces, or influences that are part of the story
and that drive an opponent’s decision both toward and away from the desired
outcome scenario. It forces the deception planner to look at both the
influences projected by the story and at the outside evidence (provided by
other channels) that might support or undermine the story.
Note: The arrows come from boxes that support a German decision to
expect an invasion in the Pas-de-Calais region. The filled dots come
from boxes that do not support that decision.
The latter two factors weighed against that decision (shown as solid circles
terminating the connectors). The positive factors (with arrows terminating the
connectors) show three key parts of the deception plan that were being
considered:
5. “Ripping Off the Ocean Bed,” New Scientist, August 16, 1973, 388.
The channels (sensors and sources) used to project a deception are mostly
traditional intelligence channels, which are covered in detail in Chapter 7.
But a number of channels for projecting deception exist that don’t fit in the
traditional intelligence domain, and these have assumed more importance in
recent years. Many of them fall into the cyberspace category, covered in
Chapter 8.
Further, some channels exist that fall into neither the intelligence nor the
cyberspace category. Open source is readily available outside of intelligence
channels, though for convenience we treat it as an “INT” in Chapter 7.
Telephones, postal mail, and emails are used to project deceptions.
Both the hypothesis and the Iranian claims are open to doubt, but they
indicate a new potential channel for projecting deception. The US military
uses a different set of codes than do civilian GPS systems to receive
broadcasts from GPS satellites. The military codes are encrypted. The civilian
GPS systems are not encrypted, and it is possible to deceive these GPS
receivers—not easily, but it has been done. In June 2013 a Cornell University
researcher demonstrated the ability to deceive the GPS navigation system on
a private yacht in the Mediterranean, luring the ship one kilometer off
course.1
With that background on the channels, let’s turn to how deceivers can
understand and make use of them.
Understanding the Channels
To execute any deception, deception planners must have an understanding of
both their own and their adversary’s capabilities for observing and orienting
to the operational environment. Though related, the two steps differ in
character. Managing the observation phase of one’s own channels (sensors
and sources), or of the target’s, is usually relatively straightforward through
the identification of intelligence, surveillance and reconnaissance platforms,
and physical capabilities. However, understanding how an intelligence
organization orients information can be challenging, as it is the product of
that organization’s culture and past experiences.
For example, during World War II, Allied deception operations in the
European theater often succeeded because the Allies were culturally close to
the Germans and Italians. The Allies also had a good understanding of how
German and Italian intelligence services operated. In the Pacific theater,
Allied deception operations were less successful. This lack of success has
been blamed on the disinclination of Pacific US forces to use deception and
on the inadequacy of Japan’s intelligence services, as noted in Chapter 4.2
More likely, the causes were the cultural differences between the Americans
and the Japanese, and the inadequate US knowledge of how Japanese
intelligence actually functioned.
To appreciate how channels are used to deceive an opponent, let’s revisit the
deception, introduced in Chapter 4, that preceded Operation Desert Storm.
The discussion in Chapter 4 illustrated a skillful use of media channels to
portray an invasion of Kuwait from the south across Wadi Al-Batin
accompanied by an amphibious assault from the east.
A first step in planning how to feed information to the target is to map the
sensors in his or her OODA loop (observation) and then characterize those
channels (orientation). Then the planner decides how to exploit the channels.
Let’s go through each step.
Sensor Mapping
Sensor mapping requires the deception planner to model the observation
stage of the adversary’s loop in order to eventually manipulate that
adversary’s OODA loop and decision-making process, as Figure 6-1
indicates. To succeed in deception, your side must commit both collection
and analytic resources to model the target’s collection capabilities. The
purpose is not only to develop a fundamental knowledge as to what types of
sensors the target has available in any given operational environment, but
also to develop an understanding of any timelines or time restrictions related
to the different channels. There is no point making a great effort to deceive a
sensor if it is not in use. As an example, dummy tanks were not deployed in
Operation Quicksilver because, as noted in Chapter 2, the Germans could not
send reconnaissance aircraft over the UK to observe them.
Observation
The process of identifying adversarial sensors can include a variety of
methods that stretch from in-depth research as to what the opponent had
before the conflict, to targeted collection and general knowledge. It could be
something simple, for example, a few lines written by a patrol leader in his or
her report that refer to “enemy roving patrols” at a specific time and place
(the enemy patrol is therefore an adversarial sensor for exploitation), to
something more complicated like working out the access times of an imaging
satellite over the area of interest or operations. It could also be the social
network surrounding “legal” spies in your country, such as military attachés,
or maybe more clandestine “illegal” HUMINT networks. It could include
understanding in detail how the channel functions in accepting information to
pass through it, as illustrated in the next case—an example of a
counterintelligence deception.
The Document Binder Deception
In November 2010 Igor Sporyshev arrived in New York on assignment as
a trade representative of the Russian Federation. The FBI took an
immediate interest in Sporyshev—his father had served as a major general
in the Russian Federal Security Service (FSB). Service in both the KGB
and its successors—the FSB and Foreign Intelligence Service (SVR)—
tends to run in families. The FBI subsequently set up an elaborate
counterintelligence operation targeting Sporyshev.3
The reports must have looked like a treasure trove to Sporyshev and his
SVR associates. After all, subsequent reporting seems to show that they led
rather boring lives by espionage standards. That at least might explain
what happened next. The Russian made a basic tradecraft mistake: Foreign
material (that has not been under friendly control from the outset) should
never be taken into a secure facility. But as he received each binder,
Sporyshev took it into the SVR’s secure facility in Russia’s UN office in
New York, presumably to be photocopied before it was returned. The
binders all had concealed voice-activated audio devices that recorded
conversations among SVR officers in the facility. Over a period of several
months, as binders went into and out of the SVR facility, their voice
recordings were downloaded and handed over to FBI translators.5 The
several hundred hours of conversations that the binders produced had a
great deal of uninteresting material but probably provided a number of
profitable leads for US counterintelligence. The result of one such lead
became public because it wound up in a US court.
The FBI subsequently turned its attention to Buryakov and exploited that
source over the next year and a half. They placed audio taps on his
telephone calls and audio and video surveillance sensors in his home, using
them to acquire additional counterintelligence information.7
Orientation
Knowing how adversaries observe their operational environment is a
prerequisite for executing an effective deception plan. However, knowing
how they collect should not translate into certainty on how they understand
what they collect. An adversary’s intelligence service may not conduct
analysis as your side does. So understanding requires assessing how the
adversary orients what is collected—also known as characterizing the
channels, discussed next.
Channel Characterization
The preceding section emphasized the importance of knowing an opponent’s
intelligence collection system (or information channels). But their analysis
capabilities also must be understood—if they have any. Nonstate actors may
depend on leadership to do its own analysis. In fact, successful deception
requires a solid appreciation of the target opponent’s operational and
intelligence cultures. One of the most effective counterintelligence
(specifically, counterespionage) operations in the previous century depended
on an intelligence service’s ability to model the analytic processes and
cultural propensities of an opposing intelligence service, as illustrated in the
next case.
Protecting Ames and Hanssen
In the waning years of the Cold War, the Soviets scored a major
intelligence coup that continued to pay dividends long after the Soviet
Union itself ceased to exist. They acquired the services of two agents who,
because of their positions in the counterintelligence services of the CIA
and FBI, represented the gold standard for “moles” in an opposing
intelligence service:
As they eliminated the agents, in some cases the KGB replaced them with
double agents who fed false information to their CIA handlers. They also
skillfully applied misdirection to lead CIA and FBI investigators away
from Ames. The misdirection efforts had several phases, including the
following:
After Ames was arrested in 1993, the SVR (successor to the KGB)
continued to apply misdirection but now to protect its remaining asset:
Robert Hanssen. Hanssen compromised a number of US intelligence
operations before his arrest in 2001, and investigators at first naturally
tended to blame Ames.12 But some of the compromises could not have
come from Ames, so the hunt began for another mole.
The Hanssen deception was helped, as most deceptions are, by the target’s
mindset. In this case, the FBI found it difficult to believe that one of their
own could be a traitor. They focused the investigation instead on an
innocent CIA employee. Hanssen made a number of mistakes beginning in
1993 that should have resulted in his arrest prior to 2001, but the FBI
apparently failed to follow up on them. Whether or not the SVR
recognized and exploited the FBI mindset is an open question. In any
event, that mindset helped to protect Hanssen for another eight years.
Efforts to understand the command and control (C2) structure of Daesh began
in earnest with the establishment of western military assistance to counter
Daesh in the spring of 2015. Understanding how the self-declared leader of
Daesh and the so-called caliphate as an organization made decisions became
a primary concern for the purposes of disrupting their efforts in Syria and
Iraq. Several media outlets as well as terrorism experts and consultants
proposed organizational diagrams of this new group based on open sources
and analytic methods. The following fictional example is inspired by a Der
Spiegel article, “Secret Files Reveal the Structure of Islamic State,” and
reflects its attempt to model the “black box” of Daesh.13 Intelligence “roll-
ups” are overviews of intelligence reporting collected on a specific topic and
placed on a timeline; they include a short description of the source as well as
a brief summary of relevant intelligence. The roll-up in Table 6-1 provides
fictional sources and information for creating a model of the Daesh “black
box.”
Table 6-1 introduces the source evaluation and rating schema for HUMINT
reporting that is used throughout the exercises in later chapters. The number
(345 in the table) identifies the source. The two characters that follow
indicate the evaluation of the source and the source’s information, as shown
in Table 6-2.
Figure 6-2 illustrates the “Black Box” model that results from analysis of the
reporting in Table 6-1. The “R.#” boxes refer to the specific reporting in the
table.
A channel is more than just the sensor. It includes the tasking, processing,
and exploitation phases of intelligence. Because of the flood of imagery, it’s
not uncommon for an image to be taken and processed but never exploited.
More than once, a deception operation has been derailed when,
unaccountably, the opponent either didn’t receive the information that was
provided or didn’t understand or act on what it did receive. Chapter 4
discussed the Allied failures during World War II in attempts to deceive the
Japanese. There, the problem was a failure to bridge a cultural divide. Often,
however, the problem is the length of the opponent’s communications chain.
It’s a basic rule of communications that, the more nodes between transmitter
and receiver, the more likely a message is to be distorted or lost. The nodes
(here, human) can misinterpret, lose, or falsify for their own purposes a
message before transmitting it to the next node.
An example of what can go wrong in the intelligence channel was the Soviet
decision to invade Afghanistan in 1979. The invasion was intended to support
the existing Afghan government, which was dealing with an open rebellion.
The intervention was based largely on flawed intelligence provided by KGB
chairman Yuri Andropov. Andropov controlled the flow of information to
General Secretary Brezhnev, who was partially incapacitated and ill for most
of 1979. KGB reports from Afghanistan and passed on to Brezhnev created a
picture of urgency and strongly emphasized the possibility that Prime
Minister Hafizullah Amin had links to the CIA and to US subversive
activities in the region.14
Most targets of deception have more than one channel available to them,
though. They also have past experiences and predispositions to draw upon in
making a decision. The ideal is for information provided in different channels
to be congruent both with each other and with the target’s experiences and
predispositions. That is, they should all reinforce each other to lead the
opponent to the desired decision. To the extent that the information provided
is incongruent—that is, it conflicts with information in other channels or with
the target’s thinking—it will be rejected.
Then, on the evening before the invasion, the second phase began, in an
effort to convince Hitler that Normandy was a diversion:
4. Ibid.
7. Ibid.
8. Ibid.
11. B. Fischer, “Spy Dust and Ghost Surveillance: How the KGB Spooked
the CIA and Hid Aldrich Ames in Plain Sight,” International Journal of
Intelligence and CounterIntelligence 24, no. 2 (2011): 284.
13. Christoph Reuter, “Secret Files Reveal the Structure of Islamic State,”
Der Spiegel, April 18, 2015,
http://www.spiegel.de/international/world/islamic-state-files-show-structure-
of-islamist-terror-group-a-1029274.html.
The deception planner’s own intelligence channels have two distinct roles in
deception. First, they are used to provide intelligence that supports the
deception operation in several ways:
Open source has long been a means for planting deceptive information, and
that role has increased now that OSINT is so prevalent on the web. News
reporting via radio, TV, newspapers, and online naturally provides many
opportunities to communicate deceptive information. These channels were
used extensively to support the deception prior to Operation Desert Storm, as
noted in Chapter 4. It’s worth restating the challenge in using these channels,
as was illustrated in that deception; one must avoid misleading one’s own
people and allies. The US government, for example, as a matter of policy
does not lie to the press.
The success of the war plan depended critically on the ability of attacking
aircraft to overcome Soviet air defenses. That meant avoiding or
neutralizing Soviet air defense radars. And that, in turn, required that
SAC’s planners create a map of the locations and types of the radars for
the entire Soviet Union.
For several years, SAC had been able to locate and identify the radars
around the northern periphery of the USSR. Air Force and naval electronic
reconnaissance aircraft, known as ferrets, flew from bases in the United
Kingdom and Alaska on routes close to the Soviet border and used
sensitive receivers to intercept and locate the radars. But radars in the
interior could not be detected by aircraft without overflying the Soviet
landmass.
For that mission, a satellite was needed, and in 1958 the Naval Research
Laboratory (NRL) proposed to build one: an electronic intelligence
(ELINT) satellite in low earth orbit that could intercept radar emissions
and geolocate radars anywhere in the world. The problem was that, if the
Soviets knew that a satellite was conducting ELINT missions, they could
practice denial—shutting down critical parts of their air defense radars
when the satellite was in view. This was already a common Soviet practice
to defeat ferret aircraft flying near the USSR’s borders. A simple
unacknowledged mission launch was not an option; it would attract too
much attention, and the Soviets might determine the satellite’s real
mission.
The solution was to create a cover story that would not attract undue
interest. NRL proposed to put two payloads on the satellite: a highly
classified ELINT receiver, and a publicly acknowledged research payload.
The result was the GRAB (Galactic Radiation and Background)
experiment. The first GRAB satellite was launched on June 22, 1960; a
second GRAB went into orbit on June 29, 1961, and operated until August
1962. GRAB transmitted its ELINT intercepts to a network of receivers at
ground sites around the world. It also provided measurements of solar
radiation (the SolRad experiment) that were publicly released in open
source channels to support the deception.
During the war, the British government published a list of the neutral firms
and businessmen who were breaking the Allied blockade by trading with
Hitler. The list was used to impose sanctions. But the British also had a
second list of firms that were only suspects. Their names were not
published. British officials, however, had orders to counter their activities
where possible.
The Black Boomerang team acquired this list of suspects along with files
that revealed both the reasons for suspicion and the personal background of
the suspect firms’ directors. Subsequently, these suspects heard the secrets
of their private and commercial lives being publicly exposed on the
Atlantiksender radio. Usually their reaction was to protest to the nearest
British authorities, who, of course, denied any connection with a German
radio broadcast. Delmer, in his book Black Boomerang, describes the
typical approach:
We then waited to see whether they would mend their ways. If they
did not, we followed up with further broadcasts about them. As a rule,
however, one broadcast was enough. One typical case was that of a
firm of Swedish exporters who were buying ball-bearings in Sweden,
and smuggling them to Germany, thereby undoing the effects of our
bombing. Our story about the firm was so accurate and so ribald that
the Swedish authorities felt deeply hurt in their national pride. They
made unofficial representations about it to the British ambassador Sir
Victor Mallet.
“We had our eye on this firm,” said the Swedes with injured dignity,
“we were waiting to prosecute them, when we had enough evidence.
Now your broadcast has warned them.” Sir Victor replied that he had
no knowledge of the Atlantiksender, and nothing to do with it….
[T]he Swedish newspapers had taken up the story we had put out, and
the directors of the Swedish firm, fearing more Atlantiksender
publicity, had cancelled all further deliveries to Germany.4
COMINT
Communications intelligence, or COMINT, is the interception, processing,
and reporting of an opponent’s communications. Communications in this
definition includes voice and data communications, facsimile, video, and any
other deliberate transmission of information that is not meant to be accessed
except by specifically intended recipients. This definition includes Internet
transmissions, though their collection overlaps into the realm of CYBER
collection, described in Chapter 8.
But once the COMINT means that an opponent relies on are known, then
deception become straightforward. For example, during the Cold War, the
Soviets made extensive use of audio taps (bugs). US counterintelligence
operatives occasionally used these bugs to feed the Soviets false information.
In some cases, they used the bugs to take out troublesome Russian agents,
either by hinting that the agent was on the US payroll or by reporting that he
was having an affair with some Soviet minister’s wife.
COMINT deception can be used for strategic or tactical deception. The
deceptions that supported Operation Quicksilver and Operation Desert Storm
fit more into the strategic deception class. But COMINT deception is most
widely used at the tactical level, as it was in Operation Bolo.
Operation Bolo
During the Vietnam War, the pilots of the US Seventh Air Force were
becoming frustrated with the tactics of their North Vietnamese opponents.
The North Vietnamese Air Force carefully avoided confrontations that
would pit their MiG-21 Fishbed fighters against the US Air Force’s
superior F-4C Phantoms. Instead, they selectively targeted US F-105
fighter/bombers that were conducting airstrikes in the theater—and in the
process, racking up kills; the MiG-21 was superior to the F-105 in a
dogfight.
F-4Cs flying from bases in Thailand and South Vietnam took the flight
paths normally used by F-105s. Knowing that the North Vietnamese
monitored aircraft communications, the F-4C pilots used the
communications frequencies and call signs normally used by F-105s on
bombing runs.
At the same time, other F-4Cs flew to positions off the coast of North
Vietnam, positioning themselves to guard selected North Vietnamese
airfields to prevent the MiG-21s from returning to them.
The mission was a success. When the MiG-21s took off and flew to
intercept what appeared to be easy prey, the F-4s pounced, shooting down
seven MiG-21s in twelve minutes—in effect destroying one-third of the
enemy’s MiG-21 inventory.5
Steganography
Steganography is defined as the art and science of hiding information by
embedding messages within other, seemingly innocent messages. Secret
writing traditionally made use of invisible ink to place a message between the
visible lines of an apparently innocuous letter. Counterespionage routinely
relies on opening suspect correspondence (known in the trade from pre-
Internet days as “flaps and seals” operations) and testing for secret writing.
So secret writing is not commonly used by sophisticated intelligence services.
However, it still finds use in countries or organizations where technical
methods of communication are not available. During World War II, agents
used a sophisticated type of steganography called the microdot: a technique
of photographing and reducing a page of text to the size of a pinhead, then
making it look like the period at the end of a sentence.
Because the use of code words and “talking around” a topic is a widely
known tactic, it is a natural way to communicate deceptive information.
Again, this deception method requires some subtlety when planting deceptive
information; the communication has to look like a credible attempt to conceal
information, but not be so difficult to analyze that the opponent’s COMINT
analyst misses it completely.
Encryption
Encryption is now widely practiced in many types of communication to deny
COMINT exploitation. The result is that encrypted traffic is more likely to be
the focus of intelligence interest, especially when encryption is used where it
should not be found in a system. Al Qaeda long relied on unencrypted emails
for communication because they knew that encrypted ones would attract
attention.8
Frequent communications from one node can tell the traffic analyst
who’s in charge or the control station of a network.
Correlating who talks and when can indicate which stations are active in
connection with events.
Frequent communications often indicate that an operation is in the
planning stage.
No communication can indicate either no activity, or that the planning
stage is complete.
Rapid, short communications are often connected with negotiations.
These different types of HUMINT divide into two broad classes—overt and
clandestine HUMINT. They are treated differently in both conducting and
detecting deception.
Overt HUMINT
Overt HUMINT is attractive for conveying deceptive information because the
channel is direct: The deceiver has some degree of confidence that the
information will reach its intended recipient. It has the attendant disadvantage
that it is often used to pass deceptive information, so it is a suspect channel.
There are a number of channels for collecting overt HUMINT, but three are
prominently used: elicitation and its formal counterpart, liaison; facilities
visits and public demonstrations; and interrogations.
Both diplomatic and intelligence liaison provide many benefits, but they
carry risks that the liaison partner will use the relationship to deceive. Even if
the liaison partner doesn’t intend to deceive, he or she may be the conduit for
deception aimed at the other partner. Such was the case in Operation
Mincemeat, described in Chapter 2; the British relied on the existence of a
liaison relationship between Spanish and German intelligence to ensure that
the deceptive information would be shared.
International arms control treaties often require onsite visits. These visits
have been used to promote the image that a chemical warfare plant, for
example, is actually a pesticide production plant. Countering such deception
frequently makes use of methods such as clandestine materials sampling.
Interrogation
Interrogation is a standard overt HUMINT technique, widely used by military
forces and governments to obtain intelligence. Interrogation might best be
described as elicitation in a controlled setting. The difference is that the
interrogator has physical custody or some form of control over the fate of the
source—who usually is a prisoner charged with a crime, a prisoner of war, an
émigré, or a defector. The specific tactics used depend on the interrogator’s
degree of control and the willingness of the detainee to cooperate. Émigrés
and defectors usually evince the greatest willingness to cooperate.
Clandestine HUMINT
A nation’s clandestine service performs highly diverse but interdependent
activities that span many intelligence channels, including, for example,
COMINT and CYBER. The unifying aspect of these activities is the fact that
they are conducted clandestinely or secretly. However, it is clandestine
HUMINT that is most well-known because of its popular portrayals in print
and movies. Obviously, clandestine HUMINT depends on deception to
function. And it can be a vehicle for conducting deception against both
intelligence and operational targets. Specifically, it depends on deception in
the form of cover: a false picture of personal or organizational identity,
function, or both.
Personal Cover
Clandestine services officers must conceal their actual role. So some form of
personal cover (for example, an assumed identity) is needed. There are two
types of cover—official and nonofficial. Persons under official cover work in
official government positions. The deepest level of cover is nonofficial cover.
Such operatives are called NOCs. NOCs are much more difficult for a
counterintelligence service to identify but are much more vulnerable if
identified.
All forms of overseas cover are increasingly a problem for all clandestine
services. The increased availability of life history data to foreign
governments and increasing worldwide availability of technologies such as
facial recognition software and biometrics exacerbates the cover problem.
(As an example, US law enforcement has amassed a facial recognition
database of 117 million citizens or at least half of the adult population.)11
Organizational Cover
Many intelligence agencies use organizational cover in the form of front
companies and co-opted businesses for technology and materiel acquisition
and to run agents. Countries such as Russia and China have a long history of
using organizational cover to target Western Europe, the United States, and
Japan because of the advanced technology that is available in those countries.
IMINT deception goes a step further by presenting the false. Typically, this
relies on decoys. Their use during World War II and in Kosovo has already
been described in earlier chapters, and decoys continue to be used in military
conflicts. During 2016, Iraqi forces fighting to retake the city of Mosul from
Daesh found a cache of wooden tanks, vehicles, and even dummy soldiers
near the city. The decoys were intended to divert airstrikes away from actual
targets.15
The Al Kibar case illustrates the level of sophistication that IMINT deception
can achieve, and the effective use of other channels to defeat it. Al Kibar was
an elaborate Syrian attempt to conceal efforts to develop nuclear weaponry.
The Syrian Deception at Al Kibar
In 2003 the Syrians began to construct a nuclear reactor near the town of
Al Kibar. The reactor was a near-duplicate of one at Yongbyon, North
Korea, and was built with North Korean assistance. The Syrians realized
that US imagery analysts probably had a profile of the Yongbyon reactor
and would quickly spot any similarity in imagery. They consequently
developed an elaborate deception to conceal the reactor from imagery
collection and, more important, from imagery analysis:
They partially buried the reactor and erected a false roof and walls
around it to conceal its characteristic shape from overhead imagery.
They disguised the outer shell to look like a Byzantine fortress of the
sort to be found in Syria.
Several tactics for denial supported the deception. The Syrians did
not construct new roads or put in place airfields, rail tracks, air
defenses, or site security—any of which would have drawn the
attention of imagery analysts.16
The Israelis decided on a direct approach to deal with the facility. During
the night of September 5–6, 2007, four Israeli F-16s and four F-15s
departed Israel Air Force bases, flew north along the Mediterranean coast,
and then eastward along the border between Syria and Turkey. The
attackers used electronic warfare en route to deceive and disrupt the Syrian
air defense network. The seventeen tons of bombs that they dropped on Al
Kibar totally destroyed the reactor building.
On April 28, 2008, CIA director Michael Hayden confirmed that Al Kibar
would have produced enough nuclear material for one to two weapons a
year, and that it was of a similar size and technology to North Korea’s
Yongbyon Nuclear Scientific Research Center.
Countering infrared, spectral, and synthetic aperture radar (SAR) has relied
mostly on denial (camouflage and radar-reflective coverings) rather than
deception. Deception using decoys against all of these imagers can be done,
though, and is most effective when combined with denial to protect the real
targets. The Serbian deception efforts in Kosovo, described in Chapter 3,
involved creating heat sources to deceive infrared imagers, though that was a
crude effort by current deception standards.
Deception against technical collection has a history that dates back to World
War II, when technical collection began to play an important role in warfare.
It began with efforts to deceive enemy radar.
Radar
During World War II, Allied aircraft attacking European targets began to
deceive German radars by dropping chaff—strips of tinfoil that, released
from an aircraft, appeared on radar to be another aircraft. After the war, this
type of misdirection was applied in more sophisticated versions. Because
chaff moved slowly with the wind, a radar operator could distinguish it from
the fast-moving target that was the aircraft. So deception moved into the
purely electronic domain: Using a technique called spoofing, the attacking
aircraft received and retransmitted the radar signals in such a way as to make
the aircraft appear to be in several other locations, while hiding the reflected
signal from the aircraft itself. As radars became more sophisticated, this type
of deception moved back into the physical domain, with the use of air-
launched decoys—small pilotless vehicles designed to present much the same
radar return as the aircraft that launched them.
Deception against electronic intelligence also dates back to World War II.
During that time, the radar communities developed deception techniques to
counter both ELINT and EA. In the post–World War II period,
reconnaissance aircraft began flying regular ELINT missions to locate an
opponent’s radars and to assess each radar’s purpose, performance, and
weaknesses. Many such flights targeted radar test ranges to assess new radars
in development.
ACOUSTINT Deception
Some acoustic sensors detect sound traveling through the atmosphere or in
the ground near the surface and therefore function only at comparatively
short ranges (a few meters to a few kilometers). The intelligence product of
such collection is usually called ACOUSTINT. These sensors are used both in
military operations and in intelligence to identify and track land vehicles and
airborne platforms. They can locate these targets and determine their speed
and direction of motion based on sound transmitted through the air or earth.
Such sensors can be readily deployed and disguised in all types of terrain.19
Most of the sensing of sound in air or underground finds tactical military or
law enforcement use because detection can be made only at short ranges.
Acoustic sensing has been used to detect enemy ground movements for
millennia. Armies routinely send out patrols or establish listening posts to
listen for the sounds of tanks, trucks, and marching troops. Deception—
creating deceptive sounds to simulate such movements—also has a long
history. During World War II an Allied military unit in Europe made use of
massive loudspeakers mounted on trucks to simulate the movements of tanks
and infantry.
ACINT Deception
Acoustic intelligence derived from underwater sound is usually called
ACINT. ACINT relies on a class of acoustic sensors that detect sound in
water. Sound travels much better in water than in air. Underwater sound
created by ships and submarines can be detected at distances of several
hundred kilometers. Tactical deception to counter ACINT has been practiced
for decades, in the form of decoys deployed to simulate the sound produced
by a ship or submarine.
Notes
1. Robert M. Clark, Intelligence Collection (Washington, DC: CQ Press,
2014).
5. NSA Cryptologic History Series, “Working against the Tide,” Part 1, June
1970, 149–53.
7. Ibid.
9. Mark L. Bowlin, “British Intelligence and the IRA: The Secret War in
Northern Ireland, 1969–1988,” September 1999, 83–89,
https://archive.org/stream/britishintellige00bowlpdf/britishintellige00bowl_djvu.txt
10. Ibid.
11. Steven Nelson, “Half of U.S. Adults Are in Police Facial Recognition
Networks,” US News and World Report, October 18, 2016,
https://www.law.georgetown.edu/news/press-releases/half-of-all-american-
adults-are-in-a-police-face-recognition-database-new-report-finds.cfm.
12. “A Tide Turns,” The Economist, July 21, 2010,
http://www.economist.com/node/16590867/.
15. John Bacon, “Islamic State Used Fake Tanks to Confuse Airstrikes,” USA
Today, November 14, 2016, http://www.msn.com/en-us/news/world/islamic-
state-used-fake-tanks-to-confuse-airstrikes/ar-AAkhs2f.
18. David Makovsky, “The Silent Strike,” New Yorker, September 17, 2012,
http://www.newyorker.com/magazine/2012/09/17/the-silent-strike.
19. B. Kaushik and Don Nance, “A Review of the Role of Acoustic Sensors
in the Modern Battlefield,” paper presented at the 11th AIAA/CEAS
Aeroacoustics Conference, May 23–25, 2005,
https://ccse.lbl.gov/people/kaushik/papers/AIAA_Monterey.pdf.
8 The Cyberspace Channel
Within the complex domain that we call CYBER, there are three basic types
of deception. One supports cyber offense; one is a part of cyber defense; and
one supports deception operations that transcend the cyber domain. Many of
the examples in this chapter are of the first two types: They describe
deception where the objective was a cyber target, and the deception remained
entirely in cyberspace. But deception can be conducted in cyberspace to
support operations across all of the PMESII domains. It is used to support
covert actions and influence operations. It is used to manage perceptions,
beliefs, and understanding, and to create a flawed situational awareness
among individuals or groups, thus increasing the chance that deception will
succeed. It can be a powerful tool for conducting deception against a defined
group as discussed in Chapter 1.
For the purposes of this book, and to remain within the focus of intelligence
support to a deception plan, CYBER here is treated as a two-way channel
from which information can be extracted to support planning, as well as a
communication platform from which to project information in executing the
deception. The cyber realm therefore divides into two broad types: computer
network exploitation (CNE), which is basically passive, and computer
network attack (CNA), which is unquestionably active. Both have roles to
play in deception: CNE, for collecting intelligence, and CNA, for projecting a
deception by providing material to the target to support the deception.
So the World Wide Web via the Internet is a primary channel for conducting
CNE/CNA. But the web can be an effective channel for projecting a
deception without using CNA, as discussed next.
Web-Based Deception
In 2012 a popular US television commercial featured a young woman
proclaiming, “They can’t put anything on the Internet that isn’t true.” In
response to the question “Where’d you hear that?” she responds, “The
Internet.”
The Internet and its associated social media provide what is probably the
most commonly used channel for deception by governments, organizations,
and individuals today. The extensive development of cyber-based social
media and information sources over the past fifteen years has opened a richly
dynamic environment in which to conduct deception and counterdeception. It
literally provides the communication backbone for a wide variety of possible
channels to and from adversaries that can be used to monitor, analyze, and
exploit.
The web also has become the primary source for research about almost any
topic. The web pages that search engines lead to, and online reference sites,
are excellent places to plant misleading information. Sites that have editors or
a validation process, such as Wikipedia, are somewhat better suited to defend
against deceptive inputs, but they are not immune. Ironically, on its own site,
Wikipedia displays an article noting that it has been banned by many
academic institutions as a primary source due to reliability and credibility
issues.4 The increased access to CYBER has increased the depth and scope
of “common knowledge” available, leaving artists of deception projection an
easily accessible canvas for “designing common knowledge” through sites
such as Wikipedia and blogs.
E-mails
E-mails that appear to come from a trusted source have a well-known role in
emplacing malware. But such e-mails also can be used to project a deception.
Sometimes, it isn’t even necessary to impersonate a trusted source. Prior to
operation Desert Storm, PSYOPS specialists from the coalition called senior
Iraqi officers directly on their personal cell phones and sent e-mails to their
personal accounts, attempting to induce them to surrender and providing
deceptive information about the upcoming conflict. These measures
developed a level of discord and mistrust among the senior Iraqi leadership
that had a definite adverse impact later on in the conflict.5 To return to the
terms introduced in Chapter 1, this was an “A” type deception.
Social Media
Social media may be the most widely used channel for deception today.
Individuals, organizations, and governments convey deceptive information
via social media for both worthy and nefarious purposes. Because it is
difficult to determine that a profile presented on the Internet is that of an
identifiable person, deception is easy to pull off. Such deception is, in effect,
an online version of pseudo operations.
Law enforcement uses social media, for example, to ensnare online sexual
predators via posts that appear to come from young girls. Websites
pretending to recruit for Daesh are used to identify and arrest would-be
Daesh volunteers.
Deceptive posts such as the Chinese example often are the work of Internet
trolls—persons either anonymous or using assumed names. These posts
typically are intended to advance causes and influence thinking through
emotional appeals. The Russian government has developed a sizeable team of
Internet trolls who regularly post deceptive information in an ongoing
psychological operations campaign to undermine NATO and US interests and
promote Russian interests.8
Social media such as Twitter, YouTube, Facebook, Instagram, and blogs are
just the tip of the iceberg of available cyber social networks that contribute
significantly to the formation of public opinions and perspectives across
national borders. It has been demonstrated repeatedly that “going viral” in
social media can quickly inspire violent events in the physical domain. A
single picture illustrating political hypocrisy can in a matter of seconds
weaken international alliances abroad, and undermine support for policy at
home. Perhaps the most dramatic outcome of this online mobilization,
Internet activism, and grassroots organization occurred in 2011. The events
of that year that became known as the Arab Spring started in Tunisia and
spread rapidly to Egypt and Libya, toppling governments in all three
countries, and to Syria, sparking what seems to be an endless conflict that has
drawn in major powers.
Memetic Conflict
Deception plays a major role in a new area of interstate and intercultural
conflict known as memetic warfare or memetic conflict. A meme (derived
from the word gene) is an idea or type of behavior that spreads from person
to person within a population. It is a carrier of cultural ideas, symbols, or
practices from one person to another through writing, speech, gestures,
symbols, or rituals.
The term meme was introduced by Oxford professor Richard Dawkins in his
1975 book The Selfish Gene. Though the name is of recent origin, the meme
obviously has been around for a long time in human affairs. But it has
become a powerful tool for shaping opinions and actions in the era of social
media.
Memetic conflict has been defined as “competition over narrative, ideas, and
social control in a social-media battlefield. One might think of it as a subset
of ‘information operations’ tailored to social media.”9 In contrast to cyber
conflict,
Of course, the tools of these conflicts can be used by both sides. In a memetic
conflict campaign against Daesh, for example, it has been suggested that one
could
Techniques such as these are easily applied by, for example, the Russian
government. They would pose both political and legal problems for many
Western governments that tried to apply them. The United States, for
example, is not allowed to spread propaganda domestically, and social media
knows no borders.13
Web-Based CNE/CNA
The preceding section discussed a straightforward use of the cyber realm to
project a deception; such use requires no specialized technical expertise. This
section is about applying the tools of cyber operations for obtaining
intelligence or to project a deception. Cyber offense falls into two broad
objectives when conducted against networks:
Although these terms refer to offensive deception against a network, the same
principles and tools apply in attacking or exploiting a single computer that is
not connected to a network; only the techniques for obtaining access are
different. Let’s look at some of the channels for CNE/CNA and then examine
some of the tools for using them to obtain access and implant malware.
Web-Based Channels
A basic rule of deception is that the more trustworthy a channel is believed to
be, the more effective it is as a channel for deception. Several components in
cyberspace are designed to provide a trusted environment for sharing
information or conducting daily activities. Making use of these generally
requires CNE/CNA tools, but the channels can then be effective in projecting
a deception.
Intranets
An intranet is an internal network that people can access only from within
their organization or trusted group. It is intended as a place to securely share
files or sensitive documents. Some intranets are not connected to the Internet;
others have Internet access, but only through a gateway administrator from
within the organization. In this section, we’re looking at the type of intranet
that connects to the Internet but has some form of protection. They’re usually
called virtual private networks (VPNs), and they allow people to operate with
an expectation of privacy on the Internet. An intranet that does not connect
directly to the web requires a different approach and is discussed in a later
section.
The deep web refers to the vast part of the Internet that is not indexed and
therefore not normally visible or accessible from search engines. Access-
restricted commercial databases, websites, and services comprise much of the
deep web. Special browser software such as Tor (originally created by the US
Navy to transfer files securely) is required for access. The Tor software
makes use of a set of VPNs, allowing users to securely travel the deep web
and remain anonymous. It protects users by bouncing their communications
around a distributed network of relays run by volunteers around the world,
which prevents others from watching users’ Internet connections to learn
what sites they visit, prevents the sites that users visit from learning their
physical location, and lets users access sites that are blocked to anyone unless
granted permission. Government databases, such as those maintained by
NASA and the US Patent and Trademark office, also use the deep web space
for obvious reasons.
Within the deep web lies what is often referred to as the dark web. Much of
the dark web content fits well with the name: It includes all types of black
markets, illicit drug traffic, fraud-related material, and child pornography. It
is used for a number of scams and hoaxes, but it also is used for political
discussion groups, whistleblowing sites, and social media sites often to avoid
government censorship. These legitimate uses of the dark web offer attractive
channels for projecting a deception, because many governments pay close
attention to the material posted on these sites. Tracing the source of a post in
the dark web is very difficult—an added advantage in executing deception.
Blockchains
Blockchains are another example of a seemingly secure channel for
projecting a deception. The challenge is to find a way to make use of it.
Experts estimate that 6.4 billion connected things were in use worldwide in
2016, up 30 percent from 2015. The Internet of Things is expected to consist
of almost 50 billion objects by 2020. It includes a wide class of cyber-
physical systems, which also encompasses technologies such as smart grids,
smart homes, intelligent transportation, and smart cities. Each thing is
uniquely identifiable through its embedded computing system but is able to
interoperate within the existing Internet infrastructure.
The Mirai botnet used vulnerable IoT technology to launch an attack. One
of the major IoT resources used was the security cameras sold by a
Chinese firm, Hangzhou Xiongmai. Millions of these cameras are sold in
the United States. The DDoS attack exploited the default passwords in the
equipment and organized them into a botnet. Hangzhou Xiongmai issued a
recall for the cameras on October 24, 2016, while complaining that users
should have changed the default passwords.
Botnets such as Mirai exploit weak security measures in IoT devices. Most
such devices, if they have any protection at all, are delivered with a
standard password and username combination—“admin” and “1111” are
typical ones. The botnet scans the Internet for IoT systems using these
standard passwords and infects them with malware that directs them to the
C&C server; the server then uses them as hosts to launch cyber attacks.
This can happen, for example, when the target receives an e-mail that appears
to come from a trusted source—an acquaintance or someone within the same
organization. The e-mail might ask the target to open an attachment. Adobe
PDFs, images, and Microsoft Office files are commonly used. When the file
is opened by the vulnerable program on the victim’s computer (such as
Adobe Acrobat or Microsoft Excel, PowerPoint, or Word), a backdoor
program executes and the computer has been compromised. At the same
time, a seemingly normal file or image appears on the target’s computer
screen, so that the recipient has no reason to suspect that something is amiss.
E-mails are widely used for deception because it is possible to identify an
employee’s trusted relationships and professional networks by looking at his
or her e-mail patterns.16
Alternatively, the e-mail may direct the target to a website that contains the
backdoor, with much the same outcome. Such a website is called a drive-by
download site. It typically relies on vulnerabilities in web browsers and
browser add-ons. Users with vulnerable computers can be infected with
malware simply by visiting such a website, even without attempting to
download anything.17 The attacker can then acquire files from the computer
or e-mail or send data from the computer, or force the compromised
computer to download additional malware. From there, the attacker can use
the infected computer to exploit the victim’s contacts or other computers on
the target network.18
The basic tools of malware are known as exploits, discussed next.
Exploits
An exploit takes advantage of software vulnerabilities to infect, disrupt, or
take control of a computer without the user’s consent and preferably without
the user’s knowledge. Exploits take advantage of vulnerabilities in operating
systems, web browsers, applications, or other software components.19
Four of the most widely known exploits are Trojan horses (usually
abbreviated “Trojans”), worms, rootkits, and keystroke loggers.
Although all of these exploits can be used for cyber deception, they are most
effective when they are used against a zero day vulnerability. Also called a
zero hour or day zero vulnerability, this is an application vulnerability that is
unknown to defenders or the software developer. It derives its name from that
time (called the zero hour or zero day) when the software developer first
becomes aware of the vulnerability. Until that moment of awareness, the
developer obviously cannot develop a security fix or distribute it to users of
the software. Zero day exploits (software that uses a security gap to carry out
an intrusion) are highly valued by hackers and cyber espionage units because
they cannot be defended against effectively—at least not until sometime after
zero day arrives.21
Exploits are usually emplaced via the web, but they can be emplaced directly
on machines, as discussed later. Deception via the web requires more than the
deployment of exploits. The cyber espionage organization must control the
exploits and use them to insert the desired information while maintaining the
secrecy, or at least the deniability, of the operation. Often this is done by
botnets such as Mirai, used in the DDoS attack described earlier. The botnet’s
command-and-control server can’t be easily shut down because it’s hard to
determine its real location.
The current state of the art in APTs has been named Duqu2. It had an unusual
target, though a logical one for a government that is in the cyber espionage
business.
Duqu 2
Kaspersky Lab is one of the world’s major software security companies,
operating in almost 200 countries from its Moscow headquarters. Its
founder, Eugene Kaspersky, reportedly has ties to Russia’s intelligence
services (he is a graduate of the FSB academy; the FSB is the successor to
the KGB). Kaspersky Lab has been the most prominent of software
security firms in identifying, analyzing, and countering the malware
described in this chapter.
In 2016, Kaspersky Lab was itself the target of a malware attack. The
attack successfully accessed the company’s intellectual property and
proprietary technologies and its product innovations. It was targeted
specifically on the Kaspersky tools used for discovering and analyzing
advanced persistent threats, and the data on current Kaspersky
investigations into sophisticated malware attacks. The attack was
discovered only after it had been inside the company’s intranet for several
months.28 It was, in counterespionage terms, a wilderness of mirrors event
—using computer espionage to target the web’s equivalent of a
counterespionage organization—presumably so that the attacker could
better evade discovery by Kaspersky in future cyber attacks.
The malware, which Kaspersky named Duqu 2, has very little persistence,
making it difficult both to detect and to eliminate. It exists almost entirely
in the memory of the targeted system. As a result, according to the
Kaspersky Lab report, “the attackers are sure there is always a way for
them to maintain an infection—even if the victim’s machine is rebooted
and the malware disappears from the memory.”29 Duqu 2 was so named
because it shares much of the code of the original Duqu and of Stuxnet,
leading observers to believe that it was developed by the same unidentified
organization.
Standalone Computers and Intranets
Attacking a network that is physically isolated from the Internet (a private
intranet) or a single computer that never connects to the Internet requires a
different type of effort from that used in CNE. The collector has to gain
access to the computer or the intranet in some way. Once access has been
gained through a USB drive, a network jack or cable, a utility closet, or some
similar device—almost anything can be done. From the defense point of
view, the game is over and the defense has lost.
Social Engineering
Isolated intranets are not connected to the Internet most often as a security
measure. They therefore are viewed generally as safe from attack and
placement of deceptive information. But this perceived trust makes them
more vulnerable to deception. The challenge, of course, is to get into the
intranet to place the misleading information. Social engineering is one means
of doing that. It’s used to facilitate both CNA and CNE.
In cases where computers never leave a secure facility, and where remote
access is not possible, it is necessary to use field operations to access
networks. This category encompasses deployment of any CNA or CNE tool
through physical access or proximity. In intelligence, these are called
HUMINT-enabled operations; in the world of hackers, they are usually
referred to as social engineering.30 They encompass such classic HUMINT
techniques as gaining access under false pretenses, bribery or recruitment of
trusted personnel in a facility, and surreptitious entry.31 HUMINT-enabled
operations are often facilitated by human error or carelessness, and complex
intranets are particularly susceptible to both.
The case of Stuxnet, which has attracted much international attention and
even been the source of a 2016 TV documentary titled Zero Days, illustrates
the importance of direct access to an intranet as well as the value of social
engineering. Although Stuxnet is an example of CNA, it relied heavily on
deception for its success.
Stuxnet
Stuxnet illustrates the type of precision attack that is most effective in both
CNA and CNE. Stuxnet was a worm designed to infect and disable a
specific type of computer performing a specific task. The target,
investigators believe, was the computer controlling the isotope separation
centrifuges in Iran’s Natanz uranium enrichment facility.
Stuxnet was used in a sustained and directed attack, conducted over a ten-
month period beginning in 2009. Reportedly, at least three versions of the
program were written and introduced during that time period. Investigators
found that the first version had been completed just twelve hours before
the first successful infection in June 2009. One attack, in April 2010,
exploited a zero day vulnerability in Windows-based computers.
Deception in Hardware
Even better than getting access to a target’s computer is to manufacture the
computer. Technology has allowed us to hide malware in many places, and
the supply chain (all the way from component manufacturer to end user) is a
very attractive place. Anyone in the supply chain before sale has the access
necessary for inserting malware in a computer or other electronic device.
Such embedded malware is difficult to detect, and most purchasers do not
have the resources to check for such modifications.
The hardware can be modified in ways that are not readily detectable, but that
allow an intelligence service to gain continuing entry into the computer or
communications system. Targeted components can be add-ons that are
preinstalled by the computer manufacturer before the computer is sold. A
user may not even use the vulnerable add-on or be aware that it is installed.
39 Malware inserted in a computer before sale can call home after being
activated, exfiltrate sensitive data via USB drives, allow remote control of the
computer, and insert Trojan horses and worms. Such backdoors are not
limited to software installed on the computer. Hardware components such as
embedded radio-frequency identification (RFID) chips and flash memory can
be the sources of such malware.
Cyber Deception
This chapter is about the use of cyberspace means to project a deception. But,
as previously stated, the term cyber deception has a specific meaning within
the cyber community. The target of this type of deception is the cyber
attacker. The objective is to deceive the attacker and cause him or her to
behave in a way that gives the defender an advantage.
Projecting a Deception
Cyber deception obviously is a form of deception in and of itself. But it also
can be used as a channel to project a deception. You simply place the
deceptive material in the engagement server so that it appears to be valid
protected files. When opponents hack into your site, they obtain files that
lead them into the story that you want to project.
This has been a brief introduction to the topic. Cyber deception to protect
against cyber attack is a complex subject that is treated in detailed texts
elsewhere. Readers who wish more detail may want to peruse Gartzke and
Lindsay’s article on the subject41 or the 2016 book Cyber Deception.42
Let’s next turn to the planning and execution part of the process.
Notes
1. See, for example, Sushil Jajodia, V. S. Subrahmanian, Vipin Swarup, and
Cliff Wang, eds., Cyber Deception: Building the Scientific Foundation
(Switzerland: Springer, 2016).
3. Amar Toor, “Germany Is Worried about Fake News and Bots ahead of
Election,” The Verge, November 25, 2016,
http://www.theverge.com/2016/11/25/13745910/germany-fake-news-
facebook-angela-merkel.
4. See https://en.wikipedia.org/wiki/Reliability_of_Wikipedia.
5. Wang Yongming, Liu Xiaoli, et al., Research on the Iraq War (Beijing,
PRC: Academy of Military Science Press, 2003).
7. Gary King, Jennifer Pan, and Margaret E. Roberts, How the Chinese
Government Fabricates Social Media Posts for Strategic Distraction, Not
Engaged Argument, working paper, August 26, 2016,
http://gking.harvard.edu/50c.
14. Steve Norton, “The CIO Explainer: What Is Blockchain,” Wall Street
Journal, February 2, 2016, http://blogs.wsj.com/cio/2016/02/02/cio-
explainer-what-is-blockchain/.
15. John McAfee and Colin Haynes, Computer Viruses, Worms, Data
Diddlers, Killer Programs, and Other Threats to Your System (New York: St.
Martin’s Press, 1989), 79.
18. Shadows in the Cloud: Investigating Cyber Espionage 2.0, Joint Report
JR03-2010, of the Information Warfare Monitor and Shadowserver
Foundation, April 6, 2010,
http://www.scribd.com/doc/29435784/SHADOWS-IN-THE-CLOUD-
Investigating-Cyber-Espionage-2-0.
20. Ben Weitzenkorn, “Adobe Flash Player Hit by Hackers on Both Ends,”
Security News Daily, http://www.securitynewsdaily.com/2191-adobe-flash-
player-iphone-android.html.
23. Kim Zetter, “Son of Stuxnet,” The Intercept, November 12, 2014,
https://theintercept.com/2014/11/12/stuxnet/.
25. Ibid.
26. Bernt Ostergaard, “Black Hat Roundup: Keeping Tabs on the Ones That
Got Away,” July 31, 2012,
https://itcblogs.currentanalysis.com/2012/07/31/black-hat-roundup-keeping-
tabs-on-the-ones-that-got-away/.
28. David Gilbert, “Duqu 2: The Most Advanced Cyber-Espionage Tool Ever
Discovered,” International Business Times UK, June 10, 2015,
http://www.ibtimes.co.uk/duqu-2-most-advanced-cyber-espionage-tool-ever-
discovered-1505439.
32. John Markoff, “Malware Aimed at Iran Hit Five Sites, Report Says,” New
York Times, February 11, 2011,
http://www.nytimes.com/2011/02/13/science/13stuxnet.html.
33. Ibid.
34. Ibid.
35. William J. Broad, John Markoff, and David E. Sanger, “Israeli Test on
Worm Called Crucial in Iran Nuclear Delay,” New York Times, January 15,
2011, http://www.nytimes.com/2011/01/16/world/middleeast/16stuxnet.html.
37. Ibid.
40. Carolyn Crandall, “The Ins and Outs of Deception for Cyber security,”
Network World, January 6, 2016,
http://www.networkworld.com/article/3019760/network-security/the-ins-and-
outs-of-deception-for-cyber-security.html.
41. Erik Gartzke and Jon R. Lindsay, “Weaving Tangled Webs: Offense,
Defense, and Deception in Cyberspace,” Security Studies 24, no. 2 (2015):
316–48, doi:10.1080/09636412.2015.1038188.
42. Jajodia et al., eds., Cyber Deception: Building the Scientific Foundation.
9 Planning and Executing Deception
1. Will the deception plan result in the desired decision and action?
2. Will it produce the desired outcome scenario, or an unexpected and
undesired one?
3. Will it affect only the desired target?
Checking the Decision/Action Model: Red Team Analysis
Red team analysis requires that an analyst or analytic team (ideally, not the
one who developed the deception plan) take a fresh look at it. The analyst or
team examines the story, as conveyed in the deception plan, from the point of
view of the deception target. At its core, red teaming is intended to avoid the
trap of ethnocentric bias or the mirror imaging problem.1
It’s most important to ask the question: “What different decisions could an
opponent make, besides the intended one?” In answering that question, the
red team analysis must consider all relevant PMESII factors. This is a check
on the analytic logic that supports the assessment of an opponent’s likely
decision. What might the likely decisions be, both with and without
deception?
Checking for Unfavorable Outcomes: Alternative Analysis
Alternative analysis is closely related to the idea of red teaming. Its purpose
is to identify possible outcome scenarios other than the expected one.
Cultural or organizational assumptions, biases, and preferences often shape
analytic conclusions, so applying structured techniques to challenge them and
force consideration of other possible outcomes is essential.3 Chapter 4
addressed the importance of modeling alternative decisions an opponent
could make. Alternative analysis takes that one step farther and considers not
only an opponent’s possible alternative decisions, but possible alternative
outcomes that result.
Both the United States and the USSR failed to identify possible unfavorable
outcomes in some deception operations that they conducted during the Cold
War, and they proved to be costly mistakes. Two examples stand out, and
they have common features. Both deceptions—one by each country—
succeeded. And both were strategic disasters for the deceiving country.
Moscow Air Show, 1955
During the 1950s and 1960s, the USSR leadership saw a compelling need
to display the image of a militarily powerful Soviet Union, in order to
protect and promote its interests worldwide. The Soviets accordingly
conducted a number of deceptions designed to depict this projection of
strength. One such deception occurred in July 1955, during the Soviet
Aviation Day demonstrations at Tushino Airfield, near Moscow.
During the 1955 event, a highlight of the show occurred when a flight of
ten M-4 Bison bombers flew past the reviewing stand, followed shortly
thereafter by another flight of eighteen Bisons.
It later became apparent that the flyover was a deception. After the first ten
bombers had passed the reviewing stands and were out of sight, they
quickly turned around in a racetrack pattern and flew past the stands again
with eight more Bisons joining the formation, presenting the illusion that
there were twenty-eight aircraft in the flyby.
During the Cold War, the United States conducted a series of deceptive
operations to convince the Soviets that it was making advances in chemical
warfare and subsequently in biological warfare capabilities. The deceptions
proved to be at least as counterproductive for the United States as the
Moscow Air Show deception had been for the USSR.
The CW and BW Deceptions
In 1959 the GRU (Soviet military intelligence) recruited US Army
sergeant Joseph Cassidy and used him as an agent for more than twenty
years. During all that time, Cassidy was actually serving as a double agent,
run by the FBI and US Army intelligence. Cassidy passed to the GRU
secret information about the Army’s chemical warfare (CW) program
being conducted at Edgewood Arsenal in Maryland, in order to build up
his credentials for a subsequent deception.4
The problem was that the expanded Soviet program actually succeeded in
developing a highly toxic and effective nerve agent called Novichok. It is
not clear whether Novichok was a result of their work to duplicate GJ or
something completely different. What appears to be the case, though, is
that the agent resulted from an expanded program in response to a US
deception.6
The story to be conveyed in this deception was that the United States was
pursuing a clandestine BW program at the same time that it was
negotiating the Biological Weapons Convention outlawing such weaponry.
The objective of this plan remains unclear. Under the circumstances, it is
hard to see how such a deception could have a good outcome.
The deception was carried out through a number of channels, but Colonel
Polyakov was the key one. Alarmed by this new threat, the Soviet
leadership ordered the creation of a new program for research and
development of biological weaponry. That program, nicknamed
Biopreparat, involved a number of military and civilian research and
production institutes and at least one test site.7
A few years ago, one US agency had developed a new technology for
tracking vehicles of high intelligence interest. They needed to provide the
resulting intelligence to military units in the field. But the potential for a
compromise to the intelligence was high because of such wide access, so a
cover story was needed.
Source: Barton Whaley, Stratagem, Deception and Surprise in War (Norwood, MA: Artech
House, 2007/reprint of 1969 edition), 111–14.
Let’s return to the deception that was part of Operation Desert Storm,
described in Chapter 4, with some additional facts to illustrate how this works
in practice:
Place. The Iraqis were led to expect an attack on their positions from the
south, not one across the desert to the west.
Time. The Iraqis were led to believe that some days would pass after the
0600 hour deadline on February 24, 1991, before any large-scale ground
assault by coalition forces into Iraq would occur. (The Iraqis were
slowly complying with coalition demands for withdrawal from Kuwait.)
However, a full ground campaign was set in motion before 0600 hours
on February 24.
Strength. The Iraqis were led to believe the US VII Corps was not yet at
full strength just to the west of Kuwait; the corps was in fact at full
strength.
Intention. Iraq was led to believe the coalition’s primary intended effects
were the liberation of Kuwait and the withdrawal of Iraqi forces from
Kuwait. The Iraqis assumed those actions would quickly halt further
coalition action. However, just as important to the coalition was the
destruction of Iraqi forces. (The Iraqi decision to withdraw made them
easier targets as they raced to the perceived safety of the Iraqi border in
open terrain.)
Style. An amphibious landing instead of an armored attack through the
western desert was a major part of the deception.
Now let’s shift focus to another part of the execution management process:
the opponent’s OODA loop.
Operating inside the Opponent’s OODA Loop
Frans Osinga argues that Boyd’s concept of an OODA loop is much deeper,
richer, and more comprehensive than it is commonly viewed. The objective,
according to Osinga, is to
There are two key concepts here. Let’s deal with the first one before turning
to the second. First, there is the concept of operating inside the opponent’s
OODA loop so that your side can act more effectively than can the opponent.
We always want to do that—meaning that we react to opponents’ actions
more quickly than they can react to ours. But just how do you in practice do
that? The cycle of CW deception by Libya and counterdeception by the
United States and its allies throughout the 1980s and 1990s illustrates how
the process works.
The Rabta/Tarhunah Deception and
Counterdeception
Beginning in 1984 the Libyan government began constructing a CW agent
production plant near the city of Rabta, forty miles southwest of Tripoli.
By late 1988, with extensive foreign assistance, Libya had completed
construction and begun producing CW agents. During its three years of
operation, the facility produced at least 100 metric tons of mustard gas and
the nerve agent sarin.12 To conceal the purpose of the plant, the Libyans
set up an elaborate deception that failed because the Libyans presented an
incongruent picture to Western intelligence, and because the United States
and its allies were able to react to unfolding events more rapidly than the
Libyans.
Much of the resources needed to build the Rabta plant had to come from
foreign companies. The Libyans set up a network of front companies, fake
subsidiaries, and middlemen to hide their purchases. Considerable
assistance in building the Rabta complex was provided by European and
Japanese firms. The prime contractor at Rabta was Imhausen-Chemie, a
West German chemical firm. Other supplies came from firms in
Switzerland, Austria, and Hong Kong. Most of the equipment and supplies
left European ports under false export documents. To circumvent export
controls, the Libyans relied on a complex commercial network involving
front companies that transferred goods through ports in the Far East.
Libyan government officials publicly claimed that the Rabta facility was a
pharmaceutical plant, and designated it Pharma-150. Construction at Rabta
was carried out by 1,300 low-wage laborers imported from Thailand under
tight security conditions.
On September 14, 1988, the State Department went public with the
following statement: “The U.S. now believes Libya has established a CW
production capability and is on the verge of full-scale production of these
weapons.” CIA director William Webster provided further details in a
speech on October 25, 1988, claiming that the Libyan plant was the largest
chemical weapons facility the agency had detected anywhere in the
developing world.13 The announcement triggered a second Libyan
deception and another circuit around the OODA loop.
The Libyans apparently recognized that IMINT was the source of the US
assessment. They responded with an elaborate deception operation to
counter imagery analysis, fabricating evidence of a fire on March 13, 1990,
to make the Rabta facility appear to be damaged and possibly
inoperative.14 The fabrication reportedly involved painting burn marks on
buildings and burning stacks of tires to create the perception of a fire for
imagery collectors. They also rushed ambulances to the area to make it
appear that the plant had suffered severe fire damage and casualties—
probably targeting HUMINT channels.15 When the French commercial
Earth-resources satellite SPOT 1 photographed the Rabta facility on March
18, however, it looked to be intact.
The second round of Libyan deception also was a failure, again due to the
incongruent picture that the Libyans presented to Western intelligence. By
mid-1990 the United States and its allies had again operated inside the
Libyan OODA loop. In August 1990 the US intelligence community
deduced that large-scale production of chemical agents at Rabta had begun
after a photoreconnaissance satellite observed specialized trucks, designed
to transport CW agents, picking up barrels of the suspected agent at the
plant. In 1992 a US intelligence official stated publicly that the Rabta
facility had produced and stockpiled more than 100 metric tons of mustard
gas and the nerve agent sarin.16
Once again, the United States and its allies had operated inside Libya’s
OODA loop by revealing the deception before the plant could produce CW
agents. Outmaneuvered and again frustrated, Libya subsequently stopped
construction at Tarhunah. Many of the chemical weapons caches from the
Rabta plant were later destroyed under supervision of the Organization for
the Prohibition of Chemical Weapons after an agreement was reached with
Muammar Gaddafi’s regime. The destruction reportedly was completed in
2014.
The preceding discussion was focused on making your side’s OODA loop
work faster and more effectively than the opponent’s. An alternative to
making your loop work faster is to slow down the opponent’s OODA loop,
and there are ways to do that. They involve applying Osinga’s second
concept: choosing to confuse or overload the target’s intelligence apparatus
and to introduce ambiguity or the “uncertainty, doubt, mistrust, confusion,
disorder, fear, panic, chaos, …” that he refers to. In the terms of Chapter 1,
that is an “A”-type deception.
A cover legend and airline tickets showed the six studio executives
arriving in Tehran from Hong Kong at approximately the same hour that
the actual exfiltration team arrived from Zurich. So passengers
disembarking from either flight would have been processed by the same
immigration officers. Consequently, the Iranian entry cachets stamped in
team passports could be duplicated and used to create fake passports with
the same entry cachets for the six. If the team had used the entry stamps of
someone who hadn’t been on duty that day, it could have been spotted and
blown the entire operation. The deception was in some ways a model for
how to attend to such details. For example,
The team also made smart use of timing in choosing their departure. The
Swissair flight departed Tehran at 7:30 a.m. The team arrived at the airport
at 5 a.m., when the airport was relatively quiet; the officials manning the
exit gates were sleepy; and the Revolutionary Guards were mostly still
asleep at home. A 2012 movie about the rescue, also titled Argo, depicts
the Swissair flight departure as an exciting event, with Revolutionary
Guards pursuing the departing aircraft down the runway. That depiction is
pure Hollywood. Thanks to careful planning, the actual departure was
uneventful.
Managing the Developing Scenario
The final step in the execution process is responding to unfolding events in
the opponent’s OODA loop. Those executing the deception have to look for
and monitor the leading indicators that would indicate which of the scenarios
—or which combination of scenarios—is actually taking place. As Babson
College professor Liam Fahey points out, indicators will also give important
insights into what scenarios are not taking place.19
What this means, to begin with, is that the team must identify feedback
channels. A channel is needed that will confirm whether the deception
information has reached its intended target (the decider). Another channel
may be needed to identify the actions that the target is taking based on that
information. Feedback allows a determination as to whether the deception
plan is believed and, what is more important, being acted upon. As Godson
and Wirtz describe it,
The monitoring job may involve watching trends. The intelligence team, for
example, might monitor demographic and economic trends or decreases in
the number of terrorist cells. A political or economic analyst might look at the
questions that opponents ask and the positions they take in negotiations. A
military intelligence analyst might monitor troop movements and radio traffic
for signals as to which scenario is developing.
3. Ibid.
5. Ibid.
6. Ibid.
7. Ibid.
8. Ibid.
9. Scott Gerwehr and Russell W. Glenn. The Art of Darkness: Deception and
Urban Operations (Santa Monica, CA: RAND, 1999),
http://www.rand.org/publications/MR/MR1132, p. 35.
10. Ibid.
11. Frans P. B. Osinga, Science, Strategy and War: The Strategic Theory of
John Boyd (London: Routledge, 2007), 186.
14. Ibid.
15. Ibid.
17. Ibid.
20. Roy Godson and James J. Wirtz, “Strategic Denial and Deception,”
International Journal of Intelligence and Counterlntelligence 13 (2000): 427.
21. Andrew Sleigh, ed., Project Insight (Farnborough, UK: Centre for
Defence Analysis, Defence Evaluation and Research Agency, 1996), 13.
23. Thaddeus Holt, The Deceivers: Allied Military Deception in the Second
World War (New York: Skyhorse Publishing, 2007), 378.
10 Preparing to Counter Deception
The previous chapters described how to plan for and conduct a deception.
They provided the essential background on how to detect deception when it is
conducted against your side. This chapter is therefore addressed to the
intelligence team that will in most cases have the job of identifying
deception.
The procedures for detecting a deception are relatively easy to grasp. The real
challenge lies in establishing an intelligence-driven framework that organizes
all of the participants in your OODA loop to provide deception warning and
to answer the questions that Waltz poses. This chapter describes a
methodology for doing that.
There are three steps to take toward countering deception. Two are
preconditions; that is, they need to be done before the deception may be
occurring. In advance, a vulnerability assessment must be conducted. That
means identifying channels where your side is vulnerable to deception—
usually the elements of your OODA loop that the opponent is familiar with.
Then follow up with the next step—a continuing assessment of possible
threats. Completing these two tasks requires that several principles of
counterdeception be followed (equally critical in conducting deception, of
course):
Know yourself—a sound understanding of the cognitive vulnerabilities
in your side’s analysis and decision processes
Know your channels—an understanding of your collection disciplines—
their capabilities, their limitations, and their vulnerabilities to deception
Know your decision maker
Know your situation
Know your adversary—an understanding and consideration of the
means, motives, and culture of your adversaries.
Once a specific threat has been identified, it is time to execute the third step:
using all of your channels to identify deception when it occurs. The
remainder of this chapter describes how the first two steps are best handled.
Chapter 11 addresses the third step.
Assessing Vulnerability
Preceding chapters have made clear that if you can’t assess the opponent’s
OODA loop, deception is unlikely to succeed. The opponent has the same
problem. To the extent that opponents can’t figure out your side’s loop, they
are unlikely to attempt any deception. And any deception they do attempt
should be easy to identify and counter.
Therefore, it is important to know what your OODA loop is, then make it
strong and secure. This is in part about security: the protection of sources and
methods, or the “OO” part of the loop. But it is much more than that. The
decision/action part has to be secured as much as possible, and that can be
very difficult. Let’s start with the observation process.
Observation (Collection)
When collection becomes too predictable, and the opponent understands the
observation part of the loop, tactics for countering denial and deception stop
working. If opponents can understand your collection process, they can
defeat it, as the examples throughout this text illustrate. There is a tendency
to believe that IMINT and COMINT are less vulnerable to deception. But
deception can succeed against both IMINT and COMINT if the opponent
knows enough about the collection system. The effectiveness of deception is
a direct reflection of the predictability of collection. So the first line of
defense is to keep all potential opponents uncertain about your collection
channels.
The case of Igloo White, which occurred during the Vietnam War, illustrates
how deception can work at the tactical level, but also how defensive
deception can be effective when the opponent knows too much about your
collection sensors.
Technical Deception: Igloo White
During the Vietnam War, the North Vietnamese supplied their forces in the
South via a network of trails and roads running through Laos known as the
Ho Chi Minh Trail. One US effort to degrade this supply route was known
as Operation Igloo White. Between 1967 and 1972, American aircraft
dropped over 20,000 battery-powered sensors along parts of the trail. Some
sensors hung from trees, detecting the sounds of voices or truck engines.
Some buried themselves in the ground on impact; these were designed to
look like plants and to detect human urine or the seismic disturbances
created by foot traffic and vehicles.
However, the success was short lived and finally illusory, the result of an
effective counterdeception campaign. Over time, the North Vietnamese
were able to pick up and analyze an abundance of sensors, identify their
purpose, and figure out how to counter them. Armed with this knowledge,
North Vietnamese troops began dropping bags of urine near the urine
sensors and playing recordings of truck engines near the acoustic sensors.
In consequence, later inquiries found that many US attacks hit empty
jungle.
Viewed in retrospect, the collection plan had a major vulnerability that the
North Vietnamese could and did exploit. They could easily obtain copies
of all the sensors used on the trail and assess how the sensors operated.
Once that was done, a deception effort was no great challenge. It well
illustrates the principle that knowledge of the opponent’s channels makes
deception relatively straightforward. Modern air-dropped sensors such as
the US Steel Eagle devices are still easy to acquire but more difficult to
deceive, because the new sensors have high resolution and are able to
detect fine details in an audio signature.
Orientation (Analysis)
Orientation, or analysis, is the phase during which your side is most likely to
catch deception. It is a simple tenet: All available channels have to be cross-
checked during analysis. An opponent can deceive one channel, but it’s
difficult to deceive them all at once and present a consistent story.
Collectors also should be able to easily share the false signals that they detect
—something that is often not done in large intelligence organizations. The
natural tendency is to move the material into a counterintelligence unit,
compartment it, and not share with the all-source analysts who have access to
the other channels and are better positioned to identify deception.
Be aware also of the speed of information flow through the channels, and
shorten it where possible using the close ties with collectors and decision
makers. Some channels naturally have to operate more slowly than others. An
IMINT channel can provide real-time reporting. Clandestine HUMINT
sources may have to report only periodically. And collection agencies usually
want to analyze the material that they collect before publishing it. There are
valid reasons for this, but the result can be that your side’s OODA loop
operates more slowly than the opponent’s—resulting in the needed warning
of deception arriving at the decision maker after it is too late. A competent
deceiver will time the arrival of information to do exactly that, as the allies
did in Operation Quicksilver when agent Garbo sent the urgent message to
his handler that an invasion of Normandy was imminent—the day before the
invasion.
Finally, remember that the decision maker is part of this network. Some
channels usually exist outside the intelligence organization that can be used
to convey deceptive information. These typically are channels that intersect
with the operations team and with leadership. But they usually won’t be
available unless the analyst expends some effort to make it so. It bears
repeating that access to incoming information from all channels that might
carry deceptive information is critical, including collectors, operations,
leadership, their advisors, and decision makers.
One last caution about the orientation phase: A serious cognitive
vulnerability is associated with being aware of possible deception. Veteran
CIA analyst and writer on intelligence Richards Heuer has noted the paradox
that the more alert we are to deception, the more likely we are to be deceived.
As Heuer explains it, our sensitivity to deception can influence our openness
to new information.5 When there is an existing belief about a given situation,
and new evidence challenges that belief, we have a natural tendency to find
some reason to discount the new evidence. This is one reason that it is so
desirable for the deceiver to time the arrival of deceptive information so that
an opponent is forced to make fast, poor decisions on incomplete
information. The opponent should have as little time as possible to see
through the deception. In counterdeception, information that arrives through
any channel just before a decision must be made is especially suspect for that
reason. This leads to a conundrum concerning new information:
Lucky intelligence teams don’t have to deal with leaders like these. Unlucky
teams should at least get ready for the deceptions that are coming their way.
But opponents who can’t have confidence in the likely decision your leader
will make when faced with a story should hesitate to use deception. A
decision maker usually needs help to understand that.
Those who have completely rational decision makers are on fairly good
ground. Even better are decision makers who follow the thirty-six stratagems
described in Chapter 1. Leaders whose decisions appear to be unpredictable
are a double-edged sword: They may cause difficulties for your own
operations, but they also play havoc with an opponent’s deception planners.
Recognize that the threat of deception can come from an opponent, an ally, or
a neutral power. The deception that preceded the Indian nuclear test, for
example, was conducted by India against the United States while the two
countries were on reasonably friendly terms. So when the term adversary or
opponent is used in this chapter, it includes anyone who might use deception
against your side’s interests, including neutrals and allies.
Does the adversary have a past history of engaging in deception? If so, what
kind? Opponents have a preference for using the methods that have worked
in the past. Remember that a stratagem is not invalid or unlikely just because
it’s old. The betting parlor trick in The Sting was old, but it continues to work
today, just in updated settings. Confidence scams such as Ponzi schemes in
general are old but continue to separate people from their money. So, next,
consider the following questions:
That was good advice, but Buryakov apparently ignored it and fell into the
trap. The businessman, at Buryakov’s request, provided him with
government documents marked “Internal Treasury Use Only” and “For
Official Use Only.” The documents were subsequently used to convict
Buryakov.9
8. Ibid.
9. Ibid.
11 Identifying Deception
Once the detection preparation work has been done, the next task is to
identify when deception is actually occurring. This is about monitoring your
channels for indicators. And that requires some sort of channel management
plan.
Deception Detection via Channel Management
Chapter 9 discussed the use of channel management when conducting
deception. This same channel management framework can be used to identify
abnormalities or incongruences in collected information that could indicate
an adversarial deception. This means that the initial steps of sensor mapping
your own organization and your adversary’s organization are still required in
order to establish a picture of existing channels between both parties, as
shown in Figure 11-1 (replicated here from Figure 9-2 for convenience).
To illustrate how this works in practice, let’s revisit the Rabta deception from
a counterdeception perspective.
Rabta Counterdeception
Chapter 9 showed the deception/counterdeception OODA loops for the
Rabta CW plant. The Rabta case illustrates the importance of using as
many channels as possible to detect deception: COMINT, IMINT, and
technical channels in addition to traditional OSINT and HUMINT
channels. There were a number of incongruences in the channels, as
discussed below.
It was not until August 1988, however, that the CIA obtained more solid
evidence that the Rabta plant was engaged in CW agent production.
Following a partial test run of the production process, an accidental spill
occurred as highly toxic wastes were being transferred for disposal outside
the plant. The resulting cloud of fumes killed a pack of wild desert dogs in
the vicinity of the plant. Their bodies, detected by satellite, indicated that
the plant was producing chemicals of warfare toxicity.
Literal Sources
Much deception is conveyed in literal form, with HUMINT and OSINT being
primary channels. But COMINT and CYBER channels are increasingly used
to deceive. And in communication, people can be misinformed or they lie to
each other. So even if the product is not a deliberate deception, the outcome
can be much the same. Given those parameters, consider these
counterdeception source evaluation tenets:
Accept nothing at face value. Evaluate the source of evidence carefully and
beware of the source’s motives for providing the information. Evaluating the
literal source involves answering these questions:
In the HUMINT business, this is called determining bona fides for human
sources. Even when not dealing with HUMINT, one must ask these
questions. They are useful not only for identifying state-sponsored or
organization-sponsored deception but also for catching attempts by
individuals (such as Curveball) to deceive for their own purposes.
OSINT
OSINT has to be vetted and analyzed very carefully. Two standard checks
done in validation of material are to examine
Note that a failure to pass validity checks does not mean that the information
is useless. Material having questionable validity may have intelligence value.
Deception attempts may not pass validity checks, but the fact that an attempt
was made can tell you much about an opponent’s intent.
COMINT
As with the other literal channels, identifying COMINT deception relies
heavily on understanding your own OODA loop, how the opponent views it,
and the difference between the two views. If the opponent is aware that you
have broken an encryption system or that you search for certain key words,
for example, then intelligence based on those channels should be vetted
carefully. COMINT derived from push-to-talk radios (which indeed still exist
in use) or high-frequency (HF) communications also should be looked at
closely, because such communications are easily intercepted.
Having the best possible talent on the COMINT analysis team makes a big
difference, as does having team members with long experience on their
assigned targets. The clues to detecting deception in COMINT are usually
subtle, and experienced analysts are best positioned to spot inconsistencies
that tip off deception.
HUMINT
Countering HUMINT operations falls into the broad category of
counterespionage. This field of deception and counterdeception is the subject
of many books. Counterespionage is touched on here only lightly, with the
mention of double agents, moles, and dangles.
Double Agents
A major tool of offensive counterespionage is the double agent. A double
agent can be a spy who becomes an employee or agent of the target
intelligence service or an agent who is turned as a result of discovery and
threats (usually of execution) for noncooperation. Double agents are often
used to transmit disinformation or to identify other agents as part of
counterespionage operations. Their success depends on trust by the
organization that they are working against; the organization that has doubled
them helps build that trust by providing true, but low value, information to
pass along.
Moles
The great fear of any intelligence service is a specific type of double agent:
the mole, a trusted insider who provides information to an opposing service.
On the flip side, the holy grail of HUMINT is a penetration of a foreign
intelligence service—ideally, a double agent who works in the
counterintelligence component. The Ames and Hanssen cases, described in
Chapter 6, illustrate the extensive damage that such moles can cause.
Dangles
A dangle is a seemingly attractive potential recruit intentionally placed in
front of a targeted intelligence service. The idea is for the targeted service to
initiate contact and thus believe it has spotted, developed, and recruited an
agent. If the target service swallows the bait and accepts the agent, then the
dangle’s parent service may learn the identities and vulnerabilities of some of
the target’s officers, its collection requirements, and tradecraft.3 Dangles can
also be used to pass misinformation to the recruiting organization.
What is the basis for judging the source to be reliable? Seek and heed
the opinions of those closest to the reporting. Is the source vulnerable to
control or manipulation by the potential deceiver?
Is the true identity of the source an issue? Identity intelligence
(biometrics, for example) can be used to validate or authenticate an
identity. Biographic (public and private records) and behavioral data
(travel, consumer patterns, Internet and other communications activity)
can be used to check on human sources.
How good is the source’s track record of reporting? Check all instances
in which a source’s reports that initially appeared correct later turned out
to be wrong—and yet the source always seemed to offer a good
explanation for the discrepancy. Before and during the Quicksilver
deception, Juan Pujol Garcia (Garbo) repeatedly was able to deceive the
Germans because of his skill at explaining discrepancies.
The analysts who prepared the estimate readily accepted any evidence that
supported the theory that Iraq had stockpiles and was developing weapons
programs, and they explained away or disregarded evidence that pointed in
other directions. Two of the most egregious examples were the evaluation
of a key HUMINT source on Iraq’s biological weapons program and of
HUMINT and IMINT sources on Iraq’s chemical weapons program.
The NIE also erroneously concluded that Iraq had restarted chemical
weapons production and increased its chemical weapons stockpiles, based
on poor evaluation of both IMINT and HUMINT:
Nonliteral Sources
Nonliteral intelligence channels include IMINT, SIGINT other than
COMINT (that is, ELINT and FISINT), and MASINT. These channels,
especially IMINT, are used to deceive. The wide use of dummy tanks and
landing barges to deceive imagery collection during World War II has
already been noted. The Serbs used IMINT deception frequently during the
Kosovo conflict described in Chapter 3.
Following are some of the important applications of SARs that are used to
counter deception:
Terrain mapping and characterization. Optical imagers can provide
maps, but because they do not measure range to the target, they don’t do
well at determining terrain elevation. SARs do measure range, so they
can provide precise topography of the target area. Furthermore, SARs
can characterize terrain, identifying soil and moisture changes that often
accompany attempts at visible imagery deception.
Change detection. One of the major advantages of SAR is its ability to
detect changes in a scene over time. Change detection is an application
to which SAR is particularly well suited. Examples of surface changes
that can be observed include vehicle tracks, crops growing or being
harvested, and soil excavation. Changes in the terrain surface due to
underground construction also can be observed by change detection. The
underground excavation results in both vertical settlement and a
horizontal straining that is detectable.12
Foliage and camouflage penetration. An application of SAR that has
obvious counterdeception uses is foliage penetration. SARS can be built
to operate in the low-frequency bands known as VHF and UHF bands.
At these lower frequencies SARs can effectively image objects
concealed in dense foliage, even objects located underneath the forest
canopy.13 They also penetrate into dry earth for short distances to detect
buried objects such as land mines.14
Identifying targets. SARs can identify and classify targets of intelligence
interest—ships, aircraft, and military vehicles—and collect details that
would identify visible imagery deception.15 The higher the SAR’s
resolution, the more detail it can obtain about a target.
Monitoring moving targets. SARs also have the capability to detect
target motion. This capability can be used to create an image that
highlights moving targets in an area. It is possible to detect aircraft,
helicopter, or ship movements, as well as ground vehicular
movement.16
Deception that targets technical collection can be defeated, if your side can
conceal the details about the performance of your collection sensors. If your
technical collection system is more sophisticated than the opponent thinks it
is—can measure signatures to a finer level of detail, for example—then your
intelligence network can probably spot the deception. It is difficult, for
example, to simulate exactly all of the acoustic signatures created by
submarines, because there are so many of them. Machinery vibration, water
flow over the vessel’s hull, propeller rotation or cavitation, and crew activity
all generate acoustic signatures. If the opponent misses just one of these, and
your system collects it, then you can readily identify the deception.
Furthermore, sensors that measure signatures such as sound or radar
emissions are continually improving in resolution and in the details they
measure. A misstep here by the deceiver in underestimating performance of
the opponent’s sensor can be fatal to the deception effort.
Evaluating the Channel
In detecting deception, it is important to understand the channel through
which raw intelligence flows, and make it as effective as possible. In a large
intelligence system, collection requirements must move through a
bureaucracy to a requirements officer, from there to a country desk, a field
requirements officer, a SIGINT collector or a HUMINT case officer (for
instance), then to an agent in the case of HUMINT; and the response then
goes back through the reports chain. In the process, what HUMINT
operatives call “operational information”—details about the source or
collection methods—are stripped. As a result, critical clues that can identify
deception can disappear. A somewhat different process leads to a similar
result in COMINT and IMINT reporting. In those cases, things that don’t
make sense to the COMINT or IMINT analyst often are omitted in the
published report. But the items that don’t make sense can be critical in
identifying deception.
The intelligence team should also consider the level of the opponent’s
understanding of each channel. In crafting the 1955 Moscow Air Show
deception, the Soviets took advantage of their intimate knowledge of attaché
collection. They even knew exactly where the attachés would stand to
observe the flyby, so the Bison racetrack flight pattern was designed to
follow a path that the attachés could not observe. And they knew that two
passes by the same number of bombers would be likely to arouse suspicion,
so they increased the number of bombers in the second pass. As another
example, in deceiving IMINT during preparations for their nuclear test, the
Indian government took advantage of their knowledge of US imagery quality
and of the details that imagery analysts were relying on.
The major flaw, though, could have been caught if Spanish and German
intelligence had shortened the channel by going back to an ancillary but
critical source. The Spanish coroner who examined the body was, contrary
to British expectation, an expert pathologist who had long experience in
examining drowning victims. He noticed several incongruences in his
examination. No fish or crab bites, shiny instead of dull hair, and clothing
that wasn’t shapeless, all of which indicated that the body had not been in
the water as long as the briefcase’s documents indicated; but the state of
decay indicated that the body had been in the water longer than the
documents indicated.17
The point is that complex deception efforts are very difficult to pull off,
even with a good model of the opposing services such as the British had. In
the end, the success of the British effort depended on a few lucky breaks—
the biggest one being the failure of their opponents to manage their own
intelligence channels.
Evaluating the Evidence
Finally, your side needs to step back and look at the whole picture—the story
that emerges from all of the evidence in all channels. The data from different
collection sources are most valuable when used together. The synergistic
effect of combining data from many sources both strengthens the conclusions
and increases confidence in them. It also allows a counterdeception
intelligence team to identify deception.
In terms of the 1955 Moscow Air Show case, there had been many flyovers
before. The established pattern was for aircraft of one type to fly by in a
single formation. Two flybys by the same aircraft type was an incongruity
that should have received some attention.
Vividness Weighting
In general, the channel for communication of intelligence should be as short
as possible; but when could a short channel become a problem? If the channel
is too short, the result is vividness weighting—in which evidence that is
experienced directly is strongest (“seeing is believing”). Decision makers
place the most weight on evidence that they obtain directly—a dangerous
pitfall that executives fall into repeatedly and that makes them vulnerable to
deception. Strong and dynamic leaders are particularly vulnerable: Franklin
Roosevelt, Winston Churchill, and Henry Kissinger are examples of
statesmen who occasionally did their own collection and analysis, sometimes
with unfortunate results.
Source Preferences
One of the most difficult traps to avoid is that of weighing evidence based on
its source. HUMINT operatives repeatedly value information gained from
clandestine sources—the classic spy—above that from refugees, émigrés, and
defectors. COMINT gained from an expensive emplaced telephone tap is
valued (and protected from disclosure) above that gleaned from high-
frequency radio communications (which almost anyone can monitor). The
most common pitfall, however, is to devalue the significance of OSINT;
being the most readily available, it is often deemed to be the least valuable.
Using open sources well is a demanding analytic skill, and it can pay high
dividends to those who have the patience to master it. Collectors may
understandably make the mistake of equating source with importance.
Having spent a sizable portion of their organization’s budget in collecting the
material, they may believe that its value can be measured by the cost of
collecting it. No competent analyst should ever make such a mistake.
Premature Closure
The opposite of favoring recent evidence, premature closure also has been
described as “affirming conclusions,” based on the observation that people
are inclined to verify or affirm their existing beliefs rather than modify or
discredit those beliefs when presented with new information. It has been
observed that “once the Intelligence Community becomes entrenched in its
chosen frame, its conclusions are destined to be uncritically confirmed.”20
The primary danger of premature closure is not that the analyst might make a
bad assessment because the evidence is incomplete. Rather, the danger is that
when a situation is changing quickly or when a major, unprecedented event
occurs, the analyst will become trapped by the judgments already made. The
chances of missing indications of change increase, and it becomes harder to
revise an initial estimate.
2. William R. Doerner, “On Second Thought,” Time, January 23, 1989, 31.
5. Ibid., 113.
6. Ibid., 96.
7. Ibid., 92.
9. Ibid., 127.
13. Merrill Skolnik, ed., Radar Handbook, 3rd ed. (New York: McGraw-Hill,
2008), 17.33–17.34.
15. Dai Dahai, Wang Xuesong, Xiao Shunping, Wu Xiaofang, and Chen
Siwei, “Development Trends of PolSAR System and Technology,” Heifei
Leida Kexue Yu Jishu, February 1, 2008, 15.
16. Zhou Hong, Huang Xiaotao, Chang Yulin, and Zhou Zhimin, “Ground
Moving Target Detection in Single-Channel UWB SAR Using Change
Detection Based on Sub-Aperture Images,” Heifei Leida Kexue Yu Jishu,
February 1, 2008, 23.
18. Thaddeus Holt, The Deceivers: Allied Military Deception in the Second
World War (New York: Skyhorse Publishing, 2007), 379.
19. The material in this section was taken from Robert M. Clark, Intelligence
Analysis: A Target-Centric Approach, 5th ed. (2016) and modified.
21. CIA, Penetrating the Iron Curtain: Resolving the Missile Gap with
Technology (Washington, DC: US Government Printing Office, 2014), 43–
45.
Part Two Exercises
12 Sensor Mapping and Channel Tracking
Exercises
Use the following example as a model for how to format final products for
the following three sensor mapping exercises.
Since the closure of most of the oil fields in the North Sea due to their
depletion and also the falling demand for oil, there has been an increase in the
pirating of regular transport and fishing vessels rather than of tankers in order
to ransom the crew members. Pirating incidents usually occur within the 200
kilometer (km) limit of the Scottish east coast, and a great many of the pirate
boardings of vessels are done at night. Investigations done by cooperating
authorities have determined that the pirating along the east coast of Scotland
is dominated by a large clan, the Roberts Clan. Despite analysts for the coast
guard and navy providing some good pattern of life (PoL) information, the
pirates manage to maintain a steady level of activity along the coast. The
Scottish government has worked to improve coordination between sea, land,
air, and cyber assets. However, the pirates seem always to be one step ahead.
It is becoming clear that the Roberts Clan has an effective intelligence
collection capability. Table 12-1 illustrates a sensor mapping of the clan in
roll-up form.
A reminder: The rating schema in the roll-ups in this chapter and the
following ones uses the source evaluation and rating schema introduced in
Table 6-2. The numbers in HUMINT reporting (for example, 023) refer to
specific sources.
Suggestion for class use: In each of the following three exercises, read the
short scenario and, individually or in teams, use the accompanying raw
intelligence to develop your own sensor map understanding of the target
organization. Follow the format in Figure 12-2. Participants should then
present, discuss, and compare their sensor maps with those developed by
other students or teams.
List and discuss some of the challenges with identification of possible sensors
from raw intelligence. Discuss what is just “noise” and what is useful.
Discuss how training could be conducted in your organization to ensure that
adversarial sensors are identified from raw intelligence.
Scenario
You or your team are part of the 10th Blue Battalion’s S2 (intelligence
section) and are preparing to support deception planning. The Blue Battalion
has been assigned a geographic area, their battlespace, called Krag Valley
West. Opposing you in your battlespace is an adversarial Red 5th Battalion.
Task
Use the intelligence provided in Table 12-2 to construct a simple sensor
map/model of the enemy Red 5th Battalion. Use Figure 12-2 as a model
template. How many different types of INTs do they have?
Sensor Mapping Exercise 12.2: Hells Cross Motorcycle
Gang
Sensor mapping is not restricted to military environments. This sensor
mapping assignment takes place in a fictional civilian environment common
to most nations where there is a constant struggle between authorities and
organized crime.
Scenario
In this fictional scenario, you or your team are part of the Copenhagen Police
District specialists in a new operational-level intelligence section that has
been created to help the police gain the initiative against the most powerful
organized crime gang in the city. The gang is known as the Hells Cross
Motorcycle Club (Hells Cross MC). A key figure in the gang is named Bo
Sondergaard, but it is not clear exactly what role he has.
Task
The first order of business for your intelligence section is to produce an easily
communicable model of how the Hells Cross MC collects and orients
information for use in their illicit activities as part of understanding their
overarching OODA loop. Use the provided intelligence reporting in Table
12-3 to construct a visual model that depicts how you as the subject matter
expert (SME) believe the gang observes and orients intelligence in the
Copenhagen Police District. Use Figure 12-2 as a model template.
Sensor Mapping Exercise 12.3: People’s Party Militia
Wing
Sensor mapping to support a coming strategic campaign is no different from
sensor mapping at other levels. It will quickly involve tactical considerations
as to how whole organizations observe and orient in the expected theater of
operations. Intelligence collection assets and platforms can be tactical in their
performance but strategic in their effects. Sensor mapping to support a
strategic campaign still encompasses the same methodological approach as
the two previous exercises.
Scenario
In this fictitious scenario, the Russian Federation is concerned about possible
unconventional warfare activities being directed by North Korea against the
Russian Federation. In the far east of the Russian Federation along the border
with North Korea, a local insurgency is being conducted by a group calling
themselves the People’s Party Militia Wing (PPMW). The group has
established shadow governance in the cities of Bamburovo, Bezverkhovo,
and Narva, and currently threatens Primorsky. Moscow has established a
fusion analysis cell consisting of members from the Foreign Intelligence
Service (FIS), Federal Security Service (FSS), and the Main Intelligence
Directorate (GRU) and sent them to Primorsky in order to assess the PPMW.
Task
Your team is tasked to sensor map the PPMW in order to understand how
they collect intelligence. You have three weeks before counterinsurgency
operational planning begins. Use the provided intelligence reporting in Table
12-4 to construct your sensor map, and use Figure 12-2 as a model template.
Channel Management Exercises—Deception Projection
Chapter 9 introduced the concept of channel management, along with the
concepts of congruence and incongruence. The following exercises allow
readers to practice identifying possible abnormal congruences or
incongruences within one’s own organizational channels being used to plan
and project a deception against a targeted adversary—in short, to better
manage deception. The objective in each exercise is to illustrate how channel
management can improve deception planning and reduce abnormalities that
might undermine the story line of the deception plan.
For each scenario you or your team are supporting an eventual deception
through channel management. The task is to apply the abnormality detection
framework found in Figure 12-3 and identify possible issues with the planned
story-line projection that could undermine the deception.
Each exercise includes a brief summary of the proposed story line for the
deception, as part of the situation description. The details of the story are
contained in a channel table, based on your understanding of the adversary’s
sensors. So you do not need to develop a sensor map. The table includes a
proposed timeline for the projection of the different elements of the
deception.
Scenario
The date is October 2016. You or your intelligence analysis team are in South
America working for a EUROPOL counternarcotics task force, focusing on
countering cocaine smuggling from South America to Europe. French Guiana
has long been a favorite cocaine smuggling hub to Europe because of its
departmental status in France. Cocaine is smuggled via private yachts over
the Atlantic Ocean to Africa and eventually to Marseille in southern France.
It is an important smuggling route, as 1 kilogram (kg) of raw cocaine from
Guiana can be turned into as many as 10 kg for sale to users on the streets of
Europe. Intelligence analysis to date has determined with high certainty that a
large shipment of cocaine is being readied somewhere in the equatorial
forests of French Guiana’s interior. The shipment is expected to be sent
overland to a port on the Guyanese coast, to be placed on waiting yachts, in
January 2017. The task force commander, recognizing the limited resources
for coastal surveillance, has ordered your intelligence section to project a
deception story via identified channels to the cartel, in order to convince the
cartel to choose a specific port for an expected attempt to load the yachts. The
purpose of the deception is to mitigate the lack of resources for monitoring
the whole of Guiana’s west coast and still intercept the shipment before the
yachts depart. There are five possible ports for the delivery. They include
Cayenne, Tonate, Kourou, Organabo, and Mana. The task is to project a story
to convince the cartel to attempt to load the cocaine at Kourou. The cartel’s
window for delivery to the coast is January 5–13, 2017, so you have
approximately two months to project the story.
Task
Compare and discuss abnormal congruences and incongruences between the
channels presented in the draft story-line deception plan (see Table 12-5) of
the counternarcotics task force.
Suggestion: Discuss in groups how you could improve the deception story
projection by mitigating the identified abnormalities.
Scenario
The date is May 12, 2016. The Central African Republic has asked the United
Nations for assistance in transporting a large amount of weapons-grade
uranium out of the country to an undisclosed location in the United States.
The move was prompted by the growing threat of radical insurgents in
Bangui who have close ties with international terrorist networks. As the
assigned joint task force J3 planner (operations), in close coordination with
the intelligence section, you are to put together a deception plan to support
the planned transport of the material from 1st Arrondissement in Bangui,
approximately 8 km to the Bangui International Airport. Three possible
routes have been put forward: Ave. de France, Ave. de Martyrs, and Route
Nationale 2 (RN2). The route that will be used for the actual transport of the
uranium is Ave. de Martyrs. The Central African government has already
announced that the route will be closed to the public on the planned date for
the delivery, October 4, 2016, due to the uranium transport. However, this
could possibly play to your advantage in terms of planning a deception. The
objective of the deception is to fix the insurgent focus and resources on Ave.
de France and/or RN2. You have a timeline of approximately three months to
work with.
Task
Compare and discuss abnormal congruences and incongruences between the
channels presented in the draft story-line deception plan (Table 12-6) that
could undermine the attempt to deceive the insurgents as to the route that will
be used to transport the uranium to the airport.
Suggestion: Discuss in groups how you could improve the deception story
projection by mitigating those identified abnormalities. Discuss if, and how,
the announcement of the Central African government as to the planned route
for uranium transport was exploited in the deception plan.
Note
1. A tear line is a short intelligence report that does not name a specific
source in order to protect the collection method.
13 General Deception Planning Exercises
The exercises are designed to illustrate the use of deception in a wide range
of transnational issues. Military operations are a part of some exercises, but
the emphasis is on aspects other than traditional force-on-force operations.
Chapter 14 covers purely military exercises.
Although the details presented vary from plan to plan, each deception plan
should describe at least the following:
Desired outcome
Target(s)
Story
Channels to be used
Timing of information
Deconfliction analysis
Exercise 1: Derailing a Trade Negotiation
You are the special assistant to the minister of industry and trade (MIT) in the
country of Monopolitania. The ministry recently received a classified report
from the country’s intelligence service, the Monopolitania Intelligence
Bureau (MIB). The report indicates that secret multilateral trade negotiations
are under way involving the Philippines, Thailand, Vietnam, and Indonesia.
The objective of the negotiations is to establish a free trade zone among these
four countries for iron and steel products, computers and electronics, and
chemical products.
Key Participants
The trade negotiations are taking place in Bangkok and are hosted by the
Thai government. The key participants are listed below. The comments about
their positions in the negotiations were made by a clandestine MIB source
who acts as secretary to one of the ministers participating in the trade talks.
The MIB has a cyber operations branch that targets Internet servers and
government computer installations in Thailand and the Philippines. It has a
number of successful intrusion operations that could assist in targeting the
negotiating teams. The MIB cyber branch
Guidance
Your assignment is to prepare a deception operation, targeting each
individual minister, that will cause the negotiations to collapse. Remember:
Monopolitania is not the United States. Therefore, there are no restrictions on
lying to the press and no concerns about unethical conduct. Because
negotiations are ongoing, you have two days to craft the plan.
Exercise 2: Protecting Drug Cartel Shipments
This exercise requires the development of two deception plans to divert US
attention away from a drug cartel’s proposed novel methods for shipping
drugs into the United States and exfiltrating profits. Though based loosely on
factual information, the exercise scenario is notional.
You are employed as the intelligence officer for Carlos “el Chacal” Gutiérrez,
the head of the Sinaloa drug cartel. El Chacal is desperately trying to find a
way to deal with declining cartel revenues and has turned to you for help.
His cartel has long been a leader in the distribution of Colombian cocaine,
Mexican marijuana, methamphetamine, and Mexican and Southeast Asian
heroin into the United States. In its early years, the cartel relied on
automobiles, buses, tractor trailers, and rail cars for bulk delivery. As those
channels were methodically shut down by US drug enforcement actions, the
cartel switched to airborne, seaborne, and even subterranean delivery. Small
aircraft, flying low to avoid US radar coverage, delivered drugs to airports in
the South and Southwest. “Go-fast” boats and small submarines (semi-
submersibles) moved drugs via the sea route from Colombia to Mexico and
from there into California. Tunnels—some as long as 700 meters and as deep
as 30 meters—were dug beneath the US–Mexican border near Tijuana and
used to move both bulk drugs and illegal immigrants into the United States.
In the last two years, though, these channels for delivery have been severely
restricted and losses of drugs in transit have increased dramatically. Tunnels
beneath the border near Tijuana are being discovered quickly, in some cases
during their first use. The US Navy and US Coast Guard are routinely
intercepting the cartel’s sea shipments, with consequent loss of cargo and
crew. It has become harder to find crews willing to attempt the sea delivery
route. The airborne delivery route has been basically shut down. Radar
detection and interception of aircraft (including unmanned aerial vehicles)
now is routine. Other cartels are having the same problem. The result is that
the wholesale price of cocaine has skyrocketed in the last year. Gutiérrez is
dealing with the most lucrative market the drug trade has ever seen, but he
needs to be able to deliver drugs in bulk to meet the demand.
Gutiérrez can afford to lose 75 percent of his drug shipments—the loss rate
up until two years ago—and still make a substantial profit. But his current
loss rate is approaching 90 percent, and the resulting low profit margin is
unacceptable over the long term. Furthermore, the cartel is suffering
substantial losses in its attempts to move the profits back across the border
into Mexico. Gutiérrez can launder the funds once they are in Mexico, but the
US Drug Enforcement Administration (DEA) increasingly has had success in
intercepting the profits before they can be exfiltrated.
Gutiérrez believes that he has the solution to both problems, in the form of a
new way to clandestinely move goods in both directions across the border.
He has two options for increasing the quantities of drugs delivered. One
option makes use of an unmanned underwater vehicle (UUV). The other
option relies on two unmanned aerial vehicles (UAVs). Both options result in
about the same delivery rate—about 400 kg/day. The current street prices are
$110/gram for cocaine and $245/gram for heroin. Either option would
generate income for the cartel, at the current street prices, of between $44
million and $98 million per day.
Gutiérrez can afford to pursue one of the two options—to acquire one UUV
or to acquire two UAVs. The problem is that increased quantities of drugs
available on US streets will be quickly noticed by the US DEA, and the
agency will make a determined effort to find out how the drugs are entering
the country. The deception needs to keep the DEA and its partner agencies
looking in the wrong places for entry routes. El Chacal wants a detailed
deception plan to protect each of his proposed options.
The UUV is a modified copy of the Talisman UUV design first produced by
the Underwater Systems Division of BAE Systems. It features an
innovatively shaped carbon fiber composite hull, with internal pressure
vessels containing the electronics systems and payload. The hull was
designed to be stealthy against both passive and active sonar. The vehicle
weighs approximately 2,000 kg without cargo and is approximately 6 meters
long by 2.5 meters wide. Once loading is completed, it dives to a depth of
200 meters and stays submerged until reaching its destination.
The primary concern is detection en route. If the United States suspects that a
UUV is being used, and has a general idea of the delivery route, then the
UUV is likely to be intercepted and captured or sunk. Although it is silent
and stealthy during normal operation, some acoustic techniques (discussed
later) make detection likely if a thorough search is made for it.
The operational plan calls for the drone to be launched from a different
Mexican airfield on each mission, to avoid creating a noticeable pattern. The
drone would fly nighttime missions to one of six small airfields (having sod
runways) located in Arizona or New Mexico, well away from the Mexican
border. There the drug payload would be offloaded, drug profits onloaded,
and the drone refueled for a return mission the next night. At all times when
on the ground, the drone would be concealed in a barn at both ends of the
flight route.
Cartel OODA
HUMINT: You believe that the DEA has recruited one of the cartel’s
enforcers in Tijuana as a source for reporting on cartel activities. Gutiérrez
wants to eliminate him but on your recommendation has reluctantly agreed to
keep the man alive—for now—so that you can use him as an unwitting
channel for passing misleading information to the DEA. You do not know
how credible the source is in the DEA’s eyes.
OSINT: You have two staffers who monitor news wires and Internet
accounts of “drug busts” in the United States. These accounts have provided
valuable insights into DEA, Border Patrol, and Coast Guard methods for
identifying and intercepting drug shipments.
US OODA
Although your title is “intelligence officer,” most of your work to date has
been counterintelligence, that is, assessing and countering official US and
Mexican governmental intelligence efforts to target the cartel. The primary
threat, the United States, has deployed a number of sensors to monitor drug
trafficking from Mexico and Central America. You have developed threat
analyses on the following intelligence sources and sensors.
HUMINT
Although you are aware of one DEA HUMINT source in your organization,
you suspect that the Mexican government or the DEA has at least one other
source that you have not yet identified.
OTH Radars
Two US Navy high-frequency (HF) over-the-horizon (OTH) radars known as
ROTHR (Relocatable Over-the-Horizon Radar) are operated at Corpus
Christi, Texas, and Chesapeake, Virginia. The two radars provide coverage of
the Caribbean Sea and portions of the Atlantic Ocean and the Gulf of
Mexico. The radars are operated full time in a counternarcotics surveillance
role. They can detect small aircraft flying at any altitude. However, you have
consulted with radar experts who believe that the UAV is not detectable by
these radars—or, at the most, is detectable only intermittently. Although the
present ROTHR coverage does not include the likely flight routes for cartel
UAVs, the DEA could have the radars reaimed to cover those routes if they
suspect the existence of UAV flights.
Aerostats
The US Tethered Aerostat Radar System (TARS) is designed to detect low-
level air, maritime, and surface smugglers and narcotics traffickers. TARS
provides coverage of the US–Mexico border, the Florida Straits, and the
Northern Caribbean. These aerostats resemble blimps but are raised and
lowered by tether to adjust to weather conditions and for maintenance. The
sensor suite is known to include a radar that can detect small aircraft and
small boats. You suspect that it also has SIGINT equipment that is used to
monitor communications. Six aerostats are deployed at Yuma and Fort
Huachuca, Arizona; at Deming, New Mexico; and at Marfa, Eagle Pass, and
Rio Grande City, Texas.
After talking with technical experts, you have concluded that the stealth UAV
probably could not be detected by the TARS radars operating in normal
mode, unless the UAV comes within 18 miles (30 km) of a radar. But your
expert has advised that, if the Yanquis suspect that stealth aircraft are being
used, they probably could network the radars using a technique called
“bistatic radar” (using one TARS transmitter in conjunction with the receiver
on another TARS) to detect the UAV.
All P-3 flights are intelligence driven and planned by a joint task force. A
typical flight is designed to conduct surveillance of a designated search area.
If the P-3 spots a suspect craft, the aircraft will typically stay out of the ship’s
visual range, taking pictures of the vessel and its cargo. The P-3 usually
continues to monitor the suspect craft until a US Coast Guard or US Navy
ship interdicts the craft. If interdiction is not possible, the P-3 will overfly the
boat at close range, usually resulting in the crew tossing the drugs overboard.
Cyber Operations
A DEA cyber operations team is believed to be responsible for some recent
diversions of cartel funds into accounts that may have been DEA controlled.
The team is also suspected of identifying cartel funds-laundering efforts.
SIGINT
The TARS aerostats are believed to carry SIGINT equipment that can detect
and track radars operating in their areas of coverage. They also likely collect
COMINT from cell phones and push-to-talk radios. The DEA also has
intercept stations at several locations in California, New Mexico, Arizona,
and Texas. These stations are believed to monitor the cartel’s cell phone
traffic.
Guidance
You are to develop two detailed deception plans—one for each drug transport
and funds recovery option. The deception must lead US enforcement teams
away from the actual method and route used. Gutiérrez will choose which
new transport to engage with based on the deception plan that he believes is
most likely to succeed.
Exercise 3: Taking Down the Fordow Uranium
Enrichment Facility
This exercise involves deception planning by Israel’s Mossad to protect the
source of a cyber attack against a clandestine uranium enrichment operation
at Fordow, Iran. Though based loosely on factual information, the exercise
scenario is notional. You are the Mossad agent in charge of planning the
deception.
Fordow Background
The Fordow fuel enrichment facility is located 20 miles (32 km) northeast of
the Iranian city of Qom, near the village of Fordow. Iran secretly began
construction of the facility in 2006. In September 2009, US, French, and
British officials notified the International Atomic Energy Agency (IAEA)
that Fordow was a uranium enrichment facility and presented evidence to
support their claim. Fordow at that time was Iran’s second uranium
enrichment facility, the other one being located at Natanz.
Recent Events
Mossad intelligence has determined that Iran is working to evade restrictions
and produce enough fissile material for at least one nuclear warhead. The
Iranians have expanded the underground facility and installed approximately
6,000 of the newest and most efficient centrifuges for uranium enrichment,
with the goal of producing sufficient 90-percent enriched U-235 for a nuclear
weapon. The facility began operation 2 months ago.
Since his recruitment, Rafael has provided plans of the facility, details on the
progress of enriched U-235 production, and details about the computers and
software that operate the centrifuges. The centrifuge control system is on a
separate intranet accessible only by four key people at the facility (see “Key
Personnel at the Fordow Facility” section). This intranet is protected by a
firewall from the main Fordow network (which has no physical connection to
the outside world). Because the plant’s main network is not connected to the
outside, the only way to introduce malware is by an insider with access to the
computer network. Rafael has access to the main network but not to the
centrifuge control intranet.
The malware makes use of what is called a VPN pivot. Once introduced into
the main network, the malware “pivots” to bypass the centrifuge intranet
firewall. It then infects the centrifuge control system and eliminates all
evidence of its existence in both networks. It is designed to disrupt
production by causing the centrifuges to spin out of control and self-destruct.
The trigger date and time for the disruption is to be set by Rafael after he
loads the malware into the main network.
Israeli scientists expect the resulting damage to include explosions, fires, and
the release of enriched U-235 gases. Due to the facility’s subterranean
location, they do not know whether evidence of an explosion or fire will
reach the surface. They do expect the gaseous component to eventually
escape into the atmosphere. They also expect a number of casualties among
workers at the plant. The malware is designed to erase any evidence of its
presence after it has done its work—in case the computing facility survives.
Rafael has agreed to introduce the malware only if he can be assured that the
resulting investigation will not center on him. He also does not want to be in
the facility when the malware is triggered because of uncertainty about the
possible collateral damage, but has agreed to report on the result later.
Ranjbar Saeed
Prior to his appointment as plant director at Fordow, Ranjbar Saeed was the
chief scientist in charge of fuel enrichment at Natanz. Saeed is known to be a
strong supporter of Iran’s nuclear weapons program. IAEA files indicate that
Saeed was the head of the Institute of Applied Physics, which acted as a
cover for scientific work on a possible Iranian nuclear weapons program. He
previously chaired the physics department at Tehran’s Imam Hossein
University, which is linked to the Iranian Revolutionary Guard Corps and to
work on nuclear weaponization. Saeed has also reportedly worked at Shahid
Beheshti University, sanctioned by the European Union for associations with
Iran’s missile and nuclear programs. His name has been associated with the
Iranian Nuclear Society, which was previously called Iran’s Coordination
Committee for nuclear specialists, allegedly founded by the Ministry of
Defense to employ university professors and scientists in defense projects. He
was a deputy chair of that group.
Hatef was an early addition to the United Nations sanctions list under
Security Council resolution 1747 in 2007 as a person “involved in nuclear or
ballistic missile activities.” These sanctions restrict his travel and impose an
asset freeze on any international holdings. He was subsequently placed under
European Union sanctions. Although the Security Council did not provide the
exact reason for his being added to the sanctions list, according to the expert
close to the IAEA, his involvement in the Iranian nuclear program led to his
sanctioning.
Mehran Asgharian
Mehran Asgharian graduated from the Iran University of Science and
Technology with a doctorate in computer engineering. He is currently the
chief of the computing section at Fordow and oversees both the hardware and
software operations at the facility. Rafael frequently visits Asgharian’s office
to discuss process details with him.
Hassan Gharmeei
Hassan Gharmeei also graduated from the Iran University of Science and
Technology with a bachelor’s degree in computer engineering. He is
responsible for development and maintenance of the software that controls
the centrifuge operation. Rafael has worked closely with Gharmeei on several
occasions during the control software development process and considers
Gharmeei a friend. Like Rafael, Gharmeei is a moderate Muslim.
Iranian OODA
The Ministry of Intelligence and Security (MOIS) employs all means at its
disposal to protect the Islamic Revolution of Iran, utilizing such methods as
infiltrating internal opposition groups, monitoring domestic threats and
expatriate dissent, arresting alleged spies and dissidents, exposing
conspiracies deemed threatening, and maintaining liaison with other foreign
intelligence agencies as well as with organizations that protect the Islamic
Republic’s interests around the world.
All organizations must share information with the MOIS. The ministry
oversees all covert operations. It usually executes internal operations itself,
but the Quds Force of the Islamic Revolutionary Guards Corps for the most
part handles extraterritorial operations such as sabotage, assassinations, and
espionage. Although the Quds Force operates independently, it shares the
information it collects with MOIS.
Israeli OODA
HUMINT: Mossad has no sources in Fordow other than Rafael. It has one
source who frequently visits the city of Qom, where the key personnel live.
IMINT: Israel has a suite of imagery satellites, the Ofek (or Ofeq) suite. The
following members of that suite provide imagery of Fordow on a regular
basis:
Ofek 5 and Ofek 7 are visible imagers offering better than one-half
meter of image resolution.
Ofek 9, launched in 2010, is a multispectral imager. Like Ofek 5 and
Ofek 7, Ofek 9 offers a resolution that is substantially better than one-
half meter.
Ofek 10, also known as TecSAR, is a radar reconnaissance satellite. It is
equipped with a high-resolution synthetic aperture radar that is capable
of collecting imagery at night and through clouds.
The IDF has several elite teams of cyber warriors who engage in both cyber
offense and defense to support military operations. The teams target Iranian
missile and nuclear-related establishments for cyber intrusion and intelligence
collection as well. Over the last year, the teams have had several successes
against military targets, but they report increasing difficulty in penetrating
these targets as the Iranians have developed better cyber defenses.
Guidance
The Mossad operations commander has directed you to prepare a deception
plan, having three objectives:
Farghadani has received from the Israelis a software file, with instructions to
load it onto a thumb drive and insert it into the Fordow main computer
system. The file contains malware that is designed to infect the plant’s
centrifuge control system. Your experts have reviewed the malware and
believe that it would indeed cause the destruction of the plant centrifuges—if
the details about the plant software that the Israelis received had been
accurate.
The MOIS minister wants the Israelis to believe that their plan succeeded and
that Fordow is out of operation for the foreseeable future. He has turned to
you with instructions to make that happen.
Iranian OODA
This is your understanding of your own Iranian OODA—which differs from
the Israeli view presented in Exercise 3. Disregard the corresponding
information from that exercise.
HUMINT: You are aware that the Islamic Revolutionary Guard Corps—
Quds Force has an officer clandestinely operating in Jerusalem who runs an
agent network in the Israeli capital. You do not have any details about the
available sources, but the director of the operation has agreed to use his
sources, if they have the necessary access, to either pass information or to
obtain information. Based on discussions with your friends in Quds Force,
you suspect that the sources are low level, better positioned to carry out
covert action than to collect intelligence or to feed misleading information to
the Israelis.
Israeli OODA
This is Israel’s OODA as viewed by the MOIS—which differs from the
Israeli view presented in the previous exercise. Disregard the corresponding
information from that exercise.
IMINT: Israel has a suite of imagery satellites, the Ofek (or Ofeq) suite. The
satellites are believed to image government and defense installations in Iran
on every pass:
Ofek 5 and Ofek 7 are visible imagers reportedly offering better than
one-half meter of image resolution.
Ofek 9, launched in 2010, reportedly is a multispectral imager. Like
Ofek 5 and Ofek 7, Ofek 9 offers a resolution that is substantially better
than one-half meter.
Ofek 10, also known as TecSAR, is a radar reconnaissance satellite. It is
reportedly equipped with a high-resolution synthetic aperture radar that
is capable of collecting imagery at night and through clouds.
Guidance
Your assignment is to develop a deception plan to convince the Israelis that
their malware has successfully wrecked the centrifuges. Your plan must
include some means to ensure that Israeli intelligence has bought in to the
deception.
Exercise 5: Supporting a NEO Evacuation
This exercise features the development of a multi-INT deception plan to
cover a noncombatant evacuation operation (NEO) in the city of Tripoli,
Libya. It is fictional but is based loosely on real organizations.
Background
Two weeks ago, a Daesh contingent conducted a surprise attack on Tripoli
from the sea, seizing much of Tripoli harbor. The attack was launched by
Daesh units stationed in Sirte, Libya. The attackers had support from Daesh
cells in Tripoli. The attack is being resisted by fighters from the pro-Islamist
militias known collectively as Libya Dawn, who control most of the Tripoli
area and serve as the military enforcement arm of the current Tripoli
government.
During the last week, Daesh units have mounted a series of attacks in Tripoli
from their harbor positions. They currently control the entirety of Tripoli
harbor, as shown in Figure 13-1. Libya Dawn militias control the rest of the
area shown, though their level of “control” is quite loose.
The city appears to be descending into chaos, and local residents are
panicked. The government of Tripoli, the General National Congress (GNC),
has fled the city in the face of Daesh advances. Libya Dawn forces remain in
the city and outnumber the Daesh force, but Daesh is better organized and
equipped.
A seaborne NATO joint task force (JTF) has been established and is currently
en route to Tripoli, expected to arrive off the coast in two days. The JTF is
charged with conducting a noncombatant evacuation operation (NEO) of
foreigners trapped in the two hotels. The JTF commander has received three
different proposals from his subordinate commanders on how to conduct the
NEO. All proposals involve sending an advance special operations team into
both hotels to protect civilians trapped there and to coordinate the evacuation.
The JTF commander has concluded that, no matter which option is selected,
it will be essential to conduct a deception to misdirect Daesh and Libya Dawn
forces away from the actual evacuation to improve the chances of success for
the NEO. He has asked for a detailed and complete deception plan to support
each of the three operations.
Daesh Background
The Daesh commander in Tripoli is Abdul Sa’ad al-Tajuri. He reportedly was
the mastermind behind several attacks on government forces and civilians in
the Sirte region. Associates describe him as brutal, short tempered, impulsive,
and suspicious, bordering on paranoiac about both external and internal
threats. As a unit commander in Sirte, he was responsible for numerous
public beheadings and hanging men in orange jumpsuits from scaffolding in
what he called “crucifixions” for alleged crimes including blasphemy,
sorcery, and spying.
Tajuri appears determined not to allow Westerners trapped in the two hotels
to escape, though his plans for them remain unknown. In Internet
communications monitored by the NATO SIGINT units, Tajuri has ordered
all units in the Tripoli area to be on alert for an airborne or seaborne attempt
to rescue foreigners in the two hotels.
The areas of Daesh control are shown in Figure 13-1. Tajuri has positioned
snipers close to the Corinthia Hotel, and Libya Dawn units near the hotel
have taken some casualties from sniper fire.
Daesh OODA
Daesh units now are well positioned to observe activity at the Corinthia Hotel
Tripoli. Unconfirmed reports indicate that a Daesh cell also has the Sheraton
Tripoli under observation. Daesh monitors CNN and Al-Jazeera broadcasts
about the crisis.
Daesh SIGINT units in the area make use of commercially available French
SIGINT equipment and captured Russian SIGINT equipment. They are
believed to be able to monitor radio traffic from shipping in the area; hand-
held two-way radio traffic; cell phones; and air-to-ground communications
from commercial and military aircraft. Daesh also reportedly monitors
Internet traffic to and from both hotels. The possibility exists that the SIGINT
unit will be able to detect the approach of the JTF by monitoring the ship
radar and ship-to-ship communications.
It is believed that Daesh SIGINT units also are able to monitor at least the
Libya Dawn two-way radio traffic, and possibly their cell phone traffic as
well.
Some Libya Dawn units in Tripoli probably have been infiltrated by Daesh
agents, but there are no details on specific infiltrations.
Zintan Brigades
The powerful anti-Islamist Zintan militia or Zintan brigades support the
internationally recognized government of Libya (the Government of National
Accord, or GNA) and have clashed on numerous occasions with Libya Dawn.
The group is equally hostile to Daesh but has not previously engaged Daesh
units.
On July 14, 2014, Tripoli International Airport was the site of a fierce battle
between Libya Dawn forces and the Zintan militia, leaving the airport
facilities almost completely destroyed. The airport has been closed since then,
but the runways remain operational. The Zintan militia currently has units in
the region surrounding the airport; the units moved in after Libya Dawn
militias moved back into the center of Tripoli to deal with the threat from
Daesh.
NATO Force
German units afloat include frigates Brandenburg and Rheinland-Pfalz as
well as the task force supply ship Berlin and the fleet service boat Oker. The
Germans have three C-160 aircraft (each with the capacity to carry 95
passengers) on standby in Malta that are available to support the NEO. A
Bundeswehr Special Forces team (Kommando Spezialkräfte) of 400 troops is
on standby in Malta.
British units include four C-130 aircraft (each with the capacity to carry 92
passengers) and a 500-man special operations contingent, also on Malta.
A US Navy strike group from the Sixth Fleet is the main force supporting the
operation. It includes the following:
NATO OODA
Cyber operations support is provided by the US Tenth Fleet (US Fleet Cyber
Command). The NATO information operations team also has tapped into the
Daesh intranet and is monitoring traffic between the leadership and individual
units scattered around Tripoli.
Guidance
Your assignment as the lead planner is to develop and present a deception
plan to support one of the three proposed evacuation options. (Note for
classroom use: This assumes a class divided into three teams, each team
taking a different option.) The JTF commander is amenable to modifying any
of the three proposed evacuation plans or conducting demonstrations to
support the deception. The plan must identify the information to be provided
to each opposing or friendly force (Daesh, Libya Dawn, and the Zintan
brigades), who provides it, and how and when to provide it. It must also
identify and schedule information that is to be provided publicly (e.g., via
press releases, or CNN or Al Jazeera broadcasts).
Exercise 6: Disrupting Boko Haram Operations
This exercise features a no-holds-barred PSYOPS deception plan to disrupt
the terrorist group Boko Haram in North Africa. It is fictional but is based
loosely on real organizations.
You are currently planning a series of deception operations to carry out your
mission. Following are details about your opponent and the resources
available to you. Military leaders in Chad, Cameroon, and Nigeria have
agreed to conduct operations to assist in the execution of your overall
deception plan, so long as it does not unduly interfere with their normal
operations.
Boko Haram initially relied on donations from members. Its links with al-
Qaeda in the Islamic Maghreb opened it up to funding from groups in Saudi
Arabia and the United Kingdom. But it also gets funding from bank robberies
and kidnapping for ransom. Boko Haram also occasionally has been
connected in media reports with cocaine trafficking. The group cloaks its
sources of finance through the use of a highly decentralized distribution
network. The group employs an Islamic model of money transfer called
hawala, which is based on an honor system and a global network of agents
that makes the financing difficult to track.
Boko Haram’s loyalty pledge has so far mostly been a branding exercise
designed to boost its international jihadi credentials, attract recruits, and
appeal to the Daesh leadership for assistance.
There have been periodic reports of cooperation between Boko Haram and
the Libyan branch of Daesh. In April 2016, OSINT reported that an arms
convoy believed bound for Boko Haram from Libya was intercepted in Chad,
providing one of the first concrete examples of cooperation. Evidence shows
that Boko Haram cooperation with the Daesh Libya branch has increased
since then. Boko Haram fighters travel to Libya to fight for Daesh. In turn,
Daesh sends arms and supplies to Boko Haram via convoys through Chad.
Key Officers
Abubakar Shekau, Mamman Nur, and an unidentified leader of the Ansaru
splinter group, are the three most influential leaders in Boko Haram’s
network. Shekau is the network’s most visible leader, but the other two
leaders wield power in the network and have a complex relationship with
Shekau.
Abubakar Muhammad Shekau
Boko Haram’s current leader was born in Shekau village in Nigeria’s
northeastern state of Yobe. He is said to be a fearless loner, a complex,
paradoxical man—part theologian, part gangster. He is known for his intense
ideological commitment and ruthlessness. He reportedly said, “I enjoy killing
anyone that God commands me to kill—the way I enjoy killing chickens and
rams.”3 Shekau is fluent in Hausa and Arabic as well as his native Kanuri
language. He is believed to be in his late thirties to mid-forties. Shekau does
not communicate directly with his troops. He wields power through a few
select subcommanders, but even with them contact is minimal. He remains
firmly opposed to any negotiations or reconciliation with the Nigerian
government.
Shekau is a divisive leader but has legitimacy because he was deputy to Boko
Haram founder Muhammad Yusuf and remained close to grassroots followers
in Borno State. He retains a core group of loyalists because many militants
who opposed him have defected or were killed by Boko Haram’s more
ruthless and indoctrinated militants. The loyalty of these militants to Shekau
remains high, though some reportedly oppose his decision to align closely
with Daesh.
Mamman Nur
Mamman Nur is connected to al-Qaeda affiliates in Africa and is both an
operational and ideological leader. Nur reportedly bases his operations in
Kano and has little or no network presence in Borno. His fighters are former
followers of Boko Haram founder Muhammad Yusuf, with whom Nur was
closely allied. They tend to have an international outlook. Nur occasionally
cooperates with Ansaru militants. He has ideological disagreements with
Shekau because of Shekau’s leadership style and close relations with Daesh.
Ansaru
Ansaru is an Islamist jihadist militant organization based in the northeast of
Nigeria. It is a Boko Haram splinter group, founded in January 2012. Ansaru
was headed by Khalid al-Barnawi until his capture in April 2016. Al-Barnawi
has been replaced, but his replacement has not been identified in press
releases.
The group has its home base in Kano State in north-central Nigeria. It
coordinates its operations with al-Qaeda in the Islamic Maghreb (AQIM),
based in northern Mali. Many of its members were trained in AQIM camps in
Algeria.
Ansaru, with its most recent operations in Niger, Cameroon, and possibly
Central African Republic, now functions like an “external operations unit” in
its self-declared area of operations in “Black Africa” in a way that separates
Ansaru from Boko Haram in Borno and avoids conflict with the group.
Your Resources
Channels
Your special operations unit has acquired two Boko Haram short-range
communications transceivers and can use them to monitor Boko Haram
communications in a local area or to communicate with Boko Haram officers.
Boko Haram has increasingly used videos for online recruiting and for
indoctrinating new recruits. It has incorporated into its recruiting strategy the
same message that Daesh has used successfully: that it is steadily winning
and controls territory that constitutes a state. These videos are produced by
Daesh operatives outside the region and transmitted via a dark web
connection that the French cyber unit monitors. The videos include several
Daesh-produced manuals on weapons manufacturing and training.
Guidance
Your assignment is to develop and execute a plan having several separate
deception operations to help carry out the operational missions set forth at the
beginning of this exercise.
Notes
1. The Institute for Science and International Security, “Nuclear Sites:
Fordow Fuel Enrichment Plant (FFEP),”
http://www.isisnucleariran.org/sites/detail/fordow/.
2. The term chicken feed refers to true material of low intelligence value that
a double agent provides to an opposing intelligence service to lead it to
accept him or her as a credible source.
Situation
It is the summer of 2012. The country of Chad has been locked in civil war
for the better part of two years. The conflict is between pro-Western
government troops and a radical tribe called the Zagaweri, which has control
of the second largest city, Abeche. Since the militants have been in control of
Abeche, they have reinforced all perimeter defenses with deep trenches,
improvised explosive device (IED) belts, and tunnels.
The Chadian government has determined that Abeche must be taken in the
spring of 2013, and essentially they wish to do it from the inside out in a plan
called OP Termite Junction to avoid the perimeter defenses. While the
Chadian military is busy with covert training and infiltration operations
supported by Western special operations forces (SOFs), they have asked
African Union (AU) contingent operational planners to design and execute a
deception plan in support of an uprising being planned by anti-Zagaweri
tribes inside the defensive perimeter.
3. End-of-Operation Statement
MILDEC operations typically end after the real operation has begun. The
condition for ending the deception is thus that the real operation has
progressed to a point where adversaries can no longer be uncertain as to the
intent of the real operation, are reacting to the real operation, or simply are
ignoring the deception.
Scenario
Political
The country of Tyronika (TKA) has been in a state of civil war with the
North Tyronika Region (NTR) since 2014, when NTR declared unilateral
independence and named the second largest city in Tyronika, Bruks, as its
capital. The TKA government reacted immediately, mobilizing the southern
region, and developing a military campaign plan to reassert central control
over NTR. By the end of 2016, despite TKA air superiority over NTR-held
territory, a stalemate had developed with the NTR forces, called the Northern
Rebel Defense Forces (NRDF), restricted to a pocket on both sides of the
Frank River and backed onto the Northern Sea coast.
Key Infrastructure
The NTR pocket includes the city of Bruks and the strategically important
Marble Dam power station that supplies the two million citizens of Bruks
with electrical power. NTR also has access to a coal-fueled power station
(Tetia) at the southern perimeter of the pocket near the town of Avalon.
However, this station is not activated and it would require some effort to get
it running; and being so close to the front lines, it would require considerable
resources for protection. A TKA PMESII analysis of NTR infrastructure has
identified NTR’s electric power situation as a possible “center of gravity” for
the strategic defeat of the NTR. TKA analysis has determined that life in the
NTR city of Bruks is fully dependent on electricity from the Marble Dam,
especially in order to survive the winters, and the loss of the dam would leave
the
NTR with only one option—to activate the coal power plant at Avalon. (See
Figure 14-4.)
The NRDF has the bulk of its mobile reaction force in Bruks, including the
1st Motorized Rifle Division, consisting of approximately 12,000 men.
The 1st Motorized Rifle Division (see Table 14-1) is intended to be used as a
reserve force to protect Bruks and the Marble Dam. It is severely restricted in
venturing out of Bruks due to the reduced NRDF SAM defense coverage and
risks being cut off from Bruks if it ventures too far south within the rebel-
controlled area. The division represents an excellent bargaining chip so long
as it creates a credible defensive capacity for the city of Bruks. Should the
NRDF lose the division, NTR’s hope for political concessions from the TKA
will be diminished.
TKA Military
The TKA air force enjoys air superiority but is restricted by tight controls on
targeting and by an NRDF SAM threat that increases as aircraft approach
Bruks. However, the air force has restricted the NRDF’s freedom of
movement throughout their area of control. If substantial NRDF land units try
to maneuver, they risk heavy losses from air attacks.
The TKA mobile land forces enjoy a substantial numerical advantage over
the NRDF, but their edge is not sufficient to overwhelm defenses at Bruks,
especially with the NRDF reaction force of 12,000 sitting in Bruks. The TKA
land forces consist of two tank divisions (see Table 14-2) that are not well
suited to urban warfare. To attack Bruks with these forces would also be very
costly, especially without close air support. NRDF SAM defenses would in
any circumstance have to be degraded to a sufficient level to allow for the
level of close support necessary.
Tyronika also has a mobile amphibious force consisting of two marine
brigades and one SEAL regiment (see Table 14-3).
Situation
You work within Tyronikian Joint Operational Command (TKA JOC) as a
future operations officer (J35). The date is June 2, 2017. To break the
stalemate, the commander would like to develop a plan that would allow the
majority of his mobile forces (land and/or amphibious) to engage and destroy
the bulk of NRDF mobile forces outside the city of Bruks. The problem has
been in getting the NRDF reaction force to come out of Bruks. Recently,
however, TKA JOC J2 (intelligence) has suggested such an opportunity is
likely to occur between November 4 and November 7.
TKA intelligence has been tracking the water levels at the Marble Dam
through HUMINT and MASINT since the dry period started in May, and
reports that the levels have been going down very fast—faster than usual.
Specialists in the J2 have determined that when the water levels go below a
certain level, the Marble Dam will stop producing electricity. At current rates,
the expected date for the electricity to stop is on or around November 23,
2017. The only other source of electricity is the coal-fueled Tetia power
station near Avalon that is not operational while the Marble Dam is
producing power. However, electricity is a strategic necessity for the city of
Bruks over the winter, so the NTR will have to start up the Tetia power
station and will send its mobile reaction force south to ensure the start-up is
successful. It is assessed with a high level of confidence that NTR believes
the falling water level is a secure secret.
Task
TKA JOC expects that NTR has no choice but to use the NRDF mobile force
to help secure the Tetia power station near Avalon. In other words, the NTR
will have to send the reaction force south to Avalon from Bruks. The TKA
JOC has every intention of engaging the NRDF mobile force at or near
Avalon and the Tetia power station and will dedicate 80 percent of its own
mobile force to destroy the NRDF force. However, to improve the degree of
surprise, reduce casualties, and increase the chance of success, the TKA JOC
wants
J2 and J3 to develop a MILDEC CONOP with the following objectives, in
order of priority:
1. Deceive the NRDF into underestimating the number and type of TKA
forces that will be available in the Avalon area during the period of
November 4–7, 2017.
2. Fix as many of the NRDF mobile force air defense units as possible in
or around Bruks during the period of November 4–7, 2017.
3. Fix as many of the NRDF mobile artillery units as possible in or around
Bruks during the period November 4–7, 2017.
4. Fix as many of the NRDF anti-tank units as possible in or around Bruks
during the period November 4–7, 2017.
The MILDEC CONOP can make use of 20 percent of all TKA mobile forces.
The final configuration of the main fighting force will be established after the
approval of the MILDEC CONOPs. The deception plan should be ready to
set in motion no later than August 1, 2017, and continue until the main
offensive begins on November 4, 2017.
The first step—for the J2 to sensor map the NRDF forces in order to identify
channels for projecting a deception and tracking its progress—has already
been executed. The sensor map in Figure 14-5 is provided for this purpose.
Once a channel tracker has been established, you are to develop a MILDEC
CONOP with a timeline and eventual phases to be presented to the
commander.
Scenario
This scenario begins with a campaign plan already in its early draft stages.
An ethnic conflict on the Pastonian peninsula (see Figure 14-6) has been
brewing for many years. The Niberian minority populations have been
oppressed by local authorities who complain about the large percentage of
organized crime being committed by ethnic Niberians. In the early spring of
2017, four Pastonian policemen were killed in a gunfight with narco-barons
of Niberian descent. What followed was three months of escalating conflict
between Pastonian and Niberian ethnic groups. Extremists on both sides
sabotaged any attempts for conflict resolution. On July 4, 2017, the Pastonian
government announced that all ethnic Niberians would have to report to
“holding centers” once a day. Despite being warned by the Niberian
government that any further action against ethnic Niberians would be a grave
mistake, a week later Pastonia began rounding up all ethnic Niberians now
registered via the holding centers, and placing them in internment camps
spread throughout the peninsula. As a result, unknown to Pastonia, the
Niberian Navy has been tasked by the Niberian government to prepare a
surprise amphibious invasion of the Pastonian peninsula. The Pastonian
Armed Forces (PAF) are completely unaware of the impending attack. PAF
are relying on a political assessment: They believe that Niberia has not yet
been pushed to its limits.
The RNN has a small but well-equipped amphibious landing force consisting
of two marine brigades and several landing ships. These are supported by an
air assault ship, with twelve heavy-duty Chinook helicopters for troop
transport and eight attack helicopters. In a joint operation these capabilities
ideally would be synchronized to deliver the maximum effect in the shortest
amount of time.
Task
The window for the amphibious landing is sometime between November 16
and November 30, 2017, so the deception should initially be phased from
August 10, 2017, to November 15, 2017. Thanks to its national intelligence
service, the RRN has the minefield passage data. The passage is small, but
the RNN experts believe it can be widened very quickly by mine sweepers;
therefore, it has been decided that the actual landing will be executed in West
Bay. As the assigned J2 chief, you have the responsibility, in conjunction
with the J3, to produce a first-draft deception CONOP to support the
amphibious invasion. To improve the degree of surprise, reduce casualties,
and increase the chance of success, the commander wants J2 and J3 to
develop a MILDEC CONOP with the following objectives, listed in order of
priority:
Guidance
1. Using the given sensor map (Figure 14-7), establish a channel tracker
for projection management and eventual evaluation of the deception
progress.
2. Prepare the first rough draft of the MILDEC CONOP for presentation,
including timeline and phase breakdown, ideally on one or two slides.
3. List and discuss measurements of effectiveness (MoEs) and/or requests
for information (RFIs) that can be collected during the deception
projection to control for effectiveness.
Scenario
The NATO Joint Task Force Land (JTFL) has been supporting the host
country Taman in a war against radical religious insurgents for the better part
of three years. The JTFL is responsible for defending a northern section of
the Haldun River valley in a largely arid plateau surrounding a narrow
irrigated strip known as the Green Zone. In early 2017 the JTFL established a
string of patrol bases for Tamanian troops running east to west across the
green zone and known as the Patrol Base Line (see Figure 14-8). It is June
12, 2017, and as part of the overall plan, the three-year-old Main Operations
Base (MOB) Rhino will be closed on September 30, 2017. MOB Dragon in
the south will remain operational for another six months as the Tamanian
troops reestablish control of the valley.
Task
As the TF 222 J2, you and your team have been ordered to coordinate support
for the closing of MOB Rhino, relying on deliberate targeting activities to
project a deception. The objective is to disrupt the insurgent’s capability to
mount a coordinated assault on MOB Rhino while it is being closed on
September 30, 2017. To improve the degree of surprise, reduce casualties,
and increase the chance of success, you are to develop a first-draft deception
CONOP to support the closing of MOB Rhino, with the following objectives,
in order of priority:
Guidance
1. Using the given sensor map (Figure 14-9), establish a channel tracker
for projection management and eventual evaluation of the deception
progress.
2. Prepare the first rough draft of the MILDEC CONOP for presentation,
including timeline and phase breakdown, ideally on one slide. It should
also indicate a progression through the HVTL.
3. Compare and discuss differences in the prioritization of the HVTL; list
and discuss measures of effectiveness (MoEs) and/or requests for
information (RFIs) that can be collected during the deception projection
to control for effectiveness.
The short exercises in this chapter give readers practice with methods for
detecting deception. The ability to move between one or more of the three
OODA perspectives in a situation is a fundamental skill set for deception
projection. All three perspectives are inherently involved while working on
deception detection. Keep this in mind when completing these exercises. In
going through the exercises, participants should pose and discuss the
following questions:
Scenario
It is November 15, 2017. The Blue State is expecting a peace agreement
between the Red and Blue States to collapse and a major Red State offensive
to begin along the demarcation line between December 15 and 19, 2017.
There are two avenues for a possible Red offensive: The most direct approach
is along the 40-km Red main logistics route (MLR) Tiger, with an assault
through the East Valley. This is the Blue XX Division’s Area of
Responsibility (AoR), and they have orders to retain and hold East Valley.
(See Figure 15-1.)
The second option for the Red State offensive is attacking through West
Valley. However, that option will require repairing or replacing the destroyed
Pony River Bridge at Gran Ben; this option extends logistics lines supporting
the assault through West Valley by 40 km to almost double that of an East
Valley option. West Valley is the Blue X Division’s AoR, and they have
orders to retain and hold West Valley.
Situation
Though the Red State has more forces available, with a nearly three-to-one
advantage, their forces are partially made up of conscripts and reservists. The
Blue State has a professional defense force and a great deal more experience
among the rank and file. The Red State will have to commit the majority of
its assaulting forces to one avenue of advance in order to ensure relative
numerical superiority for breaching the demarcation line. The Blue State is
aware of this situation and has established a well-organized mobile reserve
that should be sufficient to defeat any assault if they are in the right place at
the right time. However, the Blue State must decide in which valley, East or
West, to place its mobile reserve by December 15, 2017, in order to defeat
the Red State’s offensive. Essentially, each side has to commit a majority of
its forces at the proper location to ensure a successful offense or defense. For
the Red State to be successful, it must attack at a point where the Blue State
mobile reserves are not committed. For the Blue State to be successful as the
defending army, it must commit its mobile reserve to the location of the
impending attack.
Task
As BLUE J2 chief you are responsible for providing an opinion as to which
valley the BLUE mobile reserve must be positioned in before December 15,
2017. You expect the RED opponent to attempt to mislead your commander
into committing his reserve forces in the wrong valley. Your analytic section
has collated and produced a channel tracker to be used to identify adversarial
indicators of a deception (see Table 15-1).
Guidance
1. Using the given channel tracker (see Table 15-1), identify abnormal
congruences or incongruences.
2. List and discuss the merits of any deception hypotheses developed from
the channel tracker.
3. Make a list of prioritized RFIs that could help confirm or deny any
hypotheses developed.
Exercise 2: The YUTA Ghost
Scenario
Since February 2017, the United Nations (UN) has been attempting to
enforce an arms embargo on the country of Sladivista in order to stop the
spiral of violence between the Sladivistian regime and several ethnic factions
that live in the country. It has been difficult to enforce the embargo on land;
the UN has relied primarily on neighboring states to enforce the embargo. In
contrast, the UN has attempted to enforce the embargo at sea. However,
many international criminal networks are ready to exploit every chance to
make some money from weapons smuggling. And though both the rebels and
the government are under the embargo, smuggling weapons by sea favors the
Sladivistian regime and criminal networks with seafaring assets.
Situation
Intelligence reporting from the UN Southern Naval Command UNTF 91,
charged with enforcing the arms embargo along the coast of Sladivista,
suggests there will be an attempt by the international criminal organization
YUTA to deliver a shipment of illegal weapons via mini-submarine on
November 23, 2017. It is October 10, 2017. Your task group, UNTG 34, is
responsible for intercepting the expected delivery. The challenge is that your
resources cannot cover the whole of the coastal region at the same time, so it
has been divided into three maritime patrol zones. However, you can cover
only two of the zones at the same time. UNTG 34 has two medium-sized
submarines available for patrol duty: the UNS Gartner and the UNS Pike.
YUTA has only one known mini-sub, but it is famous for its successful
smuggling exploits; it has earned the appropriate call-sign Ghost.
Furthermore, YUTA has two support ships available for the mini-sub: the
Shark and the Gamlin. Either can be used to launch and recover the Ghost.
The actual coastline is also divided into areas of interest (AoIs) for UNTG 34
N2 (naval intelligence section) lead: They include Coast Zone Alpha with the
main port of Bramaville; Coast Zone Bravo with the main port of Laurdal;
and Coast Zone Charlie with the main port of Kipton. (See Figure 15-2.)
Task
As the UNTG J2, your section is responsible for providing an assessment to
the commander as to which two maritime patrol zones she should place her
two submarines on November 23, 2017, in order to interdict the Ghost. Your
analysts have collated intelligence reporting and produced a channel tracker
in order to identify possible deception indicators (see Table 15-2).
Guidance
1. Using the given channel tracker (see Table 15-2), identify abnormal
congruences or incongruences.
2. List and discuss the merits of any deception hypotheses developed from
the channel tracker.
3. Make a list of prioritized RFIs that could help confirm or deny any
hypotheses developed.
Bibliography
Daniel, Donald C., and Katherine Herbig, eds. Strategic Military Deception.
New York: Pergamon, 1982.
Dewar, Michael. The Art of Military Deception in Warfare, 1st ed. Newton
Abbot, UK: David & Charles, 1989.
Dunningan, James F., and Albert A. Nofi. Victory and Deceit: Dirty Tricks at
War. New York: William Morrow, 1995.
Godson, Roy, and James J. Wirtz., eds. Strategic Denial and Deception: A
Twenty-First Century Challenge. New Brunswick, NJ: Transaction
Publishers, 2002.
Gooch, John, and Amos Perlmutter, eds. Military Deception and Strategic
Surprise. New York: Frank Cass, 1982.
Grant, Rebecca. “True Blue: Behind the Kosovo Numbers Game.” Air Force
Magazine 83, no. 8 (August 2000): 74–78.
Larsen, Henry S., III. Operational Deception: U.S. Joint Doctrine and the
Persian Gulf War (monograph). Fort Leavenworth, KS: School of
Advanced Military Studies, 1995.
Perry, James. “Operation Allied Force: The View from Beijing.” Air & Space
Chronicles 22 (October 2000).
Yuill, J., D. Denning, and F. Feer. “Using Deception to Hide Things from
Hackers: Processes, Principles, and Techniques.” Journal of Information
Warfare 5, no. 3 (2006): 26–40.
Index
Canada, 82
Case studies
Al Kibar case, 116–117
Ames, 94–95
analysis capabilities, opponent’s, 94–95
Argo deception, 81–83, 156–157
Black Boomerang, 52, 106–107
Bolo, 108
channel management, 185
COMINT deception, 108
competence, 165–166
conditioning, 18–19
counterdeception, 153–155, 174–175
counterintelligence deception, 92–93
Cuban missile crisis, 47–48, 165–166
cultural differences, 72
CYBER, 129, 132–133, 135–136
decision-making models, 59–60
denial, 20–21, 24–25
Desert Storm, 65–66
detection, 180–181
detection preparation, 169
Document Binder Deception, 92–93, 169
economic outcome scenario, 50–51
Ethiopian campaign deception, 59–60
evaluating evidence, 188
evaluating sources, 180–181
failed deceptions against Japanese, 72
Farewell Dossier, 54–55
Fortitude, 41–43
German rearmament, 85–86
GRAB satellite, 105–106
Hanssen, 94–95
HUMINT, 113–114
Igloo White, 163–164
IMINT, 116–117
Indian nuclear test, 24–25
information outcome scenario, 53–54
infrastructure outcome scenario, 54–55
interrogation, 113–114
IRA Embezzlement Sting, 16–17
IRA prison sting, 113–114
Iraqi WMD programs, 20–21, 180–181
Kosovo, 49–50
malware, 132–133
managing developing scenario, 158
Mincemeat, 38–40, 158, 185
Mirai, 129
modeling target’s alternative decisions, 65–66
Moscow Air Show, 145, 188
need for attention to details/timing, 156–157
OODA loop, 38–40
OODA loop, opponent’s, 153–155
OSINT, 105–107
outcome scenarios, 47–48
planning, 41
Project Azorian, 83–84
Rabta/Tarhunah deception and counterdeception, 153–155,
174–175
securing collection, 163–164
social deception, 52
The Sting, 3–5
story, 81–84, 85–86
Stuxnet APT, 135–136
tactical defensive deception, 49–50
technical deception, 163–164
Ultra deception, 53–54
unfavorable outcomes, 145
wheat crop deception, 50–51
Yom Kippur War, 18–19
Cassidy, Joseph, 146
Casualties, 6, 7, 7 (figure)
Chaff, 117–118
Chamberlain, Neville, 167
Channel management
detection via, 173 (figure), 173–176
Rabta/Tarhunah deception and counterdeception, 174–175
Channel management exercises, 203–208
Bangui Backroads, 206–208
Pick a Port, 204–205
Channels, 5, 11, 31, 38, 89–101
channel exploitation, 99–101
for CNE/CNA, 127–129
in Desert Storm, 90–91
evaluating, 184–185
feedback channels, 157
in Fortitude, 42
GPS, 89
importance of knowing, 162
in inverse OODA loop, 37
managing, 148–149
managing opponent’s, 151–152, 152 (figure)
modeling black box of orientation, 96–99
observation of, 90
opponent’s understanding of, 185
planner’s, 103
roles in deception, 103, 121
selecting, 40
short, 186
speed of information flow through, 165
as target, 52–54
timing and, 100–101
traditional, 101. See also COMINT; HUMINT; IMINT; OSINT;
Technical collection
understanding, 101
understanding command and control structure of Daesh and, 96–99
understanding how opponent uses, 92
understanding opponent’s analysis capabilities, 94–99
understanding opponent’s INTs, 93
web-based, 127–129
See also COMINT; CYBER; HUMINT; IMINT; OSINT; Sensor
mapping; Technical collection
Channels, opponent’s
importance of knowing, 121
targeting, 104–120
Chaos, as objective, 15, 16
Chemicals, dual-use, 86
Chemical warfare (CW)
agent production, 86
Libyan deception, 153–155
Rabta/Tarhunah deception and counterdeception, 174–175
US deception, 146–147
See also Weapons of mass destruction
China
organizational cover by, 115
use of social media, 124
Choice, optimal, 58
Churchill, Winston, 53
Circumstances, in story, 80
Civilian populations, 147
Civil War, 112–113
Clark, Robert, 186
Clinton, Hillary, 124
Closure, premature, 187
CNA (computer network attack), 122, 126–133
CNE (computer network exploitation), 122, 126–133
Code words, 110
Cognitive vulnerabilities, 162
Cold War, 68, 94–95. See also Soviet Union
Collection
literal, 104–115
nonliteral, 115–120
securing, 162–164
See also Observation
Collection requirements, 184, 185
Collective decision modeling, 62–67
Collectivism, 69–70, 70 (table)
COMINT (communications intelligence), 104, 107–111
Bolo, 108, 118
credibility of, 109–110
identifying deception in, 176, 177–178
sources, 107, 187
techniques for defeating, 109–110
traffic analysis, 110–111
vulnerability to deception, 162–163
Command-and-control (C&C) server, 129
Compartmentalization, 165
Competence, 165
Computer files, hiding messages in, 109
Computer network attack (CNA), 122, 126–133
Computer network exploitation (CNE), 122, 126–133
Computers, standalone, 134–137
Concept of operations (CONOPs). See CONOPS
Conclusions, affirming, 187
Conditioning, 18–19
Condorcet, Marquis de, 62–63
Conflict, memetic, 67, 125–126
Conflict, nontraditional, 14, 16
Confusion, 10, 16
Congruences, abnormal, 151, 152 (figure), 174, 174 (figure)
CONOPs (concept of operations), 149–152
basic format, 237–239
MILDEC CONOP exercises, 235–254
Consequences, unintended, 144–147. See also Outcomes, unfavorable
Context, and definitions, 10
Coordination, need for, 147–148
Cost-benefit analysis, 58
Counterdeception, 161–162, 174
defined, 10–11
HUMINT in, 178–180
vs. intelligence, 11
Rabta/Tarhunah deception and counterdeception, 174–175
Counterespionage, 94–99, 122, 178. See also Counterintelligence
Counterintelligence (CI), 92, 122
deception supporting, 14
deconfliction in, 147–148
defined, 11
lack of statistics for, 8
overlap with deception, 27
overlap with PSYOPS, 27
purpose of deception in, 13
Cover, 114–115
Credibility, 17, 18, 22, 109–110
Crops, 50, 163, 179
Cuban missile crisis, 47–48, 165–166
Cultural modeling, 63, 67–75
Curveball, 176–177, 180–181
CW (chemical warfare). See Chemical warfare
CYBER (cyber intelligence), 11, 107, 121–139
CNA/CNE, 122, 126–133
deconfliction and, 148
identifying deception in, 176
importance for projection, 122
intranets, 127, 134–137
need for attention to detail in, 156
standalone computers, 134–137
types of deception in, 121
web-based deception, 122–126
Cyber Deception (Jajodia), 139
Cyber defense, 137, 139
Cyber intelligence (CYBER). See CYBER
Cyber operations, HUMINT-enabled, 116–117
Cyberspace, 121. See also Channels
Fabrication, 21
Facial recognition software, 115
Facilities visits, 112–113
Facts, in story, 80
Fahey, Liam, 157
False, presented as truth, 21
False flag operations, 21–23
Farewell Dossier, 54–55, 147
Fatigue factor, 85
Feedback channels, 157
Femininity, 72–73, 72–73 (table)
Fields of conflict, 7–8
Film industry, 156–157
Finance, 50–51
FISINT (foreign instrumentation signals intelligence), 119, 181
Five Disciplines of Intelligence Collection, The (Lowenthal and Clark),
103
Flame, 132
Flash Player, 130–131
Force X, 22
Foreign instrumentation signals intelligence (FISINT), 119, 181
Forrest, Nathan Bedford, 113
Fortitude, 41–43, 68, 87, 88 (figure), 151
Fratricide, 147
Friendly forces, disguising, 21–23
Front companies, 115
Japan, 67, 72
Kaspersky, Eugene, 133
Kaspersky Lab, 133, 136
Kenya, 22
Keystroke loggers, 131, 136
Khrushchev, Nikita, 68
Kissinger, Henry, 83
Knowledge, common, 124
Kosovo, 49–50, 51, 117
Kremlinologists, 68
Kuwait, 65
Leaders, 57
vulnerability to deception, 167–168
See also Decision makers; Decision-making; Targets
Left hook, 65, 66
Liaison channel, 112, 175
Libya, 153–155, 156
Life history data, 115
Lindsay, J., 139
Long-term orientation, 73–74, 74 (table)
Loss aversion, target’s, 57
Objectives, 4, 31
in defensive deceptions, 49
in inverse OODA loop, 37
See also Outcome scenarios
Observation, 92
in Fortitude, 41–42, 43
identifying adversarial sensor, 92–93
in Mincemeat, 39
securing, 162–164
See also Collection
Observe-Orient-Decide-Act (OODA) loop. See OODA loop
OODA (Observe-Orient-Decide-Act) loop, 32, 34 (figure), 36, 91
assessing vulnerability of, 162–164
Boyd’s, 32–33, 33 (figure)
competing, 33
with explicit outcome, 36–37, 37 (figure)
forward, 40 (figure), 40–43
interactive identity and, 33–34
inverse, 37–40
Mincemeat and, 38–40
multiple, 41
objective of, 152–153
opponent’s, 35 (figure), 35–36, 152–155
opponent’s understanding of yours, 170
opponent’s view of yours, 36, 36 (figure)
perspectives, 33–37
understanding your own, 34 (figure), 34–35
See also Decision/action; Observation; Orientation
OPELINT (operational electronic intelligence), 118
Open source intelligence (OSINT). See OSINT
Operational electronic intelligence (OPELINT), 118
Operational information, 184
Operational security (OPSEC), 20
Operations
Argo deception, 81–83, 156–157
Black Boomerang, 9, 52, 106–107
Bolo, 108, 118
defined, 12
Farewell, 54, 147
Fortitude, 42, 68, 87, 88 (figure), 151
Husky, 38–40
Mincemeat, 38–40, 42, 48, 109, 158, 180, 185
Quicksilver, 18, 42, 48, 54, 87, 88 (figure), 101, 101 (figure), 108,
165
relation with intelligence, 25
role of, 26
scope of, 7
Ultra deception, 52–54, 84–85, 110, 163, 164
See also Desert Storm
Opponent
importance of knowing, 162
intelligence service of, 13–14
OODA loop of, 35 (figure), 35–36, 152–155
sensor mapping of, 91 (figure), 91–94
in threat assessment, 168–169
understanding analysis capabilities of, 94–99
understanding of, 5
use of term, 168
See also Targets
OPSEC (operational security), 20
Orientation
black box of, 96–99
in Fortitude, 41–42, 43
in inverse OODA loop, 38
in Mincemeat, 39
securing, 164–167
See also Analysis
Osinga, Frans, 152, 155
OSINT (open source intelligence), 11, 89, 104–107
in Desert Storm, 90
devaluing significance of, 187
identifying deception in, 176, 177
use of, 164
Othman, Ibrahim, 115–116
Outcomes, unfavorable
in Fortitude, 43
good-enough, 60–61
in Mincemeat, 39
OODA loop and, 36–37
predictable, 155
surprise and, 7
unfavorable, 144–147
Outcome scenarios, 4, 45–55
classes of, 45. See also PMESII
in Fortitude, 41
identifying, 45
in inverse OODA loop, 37
in Mincemeat, 38
Quicksilver, 18, 42, 48, 84, 87, 88 (figure), 100–101, 101 (figure), 108,
165