Secret History The Story of Cryptology
Secret History The Story of Cryptology
Secret History The Story of Cryptology
Discrete Encounters
Craig P. Bauer
Craig P. Bauer
York College of Pennsylvania
and
National Security Agency
Center for Cryptologic History
2011-2012 Scholar-in-Residence
Second edition published 2021
by CRC Press
First edition published 2013
by CRC Press
6000 Broken Sound Parkway NW, Suite 300, Boca Raton, FL 33487-2742
and
by CRC Press
Reasonable efforts have been made to publish reliable data and information, but the author and publisher cannot assume
responsibility for the validity of all materials or the consequences of their use. The authors and publishers have attempted to
trace the copyright holders of all material reproduced in this publication and apologize to copyright holders if permission to
publish in this form has not been obtained. If any copyright material has not been acknowledged please write and let us know
so we may rectify in any future reprint.
Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced, transmitted, or utilized
in any form by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying,
microfilming, and recording, or in any information storage or retrieval system, without written permission from the publishers.
For permission to photocopy or use material electronically from this work, access www.copyright.com or contact the Copyright
Clearance Center, Inc. (CCC), 222 Rosewood Drive, Danvers, MA 01923, 978-750-8400. For works that are not available on
CCC please contact mpkbookspermissions@tandf.co.uk
Trademark notice: Product or corporate names may be trademarks or registered trademarks, and are used only for
identification and explanation without intent to infringe.
vii
viii ◾ Contents
The first edition of this book contained chapters devoted to how German and Japanese systems
from World War II were cracked. These are extremely important chapters and they were retained
in this new edition, but now the other side of this cipher war is told — how the United States was
able to come up with systems that were never broken. A new chapter details SIGABA, which enci-
phered text, and another covers SIGSALY, which was used for voice communications. Despite the
addition of these two new chapters, the book you are holding contains only 21 chapters, compared
to the first edition’s 20. That’s because the first two chapters of the original have been combined
into the new Chapter 1. So, if you’re using this book to teach a course, please make note of this
fact! Nothing has been deleted in the process of compressing these first two chapters and the other
chapters retain their original order, with the new chapters inserted at the appropriate positions.
New material has come to light and been incorporated throughout the book concerning vari-
ous eras in cryptology’s long history. Some of the “history” is so recent that it should be referred
to as current events. For example, much has happened concerning political aspects of cryptology
since the first edition was completed. The still unfolding story is updated in this new edition. The
final chapter includes the impact of quantum computers, which is both a current event and an
extremely important part of the future. We are living in interesting times!
xv
Note to the Reader
This book was intentionally written in an informal entertaining style. It can be used as leisure
reading for anyone interested in learning more about the history or mathematics of cryptology.
If you find any material confusing, feel free to skip over it. The history alone constitutes a book
and can be enjoyed by itself. Others, especially high school teachers and college professors in fields
such as mathematics, history, and computer science, will find the book useful as a reference that
provides many fascinating topics for use in the classroom. Although the book assumes no previous
knowledge of cryptology, its creation required making use of a tremendous number of sources,
including other books, research papers, newspaper articles, letters, original interviews, and previ-
ously unexamined archival materials. Thus, even the experts will likely find much that is new.
The purpose of the book is to give as complete a picture of cryptology as is possible in a single
volume, while remaining accessible. The most important historical and mathematical topics are
included. A major goal is that the reader will fall in love with the subject, as I have, and seek out
more cryptologic reading. The References and Further Reading sections that close every chapter
make this much easier.
I’ve used this book for two completely different classes I teach. One is a 100-level general
elective titled “History of Codes and Ciphers.” It is populated by students with a wide range of
majors and has no prerequisites. The other is a 300-level math and computer science elective with
a prerequisite of Calculus I. This prerequisite is only present to make sure that the students have a
bit of experience with mathematics; I don’t actually use calculus. In the 100-level class, much of
the mathematics is skipped, but for the upper-level class, the minimal prerequisite guarantees that
all of the material is within the students’ reach.
Anyone else wanting to use this book as a text for a cryptology class will also want to take
advantage of supplementary material provided at https://www.routledge.com/9781138061231.
This material includes hundreds of exercises that can be taken as challenges by any of the read-
ers. Many of the exercises offer real historic ciphers for the reader to test his or her skill against.
These were originally composed by diverse groups such as spies, revolutionists, lovers, and crimi-
nals. There’s even one created by Wolfgang Amadeus Mozart. Other exercises present ciphers that
played important roles in novels and short stories. Some of the exercises are elementary, while oth-
ers are very difficult. In some cases, it is expected that the reader will write a computer program
to solve the problem or use other technology; however, the vast majority of the problems may be
solved without knowledge of a programming language. For the problems that are not historic
ciphers, the plaintexts were carefully chosen or created to offer some entertainment value to the
decipherer, as a reward for the effort required to find them. Not all of the exercises involve break-
ing ciphers. For those so inclined, there are numerous exercises testing the reader’s mastery of the
mathematical concepts that are key components of the various systems covered. Sample syllabi
and suggested paths through the book for various level classes are also present. The website will
also feature a (short, I hope!) list of errata. Should you wish to correspond with the author, he may
be reached at cryptoauthor@gmail.com.
xvii
Introduction
This brief introduction defines the necessary terminology and provides an overview of the book.
The reader should feel free to flip back to this material, or make use of the detailed index, to find
the first appearances of definitions that might require refreshing over the course of the book.
Codes are a part of everyday life, from the ubiquitous Universal Price Code (UPC) to postal
Zip Codes. They need not be intended for secrecy. They generally use groups of letters (sometimes
pronounceable code words) or numbers to represent other words or phrases. There is typically no
mathematical rule to pair an item with its representation in code. Codes intended for secrecy are
usually changed often. In contrast to these, there’s no harm in nonsecret codes, like Zip Codes,
staying the same for decades. In fact, it is more convenient that way.
The invention of the telegraph and the desire to decrease the length of messages sent in that
manner, in order to minimize costs (they did not go for free like e-mails!), led to nonsecret com-
mercial codes in which phrases were replaced by small groups of letters (see Figure I.1). These pro-
vide an early example of data compression, a topic that will arise again in Section 1.17.
Of course, codes were used in wars too numerous to list here, as well as in countless intrigues
and conspiracies throughout history, and they are still with us. For now, we only consider one
more example. A code is produced by many copiers and printers on each page they turn out
that identifies the machine that was used. Few users are even aware that it is there, as you need a
blue light, magnifying glass, or microscope to make out the dots that conceal the information.1
Although it was not how he was identified, such a code appears to have been part of the evidence
used to hunt for serial killer BTK (Dennis Rader), who used a copier at Wichita State University
for one of his taunting letters to the police.
The copier and printer code is also an example of steganography, in which the very presence of
a message is intended to be concealed. Other forms that steganography may take include invisible
inks and microdots.
Examples of codes and steganography appear from place to place in this book, but they are not
the main focus. The focus is on ciphers, which typically work on individual letters, bits, or groups of
these through substitution, transposition (reordering), or a combination of both. In contrast to codes,
modern ciphers are usually defined in terms of mathematical rules and operations. In the early days,
however, this was not the case. In fact, Charles Babbage (1791–1871), the great pioneer in computer
science, is often credited as being the first to model ciphers using mathematics. This isn’t correct; there
were earlier efforts, but they didn’t catch on.2 In any case, it’s not until the twentieth century that
cryptology really became mathematical. A few more definitions will make what follows easier.
Cryptography is the science of creating cipher systems. The word comes from the Greek
κρυπτός, meaning “hidden,” and γραφία, meaning “writing.” Cryptanalysis is the science and art
1 The Electronic Frontier Foundation (EFF) has a nice webpage on this topic that includes many relevant links.
See http://www.eff.org/issues/printers.
2 Buck, Friederich Johann, Mathematischer Beweiß: daß die Algebra zur Entdeckung einiger verborgener
Schriften bequem angewendet werden könne, Königsberg, 1772, available online at https://web.archive.org/
web/20070611153102/http://www-math.uni-paderborn.de/∼aggathen/Publications/buc72.pdf. This is the first
known work on algebraic cryptology.
xix
xx ◾ Introduction
Figure I.1 A page from an 1875 code book includes short code groups for commonly used
phrases such as “Some squabble or fight on shore with crew. Crew Imprisoned.” (From Greene,
B. F., editor, The International Code of Signals for All Nations, American Edition, published
under the authority of the Secretary of the Navy by the Bureau of Navigation, U.S. Government
Printing Office, Washington, DC, 1875, p. 49. Courtesy of the National Cryptologic Museum.)
Introduction ◾ xxi
of breaking ciphers (deciphering without the key). Cryptology, the most general term, embraces
both cryptography and cryptanalysis. Most books on ciphers are on cryptology, as one cannot
determine the security of a cipher without attempting to break it, and weaknesses in one system
must be understood to appreciate the strengths in another. That is, it doesn’t make sense to study
cryptography without studying cryptanalysis. Nevertheless, the term cryptography is used more
frequently and taken to mean cryptology.
Encipher and encrypt both refer to the process of converting a message into a disguised form,
ciphertext, using some cryptographic algorithm. Decipher and decrypt refer to the reverse pro-
cess that reveals the original message or plaintext (sometimes called cleartext). The International
Organization for Standardization (ISO3), a group that offers over 22,800 voluntary standards for
technology and business, even has a standard (7498-2) regarding these terms; encipher and deci-
pher are the ones to use, as the terms encrypt and decrypt are considered offensive by some cultures
because they refer to dead bodies.4
Modern expectations of encrypted communications include not only the inability of an eaves-
dropper to recover the original messages, but much more. It is expected, for example, that any
changes made to a message in transit can be detected. This is referred to as data integrity. Suppose
an encrypted order to buy 500 shares of a given stock at a particular price is sent. Someone could
intercept the ciphertext and replace a certain portion of it with alternate characters. This is possible
without an ability to decipher the message. It only requires knowledge of which positions in the
message correspond to the stock, the number of shares, and the price. Altering any of these will
result in a different order going through. If unauthorized alterations to a message can be detected,
this sort of mischief will be less of a problem. Another important property is authentication, the
ability to determine if the message really originated from the person indicated. Billions of dollars
can be lost when the expectations of authenticity and data integrity are not met.
Encryption protects both individual privacy and industrial secrets. Financial transactions
from your own ATM withdrawals and online credit card purchase on up to fund transfers in
international banking and the major deals of transnational corporations can all be intercepted and
require protection. Encryption has never protected more data than it does now.
In today’s world, a cryptosystem is a collection of algorithms that attempts to address the con-
cerns outlined above. One algorithm takes care of the actual encryption, but many others play
an important role in the security of the system. In the pages that follow, I’ll sometimes refer to
incredibly simple ciphers as cryptosystems. At such times, I only mean to distinguish them from
other ciphers, not to imply that they have modern features.
Presumably, no one reading this needs to be talked into pursuing the subject, but if you want
more reasons to study cryptology:
1. “In Russia, no private encryption is allowed, at least not without a license from Big Brother.”5
2. “In France, cryptography is considered a weapon and requires a special license.”6 (This book
was written over many years — I am leaving this quote in because it is interesting and was
3 “Because ‘International Organization for Standardization’ would have different acronyms in different languages
(IOS in English, OIN in French for Organisation internationale de normalisation), our founders decided to give
it the short form ISO. ISO is derived from the Greek ‘isos’, meaning equal. Whatever the country, whatever the
language, we are always ISO.” — quoted from https://www.iso.org/about-us.html.
4 Schneier, Bruce, Applied Cryptography, second edition, Wiley, New York, 1996, p. 1.
5 Kippenhahn, Rudolph, Code Breaking: A History and Exploration, The Overlook Press, New York, 1999, p. 209.
6 Kippenhahn, Rudolph, Code Breaking: A History and Exploration, The Overlook Press, New York, 1999, p. 209.
xxii ◾ Introduction
once true, but in 1998 and 1999 France repealed her anti-crypto laws.7 In general, nations
that are members of the European Union place fewer restrictions on cryptology than other
nations.)
3. The Kama Sutra lists secret writing as one of the 64 arts that women should know and prac-
tice.8 (It is #45.)
4. “No one shall be subjected to arbitrary interference with his privacy, family, home or cor-
respondence, nor to attacks upon his honor and reputation. Everyone has the right to the
protection of the law against such interference or attacks.” (Article 12, Universal Declaration
of Human Rights, United Nations, G.A. res. 217A (III), U.N. Doc A/810 at 71, 1948).9
Part I
Chapter 1 begins by detailing some systems used by the ancient Greeks and the Vikings, as well
as the impact steganography had in the history of ancient Greece. It continues with a close look at
monoalphabetic substitution ciphers (MASCs), including historical uses, as well as appearances in
fictional works created by Edgar Allan Poe, Sir Arthur Conan Doyle (creator of Sherlock Holmes),
J. R. R. Tolkien, and others. Important ideas such as modular arithmetic are first presented in
this chapter, and sophisticated modern attacks on MASCs are included, along with sections on
data compression, nomenclators (systems that make use of both a cipher and a code), and book
codes. Chapter 2, as already mentioned, shows the logical progression from the Vigenère cipher
(which uses multiple substitution alphabets) to the running key cipher, and on to the unbreakable
California, p. 257.
Introduction ◾ xxiii
one-time pad. Historical uses are provided from the U.S. Civil War (for the Vigenère cipher) and
World War II (for the one-time pad). Chapter 3 shifts gears to take a look at transposition ciphers,
in which letters or words are rearranged rather than replaced by others. Most modern systems
combine substitution and transposition, so this helps to set the stage for later chapters such as 5,
13, and 20. In Chapter 4, we examine a steganographic system alleged to reveal Francis Bacon as
the true author of William Shakespeare’s plays. Although I hope to convince the reader that such
arguments are not valid, the system has been used elsewhere. This chapter also examines Thomas
Jefferson’s cipher wheel, which saw use as recently as World War II, and looks at how John F.
Kennedy’s life hung in the balance during that war, dependent on the security of the 19th-century
Playfair cipher. Again, modern attacks on these older systems are examined. Stepping back a bit,
Chapter 5 examines the great impact cryptology had on World War I and looks closely at the
fascinating cryptologic figure Herbert O. Yardley, who may be accurately described as the “Han
Solo of cryptology.” This chapter also includes a brief description of censorship, with emphasis on
censorship of writings dealing with cryptology. Linear algebra shows its importance in Chapter 6,
where matrix encryption is examined. Two attacks on this system that, prior to the first edition
of this work, had never before appeared in book form are presented. Electromechanical machines
come on the scene in Chapter 7, as the Germans attempt to protect their World War II-era secrets
with Enigma. This chapter contains a detailed look at how the Poles broke these ciphers, aided
by machines of their own. Following the invasion of Poland, the setting shifts to Bletchley Park,
England, and the work of Alan Turing, the great pioneer in computer science. A brief look is taken
at the Nazi Lorenz ciphers and the computer the British used to break them. Chapter 8 shifts to
the Pacific Theater of World War II and takes a close look at Japanese diplomatic ciphers and naval
codes and the effect their cryptanalysis had on the war. The role the Navajo code talkers played in
securing the Allies’ victory closes out this chapter. Having seen how weaknesses in cipher machine
design can be exploited by cryptanalysts, Chapter 9 details SIGABA, a World War II-era machine
used by the United States that was never broken. Chapter 10 moves away from text and looks at
systems used to encipher speech prior to and during World War II.
Part II
Chapter 11 leads off Part II with a look at how Claude Shannon’s ideas helped shape the informa-
tion age, as well as modern cryptology. His method of measuring the information content of a
message (using the terms entropy and redundancy) is explained and simple ways to calculate these
values are provided, along with the impact such concepts had in other fields. A history of the
National Security Agency is given in Chapter 12. Included with this is a discussion of how elec-
tromagnetic emanations can cause an otherwise secure system to be compromised. TEMPEST
technology seeks to protect systems from such vulnerabilities. A betrayal of the agency is looked
at in some detail and the new Crypto AG revelations are covered. The chapter on NSA is followed
by a close examination in Chapter 13 of a cipher (DES) that the Agency had a role in designing.
The controversy over DES’s key size and its classified design criteria is given full coverage, as is the
Electronic Frontier Foundation’s attack on the system. The revolutionary concept of public key
cryptography, where people who have never met to agree on a key can nevertheless communicate
securely despite the presence of eavesdroppers (!), is introduced in Chapter 14 with Diffie-Hellman
key exchange and RSA. The mathematical background is given, along with historical context
and a look at the colorful personalities of the key players. Attempts by the U.S. government to
exert control over cryptologic research and the reaction in the academic world receive a thorough
xxiv ◾ Introduction
treatment as well. Chapter 15 is devoted to attacks on RSA. A dozen are presented that don’t
involve factoring, then a series of more and more sophisticated factoring algorithms are examined.
In Chapter 16 practical consideration such as how to quickly find primes of the size needed for
RSA encryption are examined, as well as the important field of complexity theory. The public key
systems of Ralph Merkle and Taher Elgamal are included in this chapter. Although this chapter is
more technical than many of the others, some of the key work described was first done, in whole
or in part, by undergraduates (Merkle, Kayal, Saxena). Chapter 17 opens with the trouble that the
lack of authenticity caused during World War II. It then shows how RSA and Elgamal offer the
possibility of attaining authenticity by allowing the sender to sign messages. Unlike traditional
letters, the time required to sign a digital message increases with the size of the message, if signing
is done in the most straightforward manner! To fight back against this problem, hash functions
condense messages to shorter representations that may then be signed quickly. Thus, a discussion
of hash functions naturally appears in this chapter. Chapter 18 covers PGP and shows how such
hybrid systems can securely combine the speed of a traditional encryption algorithm with the con-
venience of a (slower) public key system. Many political issues already seen in Chapters 13 and 14
are expanded upon in this chapter, which is mainly historical. The unbreakable one-time pad isn’t
very practical, so we have more convenient stream ciphers to approximate them. These are detailed
in Chapter 19, and the most modern examples are used to encrypt cellphone conversations in
real time. Finally, Chapter 20 looks at elliptic curve cryptography and the Advanced Encryption
Standard, two of the very best (unclassified) modern systems. These systems were endorsed by
NSA. Chapter 21 closes Part II, and the book, with a look at quantum cryptography, quantum
computers, and DNA computers. Cryptographers have spent years preparing for the threat posed
by quantum computers, but post-quantum cryptography still looks like the wild west.
Enjoy!
Acknowledgments
Thanks to Chris Christensen and Robert Lewand for reading the entire manuscript of the first edi-
tion and providing valuable feedback; to Brian J. Winkel for his comments on several chapters and
much help and great advice over the years, as well as unbelievably generous cryptologic gifts; to
René Stein, who researched numerous queries on my behalf at the National Cryptologic Museum;
to Robert Simpson, the new librarian at the National Cryptologic Museum, who is off to a fabu-
lous start; to David Kahn for his inspiration and generosity; to everyone at the National Security
Agency’s Center for Cryptologic History (the best place to work!) for my wonderful year as the
2011–2012 scholar in residence; to the editorial board of Cryptologia, for sharing their expertise;
and to Jay Anderson, for getting me hooked on the subject in the first place.
Thanks also to the American Cryptogram Association, Steven M. Bellovin, Paolo Bonavoglia
Gilles Brassard, Jay Browne, Stephen Budiansky, Jan Bury, Kieran Crowley, John W. Dawson,
Jr., John Dixon, Sarah Fortener, Benjamin Gatti, Josh Gross, Sam Hallas, Robert E. Hartwig,
Martin Hellman, Regan Kladstrup, Neal Koblitz, George Lasry, Susan Landau, Harvey S. Leff,
Robert Lord, Andrea Meyer, Victor S. Miller, Adam Reifsneider, Karen Rice-Young, Barbara
Ringle, Kenneth Rosen, Neelu Sahu, Klaus Schmeh, Mary Shelly at Franklin & Marshall College
Library, William Stallings, Bob Stern, Ernie Stitzinger, Dave Tompkins, Sir Dermot Turing,
Patrick Weadon, Bob Weiss, Avi Wigderson, Betsy Wollheim (president of DAW Books), John
Young, and Philip Zimmermann.
Thank you all!
xxv
CLASSICAL I
CRYPTOLOGY
Chapter 1
Monoalphabetic Substitution
Ciphers, or MASCs:
Disguises for Messages
Early cavemen may have developed a system of secret oral grunts or mimetic signs to
convey messages to one another.
3
4 ◾ Secret History: The Story of Cryptology
Male Female
60 – Anu 55 – Antu
50 – Enlil 45 – Ninlil
40 – Ea/Enki 35 – Ninki
30 – Nanna/Sin 25 – Ningal
20 – Utu/Shamash 15 – Inanna/Ishtar
10 – Ishkur/Adad 5 – Ninhursag
The number paired with the name of the god was sometimes used instead of the name;1 thus,
we have a substitution cipher. In general, though, as explained in the Introduction, when entire
words or names are swapped out for numbers or letters, we refer to it as a code, rather than a cipher.
It seems that every culture that develops writing (which itself offers some secrecy if nearly
everyone is illiterate) develops cryptography soon thereafter. Although many more examples could
be included, we now move on to the ancient Greeks’ use of cryptography.
1 Röllig, Werner, “Götterzahlen,” in Ebeling, Erich and Bruno Meissner, editors, Reallexikon der Assyriologie und
Vorderasiatischen Archäologie, Vol. 3, Walter de Gruyter & Co., Berlin, Germany, 1971, pp. 499–500.
2 Singh, Simon, The Code Book, Doubleday, New York, 1999, p. 9.
3 Kelly, Thomas, “The Myth of the Skytale,” Cryptologia, Vol. 22, No. 3, July 1998, pp. 244–260.
4 The Greeks reportedly used a leather strip, sometimes disguised as a belt.
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 5
1 2 3 4 5
1 A B C D E
2 F G H I&J K
3 L M N O P
4 Q R S T U
5 V W X Y Z
In the Polybius cipher, each letter is replaced by the position in which it appears in the square,
using first the row number and then the column number. An example is
T H I S I S E A S Y T O B R E A K
44 23 24 43 24 43 15 11 43 54 44 34 12 42 15 11 25
This system was originally intended for signaling over long distances. To send the first letter,
T, one would hold four torches in the right hand and four in the left hand. To send the next letter,
H, one would hold two torches in the right hand and three in the left hand. We will see how to
break such a cipher later in this chapter. It’s a special case of a general class of ciphers known as
monoalphabetic substitution ciphers, mentioned at the beginning of this chapter, where a given letter
is always substituted for with the same symbol wherever it appears.
The five-by-five square forced us to combine I and J as 24. The Greeks, using a smaller alpha-
bet, did not have this inconvenience. For us, though, decipherment is not unique, but context
should make clear which choice is correct. Of course, we could use a six-by-six square, which
would allow for 26 letters and 10 digits. Alternatively, numbers can be spelled out. A six-by-six
5 Polybius wrote about this system in Chapter 46 of his tenth book of history, circa 170 BCE.
6 ◾ Secret History: The Story of Cryptology
square would be used for languages written in the Cyrillic alphabet. The Hawaiian alphabet, on
the other hand, contains only 12 letters: 5 vowels (a, e, i, o, u) and 7 consonants (h, k, l, m, n, p,
w). Thus, for the Hawaiian alphabet, a four-by-four square would suffice.
The Polybius Square is sometimes called a Polybius checkerboard. The letters may be placed in
the square in any order; for example, the keyword DERANGEMENT could be used to rearrange
the letters like so:
1 2 3 4 5
1 D E R A N
2 G M T B C
3 F H I&J K L
4 O P Q S U
5 V W X Y Z
Observe that repeated letters of the keyword are left out when they reappear. Once the key-
word is used up, the remaining letters of the alphabet are filled in the square.
The term derangement has a technical meaning, in that it refers to a reordering where no object
occupies its original position. Thus, the scrambling provided above is not a derangement, since the
letters U, V, W, X, Y, and Z are left in their original locations.
Figure 1.3 Example of Viking cryptography. (Redrawn from Franksen, Ole Immanuel, Mr. Babbage’s
Secret: The Tale of a Cypher—and APL, Prentice Hall, Englewood Cliffs, New Jersey, 1984.)
Markings on the Swedish Rotbrunna stone, reproduced in Figure 1.3, may appear meaningless,
but they’re actually an example of Viking cryptography.
If we note the numbers of long strokes and short strokes in each group, we get the numbers 2, 4, 2,
3, 3, 5, 2, 3, 3, 6, 3, 5. Pairing the numbers up gives 24, 23, 35, 23, 36, 35. Now consider Figure 1.4.
The Vikings used diagrams like these to translate the numbers to runes; for example, 24
indicates the second row and the fourth column. In the diagram on the left, this gives the rune
for J. Thus, this Viking encryption system was essentially a Polybius cipher. This is just one of
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 7
the Viking systems; there are others. Secrecy must have been important for these people from the
beginning, for the word rune means “secret” in Anglo-Saxon.
The warning, combined with the 300 Spartans who held off the Persians for three days,
allowed time to prepare a successful defense. Leonidas was among those who died trying to gain
the time necessary for the Greeks to build a stronger defense. Without the advance warning given
in this instance (or for the later attack mentioned previously, where the skytale was used to convey
the warning), it’s conceivable that the lack of preparation could have led to a victory for Persia, in
which case there would have been no “cradle of western civilization.”
Another steganographic trick was carried out by Histiaeus. In 499 BCE, he had a slave’s head
shaved for the purpose of tattooing a message on it, encouraging Aristagoras of Miletus to revolt
against the Persian King. When the slave’s hair grew back, concealing the message, the slave was
dispatched with instructions to tell the intended recipient to shave his head. This is not a good
method, though, if time is of the essence or if the message is long and the slave’s head small!7
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z plaintext
D E F G H I J K L M N O P Q R S T U V W X Y Z A B C ciphertext
For example,
ET TU BRUTE? message
becomes
HW WX EUXWH? ciphertext
(Note: These were not Caesar’s last words. They are the last words of Caesar in Shakespeare’s play,
which is not completely historically accurate.)
We don’t have to shift by three. We could shift by some other integer value K. If we think in
terms of the numbers 0 through 25 representing the letters A through Z, the enciphering process may
be viewed mathematically as C = M + K (mod 26), where C is the ciphertext letter, M is the plaintext
letter, and K is the key. The “mod 26” part (short for “modulo 26”) simply means that if the sum M
+ K is greater than or equal to 26, we subtract 26 from this number to get our result. The keyspace
(defined as the set of possible choices for K ) has 25 elements, because the identity, K = 0, leaves the
message unchanged, as does K = 26. Only values strictly between 0 and 26 offer distinct encipher-
ments. For those of you who have had a semester of abstract algebra, we are now working with ele-
ments from the group Z26. Just as Brutus helped kill Caesar, a brute force attack (trying all possible
keys) quickly destroys his cipher. One requirement for a strong cryptosystem is a big keyspace.
Perhaps in an act of rebellion against more modern ciphers, during the U.S. Civil War, General
Albert S. Johnston (fighting for the Confederacy), agreed with his second in command, General
Pierre Beauregard, to use a Caesar shift cipher!10
n! ∼ ( ) ( )( )
2πn n n e − n
10 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 216.
11 Ghahramani, Saeed, Fundamentals of Probability, Prentice-Hall, Upper Saddle River, New Jersey, 1996, p. 45.
12 Discovered by James Stirling (1692–1770) in 1730. You will see the ∼ notation again in Section 16.1, where the
importance of prime numbers in cryptology is discussed. For a closer look at this formula, see Bauer, Craig P.,
Discrete Encounters, CRC/Chapman & Hall, Boca Raton, Florida, 2020, pp. 200-201.
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 9
where π ≈ 3.1415 and e ≈ 2.71828. The ∼ is read as “asymptotically approaches” and means
n!
lim =1
n→∞
( ) ( )( )
2πn n n e − n
So, we have a nice compact formula that relates the concept of factorials to some of the most
important constants in mathematics!
In classical cryptography, it has often been desirable to have a key that can be memorized,
because if the key is written down, it is susceptible to seizure. It would be time-consuming to
memorize A goes to H, B goes to Q, C goes to R, etc. for all 26 letters. This leads us to keyword
ciphers.13 For example, using the keyword PRIVACY, we have
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z plaintext
P R I V A C Y B D E F G H J K L M N O Q S T U W X Z ciphertext
Letters that are not used in the keyword follow it in alphabetical order when writing out the
ciphertext alphabet. Thus, we have a cipher superior to the Caesar shift, but still weakened by some
order being retained in the cipher alphabet. Such ciphers may be further complicated by having
the keyword placed somewhere other than the start of the alphabet. For example, we may use the
two-part key (PRIVACY, H) and encipher with
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z plaintext
Q S T U W X Z P R I V A C Y B D E F G H J K L M N O ciphertext
Also, long key phrases may be used to determine the order of the substitutions. If a letter is
repeated, simply ignore it when it reappears,14 as you write the cipher alphabet out. Key phrases
may be chosen that contain all of the letters of the alphabet. For example,
The quick brown fox jumps over a lazy dog. (33 letters)
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z plaintext
T H E Q U I C K B R O W N F X J M P S V A L Z Y D G ciphertext
13 The Argentis were the first to use keywords in this manner (during the late 1580s). See Kahn, David, The
Codebreakers, second edition, Scribner, New York, 1996, p. 113.
14 An enciphered message sent to Edgar Allan Poe was made much more difficult to break because the sender did
A few devices have been invented to ease translation from plaintext to ciphertext and back again.
We now take a look at two of these.
Figure 1.5 Leon Battista Alberti’s cipher disk. (Courtesy of the David Kahn Collection, National
Cryptologic Museum, Fort Meade, Maryland.)
The manner in which Leon Battista Alberti’s cipher disk (Figure 1.5) can be used to encrypt
and decrypt should be immediately clear and require no explanation. The inner alphabet is on a
separate disk that may be rotated with respect to the larger disk in order to form other substitu-
tions. Alberti (1404–1472) invented this disk in 1466 or 1467.16
Another device is the St.-Cyr Slide (Figure 1.6).17 Here again, one alphabet (the one written twice
in the figure) may be moved relative to the other. This device is only good for performing a Caesar
type shift, if using a straight alphabet, as pictured below, and keeping the slide in a fixed position.
Yet, even with a completely mixed alphabet, it is a trivial matter to break such a cipher. MASCs
have often been used in fiction. The hero of the tale typically succeeds in his attempt at cryptanaly-
sis and explains how he did it.18 One of the best of these tales was penned by Edgar Allan Poe. His
connection with cryptology is examined next. The following section examines Sherlock Holmes’s
encounters with ciphers and we then conclude our treatment of MASCs with a look at solution
techniques unknown to Poe or Arthur Conan Doyle’s famous detective.
16 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, pp. 126–128.
17 This device is named after the French national military academy.
18 For an overview of codes and ciphers in fiction see Codes and Ciphers in Fiction: An Overview by John F. Dooley
in the October 2005 issue of Cryptologia, Vol. 29, No. 4, pp. 290–328. This paper contains an annotated bibli-
ography with 132 entries. For an updated list, go to https://www.johnfdooley.com/ and follow the link “Crypto
Fiction.” As of October 7, 2020, this list contains 420 examples.
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 11
Let any one address us a letter in this way, and we pledge ourselves to read it forth-
with—however unusual or arbitrary may be the characters employed.
He did insist that word spacing be preserved. Poe solved the ciphers that were sent in, with
a few exceptions. Some wiseguys apparently sent in fake ciphers that never contained a message,
but rather consisted of random letters. Several readers made Poe’s work easier by enciphering well-
known writings. For example, Poe only needed to glance at one such ciphertext to know that it was
the Lord’s Prayer. Poe had articles on cryptography in a total of 15 issues,19 yet he didn’t reveal his
method of deciphering, despite the pleas of readers.
Poe quit Alexander’s in May of 1840 and began serving as editor for Graham’s magazine a year
later. He repeated his cipher challenge, although this time it was buried in a review, “Sketches of
Conspicuous Living Characters” (April 1841). Poe’s longest article on the subject, titled, oddly
enough, “A Few Words on Secret Writing,” appeared in the July 1841 issue. The articles were once
again very popular, yet Poe still did not reveal his method. The suspense was good for sales.
Polyphonic ciphers were among the systems Poe solved in Graham’s. In these, more than one
letter may be enciphered as the same character. This makes deciphering more difficult for the
cryptanalyst, as well as for the intended recipient. An example is shown below.
At a glance, we can tell that this ciphertext wasn’t obtained by simply replacing each letter
with another in a one-to-one manner; for example, the second to last word in the second line,
inotiiiiv, has four copies of the same letter grouped together, but there is no such word in English!
Nevertheless, Poe broke this cipher. He determined the key phrase (in Latin) to be Suaviter in
modo, fortiter in re. So, the substitutions were determined as follows.
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z plaintext
S U A V I T E R I N M O D O F O R T I T E R I N R E ciphertext
The letter i in the ciphertext may represent E, I, S, or W. Other letters provide multiple pos-
sibilities as well. Intentionally or not, the encipherer made things even more difficult through the
presence of misspellings or typos. Also, he didn’t exactly use a simple vocabulary. Poe stated the
solution as
In both periodicals, Poe made a claim that he repeated in his short story “The Gold Bug”
(1843).20 The phrasing differed. It is stated here in the form seen in the story.
Yet it may be roundly asserted that human ingenuity cannot concoct a cipher which
human ingenuity cannot resolve.
This is Poe’s most famous quote concerning cryptology. It is in error, as we will see later; there
is a theoretically unbreakable cipher.
In “The Gold Bug,” Poe finally gave his readers what they were clamoring for. He revealed his
method. It became his most popular short story. I believe there is a tendency to read fiction more
passively than nonfiction. Perhaps this is a result of the willing suspension of disbelief. In most
references to “The Gold Bug,” a glaring error in cryptanalysis is not mentioned. The hero of the
tale, Legrand, begins his cryptanalysis of the message by observing that word divisions are not
present in the ciphertext, yet one of the symbols could have stood for a blank space. This symbol
would have the greatest frequency—larger than that of e. Okay, no big deal, but when Legrand
says,
Now, in English, the letter which most frequently occurs is e. Afterward, the succes-
sion runs thus: a o i d h n r s t u y c f g l m w b k p q x z. E predominates so remarkably,
that an individual sentence of any length is rarely seen in which it is not the prevailing
character.
we have a more serious error. Do you see it? If not, look again before reading on.
In a fantastic computer science text,21 William Bennett pointed out the errors that Poe made.22
He titled this section “Gold Bug or Humbug?” and referred to the origin of the frequency table
above as “the most fundamental mystery of the entire short story.” The letter t should be the
second most frequent. Why did Poe place it tenth? This was not an error made by the printer, as
Legrand’s decipherment is consistent with the erroneous frequency table. Identifying the second
most frequent cipher symbol (which stood for t) as a, according to the table, would lead to a dead
end. Hence, Legrand discards the table and uses the fact that the is the most common word.
Bennet’s solution to the mystery is that the story was originally written to have the cipher in
Spanish and when changed to English, at the request of the publisher, the frequency table was
left unaltered. However, before setting out on a search for an early draft of the story to confirm
Bennett’s conjecture, a search of the cryptologic literature reveals a better solution.
Although Bennett wasn’t aware of it, the mystery had been solved decades before the appear-
ance of his text. Raymond T. Bond, in the role of editor, assembled a collection of short stories
involving codes and ciphers and penned an introduction for each.23 In the introduction to Poe’s
tale, he described an article on ciphers in Abraham Rees’s Cyclopædia or, Universal Dictionary of
Arts, Sciences, and Literature (1819). This article, which was written by William Blair, discussed
letter frequencies, but handled consonants and vowels separately. Blair split the consonants into
four groups, according to frequency and didn’t attempt to rank the letters within each group, but
rather ordered each group alphabetically:
d h n r s t c f g l m w b k p q x z
In describing the vowels, Blair stated that e was the most frequent, followed by o, a, and i,
and then pointed out that some consonants, such as s and t, are more frequent than u and y. The
relatively recent letters v and j were not included!
Upon reading this, Poe placed u and y after s and t in the ordering given above and then put
the other vowels at the start, accidentally switching the order of o and a to get
e a o i d h n r s t u y c f g l m w b k p q x z
21 Bennett, Jr., William Ralph, Scientific and Engineering Problem-Solving with the Computer, Prentice-Hall,
Upper Saddle River, New Jersey, 1976, pp. 159–160.
22 Poe also made errors concerning wines in his excellent short story “The Cask of Amontillado.” Hey Edgar,
Amontillado is a Sherry! For more errors in “The Cask of Amontillado” see Fadiman, Clifton, editor, Dionysus:
a Case of Vintage Tales About Wine, McGraw-Hill, New York, 1962.
23 Bond, Raymond T., Famous Stories of Code and Cipher, Rinehart and Company, New York, 1947, pp. 98–99.
The contents of the paperback edition differ from the hardcover by one story, but “The Gold Bug” is present in
both.
14 ◾ Secret History: The Story of Cryptology
Because he created the genre of detective fiction, I’d like to think Poe would have enjoyed the
work that went into solving the “Mystery of the Incorrect Letter Frequencies.”
David Kahn pointed out several other errors in The Gold Bug that have nothing to do with
cryptography.24 Read the whole story and see how many you can find.
Other authors have made mistakes in their explanations of cryptanalysis. I came across an
amusing example when I thought I was reading something far removed from crypto. If you can
understand a little German, check out Das Geheimnis im Elbtunnel by Heinrich Wolff (National
Textbook Company, 1988). A familiarity with vowel recognition algorithms (see Section 1.12)
shows that the detective in this tale isn’t much better than Poe’s protagonist.
In Honoré de Balzac’s The Physiology of Marriage (published in 1829 as part of his Human
Comedy), the author includes four pages of nonsense.25 This consisted of meaningless combina-
tions of letters, as well as upside-down letters and punctuation. In the years since it first appeared,
it has never been deciphered and it is widely believed among cryptanalysts that Balzac might have
been blowin’ a little smoke. In fact, the nonsense pages changed from one edition to the next.
This nitpicking isn’t meant to detract from Poe’s greatness. Generations of cryptologists trace
their interest in this subject back to reading “The Gold Bug” as children, and I will have more to
say about Poe’s cipher challenge in Chapter 2. For now, note that Poe placed a hidden message in
the following sonnet. Can you find it? The solution is given at the end of this chapter.
An Enigma (1848)
by Edgar Allan Poe
“Seldom we find,” says Solomon Don Dunce,
“Half an idea in the profoundest sonnet.
Through all the flimsy things we see at once
As easily as through a Naples bonnet-
Trash of all trash!–how can a lady don it?
Yet heavier far than your Petrarchan stuff-
Owl-downy nonsense that the faintest puff
Twirls into trunk-paper the while you con it.”
And, veritably, Sol is right enough.
The general tuckermanities are arrant
Bubbles–ephemeral and so transparent-
But this is, now–you may depend upon it-
Stable, opaque, immortal–all by dint
Of the dear names that he concealed within’t.
24 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, pp. 790–791.
25 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 781.
26 Available online at http://sherlock-holm.es/stories/pdf/a4/1-sided/danc.pdf.
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 15
We’ll develop techniques for breaking such ciphers later in this chapter, but let’s make a few
simple observations now. First, observe that either the messages consist of very long words or word
spacing has been disguised. Assuming the latter, either we have spacing eliminated completely or a
special character represents the space. If a special symbol designates the space, it should be of very
high frequency, but the highest frequency symbol in the ciphertexts sometimes appears at the end
of a message. This would be pointless if it really represented a space.
Finally, we hit upon the idea that the flags represent the spaces. This makes our task much
easier, as we now have word divisions. However, there are still some complicating factors. The
messages are short, turn out to include proper names and locations (not just words you’d find in a
dictionary) and there’s at least one typo present! The number of typos varies from edition to edi-
tion, but all have at least one!
Holmes never mentioned the typos, so one may assume that they were errors made by Doyle
and/or the printers and not intended to make things more difficult. Anyone attempting to deci-
pher a message must be able to deal with typos. These occur frequently in real-world messages.
They can be caused by a careless encipherer or by problems in transmission. “Morse mutilation” is
a term sometimes used to describe a message received in Morse code with errors arising from the
transmission process.
Leo Marks, who was in charge of the Special Operations Executive (SOE) code and cipher
group during World War II,27 frequently had to contend with “indecipherables” arising from
agents in occupied Europe improperly enciphering. When he began in 1942, about 25% of incom-
ing messages were indecipherable for one reason or another. At that time, only 3% were recovered,
but Marks gave lectures on recovery techniques to his team and developed new methods. The
27 The Special Operations Executive was roughly the British equivalent of America’s Office of Strategic Services
(OSS). They ran risky operations behind enemy lines.
16 ◾ Secret History: The Story of Cryptology
number of solutions quickly rose to a peak of 92% and finally settled at an average of 80%. An
individual message required an average of 2,000 attempted decipherments before Marks’s team hit
upon the real error.28 It was worth the effort, for the only alternative was to have the agent resend
the message, a risky activity with Nazis constantly on the lookout with direction-finding equip-
ment for agents transmitting radio messages. A second attempt at transmission could easily result
in capture and torture for the agent.
More recently, a copycat killer nicknamed Zodiac II sent an enciphered message to the New
York Post. In order to get the correct decipherment, the cryptanalyst must grapple with some typos
the killer made. See the online exercises for this cipher.
Sherlock Holmes, of course, was able to break the dancing men cipher completely and com-
pose a message of his own using it (Figure 1.9).
Holmes’s adversary in this tale had no idea that the message was from Holmes and this decep-
tion led to his capture. So, the story raises an issue that we’ll visit again in this book. Namely,
how can we be sure that the message comes from the person we think it does? Holmes was able to
impersonate another, because he learned the secret of the cipher, but in an ideal system this would
not be possible.
Many agents were captured and impersonated over the course of World War II. In one instance,
when it was suspected that a particular SOE agent had been compromised and that a Nazi was car-
rying out his end of the radio conversation, a test was devised to ascertain the truth. The suspicious
British radio operator signed off on his message to the agent with HH, short for Heil Hitler. This
was a standard sort of “goodbye” among the Nazis and the operator abroad responded automati-
cally with HH, giving himself away.29
Was Doyle’s use of dancing men to represent letters as creative as it might seem? It turns
out that these fellows had been protecting messages long before Doyle wrote his story. When
confronted with previous uses, he chalked it up to coincidence.30 Doyle’s dancing men are also
reminiscent of some figures in a lost script, known as Rongorongo, which was used by the people
of Easter Island and, oddly, resembles Indus Valley script, another indecipherable at the moment
(Figure 1.10). The idea that there’s a connection between the two scripts is not backed by the lead-
ing experts, as the two cultures were widely separated, not only in terms of distance but also in
time.
Although such topics are only treated in passing here, deciphering lost scripts attracts the inter-
est of a decent number of cryptologists. See the References and Further Reading section at the end
of this chapter for papers Cryptologia has published on Rongorongo script.
“The Adventure of the Dancing Men” is not the only story in which Doyle’s detective encoun-
tered cryptology. Another is discussed in Section 1.21.
28 Marks, Leo, Between Silk and Cyanide: A Codemaker’s War, 1941-1945. The Free Press, New York, pp. 192, 417.
29 Marks, Leo, Between Silk and Cyanide: A Codemaker’s War, 1941-1945. The Free Press, New York, pp. 348–349.
30 Shulman, David, “The Origin of the Dancing Men,” The Baker Street Journal, (New Series), Vol. 23, No. 1,
Figure 1.10 A comparison of Rongorongo (right side of each column) and Indus Valley script
(left side of each column). (Adapted from Imbelloni, Sr., J., “The Easter Island Script and the
Middle-Indus Seals,” The Journal of the Polynesian Society, Vol. 48, No. 1, whole no. 189, March
1939, pp. 60-69, p. 68 cited here.)
A 8.2 N 6.7
B 1.5 O 7.5
C 2.8 P 1.9
D 4.3 Q 0.1
E 12.7 R 6.0
F 2.2 S 6.3
G 2.0 T 9.0
H 6.1 U 2.8
I 7.0 V 1.0
J 0.2 W 2.4
K 0.8 X 0.2
L 4.0 Y 2.0
M 2.4 Z 0.1
If youth, throughout all history, had had a champion to stand up for it; to show a
doubting world that a child can think; and, possibly, do it practically; you wouldn’t
constantly run across folks today who claim that “a child don’t know anything.” A
child’s brain starts functioning at birth; and has amongst its many infant convolu-
tions, thousands of dormant atoms, into which God has put a mystic possibility for
noticing an adult’s act, and figuring out its purport.
Wright was not alone in meeting this challenge. The French author Georges Perec completed
La Disparation (1969) without using a single E. Frequency tables vary from language to language,
but E is commonly high. Perec’s performance was perhaps more impressive, as E is even more fre-
quent in French (17.5%) than in English. Gilbert Adair translated this work without introducing
an E. He titled it A Void and it begins
Today, by radio, and also on giant hoardings, a rabbi, an admiral notorious for his links
to masonry, a trio of cardinals, a trio, too, of insignificant politicians (bought and paid
for by a rich and corrupt Anglo-Canadian banking corporation), inform us all of how
our country now risks dying of starvation. A rumour, that’s my initial thought as I switch
off my radio, a rumour or possibly a hoax. Propaganda, I murmur anxiously – as though,
just by saying so, I might allay my doubts – typical politicians’ propaganda. But public
opinion gradually absorbs it as a fact. Individuals start strutting around with stout clubs.
‘Food, glorious food!’ is a common cry (occasionally sung to Bart’s music), with ordinary
hard-working folk harassing officials, both local and national, and cursing capitalists
and captains of industry. Cops shrink from going out on night shift. In Mâcon a mob
storms a municipal building. In Rocadamour ruffians rob a hangar full of foodstuffs,
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 19
pillaging tons of tuna fish, milk and cocoa, as also a vast quantity of corn – all of it, alas,
totally unfit for human consumption. Without fuss or ado, and naturally without any
sort of trial, an indignant crowd hangs 26 solicitors on a hastily built scaffold in front of
Nancy’s law courts (this Nancy is a town, not a woman) and ransacks a local journal, a
disgusting right-wing rag that is siding against it. Up and down this land of ours looting
has brought docks, shops and farms to a virtual standstill.
Although this paragraph was written without the letter E, it cannot be read without it! By writ-
ing 26 using digits instead of letters, Adair avoided that particular E.
Frequency counts used for the purpose of cryptanalysis first appear in Arabic works. The inter-
est was apparently a byproduct of studying the Koran. All sorts of statistics on letters and words
used in the Koran had been compiled as part of the intense study this work was subject to. Finally,
someone took the next step and applied this data to an enciphered message.
aleph beth gimel daleth he waw zayin heth teth yod kaph
taw sin resh qoph sadhe pe ayin samekh nun mem lamed
shin
This system was used in Jeremiah 25:26 and Jeremiah 51:41 where Sheshach appears as an
enciphered version of Babel. Until it was recognized that this encipherment had been used, biblical
scholars spent many hours puzzling over where Sheshach was located! In Section 1.5, we exam-
ined the Caesar cipher. There is a variant of this system, called a reverse Caesar cipher, for which the
ciphertext alphabet is written backwards before being shifted. If this is suspected, one may simply
transform the ciphertext Atbash style and then break it like a regular Caesar cipher. Unfortunately,
much media attention has been focused on a nonexistent Bible Code, where equidistant sequences
are alleged to reveal information about events from biblical times through the present and into the
future. Rather than waste space on it here, I’ll simply reference a paper that debunks this nonsense
and point out that such “messages” can be found in any book of sufficient length.31
31 Nichols, Randall, “The Bible Code,” Cryptologia, Vol. 22, No. 2, April 1998, pp. 121–133.
20 ◾ Secret History: The Story of Cryptology
I 59 0.9 V 4.1 -
J 7 - W 63.3 3.7
Note: Figures above represent the frequency of occurrence in 1,000 words. V, Q, J, and Z occur so
rarely as terminals that their frequencies cannot be expressed in this table.
The letters most commonly doubled are shown in Table 1.3. Frequency tables for all two-letter
combinations (known as digraphs) can be found in many books and online sources. Figure 1.12
provides such an example. Usually, TH is the most frequent digraph, but for the government tele-
grams used for Friedman’s table EN is the most frequent and TH places fifth.
Pattern word lists are also of value for deciphering without a key. The pattern of the
letters in the word may be looked up as in a dictionary. As an example, consider the pat-
tern ABCADEAE. This is meant to indicate that the first, fourth, and seventh letters are all
the same. Also, the sixth and eighth letters match. No other matches may be present, as B,
C, and D denote distinct letters. There are very few words that fit this form. ARKANSAS,
EXPENDED, and EXPENSES are the only possibilities given in the reference cited here.32
32 Carlisle, Sheila, Pattern Words Three Letters to Eight Letters in Length, Aegean Park Press, Laguna Hills,
California, 1986, p. 65. There are more complete lists that include other possibilities.
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 21
LL 19 FF 9 MM 4
EE 14 RR 6 GG 4
SS 15 NN 5 DD 1.5
OO 12 PP 4.5 AA 0.5
TT 9 CC 4 BB 0.25
Source: Pratt, Fletcher, Secret and Urgent, Bobbs-Merrill, New York, 1939, p. 259.
If such a rare pattern word appears in a MASC, we can quickly obtain several letters in the
cipher alphabet. If the substitutions suggested by ARKANSAS yield impossible words else-
where, simply try the next choice on the list.
There are websites that allow you to enter a pattern and see all of the words, in a particular
language, that fit that pattern. A pair of these are https://design215.com/toolbox/wordpattern.php
and https://www.hanginghyena.com/solvers/cryptogram-helper.
It’s very easy to manipulate data in electronic form, but this was not how things were done
in the old days. An excerpt from an interview with Jack Levine, the creator of several volumes of
pattern word books, sheds some light on this.33
Levine: Cryptography is my hobby. I enjoy doing it. Years ago there were very few
mathematicians working in this area but today there are a good many prestigious
mathematicians studying cryptography, in particular algebraic cryptography, which
by the way is an expression I invented. My pattern word list, which I produced in the
70s, is now the standard work in this area.
Burniston: Tell me about this.
Levine: What I did was to take Webster’s Unabridged Dictionary which has over
500,000 words in it, and copied each word and classified it by its pattern. In other
words, if you wanted to know all six-letter words where the first and fourth letters were
the same, you could go to my book and find all the words with that pattern quickly.
Burniston: Let me get this straight, now. You copied out all the words in Webster’s
Unabridged Dictionary?
Levine: Yes, In fact, I started with the second edition and while I was doing this, the
third edition came out and I more or less had to start the whole thing again. That was
a pain.
Burniston: How long did it take you to do this?
Levine: About 15 years. I had the word list published by the print shop here on cam-
pus [N.C. State] at my own expense and gave the copies to members of the American
Cryptogram Association, of which I am a past president. Now because of the very limited
number of copies, it has become a valuable item. It also probably ruined my eye sight.
33 History of the Math Department at NCSU: Jack Levine, December 31, 1986, interview, https://web.archive.org/
web/20160930110613/http://www4.ncsu.edu/∼njrose/Special/Bios/Levine.html.
22 ◾ Secret History: The Story of Cryptology
Figure 1.12 Frequency table for two-letter combinations. (From Friedman, William F. and
Lambros D. Callimahos, Military Cryptanalytics, Part I, National Security Agency, Washington,
DC, 1956, p. 257.)
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 23
Pattern word lists were used during World War II, although America’s lists weren’t as complete
as those made later by Levine. Amazingly, I’ve been unable to find decent lists from before World
War II. It seems like such an obvious approach to cracking MASCs must be hundreds of years old,
but, if so, where are the lists?
Non-pattern words are also useful to have in lists. These are words that do not use any letter
more than once. They are also called isograms. The record length seems to be 16 letters. A few
examples follow:
uncopyrightables (16 letters)
uncopyrightable (15 letters)34
dermatoglyphics (15 letters—the science of fingerprints)
ambidextrously (14 letters)
thumbscrewing (13 letters)35
sympathized (11 letters)
pitchforked (11 letters)
gunpowdery (10 letters)
blacksmith (10 letters)
prongbucks (10 letters)
lumberjack (10 letters)
Many more words can be added to the shorter word length lists above. How many can you find?
Computer programs that use dictionaries of pattern (and nonpattern) words can rapidly break
the simple ciphers discussed in this chapter. But before we start cracking ciphers, let’s examine
some more tools that may be useful.
34 Fourteen- and 15-letter isograms are from Lederer, Richard, Crazy English, Pocket Books, New York, 1989, p. 159.
35 Ten-, 11-, and 13- letter isograms are from http://www.wordways.com/morenice.htm.
24 ◾ Secret History: The Story of Cryptology
the A column than in the B column. We now look at an algorithm that may be applied to mono-
alphabetic ciphertexts, as well as unknown scripts.
Steps 2 and 3
C E G I L N O R S V W Z Sum Consonant/Vowel
C 0 1 0 0 0 0 1 0 0 0 0 0 2 C
E 1 0 0 0 1 0 0 3 0 0 2 0 7 V
G 0 0 0 0 0 2 1 0 0 0 0 0 3 C
I 0 0 0 0 0 2 0 0 0 0 0 2 4 C
L 0 1 0 0 0 0 0 0 1 0 0 0 2 C
N 0 0 2 2 0 0 1 0 0 0 0 0 5 C
O 1 0 1 0 0 1 0 0 0 1 2 0 6 C
R 0 3 0 0 0 0 0 0 0 0 0 0 3 C
S 0 0 0 0 1 0 0 0 0 0 0 0 1 C
V 0 0 0 0 0 0 1 0 0 0 0 0 1 C
W 0 2 0 0 0 0 2 0 0 0 0 0 4 C
Z 0 0 0 2 0 0 0 0 0 0 0 0 2 C
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 25
Step 4
E looks like a vowel, because it has the highest row sum. We then adjust the row sums.
Step 5
C E G I L N O R S V W Z Sum Consonant/Vowel
C 0 1 0 0 0 0 1 0 0 0 0 0 0 C
E 1 0 0 0 1 0 0 3 0 0 2 0 7 V
G 0 0 0 0 0 2 1 0 0 0 0 0 3 C
I 0 0 0 0 0 2 0 0 0 0 0 2 4 C
L 0 1 0 0 0 0 0 0 1 0 0 0 0 C
N 0 0 2 2 0 0 1 0 0 0 0 0 5 C
O 1 0 1 0 0 1 0 0 0 1 2 0 6 V
R 0 3 0 0 0 0 0 0 0 0 0 0 −3 C
S 0 0 0 0 1 0 0 0 0 0 0 0 1 C
V 0 0 0 0 0 0 1 0 0 0 0 0 1 C
W 0 2 0 0 0 0 2 0 0 0 0 0 0 C
Z 0 0 0 2 0 0 0 0 0 0 0 0 2 C
Back to Step 4
Now O looks like a vowel. We adjust the row sums again (Step 5):
C E G I L N O R S V W Z Sum Consonant/Vowel
C 0 1 0 0 0 0 1 0 0 0 0 0 -2 C
E 1 0 0 0 1 0 0 3 0 0 2 0 7 V
G 0 0 0 0 0 2 1 0 0 0 0 0 1 C
I 0 0 0 0 0 2 0 0 0 0 0 2 4 C
L 0 1 0 0 0 0 0 0 1 0 0 0 0 C
N 0 0 2 2 0 0 1 0 0 0 0 0 3 C
O 1 0 1 0 0 1 0 0 0 1 2 0 6 V
R 0 3 0 0 0 0 0 0 0 0 0 0 −3 C
S 0 0 0 0 1 0 0 0 0 0 0 0 1 C
V 0 0 0 0 0 0 1 0 0 0 0 0 −1 C
W 0 2 0 0 0 0 2 0 0 0 0 0 −4 C
Z 0 0 0 2 0 0 0 0 0 0 0 0 2 C
After adjusting for O, the next vowel appears to be I:
C E G I L N O R S V W Z Sum Consonant/Vowel
C 0 1 0 0 0 0 1 0 0 0 0 0 −2 C
E 1 0 0 0 1 0 0 3 0 0 2 0 7 V
G 0 0 0 0 0 2 1 0 0 0 0 0 1 C
I 0 0 0 0 0 2 0 0 0 0 0 2 4 V
L 0 1 0 0 0 0 0 0 1 0 0 0 0 C
N 0 0 2 2 0 0 1 0 0 0 0 0 −1 C
O 1 0 1 0 0 1 0 0 0 1 2 0 6 V
R 0 3 0 0 0 0 0 0 0 0 0 0 −3 C
S 0 0 0 0 1 0 0 0 0 0 0 0 1 C
V 0 0 0 0 0 0 1 0 0 0 0 0 −1 C
W 0 2 0 0 0 0 2 0 0 0 0 0 −4 C
Z 0 0 0 2 0 0 0 0 0 0 0 0 −2 C
26 ◾ Secret History: The Story of Cryptology
Continuing the process, G and S are declared vowels! The technique is not perfect, but it works
much better with a text of greater length. This procedure is well suited for implementation on a
computer, so we can often quickly separate the vowels from the consonants in a ciphertext. The
most frequent characters are usually E and T and with the help this technique gives us in distin-
guishing them, we are well on our way to a solution.
36 Knight, H. Gary, “Cryptanalyst’s Corner,” Cryptologia, Vol. 2, No. 1, January 1978, pp. 68–74.
37 McCormick, Donald, Love in Code, Eyre Methuen Ltd, London, UK, 1980, pp. 4–5.
38 Kruh, Louis, “The Churchyard Ciphers,” Cryptologia, Vol. 1, No. 4, October 1977, pp. 372–375.
39 Kruh, Louis, “The Churchyard Ciphers,” Cryptologia, Vol. 1, No. 4, October 1977, pp. 372–375.
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 27
Figure 1.13 Find the message hidden on the cover of this book. (Courtesy of Houghton Mifflin
and the Tolkien estate.)
28 ◾ Secret History: The Story of Cryptology
Figure 1.14 A cipher created by Wolfgang Amadeus Mozart—not intended for performance!
(Retyped by Nicholas Lyman from McCormick, Donald, Love in Code, Eyre Methuen Ltd.,
London, UK, 1980, p. 49.)
A B C J N O P W
D E F K L Q R S X Y
G H I M T U V Z
Figure 1.17 Tales from the crypt(ologist)? (Image created by Josh Gross.)
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 29
We can begin our attack by constructing a frequency table for the ciphertext letters (Table 1.4).
A 1 N 5
B 4 O 0
C 5 P 0
D 4 Q 7
E 0 R 0
F 0 S 11
G 7 T 1
H 1 U 6
I 2 V 2
J 6 W 3
K 0 X 1
L 0 Y 0
M 4 Z 3
The letter S sticks out as having the largest frequency; therefore, it’s likely that it represents the
plaintext letter E. The three letter ciphertext word JDS appears twice. The most common three
letter combination (as well as the most common three letter word) is THE. This agrees with our
suspicion that S represents E. Is it likely that J represents T? T is very common in English and
the ciphertext letter J appears 6 times, so it seems plausible. Writing our guesses above the cor-
responding ciphertext letters gives:
THE T E T T E E E H E EE
JDS CGWMUJNCQV NSIBVMJCBG QGW CJU ZBGUSHMSGZSU DQIS ASSG Q
TE THE H E.
WCUQUJSN XBN JDS DMTQG NQZS.
30 ◾ Secret History: The Story of Cryptology
There are many possible ways to proceed from here. We have a 12-letter word, ZBGUSHMSGZSU,
so let’s consider its pattern. It has the form ABCDEFGECAED. At one of the websites referenced
earlier, you can type in this pattern and you’ll see that there’s only one word that fits, namely
CONSEQUENCES. Substituting for all of the letters now revealed gives
The fifth word cannot be anything other than ITS. Placing I above every C quickly leads to
more letters and the message is revealed to be:
This is a quote from former mathematician Theodore Kaczynski. The substitutions used to
encipher it follow.
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z plaintext
Q A Z W S X E D C R F V T G B Y H N U J M I K O L P ciphertext
Do you see the pattern in the substitutions now that it is entirely revealed?40
Notice that the statistics given in this chapter helped, but they don’t match our message per-
fectly. T was the second most frequent letter in our table, but it’s tied for fourth place in the sample
cipher. Nevertheless, the frequency was high enough to make it seem like a reasonable substitu-
tion. In general, we may make some incorrect guesses in trying to break a cipher. When this hap-
pens, simply backtrack and try other guesses!
Now that we’ve achieved some skill in breaking MASCs, it’s time to laugh at those who are
not so well informed; may they forever wallow in their ignorance! One fellow, whose name has
been lost to history, unwittingly displayed his ignorance when he proudly explained how he had
deciphered a message of the type we’ve been examining.41
From the moment when the note fell into my hands, I never stopped studying from
time to time the signs which it bore…. About 15 years more or less passed, until the
moment when God (Glory to Him!) did me the favor of permitting me to comprehend
these signs, although no one taught them to me…
On a more serious note, Chevalier de Rohan’s death was a direct consequence of his inability
to decipher such a message.42 The original message was in French and the ciphertext read
43 For more information, see Chapter 3 of Bauer, Craig P., Unsolved! The History and Mystery of the World’s Greatest
Ciphers from Ancient Egypt to Online Secret Societies, Princeton University Press, Princeton, New Jersey, 2017.
32 ◾ Secret History: The Story of Cryptology
1 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
2 2 4 6 8 10 12 14 16 18 20 22 24 0 2 4 6 8 10 12 14 16 18 20 22 24
3 3 6 9 12 15 18 21 24 1 4 7 10 13 16 19 22 25 2 5 8 11 14 17 20 23
4 4 8 12 16 20 24 2 6 10 14 18 22 0 4 8 12 16 20 24 2 6 10 14 18 22
5 5 10 15 20 25 4 9 14 19 24 3 8 13 18 23 2 7 12 17 22 1 6 11 16 21
6 6 12 18 24 4 10 16 22 2 8 14 20 0 6 12 18 24 4 10 16 22 2 8 14 20
7 7 14 21 2 9 16 23 4 11 18 25 6 13 20 1 8 15 22 3 10 17 24 5 12 19
8 8 16 24 6 14 22 4 12 20 2 10 18 0 8 16 24 6 14 22 4 12 20 2 10 18
9 9 18 1 10 19 2 11 20 3 12 21 4 13 22 5 14 23 6 15 24 7 16 25 8 17
10 10 20 4 14 24 8 18 2 12 22 6 16 0 10 20 4 14 24 8 18 2 12 22 6 16
11 11 22 7 18 3 14 25 10 21 6 17 2 13 24 9 20 5 16 1 12 23 8 19 4 15
12 12 24 10 22 8 20 6 18 4 16 2 14 0 12 24 10 22 8 20 6 18 4 16 2 14
13 13 0 13 0 13 0 13 0 13 0 13 0 13 0 13 0 13 0 13 0 13 0 13 0 13
34 ◾ Secret History: The Story of Cryptology
14 14 2 16 4 18 6 20 8 22 10 24 12 0 14 2 16 4 18 6 20 8 22 10 24 12
15 15 4 19 8 23 12 1 16 5 20 9 24 13 2 17 6 21 10 25 14 3 18 7 22 11
16 16 6 22 12 2 18 8 24 14 4 20 10 0 16 6 22 12 2 18 8 24 14 4 20 10
17 17 8 25 16 7 24 15 6 23 14 5 22 13 4 21 12 3 20 11 2 19 10 1 18 9
18 18 10 2 20 12 4 22 14 6 24 16 8 0 18 10 2 20 12 4 22 14 6 24 16 8
19 19 12 5 24 17 10 3 22 15 8 1 20 13 6 25 18 11 4 23 16 9 2 21 14 7
20 20 14 8 2 22 16 10 4 24 18 12 6 0 20 14 8 2 22 16 10 4 24 18 12 6
21 21 16 11 6 1 22 17 12 7 2 23 18 13 8 3 24 19 14 9 4 25 20 15 10 5
22 22 18 14 10 6 2 24 20 16 12 8 4 0 22 18 14 10 6 2 24 20 16 12 8 4
23 23 20 17 14 11 8 5 2 25 22 19 16 13 10 7 4 1 24 21 18 15 12 9 6 3
24 24 22 20 18 16 14 12 10 8 6 4 2 0 24 22 20 18 16 14 12 10 8 6 4 2
25 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 35
Then each cipher letter is obtained by taking 11M + 8 (modulo 26), where M is the message
letter. Make sure to use the mod 26 multiplication table to save time.
11(7) + 8 = 7 11(14) + 8 = 6 11(22) + 8 = 16 11(0) + 8 = 8 11(17) + 8 = 13
11(4) + 8 = 0 11(24) + 8 = 12 11(14) + 8 = 6 11(20) + 8 = 20 11(8) + 8 = 18
11(12) + 8 = 10 11(0) + 8 = 8 11(5) + 8 =11 11(5) + 8 = 11 11(8) + 8 = 18
11(13) + 8 = 21 11(4) + 8 = 0
So our numerical ciphertext is 7, 6, 16, 8, 13, 0, 12, 6, 20, 18, 10, 8, 11, 11, 18, 21, 0. Converting
back to letters and replacing punctuation (not good for security!) we have
HGQ INA MGU? S’K ILLSVA.
To decipher, we need to convert back to numbers and apply the equation M = a−1(C − b). Our
choice for a was 11 and the mod 26 multiplication table shows (11)(19) = 1, so a−1 is 19. Go ahead
and apply M = 19(C − 8) to each letter of ciphertext to recover the original message, if you are at all
unsure. You can do the subtraction, followed by the multiplication or convert the formula like so:
and then use the formula M = 19C + 4 to decipher in the same manner as we originally enciphered.
Although the affine cipher’s keyspace is so small that we don’t need to look for anything more
sophisticated than a brute-force attack to rapidly break it, I’ll point out another weakness. If we’re
able to obtain a pair of distinct plaintext letters and their ciphertext equivalents (by guessing how
the message might begin or end, for example), we can usually recover the key mathematically.
Example 1
Suppose we intercept a message and we guess that it begins DEAR…. If the first two ciphertext
letters are RA, we can pair them up with D and E to get
R = Da + b , A = Ea + b
Replacing the letters with their numerical equivalents gives
17 = 3a + b , 0 = 4a + b
We have several methods we can use to solve this system of equations:
1. Linear Algebra offers several techniques.
2. We can solve for b in one of the equations and substitute for it in the other to get an equa-
tion with one unknown, a.
3. We can subtract one equation from the other to eliminate the unknown b.
Let’s take approach 3.
17 = 3a + b
− ( 0 = 4a + b )
17 = − a
Since we are working modulo 26, the solution becomes a = −17 = 9. Plugging a = 9 into 0 = 4a +
b, we get 36 + b = 0, which is b = −36 = 16. We’ve now completely recovered the key. It is (9, 16).
36 ◾ Secret History: The Story of Cryptology
Example 2
Suppose we intercept a message sent by Cy Deavours and we guess that the last two ciphertext
letters arose from his signature: CY. If the ciphertext ended LD, then we have
L = a C + b, D = aY + b
Replacing the letters with their numerical equivalents gives
11 = 2a + b , 3 = 24a + b
Let’s take approach 3 again to solve this system of equations:
3 = 24a + b
− (11 = 2a + b )
− 8 = 22a
Because we’re working modulo 26, this result becomes 18 = 22a. Looking at our mod 26 multipli-
cation table, we see that there are two possible values for a, 2 and 15. Let’s see what each equation
gives us, when we plug in a = 2.
3 = 24a + b
− (11 = 2a + b )
− 8 = 22a
Now let’s make it more general by letting C1 and C2 represent the ciphertext values and M1 and
M2 the plaintext values:
C2 = M 2a + b
− (C1 = M1a + b )
C 2 − C1 = ( M 2 − M1 )a
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 37
Thus, we’ll fail to have a unique solution whenever the equation C2 − C1 = (M2 − M1)a fails to
have a unique solution. If M2 − M1 has an inverse modulo 26 (something we can multiply by to get
1 modulo 26), then there will be a unique solution for a, namely a = (M2 − M1)−1(C2 − C1), where
(M2 − M1)−1 denotes the inverse of M2 − M1.
In Example 1, M2 − M1 was 1, which is invertible mod 26. However, for Example 2, M2 − M1
was 22, which is not invertible mod 26 (there is no number in the mod 26 table that 22 can be
multiplied by to get 1).
Before we move on to the next section, a challenge is presented. Can you find the hidden mes-
sage on the book cover reproduced in Figure 1.22? If not, try again after reading the next few pages.
Figure 1.22 Cryptic science fiction cover art. (Courtesy of DAW Books, www.dawbooks.com.)
A ._ N _.
B _… O ___
C _._. P ._ _.
D _.. Q _ _._
E . R ._.
F .._. S …
G _ _. T _
H …. U .._
I .. V …_
J ._ _ _ W ._ _
K _._ X _.._
L ._.. Y ._._
M __ Z _ _..
Although it looks like a substitution cipher, you’ll get confused looks if you refer to this system
as Morse cipher. Notice that the most common letters have the shortest representations, whereas
the rarest letters have the longest. This was done intentionally so that messages could be conveyed
more rapidly.
There are also combinations of dots and dashes representing the digits 0 through 9, but as
these can be spelled out, they are not strictly necessary. Notice that the most common letters, E
and T, have single character representations, while V is represented by four characters. V is easy to
remember as the beginning of Beethoven’s Fifth Symphony. The allies used V for victory during
World War II and made use of Beethoven’s Fifth Symphony for propaganda purposes.
Look closely at the back of the coin shown in Figure 1.23. Notice the series of dots and dashes
around the perimeter that spell out a message in Morse code. Start reading clockwise from the
bottom of the coin just to the left of the “N” in “CENTS” and the message WE WIN WHEN
WE WORK WILLINGLY is revealed.44 There have been much more disturbing uses of Morse
code; for example,45
[Jeremiah] Denton is best known for the 1966 North Vietnamese television interview
he was forced to give as a prisoner, in which he ingeniously used the opportunity to
communicate to American Intelligence. During the interview Denton blinked his eyes
in Morse code to spell out the word “T-O-R-T-U-R-E” to communicate that his cap-
tors were torturing him and his fellow POWs.
44 Anonymous, “DPEPE DPJO,” Cryptologia, Vol. 1, No. 3, July 1977, pp. 275–277, picture on p. 275.
45 http://en.wikipedia.org/wiki/Jeremiah_Denton.
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 39
Figure 1.23 A Canadian coin with a hidden message (Thanks to Lance Snyder for helping with
this image.).
As a historical note, in 1909, the international distress signal (SOS) which in Morse code is …
--- …, was first radioed by Jack Binns, when his ship, the S.S. Republic collided with the S.S. Florida.
Morse code offers no secrecy, because there isn’t a secret key. The substitutions made are available
to everyone. In fact, broadcast messages are so easy to intercept, they may as well be sent directly to
the enemy. By comparison, it’s much more difficult to get messages that are conveyed by a courier.
Thus, the telegraph (and radio) made cryptology much more important. If the enemy is going to
get copies of your messages, they better be well protected. The convenience of telegraph and radio
communication, combined with the usual overconfidence in whatever means of encipherment is
used, makes this a very appealing method. The telegraph was first used for military purposes in the
Crimean War (1853–1856) and then to a much greater extent in the U.S. Civil War.
It may appear that Morse code only requires two symbols, dots and dashes, but there is a
third—the space. If all of the dots and dashes are run together, we cannot tell where one letter
ends and the next begins; for example,46
There is another, more modern, system for coding that prevents this problem and truly requires
just two characters. It’s due to David. A. Huffman (Figure 1.24), who came up with the idea while
working on his doctorate in computer science at MIT.47
46 http://rubyquiz.com/quiz121.html.
47 Huffman, David A., “A Method for the Construction of Minimum-Redundancy Codes,” Proceedings of the
Institute of Radio Engineers, Vol. 40, No. 9, September 1952, pp. 1098–1101.
40 ◾ Secret History: The Story of Cryptology
Figure 1.24 David A. Huffman (1925–1999). (Courtesy of Don Harris, University of California,
Santa Cruz.)
Huffman codes make use of the same idea as Morse code. Instead of representing each character
by eight bits, as is standard for computers, common characters are assigned shorter representations
while rarer characters receive longer representations (see Table 1.7). The compressed data is then
stored along with a key giving the substitutions that were used. This is a simple example of an impor-
tant area known as data compression. High compression rates allow information to be sent more rap-
idly, as well as take up less space when stored. Zip files are an example. If not zipped, the download
time for the file would be longer. Huffman coding is also used to compress images such as JPEGs.
Using the code shown in Table 1.7, MATH would be expressed as 1101001000011001. You may
now try to break the 0s and 1s up any way you like, but the only letters you can get are MATH. To
see why this is the case, examine the binary graph in Figure 1.25.
To read this graph, start at the top and follow the paths marked by 0s and 1s until you get to a
letter. The path you followed is the string of bits that represents that letter in our Huffman code. In
Morse code, the letter N (– .) could be split apart to get – and . making TE, but this cannot happen
with the letters represented in the graph shown in Figure 1.25, because a particular string of bits
that leads to a letter doesn’t pass through any other letters on the way. We only labeled letters at the
ends of paths. The tree could be extended out more to the right to include the rest of the alphabet,
but enough is shown to make the basic idea clear.
The next level of Huffman coding is to replace strings of characters with bit strings. A common
word may be reduced to less space than a single character normally requires, while a rarer word
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 41
E 000 M 11010
T 001 W 11011
A 0100 F 11100
O 0101 G 111010
I 0110 Y 111011
N 0111 P 111100
S 1000 B 111101
H 1001 V 111110
R 1010 K 1111110
D 10110 J 11111110
L 10111 X 111111110
C 11000 Q 1111111110
U 11001 Z 1111111111
0 1
0 1 0 1
0 1 0 1 0 1
E T
0 1 0 1 0 1 0 1 0
A O I N S H R
0 1 0 1
D L
0 1 0 1
C U M W
becomes longer after encoding. This method should only be applied to large files. It would not be
efficient to encode a short letter in this manner.
Although the terms encode and decode are used in this context, data compression is not the
same as coding theory. Coding theory lengthens messages in an effort to make garbled bits recov-
erable. It adds redundancy, while data compression seeks to remove it, as does cryptography.
The coding technique described above earned Huffman a place in this book, but he may have
done other work that is relevant. A 1955 letter written by John Nash48 to Major Grosjean of the
National Security Agency contains the following passage.49
Recently a conversation with Prof. Huffman here indicated that he has recently been
working on a machine with similar objectives. Since he will be consulting for NSA I
shall discuss my results with him.
Many of the great mathematicians and computer scientists of the second half of the twentieth
century have connections to NSA. I expect that declassification efforts will reveal fascinating sto-
ries in the decades to come.
nash_letters1.pdf.
50 See the References and Further Reading at the end of this chapter.
51 Said of the French cryptanalyst Viète by King Philip II of Spain when writing the Pope in an attempt to get
Viète tried in a Cardinal’s Court. See Singh, Simon, The Code Book, Doubleday, New York, 1999, p. 28
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 43
Figure 1.26 Alphabet of the Magi. (From Christian, Paul (pseudonym for Jean Baptiste Pitois).
(From Histoire de la Magie, du Monde Surnaturel et da la Fatalité à travers les Temps et les
peuples, 1870, p. 177.)
Figure 1.27 Chaucer’s cipher. (Courtesy of the David Kahn Collection, National Cryptologic
Museum, Fort Meade, Maryland.)
44 ◾ Secret History: The Story of Cryptology
1.19 Nomenclators
From 1400 to 1850, nomenclators were the kings of the European nations’ cipher systems. They
are basically a combination of a code and a MASC, which may be used to spell out words not pro-
vided for in the code portion. The code portions initially just consisted of names, hence, nomencla-
tor. Figure 1.28 shows one used by Mary, Queen of Scots.
Figure 1.28 Nomenclator used by Mary, Queen of Scots. (From Singh, Simon, The Code Book,
Doubleday, New York, 1999, p. 38. With permission.)
You can probably guess what happened when Mary’s life hung in the balance, dependent on
whether or not messages in this cipher could or could not be read by a cryptanalyst without access
to the key. Mary was wise to include nulls, but there were no homophones (different symbols
representing the same letter, as in the Zodiac ciphers) and far too few words in the code portion.
Mary was crowned queen of Scotland in 1543 at only nine months of age. In 1559, she married
Francis, the dauphin of France. It was hoped that this would serve to strengthen the ties between the
two Roman Catholic nations, but he died in 1560 and in the meanwhile Scotland was becoming more
and more Protestant. Mary then married her cousin Henry Stewart, who caused so much trouble for
Scotland that it was planned for him to die by having his house blown up while he was inside. He
escaped the explosion only to die of strangulation, which certainly looked suspicious. Mary found a
third husband, James Hepburn, but he was exiled in 1567 by the now powerful Protestant population
of Scotland, and Mary was imprisoned. She escaped and with an army of her supporters attempted to
reclaim her crown, but the attempt failed and so she fled to England and the imagined protection of her
cousin Queen Elizabeth I. But, Queen Elizabeth, knowing that England’s Catholics considered Mary
the true Queen of England, had Mary imprisoned to minimize any potential threat.
Angered by England’s persecution of Catholics, Mary’s former page, Anthony Babbington, and
others put together a plan in 1586 to free Mary and assassinate Queen Elizabeth. The conspirators
decided they could not carry out their plans without Mary’s blessing and managed to smuggle a
message to her in prison. It was enciphered using the nomenclator pictured in Figure 1.29. However,
the conspirators didn’t realize that Gilbert Gifford, who helped to smuggle the letter (and earlier mes-
sages from other supporters of Mary) was a double agent. He turned the messages over to Sir Francis
Walsingham, Principal Secretary to Queen Elizabeth. Thus, they were copied before being delivered
to Mary, and the cryptanalyst Thomas Phelippes succeeded in breaking them.
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 45
Mary responded, supporting the conspiracy, as long as her liberation was before or simultane-
ous with the assassination, as she feared for her life if the assassination came first. Like the previous
message, this one was read by Phelippes and relayed to Walsingham. Both Babbington and Queen
Mary were now marked for death, but the other conspirators remained unnamed. To snare the
rest, Walsingham had Phelippes add a bit more enciphered text to Mary’s response in the style of
Mary’s own hand (Figure 1.29).
Figure 1.29 A forged ciphertext. (Courtesy of the David Kahn Collection, National Cryptologic
Museum, Fort Meade, Maryland.)
Massachusetts, 1930.
54 Pennypacker, Morton, General Washington’s Spies, Long Island Historical Society, Brooklyn, New York, 1939,
facing p. 50.
46 ◾ Secret History: The Story of Cryptology
Figure 1.30 A nomenclator used in the American Revolution. (From George Washington Papers
at the Library of Congress, 1741-1799: Series 4. General Correspondence. 1697-1799, Talmadge,
1783, Codes.)
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 47
Although Culper Junior’s identity was eventually revealed, we still don’t have a very good idea
of his appearance. Pennypacker identified a silhouette (Figure 1.31, left) as Robert Townsend, but
it is actually Townsend’s brother. The only depiction currently accepted as Robert (Figure 1.31,
right) was drawn by his nephew Peter Townsend.
During the Revolutionary War, General Washington only spent $17,617 on espionage activi-
ties, and he paid for these activities out of his own pocket! He did later bill Congress, but this sort
of budget contrasts very strongly with the situation today. See Chapter 12.
n +1
Size ≈ Max − 1
n
where n is the number of codegroups observed and Max is the largest value among the observed
codegroups.
55 Two part codes, which avoid this pitfall, are discussed in Section 4.2.
56 For a derivation see Ghahramani, Saeed, Fundamentals of Probability, Prentice-Hall, Upper Saddle River, New
Jersey, 1996, pp.146–148. This is not, however, the first place the result appeared or was proven.
48 ◾ Secret History: The Story of Cryptology
For example, if an intercepted message contains 22 codegroups and the largest is 31,672, then
the number of entries in the code book is roughly
n +1 22 + 1
Max − 1 = (31,672) − 1 ≈ 33,111
n 22
Subtracting the 1 makes no practical difference here, but it was included to be mathematically
correct. The formula above has other uses. In the probability text I referenced for the derivation, it
is used to estimate the number of enemy tanks based on how many were observed and the highest
number seen on them.
Once we know the size of the code, a dictionary of similar size may be consulted. Because it
is not likely to be the key, we cannot expect to simply plug in the word in the position of the code
number; however, there’s a good chance that the first letter, and possibly the second, of that word
may be correct. Thus, we can jot down assumptions for the first two letters of each word. This,
when combined with context, may allow us to guess some phrases of plaintext.
Table 1.8 shows where words beginning with the given letters stop in a dictionary of 60,000–
65,000 words.
57 Doyle, Arthur Conan, The Valley of Fear, Doran, New York, 1915. This story was first serialized in Strand Magazine
from September 1914 to May 1915. Three months after the New York edition, a British edition appeared.
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 49
Figure 1.32 A solution to a book code. (Courtesy of the David Kahn Collection, National
Cryptologic Museum, Fort Meade, Maryland.)
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 51
1 2 3
F A M Y
I B N Z
L C O
N D P
P E Q
T F R
U G S
V H T
W I U
X J V
Y K W
Z L X
Thus, HELLO would become V1 P1 Z1 Z1 L2. The book promotes free thinking and
exploration in the spirit of Charles Fort, who is quoted at various points.
Bauer, Craig P., Discrete Encounters, CRC/Chapman & Hall, Boca Raton, Florida, 2020. If you like the
style of Secret History: The Story of Cryptology, you might also like this book. It merges history with the
presentation of discrete mathematics, nearly all of which finds applications in cryptology.
Bellovin, Steven, M., Compression, Correction, Confidentiality, and Comprehension: A Modern Look
at Commercial Telegraph Codes, Department of Computer Science, Columbia University, New
York, 2009, available online at http://www.usenix.org/events/sec09/tech/slides/bellovin.pdf. This
PowerPoint® presentation on commercial code books provides many entertaining examples, such as
this excerpt from The Theatrical Cipher Code (1905):
Filibuster Chorus girls who are shapely and good looking and can sing
Hunt, Arthur S., “A Greek Cryptogram,” Proceedings of the British Academy, Vol. 15, 1929, pp. 127–134.
This easy-to-read paper presents a Greek ciphertext in which the letters were turned half over or modi-
fied in other small ways to disguise the writing. No knowledge of Greek is needed to understand this
paper, as an English translation of the ciphertext is provided. Unfortunately, an approximate date for
the ciphertext examined (The Michigan Cryptographic Papyrus) is not given.
Kahn, David, The Codebreakers, Second Edition, Scribner, New York, 1996. Kahn surveys the cryptography
of the entire ancient world in Chapter 2 of his classic history.
The following three references deal with controversial decipherments. I’m simply providing the titles; it’s up
to you to decide whether or not the claims are correct.
Landsverk, Ole G., Ancient Norse Messages on American Stones, Norseman Press, Glendale, California, 1969.
Landsverk, Ole G., “Cryptography in Runic Inscriptions,” Cryptologia, Vol. 8, No. 4, October 1984, pp.
302–319.
Mongé, Alf and Ole G. Landsverk, Norse Medieval Cryptography in Runic Carvings, Norseman Press,
Glendale, California, 1967.
Reeds, Jim, Commercial Code Book Database, Mar 23, 2001, archived at https://web.archive.org/
web/20140130084013/http://www.dtc.umn.edu:80/∼reedsj/codebooks.txt. This online source pro-
vides bibliographic details of 1,745 commercial code books. Here’s your checklist collectors!
On Poe
Brigham, Clarence S., “Edgar Allen Poe’s Contributions to Alexander’s Weekly Messenger,” Proceedings of the
American Antiquarian Society, Vol. 52, No. 1, April 1942, pp. 45–125. This paper marks the rediscov-
ery of Poe’s columns on cryptography.
Friedman, William F., “Edgar Allan Poe, Cryptographer,” American Literature, Vol. 8, No. 3, November
1936, pp. 266–280.
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 53
Friedman, William F., “Edgar Allan Poe, Cryptographer,” Signal Corps Bulletin, No. 97, July–September
1937, pp. 41–53; Friedman, William F., “Edgar Allan Poe, Cryptographer (Addendum),” Signal Corps
Bulletin, No. 98, October–December 1937, pp. 54–75. These items were reprinted in Friedman,
William F., editor, Cryptography and Cryptanalysis Articles, Vol. 2, Aegean Park Press, Laguna Hills,
California, 1976.
Pirie, David, The Patient’s Eyes: The Dark Beginnings of Sherlock Holmes, St. Martin’s Minotaur, New York,
2002. This is a novel that involves some cryptanalysis. Interestingly, the author reproduced Poe’s fre-
quency ordering and introduced another error by omitting the letter x.
Silverman, Kenneth, Edgar A. Poe: Mournful and Never-ending Remembrance, HarperCollins, New York,
1991.
Wimsatt, Jr., William K., “What Poe Knew about Cryptography,” Publications of the Modern Language
Association, Vol. 58, No. 3, September 1943, pp. 754–779.
Not of Interest
Rosenheim, Shawn James, The Cryptographic Imagination: Secret Writing from Edgar Poe to the Internet,
Parallax: Re-Visions of Culture and Society, The Johns Hopkins University Press, Baltimore,
Maryland, 1997.
For details, see the following review by a professor of computer science at the University of Waterloo:
Shallit, Jeffrey, “Book review of Menezes, van Oorschot, and Vanstone, Handbook of Applied Cryptography,
and Rosenheim, The Cryptographic Imagination: Secret Writings from Edgar Poe to the Internet,”
American Mathematical Monthly, Vol. 106, No. 1, January 1999, pp. 85–88.
Schenk, Remsen Ten Eyck, “Holmes, Cryptanalysis and the Dancing Men,” The Baker Street Journal, (New
Series), Vol. 5, No. 2, April 1955, pp. 80–91.
Schorin, Howard R., “Cryptography in the Canon,” The Baker Street Journal, (New Series), Vol. 13,
December 1963, pp. 214–216.
Shulman, David, “Sherlock Holmes: Cryptanalyst,” The Baker Street Journal, (Old Series), Vol. 3, 1948, pp.
233–237.
Shulman, David, “The Origin of the Dancing Men,” The Baker Street Journal, (New Series), Vol. 23, No. 1,
March 1973, pp. 19–21.
Trappe, Wade and Lawrence C. Washington, Introduction to Cryptography with Coding Theory, Prentice
Hall, Upper Saddle River, New Jersey, 2002, pp. 26–29. These authors summarize the story (a sort of
Cliff’s Notes edition) and discuss the typos in various editions.
For more examples of codes and ciphers used in fiction, John Dooley is the person to turn to:
Dooley, John F., “Codes and Ciphers in Fiction: An Overview,” Cryptologia, Vol. 29, No. 4, October 2005,
pp. 290–328. For an updated list, go to https://www.johnfdooley.com/ and follow the link “Crypto
Fiction.” As of October 7, 2020, this list contains 420 examples.
On RongoRongo script
If you’d like to learn more, these Rongorongo references, in turn, reference many more books and papers.
Fischer, Steven Roger, Glyphbreaker, Copernicus, New York, 1997.
Melka, Tomi S., “Structural Observations Regarding the RongoRongo Tablet ‘Keiti’,” Cryptologia, Vol. 32,
No. 2, January 2008. pp. 155–179.
Melka, Tomi S., “Some Considerations about the Kohau Rongorongo Script in the Light of Statistical
Analysis of the ‘Santiago Staff’,” Cryptologia, Vol. 33, No. 1, January 2009, pp. 24–73.
Melka, Tomi S., and Robert M. Schoch, “Exploring a Mysterious Tablet from Easter Island: The Issues of
Authenticity and Falsifiability in Rongorongo Studies,” Cryptologia, Vol. 44, No. 6, November 2020,
pp. 482–544.
Wieczorek, Rafal, “Putative Duplication Glyph in the Rongorongo Script,” Cryptologia, Vol. 41, No. 1,
January 2017, pp. 55–72.
On Arabic Cryptology
Al-Kadi, Ibraham A., “Origins of Cryptology: The Arab Contributions,” Cryptologia, Vol. 16, No. 2, April
1992, pp. 97–126.
Mrayati, M., Y. Meer Alam and M.H. at-Tayyan, series editors, Series on Arabic Origins of Cryptology,
Volume 1: al-Kindi’s Treatise on Cryptanalysis, KFCRIS (King Faisal Center for Research and Islamic
Studies) & KACST (King Abdulaziz City for Science and Technology), Riyadh, 2003.
Mrayati, M., Y. Meer Alam and M.H. at-Tayyan, series editors, Series on Arabic Origins of Cryptology, Volume
2: Ibn ‘Adlān’s Treatise al-mu’allaf lil-malik al-’Ašraf, KFCRIS (King Faisal Center for Research and
Islamic Studies) & KACST (King Abdulaziz City for Science and Technology), Riyadh, 2003.
Mrayati, M., Y. Meer Alam and M.H. at-Tayyan, series editors, Series on Arabic Origins of Cryptology,
Volume 3: Ibn ad-Duryahim’s Treatise on Cryptanalysis, KFCRIS (King Faisal Center for Research
and Islamic Studies) & KACST (King Abdulaziz City for Science and Technology), Riyadh, 2004.
Mrayati, M., Y. Meer Alam and M.H. at-Tayyan, series editors, Series on Arabic Origins of Cryptology,
Volume 4: Ibn Dunaynīr’s Book: Expositive Chapters on Cryptanalysis, KFCRIS (King Faisal Center
for Research and Islamic Studies) & KACST (King Abdulaziz City for Science and Technology),
Riyadh, 2005.
Mrayati, M., Y. Meer Alam and M.H. at-Tayyan, series editors, Series on Arabic Origins of Cryptology,
Volume 5: Three Treatises on Cryptanalysis of Poetry, KFCRIS (King Faisal Center for Research and
Islamic Studies) & KACST (King Abdulaziz City for Science and Technology), Riyadh, 2006.
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 55
Schwartz, Kathryn A., “Charting Arabic Cryptology’s Evolution,” Cryptologia, Vol. 33, No. 4, October
2009, pp. 297–305.
On Cryptanalysis
Barker, Wayne G., Cryptanalysis of the Simple Substitution Cipher with Word Divisions Using Non-Pattern
Word Lists, Aegean Park Press, Laguna Hills, California, 1975. This work also discusses techniques for
distinguishing vowels from consonants.
Edwards, D. J. OCAS – On-line Cryptanalysis Aid system, MIT Project MAC, TR-27, May 1966. Bruce Schatz
wrote, “reported on a SNOBOL-like programming language specially designed for cryptanalysis.”
Gaines, Helen F., Cryptanalysis: A Study of Ciphers and Their Solutions, corrected from prior printings and
augmented with solutions, Dover, New York, 1956, pp. 74ff, 88–92; the first printing appeared in
1939 and was titled Elementary Cryptanalysis.
Girsdansky, M. B., “Cryptology, the Computer, and Data Privacy,” Computers and Automation, Vol. 21, No.
4, April 1972, pp. 12–19. This is a survey of automated cryptanalysis, so you can see that this has been
investigated publicly for quite some time.
Guy, Jaques B. M., “Vowel Identification: An Old (But Good) Algorithm,” Cryptologia, Vol. 15, No. 3, July
1991, pp. 258–262.
Mellen. Greg E., “Cryptology, Computers, and Common Sense,” in AFIPS ‘73: Proceedings of National
Computer Conference and Exposition, Vol. 42, June 4–8, 1973, AFIPS Press, Montvale, New Jersey,
pp. 569–579. This is a survey of automated cryptanalysis. AFIPS stands for American Federation of
Information Processing Societies.
Moler, Cleve and Donald Morrison, “Singular Value Analysis of Cryptograms,” American Mathematical
Monthly, Vol. 90, No. 2, February 1983, pp. 78–87. This paper uses the singular value decomposition
for vowel recognition.
Olson, Edwin, “Robust Dictionary Attack of Short Simple Substitution Ciphers,” Cryptologia, Vol. 31, No.
4, October 2007, pp. 332–342.
Schatz, Bruce R., “Automated Analysis of Cryptograms,” Cryptologia, Vol. 1, No. 2, April 1977, pp. 116–142.
Silver, R., “Decryptor,” in MIT Lincoln Laboratory Quarterly Progress Report, Division 5 (Information
Processing), December 1959, pp. 57–60.
Sutton, William G., “Modified Sukhotin A Manual Method,” from Computer Column in The Cryptogram,
Vol. 58, No. 5, September–October 1992, pp. 12–14.
Vobbilisetty, Rohit, Fabio Di Troia, Richard M. Low, Corrado Aaron Visaggio, and Mark Stamp, “Classic
Cryptanalysis Using Hidden Markov Models,” Cryptologia, Vol. 41, No. 1, January 2017, pp. 1–28.
58 Raja is the American Cryptogram Association (ACA) nom de plume for the author and his wife Josephine.
56 ◾ Secret History: The Story of Cryptology
Goddard, Eldridge and Thelma Eldridge, Cryptodyct, Wagners, Davenport, Iowa, 1976. Upon seeing the
title, I thought it must have been written in some foreign language I wasn’t familiar with; however, it
is in English and is simply a pattern word dictionary of 272 pages for words of length 14 and shorter.
This effort took two years and appears to have been privately printed.
Goddard, Eldridge and Thelma Eldridge, Cryptokyt I, Non-Pattern Nine Letter Word List, Wagners,
Davenport, Iowa, 1977, 28 pages.
Goddard, Eldridge & Thelma, Cryptokyt II, Non-Pattern Five Letter Word List, Wagners, Davenport, Iowa,
1977, 64 pages.
Hempfner, Philip and Tania Hempfner, Pattern Word List For Divided and Undivided Cryptograms, self-
published, 1984. An electronic dictionary was used to generate the 30,000+ entries in this 100-page
book, which goes up to the 21-letter word electroencephalograph.
Levine, Jack, A List of Pattern Words of Lengths Two Through Nine, self-published, 1971, 384 pages.
Levine, Jack, A List of Pattern Words of Lengths Ten Through Twelve, self-published, 1972, 360 pages.
Levine, Jack, A List of Pattern Words of Lengths Thirteen to Sixteen, self-published, 1973, 270 pages.
Lynch, Frederick D., Colonel, USAF, Ret., Pattern-Word List, Volume 1, Containing Words up to 10 Letters
in Length, Aegean Park Press, Laguna Hills, California, 1977, 152 pages. “much of Colonel Lynch’s
work remains classified,” but this work was “compiled manually by the author over a period of many
years” using open source material (Webster’s New International Dictionary, third edition). Words where
only a single letter repeats itself once were not included.
On Zodiac
Bauer, Craig P., Unsolved! The History and Mystery of the World’s Greatest Ciphers from Ancient Egypt to
Online Secret Societies, Princeton University Press, Princeton, New Jersey, 2017. See Chapter 4.
Crowley, Kieran, Sleep My Little Dead: The True Story of the Zodiac Killer, St. Martin’s Paperbacks, New
York, 1997. This is about a copycat killer in New York, not the original Zodiac. The author signed my
copy “To Craig – What’s Your Sign?” It gave me a chill – thanks!
Graysmith, Robert, ZODIAC, St. Martin’s/Marek, New York, 1986. This is the best book on Zodiac. It’s
creepy and in 2007 was made into a movie of the same title that is also creepy.59 It is not to be confused
with an extremely low budget film titled Zodiac Killer,60 which appeared in 2005, a year that also saw
the release of The Zodiac.61
Graysmith, Robert, Zodiac Unmasked: The Identity of American’s Most Elusive Serial Killer Revealed, Berkley
Books, New York, 2002.
Hunt for the Zodiac Killer, The. This is a 5-part television series that premiered on History in 2017. I recom-
mend reading chapter 4 of the first reference in this section before viewing it.
Oranchak, David, “Let’s Crack Zodiac - Episode 5 - The 340 Is Solved!” December 11, 2020, https://www.
youtube.com/watch?v=-1oQLPRE21o&feature=youtu.be. This video was posted when the present
book was at the proof stage.
On MASCs
Bamford, James, Body of Secrets, Doubleday, New York, 2001. Although this is a book about the National
Security Agency, each chapter begins with cryptograms—can you find the correct decipherments?
Fronczak, Maria, “Atbah-Type Ciphers in the Christian Orient and Numerical Rules in the Construction of
Christian Substitution Ciphers,” Cryptologia, Vol. 37, No. 4, October 2013, pp. 338–344.
Huffman, David A., “A Method for the Construction of Minimum-Redundancy Codes,” Proceedings of the
Institute of Radio Engineers, Vol. 40, No. 9, September 1952, pp. 1098–1101.
Kruh, Louis, “The Churchyard Ciphers,” Cryptologia, Vol. 1, No. 4, October 1977, pp. 372–375.
59 http://www.imdb.com/title/tt0443706/.
60 http://www.imdb.com/title/tt0469999/.
61 http://www.imdb.com/title/tt0371739/.
Monoalphabetic Substitution Ciphers, or MASCs: Disguises for Messages ◾ 57
Reeds, Jim, “Solved: The Ciphers in Book III of Trithemius’s Steganographia,” Cryptologia, Vol. 22, No. 4,
October 1998, pp. 291–317. This paper concerns an old and hidden cipher of Trithemius, only recog-
nized and deciphered hundreds of years after his death. Trithemius’s life and his cryptologic work are
discussed in Section 2.2, but the paper referenced here can be appreciated now.
Hint for the pattern in the cipher alphabet from the Kaczynski example: Look at a keyboard.
Chapter 2
Simple Progression to an
Unbreakable Cipher
This chapter describes a simple cipher system and proceeds to patch it against attacks until the
final result of a theoretically unbreakable cipher is achieved.
Figure 2.1 A ciphertext that Poe could not solve. (From Winkel, Brian J., Cryptologia, Vol. 1,
No. 1, January 1977, p. 95; the ciphertext originally appeared in Poe, Edgar Allan, Alexander’s
Weekly Messenger, February 26, 1840.)
1 Poe demonstrated that the cipher was nonsense in his article, “More of the Puzzles,” which appeared in
Alexander’s Weekly Messenger, Vol. 4, No. 9, February 26, 1840, p. 2, column 4, and gave the quote provided
over a year later (reflecting back) in his article “A Few Words on Secret Writing,” which appeared in the July
1841 issue of Graham’s Magazine, Vol. 19, No. 1, pp. 33–38. Ironically, Poe’s “A Few Words on Secret Writing”
was the longest of his essays dealing with cryptology.
59
60 ◾ Secret History
Jumping ahead to the 1970s, Mark Lyster, an undergraduate in Brian Winkel’s cryptology
class at Albion College, became curious and attempted a solution. Together, the professor and
his student solved it. Brian then challenged Cryptologia’s readers to attempt their own solu-
tions in the paper referenced with Figure 2.1. In the August 1977 Scientific American, Martin
Gardner challenged his readers to solve it. You may consider yourself so challenged after
reading the material on cryptanalysis that follows in this chapter! A solution was presented
in Brian Winkel’s article, “Poe Challenge Cipher Solutions,” in the October 1977 issue of
Cryptologia (pp. 318–325). Look at this paper only after making a serious attempt to solve it
yourself!
The system behind the Poe cipher was long known as Le Chiffre Indéchiffrable (“The
Unbreakable Cipher”). Today, it is simply referred to as the Vigenère cipher—we have to make a
few improvements before it becomes truly unbreakable!
The main weaknesses of monoalphabetic ciphers are the preservation of letter frequencies (only
the symbols representing the letters change) and word patterns, as detailed in the previous chapter.
So, a necessary condition for a cipher to be secure is that it be invulnerable to these attacks. The
Vigenère cipher accomplishes this by using a variety of substitutions for each letter in the plaintext
alphabet. The frequencies of the letters in the ciphertext are thus flattened. Pattern words are also
disguised. This is an example of a polyalphabetic substitution cipher. Really, it shouldn’t be named
after Vigenère (Figure 2.2), but we’ll let the history wait for a moment while we take a look at
an example of this cipher using the keyword ELVIS, which can be seen running down the first
column in the substitution table below.
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z plaintext
E F G H I J K L M N O P Q R S T U V W X Y Z A B C D alphabet 1
L M N O P Q R S T U V W X Y Z A B C D E F G H I J K alphabet 2
V W X Y Z A B C D E F G H I J K L M N O P Q R S T U alphabet 3
I J K L M N O P Q R S T U V W X Y Z A B C D E F G H alphabet 4
S T U V W X Y Z A B C D E F G H I J K L M N O P Q R alphabet 5
Simple Progression to an Unbreakable Cipher ◾ 61
Alphabet 1 is used to encipher the first letter in the message, alphabet 2 is used for enciphering
the second letter, and so on. When we get to the sixth letter, we return to alphabet 1. A sample
encipherment follows.
The words THANK YOU are enciphered in two different ways, depending upon the position
relative to the key alphabets. Also, we have doubled letters in the ciphertext, VV and ZZ, where
there are no doubled letters in the plaintext. When this system first appeared, there were no
cryptanalytic techniques in existence that were any better than simply guessing at the key. In
general, longer keys are better. If the key is only a single character, this system reduces to the
Caesar cipher.
Figure 2.4 Johannes Trithemius, cryptology’s first printed author. (Courtesy of the National
Cryptologic Museum, Fort Meade, Maryland.)
The next step was taken by Johannes Trithemius (1462–1516) (Figure 2.4), the author of the
first printed book on cryptology, Polygraphiae (Figure 2.5). It was written in 1508 and first printed
in 1518, after his death. This was actually his second book dealing with cryptology, but the first,
Steganographia, did not reach printed form until 1606. Steganographia had long circulated in man-
uscript form and had even attracted the attention of the Roman Catholic Church, which placed
it on the Index of Prohibited Books. It is now available online at http://www.esotericarchives.
com/esoteric.htm#tritem. Most of the cryptographers of Trithemius’s era were also alchemists
and magicians of a sort. In fact, Trithemius knew the real Dr. Faustus (whom he considered a
charlatan) and is said to have been a mentor of Paracelsus and Cornelius Agrippa.2 According
to legend, Trithemius himself was said to have raised the wife of Emperor Maximilian I from
the dead.3
2 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 131.
3 Goodrick-Clarke, Nicholas, The Western Esoteric Traditions: A Historical Introduction, Oxford University Press,
New York, 2008, p. 52.
Simple Progression to an Unbreakable Cipher ◾ 63
Figure 2.5 Title page of Polygraphiae by Trithemius. (Courtesy of the National Cryptologic
Museum, Fort Meade, Maryland.)
Polygraphiae contained the first “square table” or “tableau.” This is pictured below and simply
represents all possible shift ciphers. The first row is the plaintext.
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
B C D E F G H I J K L M N O P Q R S T U V W X Y Z A
C D E F G H I J K L M N O P Q R S T U V W X Y Z A B
D E F G H I J K L M N O P Q R S T U V W X Y Z A B C
E F G H I J K L M N O P Q R S T U V W X Y Z A B C D
F G H I J K L M N O P Q R S T U V W X Y Z A B C D E
G H I J K L M N O P Q R S T U V W X Y Z A B C D E F
H I J K L M N O P Q R S T U V W X Y Z A B C D E F G
I J K L M N O P Q R S T U V W X Y Z A B C D E F G H
J K L M N O P Q R S T U V W X Y Z A B C D E F G H I
K L M N O P Q R S T U V W X Y Z A B C D E F G H I J
L M N O P Q R S T U V W X Y Z A B C D E F G H I J K
M N O P Q R S T U V W X Y Z A B C D E F G H I J K L
N O P Q R S T U V W X Y Z A B C D E F G H I J K L M
O P Q R S T U V W X Y Z A B C D E F G H I J K L M N
P Q R S T U V W X Y Z A B C D E F G H I J K L M N O
Q R S T U V W X Y Z A B C D E F G H I J K L M N O P
R S T U V W X Y Z A B C D E F G H I J K L M N O P Q
S T U V W X Y Z A B C D E F G H I J K L M N O P Q R
T U V W X Y Z A B C D E F G H I J K L M N O P Q R S
U V W X Y Z A B C D E F G H I J K L M N O P Q R S T
V W X Y Z A B C D E F G H I J K L M N O P Q R S T U
W X Y Z A B C D E F G H I J K L M N O P Q R S T U V
X Y Z A B C D E F G H I J K L M N O P Q R S T U V W
Y Z A B C D E F G H I J K L M N O P Q R S T U V W X
Z A B C D E F G H I J K L M N O P Q R S T U V W X Y
64 ◾ Secret History
Trithemius used the alphabets in order, enciphering 24 letters of plaintext with each (his Latin
alphabet had 24 letters, which seems to be why he chose this number). He also enciphered by
changing the alphabet after each letter, but he always used the alphabets in order. As with Alberti,
the idea of using a keyword was not realized. It was finally hit upon by Giovan Battista Bellaso in
1553 in his work La cifra del Sig. Giovan.
Now that all of the ideas were finally present, Giovanni Battista Porta4 (1535–1615)
(Figure 2.6) combined them. He used Bellaso’s keyword to determine which alphabets to use, but
he also mixed the letters within the cipher alphabets, as Alberti had with his cipher disk. It should
be noted that the mixed alphabets represent a greater level of security than provided by the straight
alphabets of the Vigenère cipher. Porta’s work was published as De Furtivis Literarum Notis in
1563. This work also included the first digraphic cipher, a topic we shall return to in Section 4.4.
Blaise de Vigenère (1523–1596) published his work in Traicté des Chiffres in 1586, by which
time the cipher described above already existed. Vigenère was careful to give credit to those who
had earned it, yet somehow his name became attached to a system that wasn’t his, and his real
contribution, the autokey, was ignored.5 We shall also ignore it—for now. As a further example
4 Also of interest is that Porta founded an “Academy of Secrets” in Naples. For more information see Zielinski,
Siegfried, “Magic and Experiment: Giovan Battista Della Porta,” in Zielinski, Siegfried, editor, Deep Time
of the Media: Toward an Archaeology of Hearing and Seeing by Technical Means, MIT Press, Cambridge,
Massachusetts, pp. 57–100, available online at https://gebseng.com/media_archeology/reading_materials/
Zielinsky-deep_time_of_the_media.pdf.
5 Yet, even this contribution should really be credited to a previous discoverer, Giovan Battista Bellaso, who
described it in 1564. See LABRONICUS [ACA pen-name of Augusto Buonafalce], “Historical Tidbits,” The
Cryptogram, Vol. 58, No. 3, May–June 1992, p. 9.
Simple Progression to an Unbreakable Cipher ◾ 65
of the involvement of early cryptographers in alchemy and magic, let it be known that Traicté des
Chiffres contains a recipe for making gold.
The Vigenère cipher was one of the best at the time, especially when using mixed alphabets;
nevertheless, there are still various cases of its being cracked. One amusing anecdote involves such
a cipher being broken by Casanova (Figure 2.7), who then used his accomplishment as a means to
a seduction.6 He wrote:
Five or six weeks later, she asked me if I had deciphered the manuscript which had the
transmutation procedure. I told her that I had.
“Without the key, sir, excuse me if I believe the thing impossible.”
“Do you wish me to name your key, madame?”
“If you please.”
I then told her the word, which belonged to no language, and I saw her surprise.
She told me that it was impossible, for she believed herself the only possessor of that
word which she kept in her memory and which she had never written down.
I could have told her the truth – that the same calculation which had served me
for deciphering the manuscript had enabled me to learn the word – but on a caprice
it struck me to tell her that a genie had revealed it to me. This false disclosure fettered
Madame d’Urfé to me. That day I became the master of her soul, and I abused my
power. Every time I think of it, I am distressed and ashamed, and I do penance now in
the obligation under which I place myself of telling the truth in writing my memoirs.
[I took my leave] bearing with me her soul, her heart, her wits and all the good sense
that she had left.
7 Friedrich W. Kasiski (1805–1881) was a retired Prussian infantry major, who published his method in Die
Geheimschriften und die Dechiffrir-kunst in 1863.
8 For a bit more on this see Singh, Simon, The Code Book, Doubleday, New York, 1999, p. 78. For a tremendous
amount more see Franksen, Ole Immanuel, Mr. Babbage’s Secret, The Tale of a Cypher and APL, Prentice-Hall,
Englewood Cliffs, New Jersey, 1984.
Simple Progression to an Unbreakable Cipher ◾ 67
Figure 2.8 William Friedman (1891–1969). (Courtesy of National Cryptologic Museum, Fort
Meade, Maryland.)
A wonderful attack, published in 1920 by William Friedman (Figure 2.8),9 arises from a calcu-
lation called the index of coincidence (IC). Simply stated, this is the probability that two randomly
chosen letters from a text of length N will be the same.
For both to be A, we take
FA FA − 1
P (first letter is A) ⋅ P (second letter is A) = ⋅
N N −1
where FA denotes the frequency of A.
Because both letters could have been B, or both letters could have been C, etc., we must sum
these probabilities over each letter in the alphabet, which then gives
∑
Z
Fi ( Fi − 1)
IC = i= A .
N ( N − 1)
The use of multiple substitution alphabets in a cipher flattens the frequency distribution for
the letters and therefore decreases the chance of two randomly selected ciphertext letters being the
9 Friedman, William F., The Index of Coincidence and Its Applications in Cryptography, Publication No. 22,
Riverbank Laboratories, Geneva, Illinois, 1920.
68 ◾ Secret History
same, as compared to the chance for letters in the original plaintext message. Thus, the value of the
index of coincidence can be said to measure the flatness of the frequency distribution or, in other
words, estimate the number of alphabets in use.
Due to the variation of letter frequencies in normal plaintext, we will not get exactly the same
value from the IC every time a keyword of a given length is used; however, the expected value may
be calculated for each size keyword. It is provided in the following table. The values depend in part
on the length of the text. Separate tables can be constructed for various message lengths. The table
below gives values for long messages.
1 0.0660
2 0.0520
3 0.0473
4 0.0449
5 0.0435
6 0.0426
7 0.0419
8 0.0414
9 0.0410
10 0.0407
As N gets large, we approach a limiting value of approximately 0.0388. This is the value we’d
get for purely random text—that is, text where all letters are equally frequent and thus share a
probability of 1/26.
Notice that the difference between expected values is largest when the number of alphabets
used is small. We can easily distinguish between one alphabet (a monoalphabetic substitution
cipher) and two alphabets, but distinguishing between nine and ten alphabets is difficult.
Suppose the index of coincidence is 0.04085, indicating that nine or ten alphabets have been
used. We can investigate further by assuming that the correct keylength is 9 and splitting the
ciphertext into nine groups of letters, each of which would have been enciphered by the same
alphabet, if our assumption is correct. The first group would contain the letters in positions 1, 10,
19, 28,…, because a key of length 9 forces us to start over with the first alphabet at position 10,
and again at positions 19, 28, etc. The second group would contain all letters enciphered with the
second alphabet, positions 2, 11, 20, 29, etc. Now applying the IC to group one should indicate
(by resulting in a value close to 0.066) if those letters truly did arise from encipherment with the
same alphabet. If the value of the IC is closer to 0.038, we lose confidence in a keylength of 9.
But the first group is not the only one we should consider. The IC value for this group could be a
misleading fluke! Testing all nine groups of letters separately gives a much firmer statistical base
upon which to decide if a keylength of 9 is correct. If the nine IC values, considered as a group, are
Simple Progression to an Unbreakable Cipher ◾ 69
discouraging, we can start over and assume the keylength is 10. Splitting the ciphertext letters into
ten groups and computing the IC for each will show if this assumption is better or not. The smaller
groups of ciphertext letters, for which these computations are done, are referred to as decimated
alphabets, even if there aren’t exactly ten of them.
A few examples will indicate how reliable the Kasiski and IC tests are. The first is presented
below and others are left as exercises. Caution: Some books concoct examples where such tests
work perfectly, creating the false impression that this is always the case.
The IC equation can be turned around to give the length (L) of the key, when the number of
characters (N) in the ciphertext and the ciphertext index of coincidence (IC) are known:
0.028N
L≈
( IC )( N − 1) − 0.038N + 0.066
Although you needn’t understand the derivation of the formula above in order to use it, it’s
easy to demonstrate. Suppose we have a ciphertext consisting of N letters and that the enciphering
key has length L. If we randomly pick two letters, what is the probability that they are the same
(As ciphertext—the plaintext letters they represent needn’t match)? The probability that the two
letters match is much higher if they were both enciphered with the same alphabet, so we consider
two separate cases and combine them for our final answer.
Case 1: The Two Letters Arose from the Same Cipher Alphabet
It doesn’t matter which letter we pick first, but the second letter has to come from the same alpha-
bet, so it must be one of the remaining ( N /L ) − 1 letters of this type (The L alphabets will roughly
divide the N letters of the text into groups of size N /L, each enciphered differently). Thus, there are
( N /L ) − 1 choices left out of the total remaining N − 1 letters. So the probability is
N
− 1
L .
N −1
But we also want the two letters to be the same. Because we are already within the same alphabet,
this probability is simply 0.066, the value of the IC for one alphabet. We multiply these two values
together to get
N
− 1 ( 0.066 )
L .
N −1
N
−1
L
1− .
N −1
Now that we have two letters from different alphabets, we need to multiply by the probability
that they match. This is simply the IC value for random text (or, equivalently, a large number of
alphabets), namely 0.038. So, our probability for case 2 is
N
−1
L
1 − N − 1 ( 0.038).
Combining the two cases, we have
N N
− 1 ( 0.066 ) −1
L L
IC ≈ + 1− ( 0.038).
N −1 N −1
It’s now just a matter of doing the algebra to solve for L (see Exercise 23) and obtain the result.
0.028N
L≈
( IC )( N − 1) − 0.038N + 0.066
Using this equation, you don’t need a table of values like the one given above; however, this ver-
sion of the equation is only intended for ciphers having English plaintexts. For other languages,
the constants may vary.
The Kasiski test and the index of coincidence may sound complicated at first, but they are very
easy to use. Take a look at the following example to see how simple they make the task of Vigenère
cipher cryptanalysis.
Example
IZPHY XLZZP SCULA TLNQV FEDEP QYOEB SMMOA AVTSZ VQATL LTZSZ
AKXHO OIZPS MBLLV PZCNE EDBTQ DLMFZ ZFTVZ LHLVP MBUMA VMMXG
FHFEP QFFVX OQTUR SRGDP IFMBU EIGMR AFVOE CBTQF VYOCM FTSCH
ROOAP GVGTS QYRCI MHQZA YHYXG LZPQB FYEOM ZFCKB LWBTQ UIHUY
LRDCD PHPVO QVVPA DBMWS ELOSM PDCMX OFBFT SDTNL VPTSG EANMP
MHKAE PIEFC WMHPO MDRVG OQMPQ BTAEC CNUAJ TNOIR XODBN RAIAF
UPHTK TFIIG EOMHQ FPPAJ BAWSV ITSMI MMFYT SMFDS VHFWQ RQ
Several character groups repeat. We need to note their positions to make use of the Kasiski test.
Difference
Character Between Starting
Grouping Starting Positions Positions
Simple Progression to an Unbreakable Cipher ◾ 71
IZP 1 and 57 56
HYX 4 and 172 168
EPQ 24 and 104 80
MBU 91 and 123 32
TSM 327 and 335 8
Consider the last column of the table above. All of the values are multiples of 8. This suggests that the
key is of length 8. It’s possible that the keylength is 4 (or 2), but if this were the case, it’s likely that
one of the numbers in the difference column would be a multiple of 4 (or 2) but not a multiple of 8.
Calculating the index of coincidence requires more work. We begin by constructing a fre-
quency table (Table 2.1) for the ciphertext letters.
The numerator of the index of coincidence for this example is then given by
∑F (F − 1)
i= A
i i = (18)(17) + (14)(13) + (12)(11) + (12)(11) + (15)(14) + (22)(21)
+ (9)(8) + (14)(13) + (13)(12) + (2)(1) + (4)(3) + (15)(14) + (26)(25)
+ (7)(6) + (18)(17) + (22)(21) + (17)(16) + (10)(9) + (16)(15)
+ (20)(19) + (8)(7) + (18)(17) + (5)(4) + (7)(6) + (9)(8) + (14)(13)
= 5178
The index of coincidence is then
72 ◾ Secret History
∑
Z
Fi ( Fi − 1) 5178
i= A
IC = = ≈ 0.0431.
N ( N − 1) (347)(346)
This value, although not matching an expected value perfectly, suggests that five or six alpha-
bets were used. We can assume the key is of length 5, split the ciphertext letters into groups that
would have been enciphered with the same alphabet, and perform the IC calculation for each
group. We get the values 0.046, 0.050, 0.041, 0.038, and 0.046. These values are far below 0.066,
so it’s very unlikely that they are really from the same alphabet. Our assumption that the key is of
length 5 must be wrong. Repeating these calculations based on a key of length 6 gives the values
0.050, 0.041, 0.035, 0.048, 0.038, and 0.037. Again, it seems that the key cannot be of this length.
We could try again with four alphabets and then with seven, before moving on to values fur-
ther from the ones suggested by the IC, but the Kasiski test suggested eight, so let’s skip ahead to
this value. Splitting the ciphertext into eight separate groups and calculating the IC for each gives
0.105, 0.087, 0.075, 0.087, 0.056, 0.069, 0.065, and 0.046. These values are by far the largest, so
we have another test backing up the results of the Kasiski test. Another way in which an IC cal-
culation can be used to support the result of the Kasiski test is described in Exercise 22. We now
rewrite the ciphertext in blocks of length 8, so that characters in the same column represent letters
enciphered by the same alphabet. We may construct a frequency table for each column (Table 2.2).
This had to be done to get the values for the IC given in the paragraph above, but I left out showing
the work until it could be seen to lead to a positive conclusion. Take a moment to examine Table
2.2 and the accompanying text before returning here.
For each column, there are 26 possible choices for which letter represents E. In column 1, the
maximum value for the sum of the frequencies of E, A, and T is 21 and is obtained when M repre-
sents E. Now for column 2, assuming that P represents E yields a sum of only 9. The greatest sum
is 11, which is obtained when Q represents E. For column 3, the maximum sum is also obtained
when Q represents E. This time the sum is 17. For columns 4 through 8, this technique suggests E
is represented by S, V, X, E, and P, respectively. We see that, if these substitutions are correct, E is
only the most frequent character in columns 1, 3, 4, and 5, while it is tied for first place in column
7. You are encouraged to investigate other techniques for determining the shift of each alphabet in
Exercise 14. The substitutions above imply the keyword (the letters representing A in each alphabet
strung together in order) is IMMORTAL. Since this is a real word, we gain some confidence in our
solution. Applying this keyword to the ciphertext by subtracting modulo 26 gives:
IZPHY XLZZP SCULA TLNQV FEDEP QYOEB SMMOA AVTSZ VQATL LTZSZ
IMMOR TALIM MORTA LIMMO RTALI MMORT ALIMM ORTAL IMMOR TALIM
ANDTH ELORD GODSA IDBEH OLDTH EMANI SBECO NEASO NEOFU STOKN
AKXHO OIZPS MBLLV PZCNE EDBTQ DLMFZ ZFTVZ LHLVP MBUMA VMMXG
MORTA LIMMO RTALI MMORT ALIMM ORTAL IMMOR TALIM MORTA LIMMO
OWGOO DANDE VILAN DNOWL ESTHE PUTFO RTHHI SHAND ANDTA KEALS
FHFEP QFFVX OQTUR SRGDP IFMBU EIGMR AFVOE CBTQF VYOCM FTSCH
RTALI MMORT ALIMM ORTAL IMMOR TALIM MORTA LIMMO RTALI MMORT
OOFTH ETREE OFLIF EANDE ATAND LIVEF OREVE RTHER EFORE THELO
ROOAP GVGTS QYRCI MHQZA YHYXG LZPQB FYEOM ZFCKB LWBTQ UIHUY
ALIMM ORTAL IMMOR TALIM MORTA LIMMO RTALI MMORT ALIMM ORTAL
RDGOD SENTH IMFOR THFRO MTHEG ARDEN OFEDE NTOTI LLTHE GROUN
LRDCD PHPVO QVVPA DBMWS ELOSM PDCMX OFBFT SDTNL VPTSG EANMP
Simple Progression to an Unbreakable Cipher ◾ 73
IMMOR TALIM MORTA LIMMO RTALI MMORT ALIMM ORTAL IMMOR TALIM
DFROM WHENC EHEWA STAKE NSOHE DROVE OUTTH EMANA NDHEP LACED
MHKAE PIEFC WMHPO MDRVG OQMPQ BTAEC CNUAJ TNOIR XODBN RAIAF
MORTA LIMMO RTALI MMORT ALIMM ORTAL IMMOR TALIM MORTA LIMMO
ATTHE EASTO FTHEG ARDEN OFEDE NCHER UBIMS ANDAF LAMIN GSWOR
UPHTK TFIIG EOMHQ FPPAJ BAWSV ITSMI MMFYT SMFDS VHFWQ RQ
RTALI MMORT ALIMM ORTAL IMMOR TALIM MORTA LIMMO RTALI MM
DWHIC HTURN EDEVE RYWAY TOKEE PTHEW AYOFT HETRE EOFLI FE
The plaintext turns out to be a passage from the Bible about gaining immortality.10
The index of coincidence gave too low a value for the number of alphabets, as will often happen
when two or more of the “different” alphabets used are, in fact, the same. The repeated M alphabet
in the key IMMORTAL is to blame in this instance.
Kasiski’s attack worked better in this example, but Friedman’s index of coincidence is, in gen-
eral, the more powerful technique. Friedman would have gained immortality through this work,
even if he had done nothing else. It can be applied in many different contexts and can even be used
in some cases to distinguish between languages.11
Language IC
English 0.0667
French 0.0778
German 0.0762
Italian 0.0738
Russian 0.0529 (30-letter Cyrillic alphabet)
Spanish 0.0775
There are many other ways to distinguish between languages in a monoalphabetic substitution
cipher without deciphering. One such measure, the entropy of a text (discussed more fully later in
this book), can even be used to determine (very roughly) the era in which a text was produced. The
entropy of a language seems to increase with time, obeying the second law of thermodynamics!12
The Vigenère cipher has seen extensive use over hundreds of years. It was used by the confederacy
in the Civil War and it had long been believed that they only ever used three keys: MANCHESTER
BLUFF, COMPLETE VICTORY, and (after General Lee’s surrender) COME RETRIBUTION. In
2006, however, Kent Boklan, attempting to break an old confederate message, discovered a fourth
key.13 The small number of keys certainly aided the Union codebreakers! Kasiski’s attack was pub-
lished during this war, but appears to have gone unnoticed by the Confederacy.
Mathematician Charles Dodgson (1832–1898), who wrote Alice’s Adventures in Wonderland
and Through the Looking Glass under the pseudonym Lewis Carroll, independently discovered
what we call the Vigenère cipher. He wrote that it would be “impossible for any one [sic], ignorant
of the key-word, to decipher the message, even with the help of the table.”14
10 Publishing seems to be the surest path. Some seek immortality through accomplishments in sports, but how
many ancient Greek athletes can you name? On the other hand, you can probably name several ancient Greek
playwrights and authors of works on mathematics, science, and philosophy.
11 Kahn, David, The Codebreakers, second edition, Scribner, 1996, p. 378.
12 Bennett, Jr., William Ralph, Scientific and Engineering Problem-Solving with the Computer, Prentice-Hall,
Englewood Cliffs, New Jersey, 1976. Sections 4.13 and 4.14 are relevant here.
13 Boklan, Kent, “How I Broke the Confederate Code (137 Years Too Late),” Cryptologia, Vol. 30, No. 4, October
2006, pp. 340–345. Boklan later evened things out by breaking an old Union cipher.
14 The source of this quote is typically cited as “The Alphabet Cipher,” which was published in 1868 in a children’s
magazine. Thus far, I’ve been unable to obtain further bibliographic information. I have a photocopy of the
two-page paper, but even this is no help, as no title, date, author name, or even page numbers appear on the
pages. Christie’s auctioned off an original as lot 117 of sale 2153. The price realized was $1,000, and the item
was described as “Broadsheet on card stock (180 × 123 mm). The table of letters printed on one side and the
Explanation on the other.” So perhaps it never was in a magazine? See http://www.christies.com/lotfinder/books-
manuscripts/dodgson-charles-lutwidge-5280733-details.aspx?from=salesummary&pos=5&intObjectID=
5280733&sid=&page=9 for more information.
Simple Progression to an Unbreakable Cipher ◾ 75
Even as late as 1917, not everyone had heard that the system had been broken. In that year,
Scientific American reprinted an article from the Proceedings of the Engineers’ Club of Philadelphia
that proclaimed the system to be new (!) and “impossible of translation.”15 The article also stated,
“The ease with which the key may be changed is another point in favor of the adoption of this
code by those desiring to transmit important messages without the slightest danger of their mes-
sages being read by political or business rivals.” However, the mistake was eventually recognized,
and a 1921 issue of Scientific American Monthly carried an article entitled “The Ciphers of Porta
and Vigenère, The Original Undecipherable Code and How to Decipher It” by Otto Holstein.16
2.4 Kryptos
A more recent example of a Vigenère cipher is in the form of an intriguing sculpture called Kryptos
(Figure 2.9). Created by James “Jim” Sanborn in 1990, this artwork is located in an outdoor area
within the Central Intelligence Agency (CIA). Although the location is not open to the public, it
has attracted a great deal of public attention and is even alluded to via its latitude and longitude
on the dust jacket of Dan Brown’s novel The Da Vinci Code.
Figure 2.9 Sanborn’s sculpture Kryptos. (Courtesy of National Cryptologic Museum, Fort
Meade, Maryland.)
15 “A New Cipher Code,” Scientific American Supplement, Vol. 83, No. 2143, January 27, 1917, p. 61.
16 Holstein, Otto, “The Ciphers of Porta and Vigenère, The Original Undecipherable Code and How to Decipher
It,” Scientific American Monthly, Vol. 4, No. 4, October 1921, pp. 332–334.
76 ◾ Secret History
The left half of the sculpture is the ciphertext, which may be divided into two panels. These are
referred to as panel 1 (top half of left side) and panel 2 (bottom half of left side). Both panels are
reproduced in Figures 2.10 and 2.11, as shown on the CIA’s website. Each contains two distinct
ciphers. These ciphers are distinguished in the figures here, but not in the original sculpture.
Figure 2.10 Panel 1 of Kryptos; a horizontal line has been added to separate cipher 1 from cipher
2. (Adapted from https://www.cia.gov/about-cia/headquarters-tour/kryptos/KryptosPrint.pdf.)
Figure 2.11 Panel 2 of Kryptos; a (mostly) horizontal line has been added to separate cipher
3 from cipher 4. (Adapted from https://www.cia.gov/about-cia/headquarters-tour/kryptos/
KryptosPrint.pdf.)
The right side of Kryptos (panels 3 and 4) provides a clue as to the means of encipherment
used on the left side. These panels are reproduced in Figures 2.12 and 2.13, as shown on the CIA’s
website.
Simple Progression to an Unbreakable Cipher ◾ 77
Figure 2.12 Panel 3 of Kryptos provides a clue as to the means of encipherment. (Adapted from
https://www.cia.gov/about-cia/headquarters-tour/kryptos/KryptosPrint.pdf.)
Figure 2.13 Panel 4 of Kryptos, in which the clue continues. (Adapted from https://www.cia.
gov/about-cia/headquarters-tour/kryptos/KryptosPrint.pdf.). Note: The CIA made a mistake in
transcribing this panel! Read chapter 9 in my book listed in the References and Further Reading
section at the end of this chapter for the details and much more.
K R Y P T O S A B C D E F G H I J L M N Q U V W X Z plaintext
G H I J L M N Q U V W X Z K R Y P T O S A B C D E F alphabet 1
A B C D E F G H I J L M N Q U V W X Z K R Y P T O S alphabet 2
U V W X Z K R Y P T O S A B C D E F G H I J L M N Q alphabet 3
S A B C D E F G H I J L M N Q U V W X Z K R Y P T O alphabet 4
S A B C D E F G H I J L M N Q U V W X Z K R Y P T O alphabet 5
78 ◾ Secret History
Below this, the key GAUSS is written vertically down the left hand side to provide the first let-
ters of our five cipher alphabets. Each of these cipher alphabets is continued from its first letter in
the same order as our initial mixed alphabet. Now to encipher, the five alphabets are used in order,
as many times as necessary, until the message is at an end. We get:
This sort of cipher is referred to as a Quagmire III by members of the American Cryptogram
Association (ACA), such as James J. Gillogly. The entire plaintext of Kryptos has not yet been
recovered, but Gillogly, using computer programs of his own design, deciphered the majority of it
in 1999. Several factors served to make his work more difficult:
1. There’s no obvious indication that the left side contained four ciphers instead of just one.
2. The mixing of the alphabets was done with a keyword, but not KRYPTOS, as in the clue on
the right side.
3. Sanborn intentionally introduced some errors in the ciphers.
4. Only the first two ciphers are Quagmire IIIs. Ciphers 3 and 4 arose from other systems.
Determining the correct keys and recovering the first two messages is left as a challenge to
the reader. The solutions can easily be found online. Can you use the techniques detailed in this
chapter to meet this challenge? The second part should be easier than the first, since there is more
ciphertext to work with.
The third cipher made use of transposition (see Chapter 3) and was also solved by Gillogly.
With only a little more than three lines of ciphertext left, Gillogly got stuck. He was unable to
break the fourth and final cipher.
After Gillogly’s success, the CIA revealed that one of its employees, David Stein, had already
deciphered the portions Gillogly recovered back in 1998. Stein’s results appeared in a classified
publication, which James had no opportunity to see. Not to be bested by CIA, the National
Security Agency (NSA) revealed that some of their employees had solved it in 1992, but initially
NSA wouldn’t provide their names! In 2005, the information was released that it was actually a
team at NSA (Ken Miller, Dennis McDaniels, and two others whose identities are still not pub-
licly known).17 Despite intense attention, the last portion of Kryptos has resisted decipherment, at
least as far as the general public knows! “People call me an agent of Satan,” says artist Sanborn,
“because I won’t tell my secret.”18
Sanborn (Figure 2.14) has, at least, revealed valuable clues. On November 20, 2010, he indi-
cated that the letters starting at position 64 of the undeciphered portion, namely NYPVTT, deci-
pher to BERLIN.19 Surprisingly, this didn’t help! No solution came forth! Exactly four years later,
on November 20, 2014, Sanborn released another clue. It represented an expansion of his previous
clue. He said that NYPVTTMZFPK deciphered to BERLIN CLOCK. Still no solution appeared,
17 http://en.wikipedia.org/wiki/Kryptos.
18 Levy, Steven, “Mission Impossible: The Code Even the CIA Can’t Crack,” Wired, Vol. 17, No. 5, April 20, 2009.
19 Scryer, “Kryptos Clue,” The Cryptogram, Vol. 77, No. 1, January–February 2011, p. 11. Scryer is the American
and now we have another mini-mystery—why were both clues released on November 20? Does
this date have some special significance for Sanborn or Kryptos? In 2020, Sanborn provided a third
clue: the letters in positions 26 through 34 decipher to NORTHEAST. This clue broke the previous
pattern of dates, being given on January 29.20
Although matrix encryption is not presented until Chapter 6, I should point out now that
some researchers believe that a form of matrix encryption was used to create the still unsolved
portion of Kryptos. Greg Link, Dante Molle, and I investigated this possibility, but were unable to
offer definitive proof one way or the other.21 There are many ways in which matrices can be used to
encipher text and we were only able to use Sanborn’s clues to rule out some of the simpler methods.
2.5 Autokeys
Now that we’ve examined the cryptography, cryptanalysis, and historical uses of the basic Vigenère
cipher, let’s examine the real contribution that Blaise de Vigenère made to this system, which,
20 Schwartz, John, and Jonathan Corum, “This Sculpture Holds a Decades-Old Mystery. And Now, Another Clue,”
The New York Times, January 29, 2020, available online at https://www.nytimes.com/interactive/2020/01/29/
climate/kryptos-sculpture-final-clue.html.
21 Our investigation appeared as Bauer, Craig, Gregory Link, and Dante Molle, “James Sanborn’s Kryptos and the
Matrix Encryption Conjecture,” Cryptologia, Vol. 40, No. 6, 2016, pp. 541–552. The results were later summa-
rized in a broader survey of Kryptos as part of chapter 9 (pp. 386–407) of Bauer, Craig P., Unsolved! The History
and Mystery of the World’s Greatest Ciphers from Ancient Egypt to Online Secret Societies, Princeton University
Press, Princeton, New Jersey, 2017.
80 ◾ Secret History
recall, already existed. Vigenère’s autokey only used the given key (COMET in the examples below)
once. After that single application, the original message (or the ciphertext generated) is used as the
key for the rest of the message.
S E N D S U P P L I E S … message
C O M E T S E N D S U P … key
U S Z H L M T C O A Y H … ciphertext
The ciphertext used as “key”:
S E N D S U P P L I E S … message
C O M E T U S Z H L O H … key
U S Z H L O H O S T S Z … ciphertext
This particular example was presented by Claude Shannon in his classic paper “Communication
Theory of Secrecy Systems.”22
Since the encipherment of each letter depends on previous message or cipher letters, we have a
sort of chaining in use here. This idea was applied again when matrix encryption was discovered
and is still in use in modern block ciphers. It’s examined in greater detail in this book in Section
14.6, which covers various modes of encryption. Using the ciphertext as the key, although sound-
ing like a complication, yields an easily broken cipher!
One risk associated with various autokey methods is that an error in a single position can
propagate through the rest of the ciphertext. Observe what happens in our earlier example if the
third character is mistakenly enciphered as N instead of Z. An error-free encipherment is repro-
duced below for comparison.
S E N D S U P P L I E S … message
C O M E T U S N H L O H … key
U S N H L O H C S T S Z … ciphertext obtained
U S Z H L O H O S T S Z … ciphertext desired
Continuing on we would find that every fifth ciphertext character after our initial error is also
incorrect.
The Vigenère cipher was introduced in this chapter, then broken, and we’re about to go on to patch
it, creating a stronger system. However, it should be noted that the Vigenère cipher did not get disposed
of so quickly in the real world. It had an extremely successful run. We may never again see a system that
survives for hundreds of years before successful attacks are discovered.
22 Shannon, Claude, “Communication Theory of Secrecy Systems,” The Bell System Technical Journal, Vol. 28,
No. 4, October 1949, pp. 656–715. Shannon noted, “The material in this paper appeared in a confidential
report “A Mathematical Theory of Cryptography” dated Sept. 1, 1945, which has now been declassified.”
Simple Progression to an Unbreakable Cipher ◾ 81
Here the message is enciphered using A Tale of Two Cities by Charles Dickens as the key.
The Kasiski attack won’t work against this upgrade to the Vigenère cipher, because the key never
repeats and Friedman’s index of coincidence will only indicate that a large number of cipher alpha-
bets was used. However, that doesn’t mean that Friedman was defeated by the running key cipher. In
a cover letter introducing his paper “Methods for the Solution of Running-Key Ciphers,” he wrote:
Concerning the possibility of the decipherment of a message or a series of messages
enciphered by a running-key, it was said until as recently as three months ago, “It can’t
be done” or “It is very questionable.” It is probably known to you that the U.S. Army
Disk in connection with a running-key has been used as a cipher in field service for
many years, and is, to the best of our knowledge, in use to-day. I suppose that its long-
continued use, and the confidence placed in its safety as a field cipher has been due
very probably to the fact that no one has ever taken the trouble to see whether “It could
be done.” It is altogether probable that the enemy, who has been preparing for war for a
long time, has not neglected to look into our field ciphers, and we are inclined to credit
him with a knowledge equal to or superior to our own. We have been able to prove that
not only is a single short message enciphered by the U. S. Army Disk, or any similar
device, easily and quickly deciphered, but that a series of messages sent out in the same
key may be deciphered more rapidly than they have been enciphered!23
Friedman’s new attack was based on some very simple mathematics and is now examined.
A list of the probabilities of letters in English is provided once more in Table 2.3 for handy
reference.
Table 2.3 Probabilities of Letters in English
Letter Probability Letter Probability
A = 0 0.08167 N = 13 0.06749
B = 1 0.01492 O = 14 0.07507
C = 2 0.02782 P = 15 0.01929
D = 3 0.04253 Q = 16 0.00095
E = 4 0.12702 R = 17 0.05987
F = 5 0.02228 S = 18 0.06327
G = 6 0.02015 T = 19 0.09056
H = 7 0.06094 U = 20 0.02758
I = 8 0.06966 V = 21 0.00978
J = 9 0.00153 W = 22 0.02360
K = 10 0.00772 X = 23 0.00150
L = 11 0.04025 Y = 24 0.01974
M = 12 0.02406 Z = 25 0.00074
Source: Beutelspacher, Albrecht, Cryptology, Mathematical Association
of America, Washington DC, 1994, p. 10.
23 Friedman, William F., Methods for the Solution of Running-Key Ciphers, Publication No. 16, Riverbank
Laboratories, Geneva, Illinois, 1918.
82 ◾ Secret History
Now suppose we see the letter A in a ciphertext arising from some running key. It could
have arisen from an A in the message combining with another A from the key or it could have
arisen from a B in the message combining with a Z from the key. Which seems more likely
to you? The letter A is much more common than B or Z, so the first possibility is more likely.
There are other possible combinations that would yield an A in the ciphertext. Table 2.4 lists
all plaintext/key combinations along with their probabilities (obtained using Table 2.3).
Note that we must double the probabilities for distinct key and message letters. For exam-
ple, the pair B and Z, which combine to give A can do so with B in the message and Z in
the key or with Z in the message and B in the key. Thus, there are two equally probable ways
this pair of letters can yield A. However, there is only one way that the letters A and A can
combine to yield A. Similarly N and N can only combine in one way to give A. So, we do not
double the probability if two of the same letter combine to give the ciphertext letter.
The rankings in Table 2.4 show that a ciphertext A is most likely to arise from combining an
H and a T. However, the other pairings will sometimes be correct. Considering the top five pos-
sibilities will yield the correct pairings often enough that the remaining solutions can be found in
a manner similar to fixing typos. The tables of ranked pairings for each letter of the alphabet are
given in Table 2.5:24
24 Thanks to Adam Reifsneider for writing a computer program to calculate these rankings.
Simple Progression to an Unbreakable Cipher ◾ 83
E F G H
AE 0.0207474468 OR 0.0089888818 NT 0.0122237888 OT 0.0135966784
NR 0.0080812526 NS 0.0085401846 OS 0.0094993578 DE 0.0108043212
LT 0.0072900800 MT 0.0043577472 CE 0.0070673928 AH 0.0099539396
IW 0.0032879520 BE 0.0037902768 AG 0.0032913010 NU 0.0037227484
MS 0.0030445524 AF 0.0036392152 IY 0.0027501768 PS 0.0024409566
BD 0.0012690952 HY 0.0024059112 PR 0.0023097846 LW 0.0018998000
GY 0.0007955220 CD 0.0023663692 DD 0.0018088009 CF 0.0012396592
CC 0.0007739524 LU 0.0022201900 MU 0.0013271496 BG 0.0006012760
KU 0.0004258352 IX 0.0002089800 LV 0.0007872900 MV 0.0004706136
PP 0.0003721041 KV 0.0001510032 BF 0.0006648352 QR 0.0001137530
HX 0.0001828200 JW 0.0000722160 KW 0.0003643840 IZ 0.0001030968
OQ 0.0001426330 PQ 0.0000366510 HZ 0.0000901912 JY 0.0000604044
FZ 0.0000329744 GZ 0.0000298220 JX 0.0000045900 KX 0.0000231600
JV 0.0000299268 QQ 0.0000009025
(Continued)
84 ◾ Secret History
Table 2.5 (Continued) Ranked Pairings for Each Letter of the Alphabet
I J K L
EE 0.0161340804 RS 0.0075759498 RT 0.0108436544 EH 0.0154811976
AI 0.0113782644 EF 0.0056600112 DH 0.0051835564 ST 0.0114594624
OU 0.0041408612 CH 0.0033907016 EG 0.0051189060 AL 0.0065744350
RR 0.0035844169 NW 0.0031855280 SS 0.0040030929 DI 0.0059252796
PT 0.0034938048 BI 0.0020786544 CI 0.0038758824 RU 0.0033024292
DF 0.0018951368 DG 0.0017139590 OW 0.0035433040 NY 0.0026645052
BH 0.0018184496 LY 0.0015890700 AK 0.0012609848 PW 0.0009104880
NV 0.0013201044 OV 0.0014683692 MY 0.0009498888 FG 0.0008978840
MW 0.0011356320 PU 0.0010640364 FF 0.0004963984 BK 0.0002303648
CG 0.0011211460 AJ 0.0002499102 PV 0.0003773124 OX 0.0002252100
KY 0.0003047856 QT 0.0001720640 NX 0.0002024700 CJ 0.0000851292
LX 0.0001207500 MX 0.0000721800 LZ 0.0000595700 MZ 0.0000356088
QS 0.0001202130 KZ 0.0000114256 QU 0.0000524020 QV 0.0000185820
JZ 0.0000022644 BJ 0.0000456552
M N O P
EI 0.0176964264 AN 0.0110238166 AO 0.0122619338 EL 0.0102251100
TT 0.0082011136 TU 0.0049952896 HH 0.0037136836 HI 0.0084901608
AM 0.0039299604 FI 0.0031040496 DL 0.0034236650 TW 0.0042744320
SU 0.0034899732 RW 0.0028258640 SW 0.0029863440 CN 0.0037551436
OY 0.0029637636 GH 0.0024558820 GI 0.0028072980 AP 0.0031508286
FH 0.0027154864 CL 0.0022395100 BN 0.0020139016 RY 0.0023636676
BL 0.0012010600 SV 0.0012375612 EK 0.0019611888 BO 0.0022400888
RV 0.0011710572 PY 0.0007615692 TV 0.0017713536 DM 0.0020465436
CK 0.0004295408 BM 0.0007179504 CM 0.0013386984 UV 0.0005394648
GG 0.0004060225 DK 0.0006566632 UU 0.0007606564 FK 0.0003440032
DJ 0.0001301418 EJ 0.0003886812 RX 0.0001796100 SX 0.0001898100
NZ 0.0000998852 OZ 0.0001111036 FJ 0.0000681768 GJ 0.0000616590
PX 0.0000578700 QX 0.0000028500 QY 0.0000375060 QZ 0.0000014060
QW 0.0000448400 PZ 0.0000285492
(Continued)
Simple Progression to an Unbreakable Cipher ◾ 85
Table 2.5 (Continued) Ranked Pairings for Each Letter of the Alphabet
Q R S T
EM 0.0061122024 EN 0.0171451596 EO 0.0190707828 AT 0.0147920704
DN 0.0057406994 AR 0.0097791658 AS 0.0103345218 IL 0.0056076300
II 0.0048525156 DO 0.0063854542 HL 0.0049056700 EP 0.0049004316
CO 0.0041768948 TY 0.0035753088 FN 0.0030073544 FO 0.0033451192
SY 0.0024978996 GL 0.0016220750 BR 0.0017865208 CR 0.0033311668
FL 0.0017935400 CP 0.0010732956 DP 0.0016408074 HM 0.0029324328
UW 0.0013017760 FM 0.0010721136 UY 0.0010888584 GN 0.0027198470
BP 0.0005756136 HK 0.0009409136 IK 0.0010755504 BS 0.0018879768
GK 0.0003111160 VW 0.0004616160 GM 0.0009696180 VY 0.0003861144
TX 0.0002716800 IJ 0.0002131596 WW 0.0005569600 DQ 0.0000808070
HJ 0.0001864764 SZ 0.0000936396 TZ 0.0001340288 WX 0.0000708000
AQ 0.0001551730 UX 0.0000827400 CQ 0.0000528580 UZ 0.0000408184
VV 0.0000956484 BQ 0.0000283480 VX 0.0000293400 JK 0.0000236232
RZ 0.0000886076 JJ 0.0000023409
U V W X
HN 0.0082256812 ER 0.0152093748 ES 0.0160731108 ET 0.0230058624
DR 0.0050925422 IN 0.0094027068 IO 0.0104587524 FS 0.0028193112
AU 0.0045049172 HO 0.0091495316 DT 0.0077030336 IP 0.0026874828
CS 0.0035203428 DS 0.0053817462 AW 0.0038548240 GR 0.0024127610
IM 0.0033520392 CT 0.0050387584 FR 0.0026678072 DU 0.0023459548
GO 0.0030253210 AV 0.0015974652 HP 0.0023510652 LM 0.0019368300
BT 0.0027023104 BU 0.0008229872 LL 0.0016200625 KN 0.0010420456
WY 0.0009317280 GP 0.0007773870 CU 0.0015345512 BW 0.0007042240
FP 0.0008595624 KL 0.0006214600 YY 0.0003896676 CV 0.0005441592
EQ 0.0002413380 JM 0.0000736236 KM 0.0003714864 AX 0.0002450100
JL 0.0001231650 XY 0.0000592200 BV 0.0002918352 JO 0.0002297142
KK 0.0000595984 FQ 0.0000423320 JN 0.0002065194 HQ 0.0001157860
VZ 0.0000144744 WZ 0.0000349280 GQ 0.0000382850 YZ 0.0000292152
XX 0.0000022500 XZ 0.0000022200
(Continued)
86 ◾ Secret History
We now take a look at a running key ciphertext to see how Friedman used rankings like those
given above to read the original message and key—for example,
L A E K A H B W A G W I P T U K V S G B
The L that starts the ciphertext is likely to have arisen from E + H, S + T, A + L, D + I, or R +
U, as these are the top five pairings that yield L. We write these letters under L in a long vertical
column, and then write them out again with the ordering reversed in each pair. We do the same
with the top five pairings for each of the other letters in the ciphertext. This gives us Table 2.6.
Table 2.6 Sample Running Key Ciphertext
L A E K A H B W A G W I P T U K V S G B
E H A R H O I E H N E E E A H R E E N I
H T E T T T T S T T S E L T N T R O T T
S I N D I D N I I O I A H I D D I A O N
T S R H S E O O S S O I I L R H N S S O
A A L E A A H D A C D O T E A E H H C H
L A T G A H U T A E T U W P U G O L E U
D E I S E N A A E A A R C F C S D F A A
I W W S W U B W W G W R N O S S S N G B
R N M C N P D F N I F P A C I C C B I D
U N S I N S Y R N Y R T P R M I T R Y Y
H T E T T T T S T T S E L T N T R O T T
E H A R H O I E H N E E E A H R E E N I
T S R H S E O O S S O I I L R H N S S O
S I N D I D N I I O I A H I D D I A O N
L A T G A H U T A E T U W P U G O L E U
A A L E A A H D A C D O T E A E H H C H
I W W S W U B W W G W R N O S S S N G B
D E I S E N A A E A A R C F C S D F A A
U N S I N S Y R N Y R T P R M T T R Y Y
R N M C N P D F N I F P A C I C C B I D
Simple Progression to an Unbreakable Cipher ◾ 87
Reversing the order of the letter pairs in the second block allows a nice correspondence. If the
third letter under the L was actually used in the message or key to generate the L, then the third letter
down in the bottom block of text is what it was paired with. You’ll soon see how this helps with the
cryptanalysis. Letters that pair with themselves to give the desired ciphertext letter will appear twice
in our table. This is redundant, but aesthetically pleasing; it keeps all of the columns the same length.
We now focus on the first block of text, the ten rows of letters directly beneath the ciphertext.
We try to select a single letter from each column, such that a meaningful message is formed by them,
when read across. There may be many, but we go slowly and can easily tell if we are likely to be on
the right track. We do this by considering letters in the lower block of text that occupy the same posi-
tions as the letters in the message we are attempting to form. If the letters in the bottom block are also
forming words, we gain confidence in our solution. This will become clearer as our example continues.
THE is the most common word in English, so we may as well try to start there. We select let-
ters in the top block of text that spell THE and see what we get from those positions in the bottom
block of text (see Table 2.7).
We get STA, which sounds promising. It could continue as STAY, STATION, STAB,
STATISTICIAN, STALINGRAD, STALACTITE, STAPHYLOCOCCUS…—we have many pos-
sibilities! However, the top rows of each rectangle contain the letters most likely to be used in the
continuations. It’s best to look for possibilities there first. The Y in STAY (the first word that came
to mind) doesn’t even show up in the appropriate column of the bottom rectangle. It is a possibil-
ity that STAY is correct, but not the strongest possibility. Take a moment to examine the bottom
rectangle for yourself before reading any further. What word do you think is formed?
88 ◾ Secret History
As the words start to form, we cannot tell which text is the message and which is the key.
Hopefully, when we’re done we’ll be able to distinguish the two.
Okay, did you find the word START? That seems like the best choice. Let’s see what it gives us
in the corresponding positions of the upper rectangle (Table 2.8).
The top rectangle now reads THE TH. This looks okay. It could turn out to be THE THOUGHT
IS WHAT COUNTS or THE THREE AMIGOS or THE THREAT OF DEFEAT LOOMS
LARGE. To see what happens if we make a wrong turn, let’s investigate the result if we guessed the
bottom rectangle read STAGE (Table 2.9).
The top text would be THE EW…, which we’d have trouble continuing or possibly THEE W…,
but THEE seems like an unlikely word, if the message wasn’t sent by Shakespeare or Thor.
So continuing on with the texts we’ve recovered, THE TH and START, we may look at either
rectangle, whichever seems easier to continue. I prefer the top rectangle. Examine it yourself and
see if your selection matches the one I give in Table 2.10.
THE TH can be extended to THE THOUSAND and the bottom rectangle then yields START
THE ATT… This must be START THE ATTACK. It seems that we’re getting the message in the
bottom block and the key in the top block this time. We try to complete the word ATTACK in the
bottom rectangle and check to make sure the top rectangle is still giving something meaningful.
But we hit a snag—there’s no K to be found where we need one! That’s okay. The rectangles list
the most likely pairings, but less likely pairings can occur. We simply tack on the K we need, along
Simple Progression to an Unbreakable Cipher ◾ 89
with (in the top rectangle) the J it must combine with to yield the ciphertext letter T (Table 2.11).
We can always add letters in this manner, but really shouldn’t unless we are either fairly confident
that they are correct or have no other reasonable options.
Our two texts now read START THE ATTACK and THE THOUSAND INJ… Once more,
please take a moment to look for the continuation of INJ in the top rectangle (to see that it’s not
so hard to do) before reading on.
Of course, one could proceed in a different direction by trying to extend START THE
ATTACK. In general though, it’s easier to complete partial words than to find new ones, unless the
preceding words are the beginning of a well-known phrase. Speaking of which, there will likely
be a reader who has already recognized the source of the key being used here. More in a moment.
Okay, did you extend the top text to THE THOUSAND INJURIES? This makes the bottom
text read START THE ATTACK AT NOO. Looking at the bottom text one last time, we extend
it by another letter and then do the same in the top rectangle.
We now have the message and the key:
THE THOUSAND INJURIES O key
START THE ATTACK AT NOON message
The key was, by the way, the beginning of Edgar Allan Poe’s short story “The Cask of
Amontillado.”
The example chosen above to illustrate this attack was a little easier than usual. Nine times
out of twenty (45%), the pair of letters most likely to yield a particular ciphertext letter did yield
Simple Progression to an Unbreakable Cipher ◾ 91
that letter. Experiments reveal that this happens less than a third of the time on average. Also, the
example only had one ciphertext letter (5% of the cipher) that didn’t arise from one of the five most
likely pairings. This was a lower percent than usual.
Nevertheless, this is a great attack. Friedman made the attack even quicker by cutting out the
columns. This allowed him to move individual columns up and down. When he got a word or
phrase going across letters from the top rectangle, he could glance down at the bottom rectangle
and also read the corresponding letters there straight across. I used bold text and underlining,
because it is easier to present it this way in a book, but the sliding paper works better for classroom
demonstrations.
Although Friedman didn’t pursue it further, his attack can be expanded. Instead of taking the
ciphertext characters one at a time, groups may be considered. For example, suppose the ciphertext
begins MOI. Our tables suggest
M is most likely E + I.
O is most likely A + O.
I is most likely E + E.
But these tables don’t take context into consideration. The top pairing suggested for O doesn’t
consider what letters appear before or after it in the ciphertext. Taking the letters MOI as group
and using trigraph frequencies to rank the possibilities, we see it is most likely to arise from
THE + THE.
Christian N.S. Tate, an undergraduate at the time, and I investigated this new attack. We split
the ciphertext into groups of characters and used frequencies for letter combinations of the group
size to rank our pairings. For example, if the ciphertext was HYDSPLTGQ, and we were using
groups of three characters to launch our attack, we’d split the ciphertext up as HYD SPL TGQ
and replace each trigram with the pair most likely to yield it.
The idea needed to be tested and the easiest way to do this was to write computer programs to
carry out the calculations and analyze the results. We tested single character analysis (as Friedman
had) digraphs, trigraphs, tetragraphs, pentagraphs, and hexagraphs.
I expected that after the results were in, I could compare the new attack to Friedman’s and
say, “It is a far far better thing I have done.” However, this was not the case. The results were only
marginally better! We tried to place the results in the best light possible for the write-up. Figuring
even the slightest improvement on Friedman’s work ought to be worth publishing, we submitted
the paper. A kind editor accepted it following some revising.
Fortunately, Alexander Griffing, a student in the Bioinformatics Ph.D. program at North
Carolina State University, saw the paper and was able to turn the attack into something truly suc-
cessful. He did this by not just taking the blocks of ciphertext characters one after the other, but
by also considering their overlap when computing the most likely solution.25 So, when considering
the ciphertext HYDSPLTGQ, Griffing’s method didn’t just look at how HYD, SPL, and TGQ could
arise, but rather HYD, YDS, DSP, SPL, PLT, LTG, and TGQ. A graph reproduced from his paper is
provided in Figure 2.15. It shows how his method is better than the method Tate and I proposed
for every size letter group considered, and dramatically better when pentagraphs and hexagraphs
are used.
25 Griffing, Alexander, “Solving the Running Key Cipher with the Viterbi Algorithm,” Cryptologia, Vol. 30, No.
4, October 2006, pp. 361–367.
92 ◾ Secret History
Fraction Correct
0.8
0.6
0.4
0.2
0
1 2 3 4 5 6
N-gram Size (characters)
Figure 2.15 Griffing’s results (solid line) compared to an earlier attempt by others (dotted line).
With the techniques discussed in this chapter, running key ciphers of any length can be bro-
ken; however, extremely short messages are not likely to have unique solutions. The graph in
Figure 2.16 shows how the number of potential solutions changes as a function of the message’s
length. Beyond eight characters, we expect only a single solution.
24
20
16
12
0 4 8 12
Figure 2.16 Number of spurious decipherments as a function of the message size. (From
Deavours, Cipher A., “Unicity Points in Cryptanalysis,” Cryptologia, Vol. 1, No. 1, January 1977,
pp. 46-68, p. 62 cited here.)
the attacks already described, and all other attacks. Used properly, it is unbreakable! This is, in fact,
the only cipher that is theoretically unbreakable.26 Edgar Allan Poe didn’t know about it when he
wrote that “human ingenuity cannot concoct a cipher which human ingenuity cannot resolve.” We
can forgive him, because this method had not yet been discovered. Despite the way in which the
unbreakable system seems to naturally evolve from “patching” the running key cipher, historians
long believed27 that it didn’t arise until the years 1917 to 1918, when it was developed by Gilbert
Vernam (Figure 2.17) and Major Joseph Mauborgne (Figure 2.18) at AT&T. The form it took then
was different from how it is introduced here, but functionally equivalent. We’ll return shortly to how
Figure 2.18 Joseph Mauborgne (1881–1971) (From The Signal Corp Bulletin, October-December,
1937.)
26 Shannon, Claude, “Communication Theory of Secrecy Systems,” The Bell System Technical Journal, Vol. 28,
No. 4, October 1949, pp. 656–715. Shannon noted, “The material in this paper appeared in a confidential
report “A Mathematical Theory of Cryptography” dated Sept. 1, 1945, which has now been declassified.”
27 If you know about Frank Miller, please be patient. I discuss his work at the end of this chapter.
94 ◾ Secret History
Vernam and Mauborgne described their system. Sometimes it’s referred to as a Vernam cipher, but
the name one-time pad is a bit better because it emphasizes the proper use of the key—only once! If a
random key is used for more than one message, it is no longer unbreakable, as we shall see.
The one-time pad could easily have been discovered hundreds of years earlier, as it works in
much the same way as a Vigenère cipher or a running key cipher, except the key is random and
must be as long as the message. As an example, suppose the one-time pad begins U SNHQ LCIYU
and Bob wants to send the message: I LOVE ALICE. Using the pad as the key and adding letters
(using their numerical equivalents) mod 26, we have
I LOVE ALICE plaintext
U SNHQ LCIYU key
C DBCU LNQAY ciphertext
If Eve intercepts the message and correctly guesses the key, she recovers the message. However, she
has no reason to guess this particular key. If she instead guesses U WBJQ LCIYU, then the message
deciphers to I HATE ALICE. Or suppose she tries the key U SNHQ TTYAL. In this case, the
message becomes I LOVE SUSAN. Any message ten characters in length will arise from some key.
As Eve has no reason to favor one key over another, she receives no information beyond the length of
the message. In fact, the length of the ciphertext only provides an upper bound on the length of the
message. Padding may have been added to make a very brief message appear longer.
Following the development of an unbreakable cipher, we might well expect that it would
quickly be adopted by everyone and all other methods of encryption would vanish. This was not
the case! In fact, it wasn’t until the early 1920s that this American discovery saw heavy use and it
was by the Germans!28 They used it as an extra step for their diplomatic codes. That is, after the
message was converted to numerical code groups, the one-time pad, in the form of a list of random
digits between 0 and 9, was used to shift each of the ciphertext values. Whenever such an extra
step is taken, whether the key is random or not, we refer to the system as an enciphered code. One-
time pads were also used by the Office of Strategic Services (OSS),29 an American group in World
War II that evolved into both the CIA and the Green Berets, and heavily by the Soviet Union for
diplomatic messages beginning in 1930.30
One famous Russian spy caught by the Federal Bureau of Investigation (FBI) in New York
City with a one-time pad in 1957 was Rudolf Abel.31 Five years after his arrest he was traded for
Francis Gary Powers, who was shot down while piloting a U-2 spy plane over the Soviet Union.
The trade took place on the Glienick Bridge connecting East and West Berlin.
Figure 2.19 is a page from a one-time pad used by a Communist spy in Japan in 1961.32 One
side was probably for enciphering and the other for deciphering.
Although it had been talked about for some time, it was only in 1963, after the Cuban mis-
sile crisis, that the “hot line” was set up between Washington, DC, and Moscow (Figure 2.20).
28 This is according to Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 402. However,
Alexander “Alastair” Guthrie Denniston recalled German use beginning in 1919 in a memoir (penned in
1944) titled “The Government Code and Cypher School between the Wars,” which appeared posthumously in
Intelligence and National Security, Vol. 1, No. 1, January 1986, pp. 48–70 (p. 54 cited here). Kahn’s estimate of
1921–1923 was based on interviews conducted with German cryptographers in 1962.
29 But not exclusively, as it was just one of many systems they employed. See Kahn, David, The Codebreakers,
Figure 2.19 Page from a one-time pad used by a Communist spy in Japan in 1961. (Courtesy of
the David Kahn Collection, National Cryptologic Museum, Fort Meade, Maryland.)
Figure 2.20 The one-time tape is visible on the left side in this picture of the hot line between
Washington, DC, and Moscow. (Courtesy of the National Cryptologic Museum, Fort Meade,
Maryland.)
Actually, there were two hot lines (a backup is always a good idea), and both were secured with
one-time pads. A commercially available system that was keyed by tapes was used.33
When Ché Guevara was killed in Bolivia in 1967, he was found to be carrying a one-time
pad.34 A message Ché sent to Fidel Castro months earlier, which used a Polybius cipher to convert
33 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, pp. 715–716.
34 Polmar, Norman, and Thomas B. Allen, Spy Book: The Encyclopedia of Espionage, Random House, New York,
1997, p. 413.
96 ◾ Secret History
the text to numbers prior to applying a numerical one-time pad, was later decoded by Barbara
Harris and David Kahn using Ché’s calculation sheet.35
35 James, Daniel, Ché Guevara, Stein and Day, New York, 1969.
36 Erskine, Ralph, “Enigma’s Security: What the Germans Really Knew,” in Erskine, Ralph and Michael Smith,
Action this Day, Bantam Press, London, UK, 2001, pp 370–385, p. 372 cited here.
37 Phillips, Cecil James, “What Made Venona Possible?” in Benson, Robert Louis and Michael Warner, editors,
Venona: Soviet Espionage and the American Response, 1939-1957, NSA/CIA, Washington, DC, 1996, p. xv.
Simple Progression to an Unbreakable Cipher ◾ 97
meaningful messages. In such differences, we expect identical pairs of letters to align about 6.6%
of the time (using English as an example). So, we should have about 6.6% of the difference be A =
0. If C1 and C2 did not arise from the same one-time pad key, the difference C1 – C2 would essen-
tially be random and for such text the expected probability of A = 0 is only about 3.8%. But in the
case of the Soviet ciphers, the underlying code made the pairing and deciphering processes more
difficult. Despite this extra obstacle, portions of over 2,900 messages were eventually read.38 In
late 1953, after years of effort, it was discovered that a copy of a partially burned codebook found
in April 194539 had been used for messages for 1942 and most of 1943.40 However, by this time,
the main breakthrough had already been made, the hard way.41
The intelligence derived from this material was first codenamed Jade, but then changed to
Bride, Drug, and finally Venona, the name it’s referred to by historians today. Many of the deci-
pherments weren’t found until the 1960s and 1970s.42 Declassified documents containing the
plaintexts of these messages were released beginning in July 1995.
Because all of the security lies in keeping the key secret, it must be easily hidden and easy to
destroy if the agent is compromised; otherwise, we have another way the system can fail.
Figure 2.21 A one-time pad, very much like those used by Cold War spies. (Copyright Dirk
Rijmenants, 2009; https://users.telenet.be/d.rijmenants/en/onetimepad.htm.)
A one-time pad, like the one pictured in Figure 2.21, was found in the possession of Helen
and Peter Kroger, two spies for the Soviets caught in England in 1961. Both Americans, their
38 Benson, Robert Louis and Michael Warner, editors, Venona: Soviet Espionage and the American Response, 1939-
1957, NSA/CIA, Washington, DC, 1996, p. viii.
39 It was originally found by Finnish troops, who overran the Soviet consulate in Finland in 1941. The Germans
then got the book from the Finns, and finally, in May 1945, Americans found a copy in a German signals
intelligence archive in Saxony, Germany. See Haynes, John Earl and Harvey Klehr, Venona: Decoding Soviet
Espionage in America, Yale University Press, New Haven, Connecticut, 1999, p. 33.
40 Haynes, John Earl and Harvey Klehr, Venona: Decoding Soviet Espionage in America, Yale University Press,
true names were Morris and Lona Cohen. They had done spy work in the United States, but fled
the country following the arrest of Julius Rosenberg. After their capture, they were sentenced to
20 years in prison, but 8 years later they were traded to the Soviets. A Soviet newspaper stated,
“Thanks to Cohen, designers of the Soviet atomic bomb got piles of technical documentation
straight from the secret laboratories in Los Alamos.”43
Our initial example used letters for the key, but the pad depicted above used numbers. Suppose
we use a random number generator to get a string of values, each between 0 and 9. We then shift
each letter of a message by the digits, one at a time. Here’s an example:
IS THIS SECURE? message
74 9201 658937 key
PW CJIT YJKDUL ciphertext
The answer to the question posed by the message is no! If there are only ten possible shifts for
each letter, then there are only ten possible decipherments for each ciphertext character. For exam-
ple, the initial ciphertext letter P couldn’t possible represent plaintext A, as that would require the
first digit of the key to have been 15.
In order to securely implement a numerical one-time pad, the message must first be converted
to numbers. Using a numerical code like the Russian spies did is fine, as is first applying the
Polybius cipher, like Ché Guevara.
Such ciphers remained popular with spies during the cold war, even though the field continued
to advance with machine encryption. There are a number of reasons for this. High among them
is the fact that they can be created with pencil and paper; the spy needn’t carry around a cipher
machine, which would tend to be incriminating!
43 Komsomolskaya Pravda, quoted here from Polmar, Norman and Thomas B. Allen, Spy Book: The Encyclopedia
of Espionage, Random House, New York, 1997, p. 128.
Simple Progression to an Unbreakable Cipher ◾ 99
Printer
Message Machine
Transmitter Perforator
Key Tape
Keyboard Transmitters
Perforator
Key Tapes
Figure 2.22 Cipher printing telegraph machine. (From Vernam, Gilbert S., “Cipher print-
ing telegraph systems: For secret wire and radio telegraphic communications,” Journal of the
American Institute of Electrical Engineers, Vol. 45, No. 2, February 1926, pp. 109-115, p. 109
cited here.)
The key tape looks like the message tape except that the sequence of characters represented is
random. Vernam suggested it be generated in advance by “working the keyboard at random.” Each
pair of message and key characters is combined by the machine to yield a cipher character. This is
done using a rule like for the Vigenère tableau, but with 32 shifted alphabets, instead of 26. On the
recipient’s end, the cipher tape is combined with a duplicate of the key tape to reclaim the original
message, which is then automatically printed in letter form.
Vernam intended for the tape to be used in loops, but this repetition made it equivalent to a
Vigenère cipher. An engineer, Lyman F. Morehouse, had the idea of using two loops of tape, with
one of them being one character longer than the other. The characters produced by combining the
pairs of characters (one from each tape loop) would be used as the key. Although he knew such
a sequence could not be truly random, he did get much longer key segments (the product of the
two lengths) than he would if each tape had been used individually (the sum of the two lengths).44
Mauborgne’s contribution was to recognize, in 1918, as the Vernam cipher was evolving, that
the system would be completely unbreakable if the key was random and never repeated. Years
later, Vernam promoted his device in a paper by describing how impractical it would be to imple-
ment this unbreakable cipher by hand:45
This method, if carried out manually, is slow and laborious and liable to errors. If
errors occur, such as the omission of one or more letters, the messages are difficult
for the recipient to decipher. Certain difficulties would also be involved in preparing,
copying and guarding long random keys. The difficulties with this system are such as
to make it unsuitable for general use, unless mechanical methods are used.
44 He also suggested that the two lengths could differ by some other amount, as long as that amount was not a
factor of the length of either tape. If this doesn’t sound quite right to you, good! There’s a much better way to
restrict the lengths of the two tapes if we want to gain a long period.
45 Vernam, Gilbert S., “Cipher Printing Telegraph Systems: For Secret Wire and Radio Telegraphic
Communications,” Journal of the American Institute of Electrical Engineers, Vol. 45, No. 2, February 1926, pp.
109–115, p. 113 cited here.
100 ◾ Secret History
Vernam also noted in this paper, “This cipher was demonstrated before the delegates to the
Preliminary International Communications Conference in October, 1920.”46 In today’s world
a government employee would not be demonstrating the latest in encryption technology at an
international conference, or publishing it in an open journal!
Figure 2.23 Five-unit printing telegraph code. This is sometimes referred to as a Baudot code,
after its French inventor J.M.E. Baudot, who is also the source for the term baud. (Image drawn
by and courtesy of Sam Hallas.)
46 Vernam, Gilbert. S., “Cipher Printing Telegraph Systems: For Secret Wire and Radio Telegraphic
Communications,” Journal of the American Institute of Electrical Engineers, Vol. 45, No. 2, February 1926, pp.
109–115, p. 115 cited here.
Simple Progression to an Unbreakable Cipher ◾ 101
If the enciphered text in Table 2.12 arose from a one-time pad, we cannot read it without the
key; however, it is believed that Mauborgne didn’t yet have the idea of the one-time pad in 1915.
On the other hand, the known systems from this year (or earlier) shouldn’t be too hard to crack
with modern attacks and technology. So, why don’t we have a plaintext yet? My best guess is that
it used a wheel cipher of the sort described in Section 4.2.48
47 Kruh, Louis, “A 77-Year Old Challenge Cipher,” Cryptologia, Vol. 17, No. 2, April 1993, pp. 172–174.
48 See Bauer, Craig P., Unsolved! The History and Mystery of the World’s Greatest Ciphers from Ancient Egypt to
Online Secret Societies, Princeton University Press, Princeton, New Jersey, 2017, Chapter 8, for a much deeper
look at this unsolved cipher.
49 Marks, Leo, Between Silk and Cyanide: A Codemaker’s War, 1941-1945, The Free Press, New York, 1998, p. 250.
102 ◾ Secret History
Figure 2.24 Leo Marks holding a silk one-time pad from Sara Krulwich/The New York
Times/Redux Pictures. With permission.)
In that instant, Bellovin had a publishable result. It could’ve been the easiest paper he’d ever
written. In fact, I saw a comment posted online that was critical of stumbling over a codebook
being considered scholarly research. Adacrypt wrote “‘Trainspotting’ old links to defunct cryp-
tography is hardly to be called crypto research.”52 David Eather responded with “And like you say
it is really nothing… so it is strange you couldn’t do it for yourself.”53 I mention this exchange to
emphasize the point that chance discoveries are usually made by people who are looking and not
by people who are busy criticizing others. There’s plenty to do in Washington, DC. Bellovin didn’t
have to spend his time at the Library of Congress looking at code books. Because he was looking,
though, he was much more likely to make such a discovery. And, having made the discovery he
brought the full force of his research abilities to bear on it. Not anyone could have done this. In
fact, Bellovin crafted a 20-page paper with 78 references! It impressed the editor of Cryptologia
enough to earn it the position of lead article in the July 2011 issue. It was also covered in The
New York Times.54 Finds like Bellovin’s are sometimes among the earned rewards of a life spent
passionately engaged in research. A paper by a pair of retired NSA historians detailing two more
examples of this sort of reward, as well as giving tips on how to increase the likelihood of such
finds is Hanyok, Robert J. and Betsy Rohaly Smoot, “Sources and Methods: Contingency and its
Role in Researching Records of Cryptologic History – A Discussion and Some Lessons to Apply
for Future Research,” Cryptologia, Vol. 44, No. 6, November 2020.
Dunin, Elonka, Elonka’s Kryptos Page, http://elonka.com/kryptos/. This page, focused on Kryptos, is a great
resource for anyone wanting to learn more about the mysterious sculpture and its creator.
Friedman, William F., “Jacques Casanova de Seingalt, Cryptologist,” Casanova Gleanings, Vol. 4, 1961, pp.
1–12.
Gardner, Martin, “Mathematical Games, A New Kind of Cipher that Would Take Millions of Years to
Break,” Scientific American, Vol. 237, No. 2, August 1977, pp.120–124. This important paper will be
revisited in Section 15.4. For now, the relevant part is inset on page 124 of this article and concerns
the decipherment of the Vigenère cipher sent to Poe.
Grošek, Otokar, Eugen Antal, and Tomáš Fabšič, “Remarks on Breaking the Vigenère Autokey Cipher,”
Cryptologia, Vol. 43, No. 6, November 2019, pp. 486–496.
Hamilton, Michael and Bill Yankosky, “The Vigenère Cipher with the TI-83,” Mathematics and Computer
Education, Vol. 38, No. 1, Winter 2004. Hamilton was an undergraduate at North Carolina Wesleyan
College when this paper was written.
Kaeding, Thomas, “Slippery Hill-climbing Technique for Ciphertext-only Cryptanalysis of Periodic
Polyalphabetic Substitution Ciphers,” Cryptologia, Vol. 44, No. 3, May 2020, pp. 205–222.
Levy, Steven, “Mission Impossible: The Code Even the CIA Can’t Crack,” Wired, Vol. 17, No. 5, April 20,
2009, http://www.wired.com/science/discoveries/magazine/17-05/ff_kryptos.
Lipson, Stanley H. and Francine Abeles, “The Key-Vowel Cipher of Charles L. Dodgson,” Cryptologia, Vol.
15, No. 1, January 1991, pp. 18–24. This paper describes a cipher invented by the famed author of
Alice in Wonderland (under the pseudonym Lewis Carroll) that turns out to be of the Vigenère type
with nulls inserted systematically.
McCloy, Helen, Panic, William Morrow, New York, 1944. This is a novel. The author thought she found a
nice solution to the problem of having sufficiently mixed alphabets and a hard-to-guess key without
requiring the users to write any of it down; that is, the key and alphabets are easy to generate when
needed. The work is of no value to cryptographers but might interest literature buffs. The cipher seems
to have been the motivation for the novel.
Park, Seongmin, Juneyeun Kim, Kookrae Cho, and Dae Hyun Yum, “Finding the Key Length of a Vigenère
Cipher: How to Improve the Twist Algorithm,” Cryptologia, Vol. 44, No. 3, May 2020, pp. 197–204.
Schwartz, John and Jonathan Corum, “This Sculpture Holds a Decades-Old Mystery. And Now, Another
Clue,” The New York Times, January 29, 2020, available online at https://www.nytimes.com/interac-
tive/2020/01/29/climate/kryptos-sculpture-final-clue.html.
Scryer, “The Kryptos Sculpture Cipher: A Partial Solution,” The Cryptogram, Vol. 65, No. 5, September–
October 1999, pp. 1–7. Scryer is the ACA pen name used by James J. Gillogly.
Scryer, “Kryptos Clue,” The Cryptogram, Vol. 77, No. 1, January–February 2011, p. 11. Scryer is the ACA
pen name used by James J. Gillogly.
Tuckerman, Bryant, A Study of the Vigenère-Vernam Single and Multiple Loop Enciphering Systems, IBM
Research Report RC-2879, T. J. Watson Research Center, Yorktown Heights, New York, May 14,
1970. This 115-page report shows such systems to be insecure.
de Vigenère, Blaise, Traicté des Chiffres, ou, Secretes Manieres D’escrire, Abel l’Angelier, Paris, 1586.
Vigenère, Cryptool – Online, https://www.cryptool.org/en/cto/ciphers/vigenere. This website allows users
to encipher and decipher using Vigenère. It’s part of a much larger (and still growing) site that cover
many ciphers and includes online cryptanalysis programs.
Winkel, Brian J., “Casanova and the Beaufort Cipher,” Cryptologia, Vol. 2, No. 2, April 1978, pp. 161–163.
Friedman, William F., Methods for the Solution of Running-Key Ciphers, Publication No. 16, Riverbank
Laboratories, Geneva, Illinois, 1918. Friedman showed that the U.S. Army’s field cipher was insecure
in this paper, even for short messages. This was reprinted together with other Friedman papers in
Friedman, William, F. The Riverbank Publications, Vol. 1, Aegean Park Press, Laguna Hills, California,
1979. As the original printing only consisted of 400 copies, I suggest looking for the reprint instead.
Griffing, Alexander, “Solving XOR Plaintext Strings with the Viterbi Algorithm,” Cryptologia, Vol. 30, No.
3, July 2006, pp. 257–265. This paper attacks running key ciphers where word spacing is preserved in
the message and the key.
Griffing, Alexander, “Solving the Running Key Cipher with the Viterbi Algorithm,” Cryptologia, Vol. 30,
No. 4, October 2006, pp. 361–367. This paper dramatically improves upon the results in Bauer and
Tate and in Bauer and Gottloeb, to the point that these papers ought to be burned.
On One-Time Pads
Note: There is some overlap between papers on generating one-time pads and papers on random number
generators. Papers that fall in the overlap are only referenced in this book in Chapter 19, which is on stream
ciphers. Stream ciphers serve as approximations of the one-time pad without having the problems associated
with true one-time pads.
Anon., “Automatic Code Messages,” in “Science News” section of Science, New Series, Vol. 63, No. 1625,
February 19, 1926, pp. x, xii.
Anon., “A Secret-Code Message Machine,” The Literary Digest, Vol. 89, No. 3, Whole No. 1878, April
17, 1926, p. 22. This article, after an introductory paragraph, reproduces text from Science Service’s
Daily News Bulletin: “The new machine was described by G. S. Vernam, engineer of the American
Telegraph and Telephone Company, who stated that it had been developed for the use of the Signal
Corps of the U.S. Army during the war, but until recently it had been kept secret.” Kept secret? Why?
It’s not like the Signal Corps was actually using it or anything…
Bellovin, Steven M., “Frank Miller: Inventor of the One-Time Pad,” Cryptologia, Vol. 35, No. 3, July 2011,
pp. 203–222.
Benson, Robert L., The Venona Story, Center for Cryptologic History, National Security Agency, Fort
George G. Meade, Maryland, 2001, available online at https://www.nsa.gov/Portals/70/documents/
about/cryptologic-heritage/historical-figures-publications/publications/coldwar/venona_story.pdf.
Benson, Robert Louis and Michael Warner, editors, Venona: Soviet Espionage and the American Response,
1939–1957, NSA/CIA, Washington DC, 1996. The bulk of this book is reproductions of formerly
classified documents. The preface is nice, but the rest is dry. Although the reproductions are of value
to historians, casual readers will prefer the book by John Earl Haynes and Harvey Klehr referenced
below.
Bury, Jan, “Breaking Unbreakable Ciphers: The Asen Georgiyev Spy Case,” Cryptologia, Vol. 33, No. 1,
2009, pp. 74–88.
Bury, Jan, “From the Archives: Breaking OTP Ciphers,” Cryptologia, Vol. 35, No. 2, April 2011, p. 176–188.
Filby, P. William, “Floradora and a Unique Break into One-Time Pad Ciphers,” Intelligence and National
Security, Vol. 10, No. 3, July 1995, pp. 408–422.
Foster, Caxton C., “Drawbacks of the One-Time Pad,” Cryptologia, Vol. 21, No. 4, October 1997, pp.
350–352. This paper briefly addresses the matter of determining the random sequence used as the
key. If it is not truly random, then the cipher ceases to be unbreakable. Algorithms run on traditional
computers are never truly random. To get true random numbers, one needs a quantum computer.
Haynes, John Earl and Harvey Klehr, Venona: Decoding Soviet Espionage in America, Yale University Press,
New Haven, Connecticut, 1999. Although focused on the history, this book has a chapter (“Breaking
the Code”) that gives more detail on the cryptology than other works.
106 ◾ Secret History
Marks, Leo, Between Silk and Cyanide: a Codemaker’s War, 1941–1945. The Free Press, New York, 1998.
Marks, a cryptographer for the Special Operations Executive (SOE), writes about his experiences in
a very entertaining manner. (After the war, but before this volume, he was a screenwriter, so he can
write!) The back cover sports blurbs from David Kahn and Martin Scorsese.
Mauborgne, Ben P., Military Foundling, Dorrance and Company, Philadelphia, Pennsylvania, 1974. A mix
of fact and fiction, this novel is of interest to us mainly for its dedication:
This book is respectfully dedicated to the memory of my illustrious, talented and versatile
father, Major General Joseph O. Mauborgne, chief signal officer of the United States Army
from 1937 to 1941; scientist, inventor, cryptographer, portrait painter, etcher, fine violin maker
and author.
Military history has recorded that he was the first person to establish two-way wireless com-
munication between the ground and an airplane in flight; that he invented an unbreakable
cipher; and that he was “directly responsible” for probably the greatest feat of cryptanalysis in
history – the breaking of the Japanese PURPLE code – more than a year prior to the sneak
attack on Pearl Harbor.
Miller, Frank, Telegraphic Code to Insure Privacy and Secrecy in the Transmission of Telegrams, Charles M.
Cornwell, New York, 1882.
Philips, Cecil, “The American Solution of a German One-Time-Pad Cryptographic System,” Cryptologia,
Vol. 24, No. 4, October 2000, pp. 324–332.
Redacted, “A New Approach to the One-Time Pad,” NSA Technical Journal, Vol. 19, No. 3, Summer 1974.
The title of this paper was released by the National Security Agency as part of a much redacted index
to this journal. In fact, the author’s name was redacted. But we do know it comes somewhere between
Gurin, Jacob and Jacobs, Walter. Any guesses?
Rubin, Frank, “One-Time Pad Cryptography,” Cryptologia, Vol. 20, No. 4, October 1996, pp. 359–364.
This paper attempts to make the one-time pad more practical.
Shannon, Claude, “Communication Theory of Secrecy Systems,” The Bell System Technical Journal, Vol. 28,
No. 4, October 1949, pp. 656–715. Shannon shows that the one-time pad is unbreakable and any-
thing that is unbreakable must be a one-time pad. Despite his having found this result over 70 years
ago, one needn’t look hard for other cipher systems that are billed as being unbreakable.
Vernam, Gilbert S., “Cipher Printing Telegraph Systems: For Secret Wire and Radio Telegraphic
Communications,” Journal of the American Institute of Electrical Engineers, Vol. 45, February 1926,
pp. 109–115.
Vinge, Vernor, A Fire Upon the Deep, St. Martin’s Press, New York, 1993. A one-time pad is used in this
science fiction novel, but it must first be pieced together by three starship captains.
Yardley, Herbert O., “Are We Giving Away Our State Secrets?,” Liberty, Vol. 8, December 19, 1931, pp.
8–13. Yardley argued that America ought to be making use of the one-time pad.
Chapter 3
Transposition Ciphers
The 21st century will see transposition regain its true importance.
‒ Friedrich L. Bauer1
1 Bauer, Friedrich L., Decrypted Secrets: Methods and Maxims of Cryptology, second edition, Springer, Berlin,
Germany, 2000, p. 100.
2 Quote from a soldier in the Middle East, as heard on the nightly news.
107
108 ◾ Secret History
The “fence” needn’t be limited to two tiers. We could encipher the same message as follows.
A W K T N W L L
N E H O S S H O G T I I L E Y
Y N O O A U E R W Y C W S R D E
O L T W A E U I
ATTACK
DAMASC → ADUWT ASNTM AKAAT ECSDT KCAW
USATDA
WNKETW
Note: The last four letters in the message are only there to complete the rectangle. It is common
to see the letter X used for this purpose, but it is better to use more frequent letters, making the
cryptanalyst’s job a bit harder.
The rectangle can have any dimension. If a cryptanalyst suspects this manner of encipherment,
the number of possible cases to consider depends on the number of factors of the ciphertext’s
length. For example, consider the following intercept:
With 72 letters of ciphertext, the enciphering rectangle could be 2 × 36, 3 × 24, 4 × 18, 6 ×
12, 8 × 9, 9 × 8, 12 × 6, 18 × 4, 24 × 3, or 36 × 2. If forced to look at each of these possibilities
individually, it would be wise to start with the dimensions closest to a square and work out, but we
have a nice technique to eliminate this tedium.
A probable word search often works nicely and because every message has some sort of context
from which to guess a crib, this is fair. Suppose we can guess the word WHIP appears in the mes-
sage. Reproducing the ciphertext with the appropriate letters underlined and boldfaced reveals a
pattern. (We only need to consider characters between the first W and the last P).
Looking at the position of each of these letters and the distances between them:
W1 16
W2 20
H1 25 H1 – W1 = 9 H1 – W2 = 5
H2 29 H2 – W1 = 13 H2 – W2 = 9
I1 33 I1 – H1 = 8 I1 – H2 = 4
I2 34 I2 – H1 = 9 I2 – H2 = 5
I3 38 I3 – H1 = 13 I3 – H2 = 9
P1 43 P1 – I1 = 10 P1 – I2 = 9 P1 – I3 = 5
P2 47 P2 – I1 = 14 P2 – I2 = 13 P2 – I3 = 9
The only number that shows up as a distance between each pair of letters in WHIP is 9. It may look
like 13 works, but chaining the letters to get through the probable word is not possible, as we would
have to use two different Hs. Reproducing the table and boldfacing the 9s for convenience, we have:
W1 16
W2 20
H1 25 H1 – W1 = 9 H1 – W2 = 5
H2 29 H2 – W1 = 13 H2 – W2 = 9
I1 33 I1 – H1 = 8 I1 – H2 = 4
I2 34 I2 – H1 = 9 I2 – H2 = 5
I3 38 I3 – H1 = 13 I3 – H2 = 9
P1 43 P1 – I1 = 10 P1 – I2 = 9 P1 – I3 = 5
P2 47 P2 – I1 = 14 P2 – I2 = 13 P2 – I3 = 9
If we start with W1 to form WHIP, we then have to use H1, as it is the only H 9 units away from W1.
This eliminates the ambiguity in the I section. Because we had to use H1, we must also use I2 and, it
follows from there, P1. Thus, we have a consistent solution. The 9s indicate that nine rows were used.
Similarly, if we start with W2 to form WHIP, we then have to use H2, as it is the only H 9 units
away from W2. This eliminates the ambiguity in the I section. Because we had to use H2, we must
also use I3 and, it follows from there, P2. Thus, we have another consistent solution. Apparently
the word WHIP appeared twice in this message. Again, the 9s indicate that nine rows were used.
To decipher, we write the ciphertext as columns and read the message out in rows.
Y O U C A N O N
L Y W H I P A M
A N F O R S O L
O N G U N T I L
H E S T A R T S
T O L I K E T H
E W H I P C H A
R L E S M A N S
O N R D H W C U
Message: YOU CAN ONLY WHIP A MAN FOR SO LONG UNTIL HE STARTS TO
LIKE THE WHIP – CHARLES MANSON
110 ◾ Secret History
A few random letters were used to round out the block. This approach will work when the
probable word appears on a single line of the enciphering block.
or password that can be guessed is a poor choice. See Section 16.5 of the present book for examples of poorly
chosen passwords.
Transposition Ciphers ◾ 111
We’d like to remove the text by columns, taking them in alphabetical rather than numerical order,
but there are two As. No problem. Start with the first A, then move on to the second A, and then
take the rest of the columns in alphabetical order. Our ciphertext is
ETNR CEOY EBGN IITT RENN CHIA YGFE SSNY.
In the examples above, we’d be wise to express the final ciphertexts in groups of five letters (the
standard) so as not to reveal the number of rows in our rectangle of text.
Which possibility looks the best to you? The first choice can almost certainly be eliminated
because of the TALKBD in the second row and the third seems unlikely due to TIFRTT in row
nine. Going with the second possibility, and working on the left hand side, we see that there is no
I available in the necessary position to form IT’S OKAY, but we do have two columns with an
A in the position needed to make THAT’S OKAY. We try each.
OTHERPE PTHERPE
ETALKED BTALKED
ESETHIN SSETHIN
ATSOKAY ATSOKAY
EVERHEA DVERHEA
FRUTTIB PRUTTIB
NETHERE AETHERE
TTIFRUT ITIFRUT
TLERICH RLERICH
Consider the fifth rows. Which looks better EVERHEA or DVERHEA? It seems that the first
possibility is the better choice. Other rows support this selection. To continue forming THAT’S
OKAY, we have two choices for a column with a T in the necessary position and two choices for
a column with an H in the necessary position. Thus, there are four possibilities, altogether. We
compare these below.
OOOTHERPE OEOTHERPE MOOTHERPE MEOTHERPE
HAETALKED HVETALKED AAETALKED AVETALKED
TGESETHIN THESETHIN TGESETHIN THESETHIN
THATSOKAY THATSOKAY THATSOKAY THATSOKAY
YREVERHEA YUEVERHEA OREVERHEA OUEVERHEA
TYFRUTTIB TIFRUTTIB TYFRUTTIB TIFRUTTIB
BSNETHERE BONETHERE OSNETHERE OONETHERE
OTTTIFRUT OUTTIFRUT TTTTIFRUT TUTTIFRUT
LATLERICH LTTLERICH IATLERICH ITTLERICH
The first choice is comical with three consecutive Os in the first row. The second isn’t any better,
and the third only makes sense if the message was composed by a cow. Thus, we go with the fourth.
MEOTHERPE
AVETALKED
THESETHIN
THATSOKAY
OUEVERHEA
TIFRUTTIB
OONETHERE
TUTTIFRUT
ITTLERICH
The deciphering should be moving along faster now. Can you guess what letter comes just
before AVETALKED or OUEVERHEA or ITTLERICH? The strangest rows are even beginning to
make sense now. TIFRUTTIB and TUTTIFRUT may look odd by themselves, but taken together,
anyone familiar with early rock and roll music ought to recognize them.
114 ◾ Secret History
Also, as the unused columns dwindle, the placements become easier, because there are fewer
possibilities. Thus, at this point, you should have no trouble completing the rectangle to get
SOMEOTHERPEOPL
EHAVETALKEDABO
UTTHESETHINGSB
UTTHATSOKAYHAV
EYOUEVERHEARDT
UTTIFRUTTIBYPA
TBOONETHERESAL
SOTUTTIFRUTTIB
YLITTLERICHARD
96: UEIMS NRFCO OBISE IOMRO POTNE NANRT HLYME PPROM TERSI
HEELT NBOFO LUMDT TWOAO ENUUE RMDIO SRILA SSYHP PRSGI IOSIT B
The 96 tells us that there are that many letters in the ciphertext. It serves as a useful check that
nothing was accidentally dropped or repeated. To decipher, we need a rectangle whose dimensions
5 Mahon, Tom and James J. Gillogly, Decoding the IRA, Mercier Press, Cork, Ireland, 2008.
Transposition Ciphers ◾ 115
Figure 3.2 IRA ciphertexts from 1927. (From Mahon, Thomas and James J. Gillogly, Decoding
the IRA, Mercier Press, Cork, Ireland, 2008, p. 11. With permission)
116 ◾ Secret History
multiply together to yield 96. Although we might not try the right dimensions the first time, we
eventually get around to considering 8 rows and 12 columns.
1 2 3 4 5 6 7 8 9 10 11 12
U C O E Y T L U O D S G
E O M N M E T M E I S I
I O R A E R N D N C Y I
M B O N P S B T U S H O
S I P R P I O T U R P S
N S O T R H F W E I P I
R E T H O E O O R L R T
F I N L M E L A M A S B
Cutting the columns out and rearranging, soon yields the result below.
2 3 1 7 10 5 9 8 12 4 6 11
C O U L D Y O U G E T S
O M E T I M E M I N E S
O R I N C E N D I A R Y
B O M B S P U T O N S H
I P S O R P U T S R I P
S O N F I R E W I T H P
E T R O L O R O T H E R
I N F L A M M A B L E S
The plaintext, with one typo, may then be read off as
COULD YOU GET SOME TIME MINES OR INCENDIARY BOMBS PUT ON SHIPS
OR PUT SRIPS ON FIRE WITH PETROL OR OTHER INFLAMMABLES
Some IRA transposition ciphers were made harder to crack by inserting columns of nulls.
During World War II, Hong Kong was attacked by the Japanese at about the same time as
Pearl Harbor. Royal Air Force (RAF) Pilot Donald Hill was captured and kept a journal of his
experiences between December 7, 1941 and March 31, 1942, even though this was forbidden by
the Ministry of War in London, who feared intelligence might be relayed to the enemy through
such writings. Hill fooled his Japanese captors by disguising his entries as mathematical tables (he
converted the letters to numbers prior to transposing). The story of Donald Hill and his love, as
well as the cipher, its cryptanalysis (by Philip Aston) and the story the plaintext revealed are all
detailed in Andro Linklater’s The Code of Love.6 A page of Hill’s cipher is reproduced in Figure 3.3.
Various forms of transposition were used during World War II by Britain’s Special Operations
Executive (SOE), in addition to the one-time pads mentioned in Section 2.11. The best form,
double transposition is described shortly.
More recently, investigators found a tremendous amount of enciphered text in the Unabomber’s
shack. The enciphering algorithm included numerical substitutions for letters, punctuation, com-
mon combinations of letters, and some words, followed by extensive transposition. Could a genius
mathematician, such as the Unabomber, have come up with a pencil and paper cipher that could
withstand the scrutiny of our nation’s best cryptanalysts? We may never know, as the key to the
system was also found in the shack. FBI cryptanalyst Jeanne Anderson detailed the system in a 2015
paper.7 The original ciphers were sold at a government auction, along with many other items that
were once property of the Unabomber. The funds raised went to his victims and their families.8
www.newsytype.com/7120-unabomber-auction/.
Transposition Ciphers ◾ 117
Figure 3.3 A page from the enciphered diary of POW Donald Hill. (Thanks to Phillip Aston, the
mathematician who cracked the diary, for providing this image.)
3.4 Anagrams
If the transposition key is very long and random (not arising from valid words) compared to the
length of the message, this system can be difficult to break. In particular, if the length of the trans-
position key equals the length of the message, the cryptanalyst is essentially playing Scrabble with
a large number of tiles and may be able to form several meaningful solutions with no statistical
reason for favoring one over another. A rearrangement of letters is also known as an anagram.9 Both
Galileo and Newton concealed discoveries through anagramming; however, they did not scramble
their messages in a systematic way, so they could not be recovered as easily as in the example above.
William Friedman also used an anagram to state his opinion on the Voynich manuscript.
Example 5
Galileo Galilei
Haec immature a me iam frustra leguntur O. Y.
Giuliano de Medici received word of a discovery made by Galileo in the anagram form given
above. As you can see, Galileo constructed his anagram such that another sentence was formed.
However, his new sentence didn’t use all the letters of the original. He had an O and Y left over
and simply placed them at the end. The translation is “These unripe things are now read by me in
vain.” It was meant to disguise
cynthiae figures aemulatur mater amorum
which translates as “The mother of love [Venus] imitates the phases of Cynthia [the moon].” This
was revealed by Galileo on January 1, 1611.
9 Some reserve this term for when the letters of a word are rearranged to make another word. I use it more gener-
ally to indicate any rearrangement.
118 ◾ Secret History
Example 6
Christian Huygens
a7c5d1e5g1h1i7l4m2n9o4p2q1r2s1t5u5
In this alphabetized transposition, the exponents indicate how many of each letter appeared in the
original message. The decipherment is
annulo cingitur tenui plano, nusquam cohaerente, ad
eclipticam inclinato
which translates as “[Saturn] is girdled by a thin flat ring, nowhere touching, inclined to the ecliptic.”
Example 7
Isaac Newton
a7c2d2e14f2i7l3m1n8o4q3r2s4t8v12x1
Concerned with establishing priority, Newton included this alphabetized transposition in his sec-
ond letter to Leibniz (1677). It deciphers to
Data aequatione quodcumque fluentes quantitates involvente,
fluxiones invenire et vice versa.
which translates as “From a given equation with an arbitrary number of fluentes to find the flux-
iones, and vice versa.”
Example 8
William F. Friedman
I put no trust in anagrammatic acrostic ciphers, for they are
of little real value–a waste–and may prove nothing–Finis.
In reference to the Voynich Manuscript (an enciphered manuscript of well over 200 pages that no
one has been able to crack), Friedman wrote that he “has had for a number of years a new theory
to account for its mysteries. But not being fully prepared to present his theory in plain language,
and following the precedents of three more illustrious predecessors, he wishes to record in brief
the substance of his theory:” His theory followed in anagram form (see above), with the rearrange-
ment, like Galileo’s, making sense; however, he topped Galileo by not having any letters left over.
Three (incorrect) solutions were found by others for this anagram:10
William F. Friedman in a feature article arranges to use
cryptanalysis to prove he got at that Voynich Manuscript. No?
This is a trap, not a trot. Actually I can see no apt way
of unraveling the rare Voynich Manuscript. For me, defeat
is grim.
To arrive at a solution of the Voynich Manuscript, try
these general tactics: a song, a punt, a prayer. William F.
Friedman.
10 Zimansky, Curt A., “William F. Friedman and the Voynich Manuscript,” Philological Quarterly, Vol. 49, No.
4, 1970, pp. 433–442. This was reprinted in Brumbaugh, Robert S., editor, The Most Mysterious Manuscript,
Southern Illinois University Press, Carbondale and Edwardsville, Illinois, 1978, pp. 99–108.
Transposition Ciphers ◾ 119
11 Shannon, Claude, “Communication Theory of Secrecy Systems,” The Bell System Technical Journal, Vol. 28,
No. 4, October 1949, pp. 656–715. Page 695 cited here.
120 ◾ Secret History
UBULO UDERI TPARS SSDNM ECMIE LEBOD RAAIN IRYHB LOLSR WIETM
LOSTD GUHIU EETAO HNDNI MNOIH HTLCI DSACE IEETR TLSAA NEURT
UDEYN LOHAR LATGS NSHEC EGWWN EGDYE YHINV NTUES NTLK
Another sort of double transposition was used in the second half of the 19th century by the anar-
chist enemies of the czars. This Nihilist cipher, rather than performing columnar transposition
twice, separately transposed columns and rows. Other Nihilist ciphers based on substitution were
in use as well.
William Friedman described double transposition as a “very excellent” method;12 however, he
did mention special cases in which it could fail. The biggest concern is that a careless encipherer
will forget to perform the second transposition. In this event, an intercepted message can be easily
read and will then provide the key for other messages that were correctly transposed twice. Other
special cases where solutions may be obtained include the following:
Friedman wrote this in 1923, before high speed computer attacks were feasible. A dictionary
attack on the keyword could now yield a solution even if the message does not satisfy any of the
three special conditions he mentioned.
However, we did not have to wait for the digital age, as a general attack was (secretly) published
in 1934.13 The author was Solomon Kullback, whom you will hear more about in Chapter 8.
One way to square the number of keys that would need to be checked using a dictionary attack
is to use two different words, one for the first transposition and another for the second. Although
the composition of two transpositions is a transposition, the “composite word” is not likely to be
in the dictionary. An attack for this improved version was also presented in Kullback’s paper.
Several years later, Britain’s Special Operations Executive (SOE) would use single and double
transposition with their agents in occupied Europe. Leo Marks tried to replace this system with
one-time pads, but the result was that both were then used, although not for the same messages!
On the other side (of the pond and the war), German operatives in Latin America used columnar
transposition until the spring of 1941.14
12 Friedman, William F., Elements of Cryptanalysis, Aegean Park Press, Laguna Hills, California, 1976, p. 103.
This is an easily available reprint edition of the May 1923 first edition, which was marked FOR OFFICIAL
USE ONLY and published by the Government Printing Office for the War Department.
13 Kullback, Solomon, General Solution for the Double Transposition Cipher, published by the Government
Printing Office for the War Department, Washington, DC, 1934. This was eventually declassified by the
National Security Agency and then quickly reprinted by Aegean Park Press, Laguna Hills, California, in 1980.
14 Bratzel, John F. and Leslie B. Rout, Jr., “Abwehr Ciphers in Latin America,” Cryptologia, Vol 7, No 2, April
1983, pp. 132–144.
Transposition Ciphers ◾ 121
communications in this manner. As a sample ciphertext, consider the following June 1, 1863 dis-
patch from Abraham Lincoln.15
GUARD ADAM THEM THEY AT WAYLAND BROWN FOR KISSING VENUS
CORESPONDENTS AT NEPTUNE ARE OFF NELLY TURNING UP CAN GET
WHY DETAINED TRIBUNE AND TIMES RICHARDSON THE ARE ASCERTAIN
AND YOU FILLS BELLY THIS IF DETAINED PLEASE ODOR OF LUDLOW
COMMISSIONER
GUARD indicates the size of the rectangle and what path to follow for the transposition. In
this case, to decipher, the words should be filled in by going up the first column, down the second,
up the fifth, down the fourth, and up the third. After GUARD, every eighth word is a null, and is
therefore ignored.16 We get
FOR VENUS LUDLOW RICHARDSON AND
BROWN CORRESPONDENTS OF THE TRIBUNE
WAYLAND AT ODOR ARE DETAINED
AT NEPTUNE PLEASE ASCERTAIN WHY
THEY ARE DETAINED AND GET
THEM OFF IF YOU CAN
ADAM NELLY THIS FILLS UP
If transposition were the only protection, we’d be able to read the message now; however, the
Union used an extra level of protection—code words:
VENUS = colonel
WAYLAND = captured
ODOR = Vicksburg
NEPTUNE = Richmond
ADAM = President of the U.S.
NELLY = 4:30 pm
Applying the code words (and removing the last words THIS FILLS UP, which are more
nulls used to fill out the block above) yields the original message:
For Colonel Ludlow,
Richardson and Brown, correspondents of the Tribune, captured
at Vicksburg, are detained at Richmond. Please ascertain why
they are detained and get them off if you can.
The President, 4:30 pm
This system completely stymied the Confederacy. It’s been claimed often in the cryptologic
literature that the Confederacy even resorted to printing some intercepted ciphertexts in southern
newspapers, along with solicitations for help! Despite this being a frequently repeated claim, all of
my attempts to find the actual solicitations only led to others seeking the same! Finally, in April
2012, I found an article that I believe solves the mystery of the missing Confederate ads. It was a
piece by Albert J. Myer of the Signal Corps titled “The Cypher of the Signal Corps.” It ran in the
October 7, 1865 edition of Army Navy Journal, p. 99. I know this is the wrong side, and after the
war, but read on! The piece is reproduced below.
15 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 215.
16 That is, we ignore KISSING, TURNING, TIMES, BELLY, and COMMISSIONER.
122 ◾ Secret History
17 Boklan, Kent D. and Ali Assarpour, “How We Broken the Union Code (148 Years Too Late),” Cryptologia, Vol.
34, No. 3, July 2010, pp. 200–210.
Transposition Ciphers ◾ 123
Figure 3.4 “I was so exceptionally clever that everyone marveled at my extraordinary skill.”
(Girolamo Cardano, 1501–1576.) (From University of Pennsylvania, Rare Book & Manuscript
Library, Elzevier Collection, Elz D 786. Thanks to Regan Kladstrup for help with this image.)
Cardano is best remembered for being the first to publish (in Ars Magna, 1545) the solution to
the cubic equation. However, a controversy ensued immediately, as he had obtained the formula
from Tartaglia, after much harassment and a promise to keep it secret. Cardano is also credited
with authoring the first book on probability18 (Liber de Ludo Aleae) and making the first explicit
use of complex numbers in a calculation (Ars Magna again). On the personal side, things didn’t go
as well. In 1560, Cardano’s son Giambattista, was jailed and executed following his conviction for
killing his wife. Another son is alleged to have had his ears cut off by Cardano for some offense!
Cardano himself was jailed in 1570. He was charged with heresy for casting the horoscope of
Jesus Christ and writing a book that praised Nero.19 His punishment was much less severe than
what others faced in the hands of the Inquisition; he spent 77 days in prison and more under house
arrest. Although he was banned from publishing (and even writing!) for a time, he authored 100
works over the course of his life, some of which consisted of more than one “book.”
Cardano had been plagued by health problems throughout his life. His list includes catarrh,
indigestion, congenital heart palpitations, hemorrhoids, gout, rupture, bladder trouble, insomnia,
plague, carbuncles, tertian fever, colic, and poor circulation, plus various other ailments. One is
tempted to add hypochondria. He seemed to delight in recounting his troubles: “My struggles,
my worries, my bitter grief, my errors, my insomnia, intestinal troubles, asthma, skin ailments
and even phtheiriasis, the weak ways of my grandson, the sins of my own son… not to mention
my daughter’s barrenness, the drawn out struggle with the College of Physicians, the constant
intrigues, the slanders, poor health, no true friends” and “so many plots against me, so many
tricks to trip me up, the thieving of my maids, drunken coachmen, the whole dishonest, cowardly,
traitorous, arrogant crew that it has been my misfortune to deal with.”20
18 Yet he didn’t know as much as he thought he did about the laws of chance, for in 1533 he was forced to pawn
his wife’s jewelry and some of his furniture to pay gambling debts.
19 If you’re curious, the horoscope was reprinted in Shumaker, Wayne, Renaissance Curiosa, Medieval &
Renaissance Texts & Studies Vol. 8, Center for Medieval & Renaissance Studies, Binghamton, New York,
1982, pp. 53–90. An introduction is included.
20 Muir, Jane, Of Men and Mathematics: The Story of the Great Mathematicians, Dodd, Mead and Company, New
York, 1965, p. 45.
124 ◾ Secret History
Initially Cardano’s grille was used steganographically—that is, to conceal the presence of the
message. It consisted of a piece of paper (usually heavy to withstand repeated use) with rectangular
holes cut out at various locations. The encipherer placed this grille atop the paper and wrote his
message in the holes. The holes could be big enough for entire words or just individual letters. After
removing the grille, he would then attempt to fill in other words around the real message to create
a cover text that he hoped would fool any interceptor into thinking it was the real message. This
last step can be tricky and awkward phrasings and handwriting that doesn’t seem to flow naturally
can result, and tip off the interceptor to the fact that a grille was used. Nevertheless, grilles saw use.
One with a single large hole, in the shape of an hourglass, was used in the American Revolution.
As described thus far, this is not a transposition scheme. The words or letters remain in their
original order. However, a slight twist turns this into a transposition device, called a turning grille.
To see how this works, consider the ciphertext shown in Figure 3.5.
By itself, the above looks like a crossword puzzle gone bad, or perhaps a word search puzzle,
but see what happens when we slide the grille in Figure 3.6 over it.
A message begins to take shape, with word spacing preserved, (THE ABILITY TO DESTROY
A), but it seems to be incomplete. We rotate our grille 90° clockwise (from your perspective, not
the clock’s!) and place it down again to observe more of the message (Figure 3.7).
Our message continues (PLANET IS INSIGNIFICANT), but still doesn’t make much
sense, although it has meaning. We rotate our grille another 90° clockwise (Figure 3.8) to get
more of the message (COMPARED TO THE POWER O).
We turn the grille yet another 90° clockwise (last time - Figure 3.9) to get the final part of the
message (F THE FORCE – DARTH VADER).
The full message is now revealed to be a quote from the Dark Lord of the Sith:
THE ABILITY TO DESTROY A PLANET IS INSIGNIFICANT
COMPARED TO THE POWER OF THE FORCE - DARTH VADER
A close look at the original ciphertext shows there are four letters that were not used. Punching
one more hole in the grille would allow us to make use of those four extra positions, if we needed
them. Instead, these were filled in with nulls. Actually the four letters used can be anagrammed
to continue the theme of the message.
Turning grilles were used as recently as World War I by the Germans, although only for a
period of four months. French cryptanalysts learned to break these, and the Germans moved on
to a better system, which will be examined in Section 5.2.
Most modern ciphers use both substitution and transposition. Some of them are detailed in
the second half of this volume.
Eyraud, Charles, Precis de Cryptographie Moderne, Editions Raoul Tari, Paris, 1953. An attack on double
transposition is presented here.
Friedman, William F., Formula for the Solution of Geometrical Transposition Ciphers, Riverbank Laboratories
Publication No. 19, Geneva, Illinois, 1918.
Friedman, William F. and Elizebeth S. Friedman, “Acrostics, Anagrams, and Chaucer,” Philological
Quarterly, Vol. 38, No. 1, January 1959, pp. 1–20.
Transposition Ciphers ◾ 127
Kullback, Solomon, General Solution for the Double Transposition Cipher, published by the Government
Printing Office for the War Department, Washington, DC, 1934. This was eventually declassified
by the National Security Agency and then quickly reprinted by Aegean Park Press, Laguna Hills,
California, in 1980.
Lasry, George, Nils Kopal, and Arno Wacker, “Solving the Double Transposition Challenge with a Divide-
and Conquer Approach,” Cryptologia, Vol. 38, No. 3, July 2014, pp. 197–214.
Lasry, George, Nils Kopal, and Arno Wacker, “Cryptanalysis of Columnar Transposition Cipher with Long
Keys,” Cryptologia, Vol. 40, No. 4, July 2016, pp. 374–398.
Leighton, Albert C., “Some Examples of Historical Cryptanalysis,” Historia Mathematica, Vol. 4, No. 3,
August 1977, pp. 319–337. This paper includes, among others, a Union transposition cipher from the
U.S. Civil War.
Leighton, Albert C., “The Statesman Who Could Not Read His Own Mail,” Cryptologia, Vol. 17, No. 4,
October 1993, pp. 395–402. In this paper, Leighton presents how he cracked a columnar transposi-
tion cipher from 1678.
Michell, Douglas W., ““Rubik’s Cube” as a Transposition Device,” Cryptologia, Vol. 16, No. 3, July 1992,
pp. 250–256. Although the keyspace makes this cipher sound impressive, a sample ciphertext I gener-
ated was broken overnight by a Brett Grothouse, a student of mine who was also a cube enthusiast.
Recall that a large keyspace is a necessary condition for security, but not a sufficient condition. It was
recently shown that any scrambling of Rubik’s Cube can be solved in 20 moves or less.21
Ritter, Terry, “Transposition Cipher with Pseudo-random Shuffling: The Dynamic Transposition
Combiner,” Cryptologia, Vol. 15, No. 1, January 1991, pp. 1–17.
Zimansky, Curt A., “Editor’s Note: William F. Friedman and the Voynich Manuscript,” Philological
Quarterly, Vol. 49, No. 4, October 1970, pp. 433–442. This was reprinted in Brumbaugh, Robert
S., editor, The Most Mysterious Manuscript, Southern Illinois University Press, Carbondale and
Edwardsville, Illinois, 1978, pp. 99–108.
21 Fildes, Jonathan, “Rubik’s Cube Quest for Speedy Solution Comes to an End,” BBC News, August 11, 2010,
available online at http://www.bbc.co.uk/news/technology-10929159.
Chapter 4
In this chapter, we examine a controversy involving the works of William Shakespeare, the contri-
butions of Thomas Jefferson, and a critical moment in the life of John F. Kennedy.
Sir Francis Bacon (Figure 4.1) is best known as a philosopher and advocate of applied science and
the scientific method, which he called the New Instrument. His views became more influential
following his death. In particular, he provided inspiration to the men who founded the Royal
129
130 ◾ Secret History
Society. Bacon earned a place in these pages because he also developed a binary cipher—that is, a
cipher in which only two distinct symbols are needed to convey the message. An updated example
of his biliteral cipher follows.
A = aaaaa N = abbab
B = aaaab O = abbba
C = aaaba P = abbbb
D = aaabb Q = baaaa
E = aabaa R = baaab
F = aabab S = baaba
G = aabba T = baabb
H = aabbb U = babaa
I = abaaa V = babab
J = abaab W = babba
K = ababa X = babbb
L = ababb Y = bbaaa
M = abbaa Z = bbaab
One could use this to encipher a message, sending the 25-letter string aabbb aabaa ababb
ababb abbba to say “hello,” but this is a particularly inefficient way to do a monoalphabetic
substitution! The strength in this cipher lies in its invisibility. Let a be represented by normal
text characters and let b be represented by boldface characters. Now observe the message hidden
behind the text that follows.
1 Pratt, Fletcher, Secret and Urgent, Bobbs Merrill, New York, 1939, p. 85.
Shakespeare, Jefferson, and JFK ◾ 131
authorship involved a convoluted numerical scheme with a great deal of flexibility. Such flexibil-
ity in determining the plaintext caused most to react with great skepticism. In fact, in the same
year that Donnelly’s book appeared, Joseph Gilpin Pyle authored a parody The Little Cryptogram,
which in only 29 pages used Donnelly’s technique to generate messages of his own that could not
possibly have been intended by Bacon. Although Pyle’s approach doesn’t provide a rigorous proof
that Donnelly was wrong, it is very satisfying.
In general, we may draw all sorts of conclusions, depending on what sort of evidence we are
willing to accept. For example, let’s take a look at Psalm 46. The words from the psalm have been
numbered from beginning and end to position 46.
1 2 3 4 5 6
God is our refuge and strength,
7 8 9 10 11 12
a very present help in trouble.
13 14 15 16 17
Therefore will not we fear,
24 25 26 27 28
though the mountains be carried
21 20 19 18 17 16 15
I will be exa lted in t he ea r t h.
translation to honor his 46th birthday (the King James Version was used). Or perhaps it is all a
coincidence.
Elizabeth Gallup was the first to publish an anti-Shakespeare theory using the cipher intro-
duced in this chapter. Her 1899 work was titled The Bi-literal Cypher of Francis Bacon. In it
she claimed that different-looking symbols for the same letters used in printing Shakespeare’s
plays actually represented two distinct symbols and spelled out messages in Bacon’s biliteral
cipher to indicate that he was the true author. Bacon revealed his system of concealing mes-
sages in 1623, the year that the first folio was published, so the timing is right, but that is
about all.
The evidence against Gallup’s case is convincing. At the time of the printing of the first folio,
various old typefaces were commonly mixed, and broken characters were used along with better
copies. Thus, the characters seem to form a continuum between the new and the old, rather than
two distinct forms. Also, Gallup had Bacon use words that did not exist at the time his hidden
message was allegedly printed.2 Over a half-century after Gallup’s book appeared, William and
Elizebeth Friedman authored a book examining the controversy.3 Their research ended with the
conclusion that the Bacon supporters were mistaken. The uncondensed version of the book (1955)
won the Folger Shakespeare Library Literature Prize of $1,000.4
The Friedmans got much more out of this controversy than a published book and a prize, as
it was actually how they met. They both worked for Riverbank Laboratories, run by eccentric
millionaire George Fabyan, in Geneva, Illinois, just outside Chicago. The research areas included
acoustics, chemistry, cryptology (only with the aim of glorifying Bacon as the author of those
famous plays—Gallup worked there), and genetics. William Friedman was hired as a geneticist,
but he helped the dozen plus workers in the cryptology section with his skill at enlarging photo-
graphs of texts believed to contain hidden messages. It could be said that William fell in love twice
at Riverbank Labs. In addition to meeting his wife-to-be Elizebeth,5 who worked in the cryptol-
ogy section, he also began researching valid cryptologic topics for Fabyan. When a cryptologist
hears someone refer to “The Riverbank Publications,” Friedman’s cryptologic publications are
what come to mind, even though other works were put out by the Lab’s press. As America headed
into World War I, Friedman’s cryptologic education proved valuable. Remember, America still
didn’t have a standing cryptologic agency. Like spies, codemakers and codebreakers went back to
other work at the end of each of America’s wars.
Broken type in Shakespeare first folios may not reveal a hidden message, but something can be
learned from type in general. Penn State biology professor S. Blair Hedges found a way to estimate
the print dates of various editions of books by comparing the cracks in the wood blocks used to
print the illustrations. These cracks appear at a continuous rate. Other processes acting on copper
plates allow dating for images printed in that manner, as well.6
Contrarians attempting to show that Shakespeare’s plays were really written by Sir Francis
Bacon seem to have been gradually replaced by contrarians attempting to show that Shakespeare’s
2 Pratt, Fletcher, Secret and Urgent, Bobbs Merrill, New York, 1939, p. 91.
3 Friedman, William F. and Elizebeth S. Friedman, The Shakespearean Ciphers Examined, Cambridge University
Press, Cambridge, UK, 1957.
4 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 879.
5 They were married in May 1917.
6 Marino, Gigi, “The Biologist as Bibliosleuth,” Research Penn State, Vol. 27, No. 1, Fall 2007, pp. 13–15.
134 ◾ Secret History
plays were really written by Edward de Vere, although there are several other names bandied
about. The arguments made today for de Vere are much the same as those of their predecessors,
and are not taken seriously by professional cryptologists.
On the bright side, progress is still being made in the study of Shakespeare himself. In 2009,
for the first time, a contemporary portrait of Shakespeare was publicly revealed. It was made in
1610 and is reproduced in Figure 4.3. For generations it passed down through the Cobbe family,
in a house outside of Dublin, without anyone realizing who the painting depicted. Finally, some-
one noticed the resemblance and some top experts agree that it is actually William Shakespeare.
7 https://web.archive.org/web/20021002043241/http://www.loc.gov/exhibits/lewisandclark/preview.html.
Shakespeare, Jefferson, and JFK ◾ 135
Figure 4.5 A two-part code by Thomas Jefferson. (Courtesy of the David Kahn Collection,
National Cryptologic Museum, Fort Meade, Maryland.)
them. We now come to his most famous discovery (Figure 4.6). The older wheel cipher pictured
on the right is described below.8
This enciphering and deciphering device was acquired from West Virginia by NSA
in the early 1980s. It was first thought to have been a model of the “Jefferson cipher
wheel,” so called because Thomas Jefferson described a similar device in his writings.
We believe it to be the oldest extant device in the world, but the connection with
Jefferson is unproven. Such devices are known to have been described by writers as
8 http://www.nsa.gov/museum/wheel.html.
136 ◾ Secret History
early as Francis Bacon in 1605 and may have been fairly common among the arcane
“black chambers” of European governments. This cipher wheel was evidently for use
with the French language, which was the world’s diplomatic language up through
World War I. How it came to be in West Virginia is unknown.
Jefferson, and several others, independently invented an enciphering device like the ones pic-
tured in Figure 4.6. For this reason, it is sometimes referred to as the “Thomas Jefferson cipher
wheel” or “Thomas Jefferson wheel cipher.” To encipher using the wheel cipher, simply turn the
individual wheels to form the desired message across one of the lines of letters. Copy any of the
other lines to get the ciphertext. Deciphering is just as easy. To do this, form the ciphertext along
one line of the wheel and then search the other lines for a meaningful text.
The wheel cipher pictured on the left in Figure 4.6 has 25 wheels. Each wheel has the alphabet
ordered differently around the edge (notice the distinct letters appearing above the four Rs). The
key is given by the order in which the wheels are placed on the shaft. Hence, the 25-wheel model
has a keyspace almost as big as a monoalphabetic substitution cipher. It is, however, much more
difficult to break. Jefferson’s version had 36 wheels.9
Others following Jefferson also came up with the idea independently. Major Etienne Bazeries
proposed such a device with 20 disks in 1891 for the French Ministry of War (which turned it
down).10 Captain Parker Hitt came up with the idea in 1914 in the strip-cipher variant.11 Here,
vertical slips of paper bearing scrambled alphabets are held in place horizontally by a backing that
allows vertical motion (Figure 4.7).
Moving the strips up and down is equivalent to turning the wheels on Jefferson’s device. In this
format, it is necessary to have two copies of the shuffled alphabet on each strip. Otherwise, when
attempting to read a given row off the device, one or more letters might be missing due to strips
being shifted too far up or down. If it weren’t for this repetition in the alphabets, joining the ends
of each strip would turn the device into a wheel cipher. Hitt’s device became, in cylinder form,
9 Salomon, David, Data Privacy and Security, Springer, New York, 2003, p. 82.
10 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 247.
11 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 493.
Shakespeare, Jefferson, and JFK ◾ 137
Figure 4.7 A device equivalent to a wheel cipher. (Courtesy of the National Cryptologic
Museum, Fort Meade, Maryland.)
the U.S. Army’s field cipher in 1922. In this form, it is known as the M-94, and was used until
the middle of World War II.12 The Navy adopted the device in 1928, naming it CSP 488, and the
Coast Guard was using it by 1939 under the name CSP 493.13 The U.S. Navy still had a version
12 Mellen, Greg and Lloyd Greenwood, “The Cryptology of Multiplex Systems,” Cryptologia, Vol. 1. No. 1,
January 1977, pp. 4–16, p. 13 cited here
13 Gaddy, David W., “The Cylinder-Cipher,” Cryptologia, Vol. 19, No. 4, October 1995, pp. 385–391, p. 386
cited here. Note that the dates of adoption given for the various service branches vary from author to author!
For example, in Weller, Robert, “Rear Admiral Joseph N. Wenger USN (Ret) and the Naval Cryptologic
Museum,” Cryptologia, Vol. 8, No. 3, July 1984, pp. 208–234 these wheel ciphers were delivered to the Navy
in December 1926 and use by the Coast Guard began “about 1935.” p. 214 cited here.
138 ◾ Secret History
of this cipher in use in the mid-1960s!14 A wheel cipher is shown in Figure 4.8 with an operator to
give a sense of scale.
Figure 4.8 Former high school cryptology student Dustin Rhoades gives us a sense of scale as
he examines a wheel cipher in the National Cryptologic Museum library. A poster in the back-
ground seems to show that this pleases David Kahn, who generously donated his own crypto-
logic library to the museum.
The wheel cipher is an example of a multiplex system. This simply means that the user is able
to choose from more than one ciphertext for each message. The term is actually an abbreviation
coined by William Friedman for multiple possible ciphertexts. In this case, we have 25 choices,
although the line or two directly beneath the message was sometimes forbidden. An advantage
of a multiplex system is that identical plaintext portions of a message needn’t generate identical
portions of ciphertext.
14 Mellen, Greg and Lloyd Greenwood, “The Cryptology of Multiplex Systems,” Cryptologia, Vol. 1. No. 1,
January 1977, pp. 4–16, p. 5 cited here.
Shakespeare, Jefferson, and JFK ◾ 139
determine the new key. There are 25! possible orderings, so we need an approach more sophisti-
cated than brute force. The alphabets for the U.S. Navy wheel cipher were as follows.15
1 BCEJIVDTGFZRHALWKXPQYUNSMO
2 CADEHIZFJKTMOPUQXWBLVYSRGN
3 DGZKPYESNUOAJXMHRTCVBWLFQI
4 EIBCDGJLFHMKRWQTVUANOPYZXS
5 FRYOMNACTBDWZPQIUHLJKXEGSV
6 GJIYTKPWXSVUEDCOFNQARMBLZH
7 HNFUZMSXKEPCQIGVTOYWLRAJDB
8 IWVXRZTPHOCQGSBJEYUDMFKANL
9 JXRSFHYGVDQPBLIMOAKZNTCWUE
10 KDAFLJHOCGEBTMNRSQVPXZIYWU
11 LEGIJBKUZARTSOHNPFXMWQDVCY
12 MYUVWLCQSTXHNFAZGDRBJEOIPK
13 NMJHAEXBLIGDKCRFYPWSZOQUVT
14 OLTWGANZUVJEFYDKHSMXQIPBRC
15 PVXRNQUIYZSJATWBDLGCEHFOKM
16 QTSEOPIDMNFXWUKYJVHGBLZCAR
17 RKWPUTQEBXLNYVFCIMZHSAGDOJ
18 SONMQUVAWRYGCEZLBKDFIJXHTP
19 TSMZKXWVRYUFIGJDABEOPCHNLQ
20 UPKGSCFJOWAYDHVELZNRTBMQIX
21 VFLQYSORPMHZUKXACGJIDNTEBW
22 WHOLBDMKEQNIXRTUZJFYCSVPAG
23 XZPTVOBMQCWSLJYGNEIUFDRKHA
24 YQHACRLNDPBOVZSXWITEGKUMJF
25 ZUQNXWRYALIVPBESMCOKHGJTFD
One possible means of breaking it, if modern technology is allowed, is to use a probable word
search. Suppose we believe the word MONEY appears in the message. There are 25P5 = (25)(24)(23)
(22)(21) = 6,375,600 possibilities as to which of the five wheels were used to encipher this word. A
computer can examine, for each possibility, the various ciphertexts that would result. If one of them
matches part of the ciphertext that has been intercepted, we may know the order of five of the wheels
(a coincidence is also possible, as the word MONEY may not be present in the message). This attack
assumes that we know the order of the letters on each wheel and only the ordering of the wheels on
the shaft is unknown. Once we know the order of a few of the wheels, the calculations to determine
the rest become less time-consuming. The various wheels can be tried on the end of the one containing
the crib such that the ciphertext is continued on the appropriate line, while looking to see if the line
of plaintext continues to make sense. If there’s a given ciphertext of length 25 or greater for which we
know, or can guess, the plaintext, we can recover the order of the wheels with just pencil and paper.
Example 1
Perhaps a commonly sent message is NOTHING TO REPORT AT THIS TIME. If we suspect
the ciphertext YTWML GHWGO PVRPE SDKTA QDVJO represents this message, we pair the
two up and examine the distance between each pair of plaintext/ciphertext letters for each of the
25 disks. Table 4.1 shows the result.
15 This is according to Salomon, David, Data Privacy and Security, Springer, New York, 2003, p. 84. Elsewhere,
other alphabets have been stated as being in use.
140 ◾ Secret History
Because each ciphertext character is the same fixed distance on the wheel it arose from, when
compared to the message letter it represents, we need to find a numerical value that appears in
every row and column.
Column 1 doesn’t contain 2, 5, 11, 13, 14, 18, 20, 21, and 25, so these may be eliminated as
possible shifts. The possibilities that remain are 1, 3, 4, 6, 7, 8, 9, 10, 12, 15, 16, 17, 19, 22, 23, and
24. But column 2 doesn’t contain 1, 3, 6, 9, 10, 17, 19, or 22, so our list is quickly reduced to just
4, 7, 8, 12, 15, 16, 23, and 24. Column 3 doesn’t contain a 16, so we are then left with 4, 7, 8, 12,
15, 23, and 24. Column 4 eliminates 8 and 23, leaving 4, 7, 12, 15, and 24. Column 5 reduces
the choices to 7, 12, and 15. There’s no 7 in column 6, so we now know the shift is either 12 or 15.
Things now start to move slower! Every column contains both a 12 and a 15 until we get to column
18, which lacks the 15. Finally (with seven columns to spare!) we conclude the shift was by 12.
Locating all of the 12s in Table 4.1 will help us to find the order of the wheels (Table 4.2).
Table 4.2 shows that N is followed 12 places later by Y on wheels 6, 9, and 19, but we don’t
know which of these it is. The best strategy is to not worry about it for now. Moving on to the
fifth letter in the message, we see that I is followed 12 places later by L on wheel 19. Thus, wheel
19 must be in position 5 on the shaft of the wheel cipher. Similarly, wheels 2, 25, 12, 17, and 11
must be in positions 9, 10, 15, 21, and 25, respectively. We may label these determinations like so
19 2 25 12 17 11
N O T H I N G T O R E P O R T A T T H I S T I M E
Y T W M L G H W G O P V R P E S D K T A Q D V J O
We now take these wheels “off the table” since their positions in the key have been determined
(Table 4.3).
Shakespeare, Jefferson, and JFK ◾ 141
Notice that wheel 4 only moves one character of the message to the appropriate ciphertext
character—namely, the letter in position 6. Although there are other wheels that move the letter
in position 6 to where it needs to go, it must be wheel 4 that actually does so. This is because wheel
4 must be used somewhere, and it doesn’t work anywhere else. We may therefore take wheel 4 off
the table and remove the underlining and boldfacing that indicated the other possible wheels for
position 6. We do the same (by following the same reasoning) for wheels 7, 8, 15, 16, 21, and 24.
This is reflected in the updated key below and in Table 4.4.
8 19 4 7 2 25 16 12 21 24 17 15 11
N O T H I N G T O R E P O R T A T T H I S T I M E
Y T W M L G H W G O P V R P E S D K T A Q D V J O
With some of the underlining and boldfacing removed in the previous step, we see that we can
apply the same argument again. Wheels 1 and 3 must be in positions 13 and 18, respectively. We
now update our key and table (Table 4.5).
8 19 4 7 2 25 16 1 12 3 21 24 17 15 11
N O T H I N G T O R E P O R T A T T H I S T I M E
Y T W M L G H W G O P V R P E S D K T A Q D V J O
Another bit of underlining and boldfacing removed, as a consequence of the previous step, reveals
wheel 5 must be in position14. We update again to get the following key and Table 4.6.
8 19 4 7 2 25 16 1 5 12 3 21 24 17 15 11
N O T H I N G T O R E P O R T A T T H I S T I M E
Y T W M L G H W G O P V R P E S D K T A Q D V J O
Shakespeare, Jefferson, and JFK ◾ 143
This reveals that wheel 6 must be in position 1. Again, we update to get the following key and
Table 4.7.
6 8 19 4 7 2 25 16 1 5 12 3 21 24 17 15 11
N O T H I N G T O R E P O R T A T T H I S T I M E
Y T W M L G H W G O P V R P E S D K T A Q D V J O
Positions 17 and 22 must be wheels 13 and 14 (in one order or another), so the other under-
lined and boldfaced options for these wheels no longer need to be considered (Table 4.8).
In the same manner, positions 3 and 8 must be wheels 10 and 22 (in one order or another), so
the other underlined and boldfaced option for wheel 22 no longer needs to be considered (Table 4.9).
The updated table now reveals that position 2 must be wheel 20. We update our key (below)
and the table (Table 4.10 on p. 146) again.
6 20 8 19 4 7 2 25 16 1 5 12 3 21 24 17 15 11
N O T H I N G T O R E P O R T A T T H I S T I M E
Y T W M L G H W G O P V R P E S D K T A Q D V J O
Notice that Table 4.10 uses three shades of highlighting and underlining/boxing for the unde-
termined possibilities that remain. This is because we cannot continue as we’ve been going. To
brute force a solution at this stage would seem to require 128 configurations of the wheel cipher
(two possibilities for each of seven unknowns). A little reasoning will reduce this but we cannot
narrow it down to a single possibility based on the information we have. With more pairs of plain-
text and ciphertext we would likely be able to do so, but we don’t have this.
Shakespeare, Jefferson, and JFK ◾ 145
Consider the four lightly shaded values. Positions 3 and 8 are occupied by wheels 10 and 22,
in one order or the other. This represents two possibilities, not the four it might seem to be at first
glance, because a particular wheel cannot be in two positions at once. Similarly, the four under-
lined/boxed values give us two possibilities altogether. For the six darkly shaded values, assigning
wheel 9 to either position forces wheels 18 and 23 to particular positions. Thus, there are only
two ways to assign those three wheels. Assignments for the various shaded and underlined/boxed
values are all independent.
Thus, the total number of possibilities left to check (and these must be checked by hand) is
(2)(2)(2) = 8. These possibilities all convert the given message to the given ciphertext, but only one
is likely to correctly decipher the next message that is received.
The attack presented here relied on knowing some plaintext and the corresponding ciphertext,
as well as the order of the alphabet on each wheel. Only the key was unknown. There are more
sophisticated attacks that do not demand as much. See the paper “The Cryptology of Multiplex
Systems. Part 2: Simulation and Cryptanalysis” by Greg Mellen and Lloyd Greenwood in the
References and Further Reading section, if you would like to learn more about these attacks.
Another weakness with this cipher is that a letter can never be enciphered as itself. If we have a
phrase that we believe appears in the message, this weakness can sometimes help us decide where.
Although primarily known for his “invention” of the wheel cipher, it is interesting to note that
Thomas Jefferson (1743–1826) also wrote the Declaration of Independence, served as the third
president of the United States, and founded the University of Virginia. One might expect that a
figure as important as Jefferson would have been so closely examined that there is no room left
for original research; however, this is not the case. In the winter of 2007, mathematician Lawren
Smithline learned from a neighbor, who was working on a project to collect and publish all of
Shakespeare, Jefferson, and JFK ◾ 147
Jefferson’s letters and papers that several were written in code or cipher. In June, the neighbor
mentioned that one of these letters, from Robert Patterson to Jefferson, included a cipher or code
portion that couldn’t be read. The letter discussed cryptography and the unreadable passage was
a sample ciphertext that Patterson thought couldn’t be broken. Lawren got a copy of the letter,
which was dated December 19, 1801, and went to work. It was a columnar transposition cipher
with nulls and Lawren was able to solve it. The plaintext turned out to be the preamble to the
Declaration of Independence.16 There are two lessons we can take away from this. First, don’t
assume there’s nothing new to be discovered, just because a topic is old or already much studied.
Second, be social. Because Lawren talked with a neighbor, both benefited. You may be amazed at
how often you profit from letting people know your interests.
In mathematics, we begin with a small number of assumptions that we cannot prove and then
try to prove everything else in terms of them. We call these assumptions axioms or postulates and
ideally they would seem “obviously true” although no proof of them can be given. Jefferson must
have been in a mathematical mindset when he began his greatest piece of writing with “We hold
these truths to be self-evident…”
The cipher system named after Playfair (although he is not the creator of it) is more sophisti-
cated than those that typically appeared in the papers. It’s an example of a digraphic substitution
cipher, which simply means that the letters are substituted for two at a time.
Before the Playfair Cipher, digraphic ciphers required the users to keep copies of the key writ-
ten out, because they were clumsy and not easy to remember, as Porta’s example (believed to be
the first) demonstrates in Figure 4.9.
We’ll use this table to encipher REVOLT. First we split the message into two-letter groups: RE
VO LT. To encipher the first group, RE, we find R in the alphabet that runs across the top of the
table and then move down that column until we come to the row that has E on the right hand
16 See Smithline, Lawren M., “A Cipher to Thomas Jefferson: A Collection of Decryption Techniques and the
Analysis of Various Texts Combine in the Breaking of a 200-year-old Code,” American Scientist, Vol. 97, No. 2,
March-April 2009, pp. 142–149.
17 McCormick, Donald, Love in Code, Eyre Methuen Ltd., London, UK, 1980, p. 84.
148 ◾ Secret History
Figure 4.9 A digraphic cipher created by Porta. (Courtesy of the David Kahn Collection,
National Cryptologic Museum, Fort Meade, Maryland.)
side. The symbol in this position, , takes the place of RE. In the same manner, VO becomes
18 Deavours, Cipher A., “Unicity Points in Cryptanalysis,” Cryptologia, Vol. 1, No. 1, January 1977, pp. 46–68.
19 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 198.
Shakespeare, Jefferson, and JFK ◾ 149
plus side for Wheatstone, Wadsworth invented a cipher that became known as the Wheatstone
cipher.20 We’ll now examine how the Playfair cipher works.
Example 2
To start, we fill a rectangle with the alphabet. I and J are once again (see Polybius) equated:
A B C D E
F G H I&J K
L M N O P
Q R S T U
V W X Y Z
Given the message
LIFE IS SHORT AND HARD - LIKE A BODYBUILDING ELF.21
we begin by breaking it into pairs:
LI FE IS SH OR TA ND HA RD LI KE AB OD YB UI LD IN GD WA RF
To encipher the first pair, LI, we find those letters in the square above. We can then find two more
letters, F and O, to get the four corners of a rectangle.
A B C D E
F G H I&J K
L M N O P
Q R S T U
V W X Y Z
We take these two new corners as our ciphertext pair. But should we take them in the order FO or
OF? It was arbitrarily decided that the letter to appear first in the ciphertext pair should be the one
20 Clark, Ronald, The Man Who Broke Purple, Little, Brown and Company, Boston, Massachusetts, 1977, pp. 57–58.
21 From “Lift Your Head Up High (And Blow Your Brains Out),” by Bloodhound Gang.
150 ◾ Secret History
in the same row as the first plaintext letter. Making note of this first encryption and continuing
in the same manner we have
LI → OF
FE → KA
IS → HT
SH → ??
Here we have a problem. S and H appear in the same column, so we cannot “make a rectangle”
by finding two other letters as we did for the previous pairs. We need a new rule for this special
case: if both letters appear in the same column, encipher them with the letters that appear directly
beneath each. We then have
SH → XN
If one of the letters was in the last row, we’d circle back to the top of the column to find its enci-
pherment. Now we continue with the other pairs.
OR → MT
TA → QD
ND → OC
HA → FC
RD → TB
LI → OF
KE → PK
AB → ??
Another problem! A and B appear in the same row. Again, we cannot form a rectangle. In this case,
we simply take the letters directly to the right of each of the plaintext letters We get
AB → BC
If one of the letters was in the last column, we’d circle back to the start of the row to find its enci-
pherment. Our rules now allow us to finish the encryption:
OD → TI
YB → WD
UI → TK
LD → OA
IN → HO
GD → IB
WA → VB
RF → QG
Thus, our ciphertext is
OFKAH TXNMT QDOCF CTBOF PKBCT IWDTK OAHOI BVBQG.
Although it did not arise with this message, there is an ambiguous case. What do we do when
a plaintext pair consists of two of the same letter? Do we shift down (because they are in the same
column) or shift to the right (because they are in the same row)?
The solution is to avoid this situation! An X is to be inserted between doubled letters prior
to encipherment to break them up. Because X is a rare letter, it will not cause any confusion. A
recipient who, after deciphering, sees an X between two Ls or two Os, for example, would simple
remove the X. The example above was just for instructional purposes. The alphabet in the grid
would normally be scrambled (perhaps using an easy to remember keyword).
Shakespeare, Jefferson, and JFK ◾ 151
The first recorded solution of the Playfair cipher was by Joseph O. Mauborgne in 1914. At this
time, Playfair was the field cipher for the British. There are reports that this cipher was used in the
Boer War (1899–1902),22 but the example below is more recent. Imagine that, as an Australian
coastwatcher, you’re the intended recipient of the following Playfair cipher sent on August 2, 1943,
in the midst of the war in the Pacific (Figure 4.11).
Figure 4.11 Playfair message sent during World War II in the Pacific Theater. (Courtesy of the
David Kahn Collection, National Cryptologic Museum, Fort Meade, Maryland.)
The ciphertext, which is typically sent in groups of five letters, has already been split into
groups of size two. At about the middle of the second line you notice a doubled letter, TT. You fear
the message has been garbled or, perhaps, isn’t a Playfair cipher, after all.
In any case, the key is ROYAL NEW ZEALAND NAVY, so you form the following square:
R O Y A L
N E W Z D
V B C F G
H I K M P
Q S T U X
You begin deciphering. (recall I and J are not distinguished here)
KX → PT JE → BO YU → AT RE → ON BE → EO
ZW → WE EH → NI EW → NE RY → LO TU → ST
HE → IN YF → AC SK → TI RE → ON HE → IN
GO → BL YF → AC IW → KE TT → ?? TU → ST
OL → RA KS → IT YC → TW AJ → OM PO → IL
BO → ES TE → SW IZ → ME ON → RE TX → SU
BY → CO BW → CE TG → XC ON → RE EY → WO
CU → FT ZW → WE RG → LV DS → EX ON → RE
SX → QU BO → ES UY → TA WR → NY HE → IN
BA → FO AH → MR YU → AT SE → IO DQ → NX
Putting it all together, you get
PTBOATONEOWENINELOSTINACTIONINBLACKE??STRAITTWOMILESSW
MERESUCOCEXCREWOFTWEL VEXREQUESTANYINFOMRATIONX
The mystery ciphertext pair TT, deciphered as ?? temporarily, is easy to determine in the context
of the plaintext BLACKE??STRAIT. This must be BLACKETT STRAIT. The TT was left as is,
not even enciphered!23 After inserting word spacing you get
PT BOAT ONE OWE NINE LOST IN ACTION IN BLACKETT STRAIT TWO MILES
SW MERESU COCE X CREW OF TWELVE X REQUEST ANY INFORMATION X
22 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 202.
23 Although an obvious weakness, the Playfair cipher was actually sometimes used this way, as the present exam-
ple shows!
152 ◾ Secret History
There’s an error, but again, context makes it easy to fix. You produce the final message.
PT BOAT ONE ONE NINE LOST IN ACTION IN BLACKETT STRAIT TWO MILES
SW MERESU COVE X CREW OF TWELVE X REQUEST ANY INFORMATION X
The message is describing John F. Kennedy’s patrol torpedo boat, which had been sliced in half
by a Japanese destroyer that had rammed it. More messages will follow and eventually allow the
crew, which had swum ashore, to be rescued from the behind enemy lines. Perhaps years later you
will recall how the failure of the Japanese to read this (and other messages) may have saved the life
of a future American president.
On dividing the unknown substitution into groups of two letters each, examine the
groups and see if any group consists of a repetition of the same letter, as SS. If so, the
cipher is not a Playfair.
—J. O. Mauborgne24
Although Mauborgne was one of the (re)discoverers of the only unbreakable cipher, his advice
above wasn’t correct this time. Ciphers are often used improperly by individuals in highly stressful
situations. Also, a letter could repeat accidentally due to Morse mutilation.
Alf Mongé solved it (by hand) in the following manner.27 Splitting the ciphertext into pairs
and numbering the pairs for easy reference, we have:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
BU FD AG NP OX IH OQ YT KV QM PM BY DA AE QZ
Indicating the pairs OQ and QM in positions 7 and 10, Mongé pointed out that O and Q are
close to each other in a straight alphabet, as are Q and M. Looking for two other high frequency
digraphs with letters that are close to each other in the alphabet and have a letter in common
between the pairs, Mongé came up with NO and OU. (He did not say how many other possibilities
he tried first!) The proposed ciphertext/plaintext pairings would arise from the following square.
24 Mauborgne, Joseph O., An Advanced Problem in Cryptography and its Solution, second edition, Army Service
Schools Press, Fort Leavenworth, Kansas, 1918.
25 Deavours, Cipher A., “Unicity Points in Cryptanalysis,” Cryptologia, Vol. 1, No. 1, January 1977, pp. 46–68.
26 Aston, George, Secret Service, Faber & Faber, London, England, 1933.
27 Mongé, Alf, “Solution of a Playfair Cipher,” Signal Corps Bulletin, No. 93, November–December 1936,
reprinted in Friedman, William F., Cryptography and Cryptanalysis Articles, Vol. 1, Aegean Park Press, Laguna
Hills, California, 1976 and in Winkel, Brian J., “A Tribute to Alf Mongé,” Cryptologia, Vol. 2, No. 2, April
1978, pp. 178–185.
Shakespeare, Jefferson, and JFK ◾ 153
1 2 3 4 5
6 7 8 9 10
11 12 13 14 15
M N O Q U
V W X Y Z
Thus, Mongé determined 40% of the square already! Returning to the ciphertext, he filled in as
much as he could, indicating multiple possibilities where they existed and were not too numerous.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
BU FD AG NP OX IH OQ YT KV QM PM BY DA AE QZ
-- -- -- -- -O -- NO -- -- OU -- -- -- -- UY
M M V W N Q
N O W X O V
O Q X Y Q W
Q U Z Z U X
Q M Z
Which letters do you think would make the best choices for positions 8 and 9? Think about it
for a minute before reading the answer below!
Mongé selected W and Y to form the words NOW and YOU, but positions 8 and 9 represent pairs
of letters, so there must be a two-letter word connecting NOW and YOU in the plaintext. Making
these partial substitutions, T must occur in position 2, 7, or 12 of the enciphering square and K
must be in position 4, 9, or 14. Mongé assumed K didn’t occur in the key, forcing it to be in posi-
tion 14. He then had the following partially recovered square to work with:
1 T? 3 4 5
6 T? 8 9 10
11 T? 13 K
15
M N O Q U
V W X Y Z
Position 15 must be L, so the square quickly becomes:
1 T? 3 4 5
6 T? 8 9 10
11
T? 13 K L
M N O Q U
V W X Y Z
Moving back to the ciphertext/plaintext again gives:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
BU FD AG NP OX IH OQ YT KV QM PM BY DA AE QZ
-- -- -- -- -O -- NO W- -Y OU -- -- -- -- UY
M M N Q
N O O V
O Q Q W
Q U U X
L T Z
154 ◾ Secret History
Mongé then focused on the letters T and K from ciphertext groups 8 and 9. If T was in posi-
tion 12 of the square, then the keyword would be at least 12 letters long and consist of A, B, C,
D, E, F, G, IJ, P, R, S, and T. Mongé rejected this as unlikely, so T was in either position 2 or 7
of the square.
If the keyword was less than 11 letters long, then three of the letters A, B, C, D, E, F, G, H, and
IJ would have to appear in positions 11, 12, and 13 of the square.
Mongé noticed that H and IJ cannot appear in position 11 of the square, as there are not
enough letters between them and K to fill in positions 12 and 13. Thus, position 11 must be A,
B, C, D, E, F, or G. Mongé simply tried each possibility and found that only one worked. For
example, placing A in position 11, causes ciphertext block 9 to decipher to AY, which, in context,
gives a plaintext of NOW –A YOU. There is no letter that can be placed in front of the plaintext A
that makes sense. Similarly, all but one of the other possibilities fizzle out.
Placing F in square 11 makes the ciphertext block 9 decipher to FY, so that the plaintext contains
the phrase NOW –F YOU, which may sound vulgar until it is recalled that “–” represents an unknown
letter. It is then easy to see that the plaintext must be NOW IF YOU. Thus, it is also revealed that I
must be in position 4 or 9 of the square. Once F is placed in position 11 and I is forced in position 4
or 9 of the partially recovered square, positions 12 and 13 can be nothing but G and H. We now have
1 T? 3 I? 5
6 T? 8 I?
10
F G H K L
M N O Q U
V W X Y Z
Continuing to work back and forth between the square and the ciphertext, Mongé wrote
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
BU FD AG NP OX IH OQ YT KV QM PM BY DA AE QZ
-- -- -- -- HO -K NO W- -Y OU -- -- -- -- UY
and saw that the phrase NOW IF YOU was really KNOW IF YOU.
If the attacker can recover the keyword, the solution is immediately obtained. Mongé’s work
thus far indicates that IJ, P, R, S, and T must be part of the key. He expected more than a single
vowel in the key, and so supposed either A or E or both were part of the key. That then left B, C,
and D, as (perhaps) not part of the key. Mongé therefore placed them in the square as follows
1 T? 3 I 5
6 T? B C D
F G H K L
M N O Q U
V W X Y Z
This has the added benefit (if correct) of eliminating the ambiguity over the location of I. This
conjecture may be tested against the ciphertext as follows:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
BU FD AG NP OX IH OQ YT KV QM PM BY DA AE QZ
DO L- -- -- HO -K NO W- -Y OU -- CX -- -- UY
The decipherment of group 12 as CX may be discouraging at first, but we recall that doubled
plaintext letters are broken up with an X if they are to be enciphered together. Because C can be
Shakespeare, Jefferson, and JFK ◾ 155
doubled in English words, we continue on, now following up on group 13 representing C–. This
hypothesis suggests A belongs in position 7 of the square.
1 T 3 I 5
6 A B C D
F G H K L
M N O Q U
V W X Y Z
This assignment also eliminates the ambiguity concerning the position of T in the square.
Mongé was then able to determine the keyword, which consisted of the letters E, I, P, R, S, T,
but in his explanation continued the analysis by looking at the ciphertext again. Feel free to take
a moment to determine the key before reading the rest of Mongé’s explanation! The ciphertext/
plaintext pairings now become28
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
BU FD AG NP OX IH OQ YT KV QM PM BY DA AE QZ
DO L- TA -- HO -K NO WI FY OU -- CX C- -- UY
LP MA PK CP
LR MT RK CR
LS OT SK CS
LE UT EK CE
Group 2 must be LE, which then forces E to take position 6 in the square, which, in turn,
makes ciphertext groups 13 and 14, CE and ED, respectively. Thus, the message ends with
––CXCEEDUY. Recalling that the X is only in the plaintext to split up the pair CC, the ending
of the message should read ––CCEEDUY. At this point, the attacker either may guess SUCCEED
followed by two meaningless letters so that the ciphertext could be evenly split into groups of five
characters or he may look at the recovered square again, which is now almost complete, testing the
very few remaining for a meaningful plaintext. With either technique, the square and message are
both quickly revealed as
S T R I P
E A B C D
F G H K L
M N O Q U
V W X Y Z
and
BU FD AG NP OX IH OQ YT KV QM PM BY DA AE QZ
DO LE TA UT HO RK NO WI FY OU SU CX CE ED UY
DO LET AUTHOR KNOW IF YOU SUCCEED
There were several places in the above solution where guesses or assumptions were made. They
all proved correct, but it wouldn’t have been a disaster if one or more were wrong. We’d simply
generate some impossible plaintext and then backtrack to try another guess. It’s no different from
the backtracking that is typically needed when attempting to solve a monoalphabetic substitution
cipher by hand.
28 Mongé left out the possibilities for ciphertext blocks 11 and 14 as being too numerous.
156 ◾ Secret History
Mongé didn’t indicate how many incorrect guesses he may have made, but as the challenge
appeared in 1933 and Mongé’s solution appeared in 1936, there is a bound of a few years on how
long this could possibly have taken. It’s unlikely, however, that this is a tight bound! William
Friedman had written, “The author once had a student [Mongé] who ‘specialized’ in Playfair
ciphers and became so adept that he could solve messages containing as few as 50-60 letters within
30 minutes.”29
29 Friedman, William F., Military Cryptanalysis, Part I, Aegean Park Press, Laguna Hills, California, 1996,
p. 97, taken here from Winkel, Brian J., “A Tribute to Alf Mongé,” Cryptologia, Vol. 2, No. 2, April 1978,
pp. 178–185.
30 Cowan, Michael J., “Breaking Short Playfair Ciphers with the Simulated Annealing Algorithm,” Cryptologia,
Cowan gives much more detail in his paper. It seems that his approach is reliable, but runtime
may vary greatly depending on the particular cipher being examined and the initial guess. His
solving times for particular ciphers (averaged over many initial guesses) ranged from about 6 sec-
onds to a little more than a half hour.
After the idea of replacing characters two at a time is contemplated, a good mathemati-
cian ought to quickly think of a generalization. Why not replace the characters three at a time
(trigraphic substitution) or four at a time, or n at a time? A nice mathematical way of doing this
(using matrices) is described in Section 6.1.
Figure 4.12 Auguste Kerckhoffs (1835–1903). (Courtesy of the David Kahn Collection, National
Cryptologic Museum, Fort Meade, Maryland.)
K1. The system should be, if not theoretically unbreakable, unbreakable in practice.
K2. Compromise of the system should not inconvenience the correspondents.
K3. The method for choosing the particular member (key) of the cryptographic system to be
used should be easy to memorize and change.
K4. Ciphertext should be transmittable by telegraph.
31 Taken here from Konheim, Alan G., Cryptography, A Primer, John Wiley & Sons, New York, 1981, p. 7.
158 ◾ Secret History
Item K6 was echoed by Claude Shannon years later in his paper “Communication Theory of
Secrecy Systems” with a justification: “Enciphering and deciphering should, of course, be as simple
as possible. If they are done manually, complexity leads to loss of time, errors, etc. If done mechan-
ically, complexity leads to large expensive machines.”32 Shannon also shortened K2 to “the enemy
knows the system,” which is sometimes referred to as Shannon’s maxim.
Revealing the details of the system is actually a good way to make sure it’s secure. If the world’s
best cryptanalysts cannot crack it, you have an ad campaign that money can’t buy. Despite all of
this, some modern purveyors of cryptosystems still try to keep their algorithms secret. An example
that will be examined in greater detail in Section 19.5 is RC4, sold by RSA Data Security, Inc.
Despite the effort to maintain secrecy, the algorithm appeared on the cypherpunks mailing list.33
32 Shannon, Claude, “Communication Theory of Secrecy Systems,” The Bell System Technical Journal, Vol. 28,
No. 4, October 1949, pp. 656–715. Shannon noted, “The material in this paper appeared in a confidential
report, ‘A Mathematical Theory of Cryptography,’ dated Sept. 1, 1945, which has now been declassified.”
33 See http://www.cypherpunks.to/ for more information on the cypherpunks.
Shakespeare, Jefferson, and JFK ◾ 159
Marino, Gigi, “The Biologist as Bibliosleuth,” Research Penn State, Vol. 27, No. 1, Fall 2007, pp. 13–15.
Pyle, Joseph Gilpin, The Little Cryptogram, The Pioneer Press Co., St. Paul, Minnesota, 1888, 29 pages. This
is a spoof of Donnelly’s 998-page book The Great Cryptogram.
Schmeh, Klaus, “The Pathology of Cryptology – A Current Survey,” Cryptologia, Vol. 36, No. 1, January
2012, pp. 14–45. Schmeh recommends Pyle’s approach to investigation of alleged hidden messages:
If the technique yields messages in similar items, selected at random, or different messages from the
original source, then it is likely to be an invalid technique.
Stoker, Bram, Mystery of the Sea, Doubleday and Company, New York, 1902. Stoker used the biliteral cipher
extensively in this novel—not to hide a message within its text, but rather for two characters in the
novel to communicate with each other. The two symbols needed for the cipher are manifested in a
great variety of ways, not limited to print.
Walpole, Horace, Historic Doubts on the Life and Reign of King Richard the Third, J. Dodsley, London, UK,
1768, reprinted by Rowman & Littlefield, Totowa, New Jersey, 1974. The claims Pratt says Walpole
makes are not to be found in here!
Zimansky, Curt A., “Editor’s Note: William F. Friedman and the Voynich Manuscript,” Philological
Quarterly, Vol. 49, No. 2, October 1970, pp. 433–443. The last two pages reproduce text mask-
ing messages via Bacon’s biliteral cipher. This paper was reprinted in Brumbaugh, Robert S., editor,
The Most Mysterious Manuscript, Southern Illinois University Press, Carbondale and Edwardsville,
Illinois, 1978, pp. 99–108 with notes on pp. 158–159.
On Wheel Ciphers
Bazeries, Étienne, Les Chiffres Secrets Dévoilés, Charpentier-Fasquelle, Paris, France, 1901.
Bedini, Silvio A., Thomas Jefferson Statesman of Science, Macmillan, New York, 1990. Although this biog-
raphy contains only a few paragraphs dealing with cryptology, it does focus on Jefferson’s scientific
interests and accomplishments.
de Viaris, Gaëtan, L’art de Chiffrer et Déchiffrer les Dépêches Secretes, Gauthier-Villars, Paris, France, 1893.
The attack described by de Viaris makes the same assumption as the example in this chapter.
Friedman, William F., Several Machine Ciphers and Methods for their Solution, Publication No. 20, Riverbank
Laboratories, Geneva, Illinois, 1918. Friedman showed attacks on the wheel cipher in part III of this
paper. This paper was reprinted together with other Friedman papers in Friedman, William F., The
Riverbank Publications, Vol. 2, Aegean Park Press, Laguna Hills, California, 1979. As the original
printing only consisted of 400 copies, I suggest looking for the reprint instead.
Gaddy, David W., “The Cylinder-Cipher,” Cryptologia, Vol. 19, No. 4, October 1995, pp. 385–391. Gaddy
argues that the wheel cipher was probably not an independent invention of Jefferson, but rather that
he got the idea from an already existing wheel or description.
Kruh, Louis, “The Cryptograph that was Invented Three Times,” The Retired Officer, April 1971, pp. 20–21.
Kruh, Louis, “The Cryptograph that was Invented Three Times,” An Cosantoir: The Irish Defense Journal,
Vol. 32, No. 1–4, January–April, 1972, pp. 21–24. This is a reprint of Kruh’s piece from The Retired
Officer.
Kruh, Louis, “The Evolution of Communications Security Devices,” The Army Communicator, Vol. 5,
No. 1, Winter 1980, pp. 48–54.
Kruh, Louis, “The Genesis of the Jefferson/Bazeries Cipher Device,” Cryptologia, Vol. 5, No. 4, October
1981, pp. 193–208.
Mellen, Greg and Lloyd Greenwood, “The Cryptology of Multiplex Systems,” Cryptologia, Vol. 1, No. 1,
January 1977, pp. 4–16. This is an interesting introduction and overview of wheel cipher/strip cipher
systems. The cryptanalysis is done in the sequel, referenced below.
Mellen, Greg and Lloyd Greenwood, “The Cryptology of Multiplex Systems. Part 2: Simulation and
Cryptanalysis,” Cryptologia, Vol. 1. No. 2, April 1977, pp. 150–165. A program in FORTRAN V
to simulate the M-94 is described. Cryptanalysis for three cases is examined: (1) known alphabets
and known crib; (2) unknown alphabets and known crib (“A crib of 1000–1500 characters is desir-
able. Shorter cribs of several hundred letters can be used but prolong the effort.”); and (3) unknown
160 ◾ Secret History
alphabets and unknown crib. The authors noted, “The general method for this case was originated by
the Marquis de Viaris in 1893 [15] and elaborated upon by Friedman [16].” In the reference section at
the end of this paper, we see that [15] refers to David Kahn’s The Codebreakers, pp. 247–249, but [16]
is followed by blank space. Perhaps this work by Friedman was classified at the time and couldn’t be
cited!
Rohrbach, Hans, “Report on the Decipherment of the American Strip Cipher O-2 by the German Foreign
Office (Marburg 1945),” Cryptologia, Vol. 3, No. 1, January 1979. Rohrbach was one of the German
codebreakers who cracked this cipher during World War II. Following a preface, his 1945 report on
how this was done is reprinted.
Smithline, Lawren M., “A Cipher to Thomas Jefferson: A Collection of Decryption Techniques and the
Analysis of Various Texts Combine in the Breaking of a 200-year-old Code,” American Scientist,
Vol. 97, No. 2, March–April 2009, pp. 142–149.
Smoot, Betsy Rohaly, “Parker Hitt’s First Cylinder Device and the Genesis of U.S. Army Cylinder and Strip
Devices,” Cryptologia, Vol. 39, No. 4, October 2015, pp. 315–321.
For 29 years (116 issues), Cryptologia almost never repeated a cover. When it was decided to settle
on a single cover, only changing the dates each time, the image that won was of a wheel cipher
(Figure 4.13). This is fitting, as a wheel cipher cover marked the journal’s debut.
In World War I, all of the ciphers previously discussed in this book saw use, even the weakest
ones! We focus here on the best systems, ADFGX and ADFGVX. The fascinating life of Herbert
O. Yardley is also covered, but first we look at a coded telegram that had a huge effect on the war.
1 http://www.nara.gov/education/teaching/zimmermann/zimmerma.html.
163
164 ◾ Secret History
President of the above most secretly as soon as the outbreak of war with the United
States of America is certain and add the suggestion that he should, on his own ini-
tiative, invite Japan to immediate adherence and at the same time mediate between
Japan and ourselves. Please call the President’s attention to the fact that the ruthless
employment of our submarines now offers the prospect of compelling England in a
few months to make peace.
—Signed, Zimmermann.
Mexico is hardly regarded as a military threat to the United States today, but 1917 was only one
year after the punitive expedition of American troops into Mexico. Bearing this is mind helps us
to see how Mexico might have reacted positively to German overtures. Herbert O. Yardley noted,
“Mexico was openly pro-German. Our own spies who had been sent into Mexico reported that
hundreds of German reservists who fled across the border at the declaration of war were recruiting
and drilling Mexican troops.”2
The British intercepted a copy of the telegram and broke the code. The message, along with
the sinking of the Laconia (two years after the Lusitania), prompted America to join the war on
the side of England. If America had not joined the war, Germany may have won. The British
faced a challenging problem following their decoding of the telegram. How could they share it
with America and (1) not tip the Germans off to the fact that their code had been broken, and
(2) convince President Wilson that the telegram was real? Problem 2 ended up being solved by
Zimmermann himself, when he admitted on March 3 that the telegram was genuine, crushing
theories that it was a British invention designed to gain the badly needed military strength of the
United States.3
The telegram didn’t arrive like an email. It passed through Washington, where it was decoded
and put into an older code, as the ultimate destination didn’t have the codebook it was originally
sent in. The British were able to obtain the second version of the telegram that was received in
Mexico. It is this version, after decoding, that they shared with President Wilson. It differed
slightly from the original. The Germans recognized these differences and, instead of realizing
their code was broken, assumed there must have been a traitor, or a flaw in the security protocol,
in Mexico.
Although this is usually the only Zimmermann telegram mentioned in cryptology books,
another enciphered message of interest was sent earlier, on January 26, 1915:4
For Military Attaché: You can obtain particulars as to persons suitable for carrying on
sabotage in the U.S. and Canada from the following persons: one, Joseph MacGarrity,
Philadelphia; two, John P. Keating, Michigan Avenue, Chicago; three, Jeremiah
O’Leary, 16 Park Row, New York. One and two are absolutely reliable and discreet.
Number three is reliable but not always discreet. These persons were indicated by Sir
Roger Casement. In the U.S. sabotage can be carried out in every kind of factory for
supplying munitions of war. Railway embankments and bridges must not be touched.
Embassy must in no circumstances be compromised. Similar precautions must be
taken in regard to Irish pro-German propaganda.
—Signed, Zimmermann.
A few words should be written on the British cryptologists at this point. First, they were ahead of
the Germans, who didn’t even have any cryptanalysts on the western front for the first two years
of the war!5 But compared to today’s gigantic cryptologic agencies, they were very few in number.
2 Yardley, Herbert O., The American Black Chamber, Espionage/Intelligence Library, Ballantine Books, New
York, 1981, p. 90.
3 Kippenhahn, Rudolf, Code Breaking: A History and Exploration, The Overlook Press, New York, 1999, p. 65.
4 Sayers, Michael and Albert E. Kahn, Sabotage! The Secret War Against America, Harper & Brothers Publishers,
New York, 1942, p. 8. A pair of pictures is provided on page 9.
5 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 313.
166 ◾ Secret History
A total of 50 or so cryptanalysts worked in Room 40 of the Old Admiralty Building, where they
recovered about 15,000 encoded or enciphered messages between October 1914 and February
1919.6 Imagine yourself in a classroom with 50 of your peers. How many messages would your
group be able to crack? Although the cryptanalysts that were recruited were carefully chosen, to be
of very high intelligence, and many possessed fluency in foreign languages, they initially knew less
about cryptanalysis than anyone who has read this far. So, the comparison is fair, and it’s a good
thing those in Room 40 were quick learners.
These cryptanalysts received help on a few occasions in the form of recovered German code
books. One came as a gift from the Russians. On August 26, 1914, the German light cruiser
Magdeburg became stuck in shallow water at Odensholm (now Osmussaar) in the Baltic Sea. This
was Russian territory and their troops were able to recover the German Navy’s main code book
from the wreck, despite attempts by the Germans to destroy everything. The Russians then passed
it on to the British, who were the stronger naval power and could use it to great advantage. Indeed,
the Germans kept this particular code in use for years!7
World War I is referred to as “The Chemists’ War” due to the major role of chemical warfare,
and World War II is called “The Physicists’ War” because of the atomic bomb. It has been claimed
that, if it occurs, World War III will be “The Mathematicians’ War” (if anyone is left to talk about
it). Just imagine a cyberattack that renders all of the enemies’ computer systems useless and shuts
down all enemy communications.
6 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, pp. 275 and 273.
7 Rislakki, Jukka, “Searching for Cryptology’s Great Wreck,” Cryptologia, Vol. 31, No. 3, July 2007, pp.
263–267.
8 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 307.
9 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 344.
10 Norman, Bruce, “The ADFGVX Men,” The Sunday Times Magazine, August 11, 1974, pp. 8–15, p. 11 cited
here.
11 This is not the first time that substitution and transposition were combined. Some earlier instances are pointed
out in the References and Further Reading list at the end of this chapter.
World War I and Herbert O. Yardley ◾ 167
Figure 5.2 General Erich Ludendorff directed Germany’s spring offensive under the protection
of a new and seemingly secure cipher system. (http://en.wikipedia.org/wiki/Erich_Ludendorff.)
12 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 345.
13 This translates to “God is With Us” and appeared on the belt buckles of some German troops in both World
War I and World War II. If true, it looks like God is 0 for 2 in world wars.
168 ◾ Secret History
was common to have 20 values. This is an example of a fractionating cipher, so-called because the
original message letters are replaced by pairs that become split in the transposition step.
His masterly solutions of German ciphers caused him to become known as “artisan
of the victory” over the Germans when Paris might have fallen but for the knowledge
gained of German intentions by Painvin of where they would strike.
Figure 5.3 French cryptanalyst Georges Painvin. (Courtesy of the David Kahn Collection at the
National Cryptologic Museum.)
14 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 347.
15 Childs, J. Rives, “My Recollections of G.2 A.6,” Cryptologia, Vol. 2, No. 3, July 1978, pp. 201–214, p. 206
quoted here.
World War I and Herbert O. Yardley ◾ 169
Painvin’s solution didn’t allow all ADFGVX ciphers to be read, but did crack some special cases.
A general solution was found only after the war ended. We examine a special case below, but some
of the comments can be applied more generally. If the following paragraphs are too abstract, feel
free to skip ahead to the example.
To attack an ADFGVX cipher, all we need to worry about is how to unravel the transposition
portion. Once this is done, we’re left with a Polybius cipher that’s very easy to break. The first
step in unraveling the transposition portion is determining how many columns were used. We’ll
assume that the intercepted message has no “short columns.” That is, the message forms a perfect
rectangle. This will make things somewhat easier. As a first step, we determine if the number of
columns is even or odd. This can be done by comparing two sets of frequencies. To see this, con-
sider the generic examples below.
Expressing the pairs of ciphertext letters representing each message letter as BE (B stands for
“beginning” and E stands for “end”), our rectangle, prior to transposition will take one of two
forms, depending on whether the number of columns is even or odd.
Even # of Columns
Odd # of Columns
B E B E B E … B E B E B E B E … B E B
B E B E B E … B E E B E B E B … E B E
B E B E B E … B E B E B E B E … B E B
B E B E B E … B E E B E B E B … E B E
:
: :
:
: : :
B E B E B E … B E (form of last row depends on number of rows)
For the even case, after transposing columns, each column will still be all Bs or all Es. For the
odd case, after transposing columns, each column will still alternate Bs and Es.
Unless the placement of the letters in the Polybius square is carefully done to avoid it, the fre-
quencies of the individual letters A, D, F, G, V, and X will differ as beginning characters and end
characters in the ciphertext pairs. This allows a cryptanalyst to determine, using the patterns above,
if the number of columns is even or odd. The manner in which this is done is now described.
Given a message of n characters, we divide by a number we feel is an upper bound for the
number of columns, c, used. The result, n/c, will be a lower bound on the number of rows. Suppose
n/c is 18, for example, then the first 18 letters in the ciphertext are all from the same column. It
could be a column of all Bs, all Es, or half and half. Take the characters in the odd positions and
construct a frequency distribution. These characters must all be of the same type (all Bs or all Es),
whether the number of columns is even or odd. Now take the characters in the even positions and
construct a frequency distribution.
Again, these characters must all be of the same type, B or E. Now compare the two frequency dis-
tributions. If they look similar, then all characters in that column are of the same type, so the number
of columns must be even. If they look dissimilar, an odd number of columns must have been used.
If we decide that the frequency distributions match, then the number of columns is even. We
can then plug the ciphertext into rectangles representing each possible number of columns, 2,
4, 6, …22, 24, 26, 28,… (the extreme ends are not likely). For each case, we may then calculate
frequency distributions for each column. For the correct case, the distributions should fall into
two distinct groups, each containing the same number of columns. A similar approach is used to
determine the number of columns when it is known to be odd (See Exercise 7).
Once the number of columns is known, the ciphertext may be written out in the appropriate
size rectangle. In order to undo the transposition, we first use the distinct frequency distributions
170 ◾ Secret History
to label each column as either a B or an E (in the odd case, this label would merely indicate which
type of character begins the column). At this stage there is no way of knowing which is which,
but that’s not a problem! We simply label an arbitrary column as a B, then label the other columns
with similar frequency distributions as Bs, and finally label the rest as Es.
If the first column we labeled was labeled correctly, all is well. If it was actually an E column,
that’s okay too, as the only change that makes is to index the entries in the Polybius square by
column and row, rather than by row and column.
We may then pair B columns with E columns such that the resulting ciphertext letters have a
frequency distribution that resembles the suspected plaintext language. Once the letters have been
recreated by joining Bs to Es, we should be able to pick out some high frequency plaintext letters
such as E and T. This will help us to order the paired columns correctly, especially if we have a crib
or can find the appearances of common words such as THE and AND. When the columns are all
correctly ordered, the rest is easy—just a monoalphabetic substitution cipher without word spac-
ing. The example that follows should make this attack much clearer.
Example 1
We’ll attack a message enciphered with the original version, ADFGX, so that we needn’t be con-
cerned with the frequencies of the various numbers, after we unravel the transposition. We’ll
assume that the message is in English and that I and J are enciphered by the same pair, as a 5-by-5
grid only allows for 25 plaintext characters. Our ciphertext is
AXDXD XDDDX DXXDD DXXDD DXXDG DXGXX XDFGA AGGAF FGGFA AAFFA
ADGGF GFFAD FAFAD FGGAF DFDXD XDFFX AXDXG FGFGX DXGXX DXFAD
XGFDA AFADF FFGGA DFGDF FADFA GAAFF GAAGG XFFDF GGDFG FDFFF
GAFDA FAFAF GAFAA FAFFX DXFXF GDDGX DFFFG XDFXX XDFFX ADAFA
FDXFX FGADD GGDDA AXXXX FFGXX FDXXD FXFGD DFFFD DXDDA DDXDD
GXAFD DXXXX DGGDF XXXXF XXDDD AGGDA FAAGF GGGFA GFGAG FFXAG
FFFGF FXXFX AFXDG DXXXD XXXXD XAADF FXDDF GXGDX XFXXX AGGXD
AFFAX FGFAX XXXAD FFDFD DFDXD XFFXX XDXDA GDGFX XGDFA FGXFG
DDXXX XXGXF XFXXF AXGXF DXDDD AXDDD XFXFD XAFDG XFGGA AAAGF
GAAAA FGAGA AGAAA FDGAF DAGAA GGFDF FGGGG GGAGG AFGAA GFFFG
FGAFF DFAFA GGAGA FGAAD AGGGF GFGFG FFAGA GGAAF AAAGD GGXGF
GGAFF AGAFG AAAAF GDAAG DGFGF FGGXX DDXFD FXXXG GAXXX GGDDG
FFGXD XGDGX FXXGA AGAFG ADAGG FXFGG GAAGA FFGFD DAAAA DGAFF
AFGDA ADFGD FAAGG AFAAG FGGGG FFGDG
We have 680 characters and we assume that no more than 30 columns were used, so there must be at
least 22 characters in each column (680/30 ≈ 22.67). We take the first 22 characters, AXDXD XDDDX
DXXDD DXXDD DX, and find the frequency distribution for the characters in the odd positions:
A = 1, D = 8, F = 0, G = 0, X = 2
and in the even positions:
A = 0, D = 4, F = 0, G = 0, X = 7
Experience helps us to decide whether the two distributions are similar or dissimilar. The
marked difference in the frequency of X might incline us to the latter, but F and G have identi-
cal frequencies, and A is as close as possible without being identical. With three out of five letters
matching so closely, we conclude that the distributions are the same.
World War I and Herbert O. Yardley ◾ 171
Under the assumption that the rectangle is completely filled in, we can also examine the last
22 characters, GD FAAGG AFAAG FGGGG FFGDG, to see if our conclusion is reinforced. For
characters in odd positions, we have:
A = 2, D = 1, F = 4, G = 4, X = 0
and in the even positions:
A = 3, D = 1, F = 1, G = 6, X = 0
The values for D and X match exactly, the values for A only differ by one, and the values for G
differ by two, so our conclusion gains further support. Also, observe how markedly both of these
distributions (no Xs!) differ from the first 22 characters. It seems that the columns these letters
represent cannot both be of the same type (B or E).
So, we have an even number of columns, and that number must divide 680. Our choices are 2,
4, 8, 10, 20, 34, 68, 170, 340, or 680. We already assumed that no more than 30 columns were used,
so our list quickly shrinks to 2, 4, 8, 10, 20. The smaller values seem unlikely, so we test 10 and 20.
10 Columns
2 69-136 13 13 18 12 12 7 409-476 21 11 12 12 12
4 205-272 5 20 12 8 23 9 545-612 10 10 13 21 14
5 273-340 10 12 16 14 16 10 613-680 21 8 15 22 2
Columns 8 and 10 stand out as having very few Xs, but we need to split the columns into two
groups, Bs and Es, so each group must contain five columns. What other three columns resemble
these? The next lowest frequencies for X are 10, 12, and 12—quite a jump!
20 Columns
2 35-68 11 3 11 9 0 12 375-408 2 7 6 6 13
3 69-102 3 8 7 5 11 13 409-442 3 9 8 2 12
From this table, it’s easy to split the columns into two groups with distinct frequency distributions.
The frequency of X, by itself, clearly distinguishes them. Thus, we conclude that 20 columns were
used. Our two distinct groups are
We must now pair them together to represent the plaintext letter. Our work thus far fails to indi-
cate whether Group 1 columns are beginnings or ends of pairs. Happily, it doesn’t matter. As men-
tioned prior to this example, reversing the order of the pairs arising from the Polybius cipher will
simply correspond to someone having misused the table—writing first the column header, then
the row header, instead of vice versa. As long as all pairs are switched, switching doesn’t matter. So,
we’ll assume that the high frequency X group provides the beginnings. To determine which Group
2 column completes each of the Group 1 beginnings, Painvin, and the American cryptanalysts
who examined the problem in the years to follow, simply looked at the frequency distributions for
the various possibilities and selected the ones that looked the most like the language of the mes-
sage. We’d prefer a more objective method, but the obvious approaches don’t produce great results.
Two approaches are examined below.
Although not discovered until after World War I, the index of coincidence seems like it should
be a good measure. If a potential pairing of columns yields a value near 0.066, we favor it over
pairings yielding other values. The complete results are given below, with the correct pairings
underlined and boldfaced.
End Column
2 4 5 9 14 15 16 17 19 20
1 0.0909 0.0802 0.1087 0.0891 0.1462 0.1052 0.1034 0.0856 0.0963 0.0873
3 0.0517 0.0481 0.0749 0.0624 0.0731 0.0517 0.0784 0.0481 0.0481 0.0446
6 0.0535 0.0463 0.0766 0.0553 0.0766 0.0713 0.0731 0.0517 0.0446 0.0588
7 0.0570 0.0446 0.0606 0.0606 0.0660 0.0642 0.0677 0.0535 0.0517 0.0535
Start
Column 8 0.0731 0.0553 0.0856 0.0695 0.0998 0.0820 0.0784 0.0677 0.0980 0.0624
10 0.0713 0.0606 0.0749 0.0588 0.1248 0.0802 0.1230 0.0570 0.0588 0.0695
11 0.0624 0.0535 0.0517 0.0446 0.0873 0.0695 0.0802 0.0463 0.0624 0.0677
12 0.0606 0.0713 0.0677 0.0660 0.0731 0.0695 0.0731 0.0624 0.0570 0.0695
13 0.0570 0.0535 0.0660 0.0606 0.0873 0.0624 0.1141 0.0588 0.0535 0.0570
18 0.0713 0.0624 0.0570 0.0499 0.1016 0.0713 0.0677 0.0606 0.0660 0.0749
The correct values range from 0.0446 to 0.1034; thus, this test is not as useful as we might expect.
Another obvious approach is to examine, for each possible pairing, the frequency table and see
how it compares to that of normal English. To do this, we order the frequencies for each pairing
and the regular alphabet, then compare the most frequent in each group, the second most fre-
quent in each group, and so on. To attach a number to this, we sum the squares of the differences
World War I and Herbert O. Yardley ◾ 173
between observed and expected frequencies. This yields the table below. Once again, values for
correct pairings are underlined and boldfaced.
End Column
2 4 5 9 14 15 16 17 19 20
1 0.0208 0.0140 0.0310 0.0187 0.0556 0.0277 0.0285 0.0167 0.0245 0.0191
3 0.0039 0.0029 0.0123 0.0064 0.0105 0.0039 0.0161 0.0029 0.0072 0.0040
6 0.0061 0.0045 0.0115 0.0056 0.0149 0.0109 0.0105 0.0046 0.0033 0.0055
7 0.0050 0.0040 0.0068 0.0076 0.0078 0.0084 0.0099 0.0070 0.0046 0.0054
Start
Column 8 0.0112 0.0069 0.0162 0.0084 0.0269 0.0175 0.0135 0.0105 0.0254 0.0064
10 0.0119 0.0076 0.0140 0.0052 0.0388 0.0127 0.0438 0.0053 0.0074 0.0084
11 0.0071 0.0045 0.0035 0.0028 0.0192 0.0092 0.0144 0.0045 0.0083 0.0103
12 0.0073 0.0109 0.0087 0.0100 0.0105 0.0115 0.0105 0.0063 0.0061 0.0119
13 0.0047 0.0038 0.0092 0.0058 0.0174 0.0084 0.0347 0.0055 0.0046 0.0062
18 0.0103 0.0063 0.0057 0.0049 0.0253 0.0090 0.0089 0.0075 0.0092 0.0135
Looking at the fourth row (headed with 7) we see that the smallest value represents the correct
pairing. Sadly, this is the only row for which this happens! Thus, this approach also fails to readily
pair the columns.
As was mentioned before, Painvin, and later American cryptanalysts who approached this
problem, didn’t use either of these measures. They simply looked at the frequency distributions for
possible pairings and determined by sight which were most likely. Being that there are 10! ways
to pair the columns when 20 columns are used, this must have taken them a great deal of time.
Surely this was the most difficult step in solving ADFGX and ADFGVX.
With today’s technology, we can consider all 10! possibilities. Each possibility then gives 10
columns (each consisting of two letters per row), which may be arranged in 10! ways. The correct
arrangement then represents a monoalphabetic substitution cipher without word spacing, which
may easily be solved with technology or by hand.
We continue our attack, assuming that the correct pairings have been determined, probably
after tremendous trial and error. The pairings are
1 ↔ 16
3 ↔ 9
6 ↔ 5
7 ↔ 4
8 ↔ 2
10 ↔ 15
11 ↔ 17
12 ↔ 20
13 ↔ 19
18 ↔ 14
We must now find the proper order for these ten pairs of columns and solve the Polybius cipher
(without word divisions) that they provide. There are 10! = 3,628,800 ways to arrange pairs of
columns, so we could brute force a solution with a computer.
174 ◾ Secret History
If we prefer to stick to World War I-era technology, we can use the frequency distribution for
all of the pairs to guess at some letters and piece together columns by making words. We have
AA = 6 DA = 27 FA = 19 GA = 15 XA = 36
AD = 3 DD = 7 FD = 8 GD = 6 XD = 8
AF = 12 DF = 22 FF = 23 GF = 7 XF = 31
AG = 5 DG = 31 FG = 12 GG = 13 XG = 43
AX = 0 DX = 1 FX = 2 GX = 0 XX = 3
Based on these frequencies, we conjecture that XG = E and XA = T. Substituting these values in wher-
ever they appear, we have
1↔16 3↔9 6↔5 7↔4 8↔2 10↔15 11↔17 12↔20 13↔19 18↔14
AG AD DF XF FA E XD DG XX E
E FD XF FD DA XF FG T FG E
DA DD FD GA DG FD E FF T DA
E FA XF AA E XF XX FF FA DA
DA DG FG DF DA AF E T E T
XF E GG DA DF FG AF XF T FA
DG DD DD GD AF E GG E FF DG
DA T DF GF DG DG GG DD AG FF
DA DF GG DF DG GG T T T E
XD FA XF DF XF DG DF DA GD T
DA FA DD AG DA E AF AD T T
E E FF AG DA T FA GF FG GA
E AF FF T GA E FG DG DG GA
DG E FF XD XF DG AA GD XF AF
DF DG GG XF AF T XF FF DX E
DG E T E FA XF FG T DF T
XF GF DF FD DA E GA T DG E
E FA FD FF DD T FA GG AG GA
DF GG T GF E DA AA DG E GA
DG FF XF T E E T FA DA DG
DF GG T XD XF AF XF AF DA DA
XF T DF FF E AF E FA DG GA
T DG FA DA DF DF XD GA T FA
DG XF FF E GF FG AA E FF FF
GA GF E T GA FF DA FF XF GD
DG XX AA DA DD E FG GG FG E
E T DF FF FF DA FD DG DF DA
GA DG AA XF T DF DG DG XD XF
T XF FA FG XF FF FF E AD GD
XF FF AF GA T GD DG XF FA DA
T AF FA DA XD XF DF XF DA GG
DA DG DF DG FF GA FF E GA T
FA XF XF FG E DF DG XD T FA
GG GF FX FX E T E GG FD E
World War I and Herbert O. Yardley ◾ 175
We’ll use the high frequency of the trigraph THE to order the columns. We start with column
1↔16. Looking down at position 12 and 18, we find Es that match the positions of Ts in column
10↔15. We expect that there is a column that appears between these two that has a plaintext H
in positions 12 and 18. We look for a column such that the pairs in positions 12 and 18 match and
come up with two possibilities 11↔17 and 18↔14. They can be considered separately.
Case 1 Case 2
10↔15 11↔17 1↔16 10↔15 18↔14 1↔16
E XD AG E E AG
XF FG E XF E E
FD E DA FD DA DA
XF XX E XF DA E
AF E DA AF T DA
FG AF XF FG FA XF
E GG DG E DG DG
DG GG DA DG FF DA
GG T DA GG E DA
DG DF XD DG T XD
E AF DA E T DA
T FA E T GA E
E FG E E GA E
DG AA DG DG AF DG
T XF DF T E DF
XF FG DG XF T DG
E GA XF E E XF
T FA E T GA E
DA AA DF DA GA DF
E T DG E DG DG
AF XF DF AF DA DF
AF E XF AF GA XF
DF XD T DF FA T
FG AA DG FG FF DG
FF DA GA FF GD GA
E FG DG E E DG
DA FD E DA DA E
DF DG GA DF XF GA
FF FF T FF GD T
GD DG XF GD DA XF
XF DF T XF GG T
GA FF DA GA T DA
DF DG FA DF FA FA
T E GG T E GG
If Case 1 is correct, then FA = H. If Case 2 is correct, then GA = H.
At the moment, we don’t have enough information to decide, which is correct, so we begin to pur-
sue each possibility further. For Case 1, we look for a column that can appear at the beginning of
the chain we’re building such that a T would be joined to an E in columns 10↔15. We have four
176 ◾ Secret History
columns that do this: 7↔4, 12↔20, 13↔19, and 18↔14. Initially, 7↔4 looks best, because
it pairs two Ts and Es, whereas the others only pair one T and E. However, there is no column
that can be placed between 7↔4 and 10↔15 to place an H between each T and E. Recall, H is
identified as FA in Case 1, so it is not enough to find a column that has the same letter pair in
each of these two positions; we also want that pair to be FA. (12↔20 offers an FA in one of those
positions, but then the other T-E would have some other letter in the middle, which is possible…)
Now suppose 12↔20 leads into Case 1. Then we’d want an FA in position 17 of some other col-
umn (to fill in the H), and we don’t have it anywhere. Moving on to 13↔19, we’d want an FA
in position 11. We are given this by 3↔9. The last case is the same: For 18↔14, we’d want an
FA in position 11 and we are given this by 3↔9. So, there are three reasonable possibilities, with
the last two being slightly favored, because they don’t require a non-H to appear between T and E
anywhere. We examine these last two possibilities as subcases of Case 1.
Case 1a Case 1b
13↔19 3↔9 10↔15 11↔17 1↔16 18↔14 3↔9 10↔15 11↔17 1↔16
XX AD E XD AG E AD E XD AG
FG FD XF FG E E FD XF FG E
T DD FD E DA DA DD FD E DA
FA FA XF XX E DA FA XF XX E
E DG AF E DA T DG AF E DA
T E FG AF XF FA E FG AF XF
FF DD E GG DG DG DD E GG DG
AG T DG GG DA FF T DG GG DA
T DF GG T DA E DF GG T DA
GD FA DG DF XD T FA DG DF XD
T FA E AF DA T FA E AF DA
FG E T FA E GA E T FA E
DG AF E FG E GA AF E FG E
XF E DG AA DG AF E DG AA DG
DX DG T XF DF E DG T XF DF
DF E XF FG DG T E XF FG DG
DG GF E GA XF E GF E GA XF
AG FA T FA E GA FA T FA E
E GG DA AA DF GA GG DA AA DF
DA FF E T DG DG FF E T DG
DA GG AF XF DF DA GG AF XF DF
DG T AF E XF GA T AF E XF
T DG DF XD T FA DG DF XD T
FF XF FG AA DG FF XF FG AA DG
XF GF FF DA GA GD GF FF DA GA
FG XX E FG DG E XX E FG DG
DF T DA FD E DA T DA FD E
XD DG DF DG GA XF DG DF DG GA
AD XF FF FF T GD XF FF FF T
FA FF GD DG XF DA FF GD DG XF
DA AF XF DF T GG AF XF DF T
GA DG GA FF DA T DG GA FF DA
T XF DF DG FA FA XF DF DG FA
FD GF T E GG E GF T E GG
World War I and Herbert O. Yardley ◾ 177
In Case 1a, T is followed by FA three times, but in Case 1b, T is followed by FA four times. We
thus conclude that Case 1b looks better. Now we attempt to expand out Case 1b further.
Consider position 6. We have an FA (presumed H) followed by an E in the next column. We
therefore look for a column with a T in position 6 to complete another THE. Column 13↔19 is
the only one that works. We now have
We could continue to work the front of the key, but none of the matches at this stage can be made
with great confidence, so we move to the end. Notice how 1↔16 has Ts in positions 23, 29, and
31. Column 6↔5 provides a wonderful match with FAs (presumed Hs) in all of these positions.
We place this in our partially reconstructed key.
178 ◾ Secret History
We could continue trying to place the three remaining columns in our partially recovered key, but
there is not a very strong reason to make another placement at this stage, so instead we consider
these remaining columns as a group. There are six ways to order three objects and we quickly
notice that two of these orderings, 12↔20 8↔2 7↔4 and 7↔4 8↔2 12↔20, are such that
THE appears (positions 16 and 20, respectively). See below.
World War I and Herbert O. Yardley ◾ 179
We could look more closely at these two possibilities in isolation, but we can also see how they fit
onto the key we’ve partially assembled; for example, placing 12↔20 8↔2 7↔4 at the start of our
partial key gives us some nice results.
180 ◾ Secret History
12↔20 8↔2 7↔4 13↔19 18↔14 3↔9 10↔15 11↔17 1↔16 6↔5
DG FA XF XX E AD E XD AG DF
T DA FD FG E FD XF FG E XF
FF DG GA T DA DD FD E DA FD
FF E AA FA DA FA XF XX E XF
T DA DF E T DG AF E DA FG
XF DF DA T FA E FG AF XF GG
E AF GD FF DG DD E GG DG DD
DD DG GF AG FF T DG GG DA DF
T DG DF T E DF GG T DA GG
DA XF DF GD T FA DG DF XD XF
AD DA AG T T FA E AF DA DD
GF DA AG FG GA E T FA E FF
DG GA T DG GA AF E FG E FF
GD XF XD XF AF E DG AA DG FF
FF AF XF DX E DG T XF DF GG
T FA E DF T E XF FG DG T
T DA FD DG E GF E GA XF DF
GG DD FF AG GA FA T FA E FD
DG E GF E GA GG DA AA DF T
FA E T DA DG FF E T DG XF
AF XF XD DA DA GG AF XF DF T
FA E FF DG GA T AF E XF DF
GA DF DA T FA DG DF XD T FA
E GF E FF FF XF FG AA DG FF
FF GA T XF GD GF FF DA GA E
GG DD DA FG E XX E FG DG AA
DG FF FF DF DA T DA FD E DF
DG T XF XD XF DG DF DG GA AA
E XF FG AD GD XF FF FF T FA
XF T GA FA DA FF GD DG XF AF
XF XD DA DA GG AF XF DF T FA
E FF DG GA T DG GA FF DA DF
XD E FG T FA XF DF DG FA XF
GG E FX FD E GF T E GG FX
World War I and Herbert O. Yardley ◾ 181
The last column has a T in position 19. When reading the plaintext out of this rectangle, the end
of this row is continued at the start of row 20, where we have HE. This indicates a nice placement
of the three remaining columns.
Placing 12↔20 8↔2 7↔4 at the end instead doesn’t yield such a nice result, nor does
placing 7↔4 8↔2 12↔20 at either end. Therefore, we assume that we now have the proper
ordering of the columns and that we have correctly identified T, H, and E. Recall that this was all
part of Case 1b. If we had reached an impasse, we would have backed up to consider Case 1a or
even Case 2, but because things seem to be working out, there’s no need.
We fill in all 19 Hs to get the partial plaintext below:
DG H XF XX E AD E XD AG DF
T DA FD FG E FD XF FG E XF
FF DG GA T DA DD FD E DA FD
FF E AA H DA H XF XX E XF
T DA DF E T DG AF E DA FG
XF DF DA T H E FG AF XF GG
E AF GD FF DG DD E GG DG DD
DD DG GF AG FF T DG GG DA DF
T DG DF T E DF GG T DA GG
DA XF DF GD T H DG DF XD XF
AD DA AG T T H E AF DA DD
GF DA AG FG GA E T H E FF
DG GA T DG GA AF E FG E FF
GD XF XD XF AF E DG AA DG FF
FF AF XF DX E DG T XF DF GG
T H E DF T E XF FG DG T
T DA FD DG E GF E GA XF DF
GG DD FF AG GA H T H E FD
DG E GF E GA GG DA AA DF T
H E T DA DG FF E T DG XF
AF XF XD DA DA GG AF XF DF T
H E FF DG GA T AF E XF DF
GA DF DA T H DG DF XD T H
E GF E FF FF XF FG AA DG FF
FF GA T XF GD GF FF DA GA E
GG DD DA FG E XX E FG DG AA
DG FF FF DF DA T DA FD E DF
DG T XF XD XF DG DF DG GA AA
E XF FG AD GD XF FF FF T H
XF T GA H DA FF GD DG XF AF
XF XD DA DA GG AF XF DF T H
E FF DG GA T DG GA FF DA DF
XD E FG T H XF DF DG H XF
GG E FX FD E GF T E GG FX
There are many ways to pursue the decipherment from here. The problem has been reduced to
a Polybius cipher without word spacing. A vowel recognition algorithm (as seen in Section 1.11)
could be applied and, perhaps after a few guesses the As, Os, and Is could all be filled. From there,
it ought to be easy to guess a few words. The solving process would then quickly accelerate. Other
approaches are possible. Ultimately, it’s up to you, as the final steps are left as an exercise.
The approach used in this example won’t work so nicely (although it can be patched), if the rect-
angle is not completely filled in. However, if the length of the transposition key is 20, then 5% of the
182 ◾ Secret History
messages should, by chance, complete a rectangle. Without knowing the length of the key, the attack
could be tried on every message, and once the key is found, the other messages may be easily broken.
With the large number of messages that flow in every war, 5% will give cryptanalysts a lot to work with.
It wasn’t until 1966 that the creator of ADFGVX, Fritz Nebel, learned that his system had
been cracked. He commented:16
I personally would have preferred the second stage to have had a double transposition
rather than a single one, but, in discussion with the radiotelegraphy and decipherment
chiefs, the idea was rejected. The result was a compromise between technical and tac-
tical considerations. Double transposition would have been more secure but too slow
and too difficult in practice.
Nebel met Painvin in 1968. He described it as “the enemies of yesterday meeting as the friends of
today.” At this meeting, Painvin recalled, “I told him, if he had his way and they’d used a double
transposition I’d never have been able to make the break.”17
Following previous wars, America’s spies and codebreakers went back to the lives they led
before their time of service, but America was now headed into an era of permanent institutions to
handle such activities. In this regard, America was behind the Europeans, and there would still
be some bumps in the road, but the nation was on its way to establishing an intelligence empire.
William Friedman would play an important role as he transitioned from his work at Riverbank
Laboratories to the government. We’ll come back to him soon, but for now we’ll take a look at the
contributions and controversies of Herbert O. Yardley.
16 Norman, Bruce, “The ADFGVX Men,” The Sunday Times Magazine, August 11, 1974, pp. 8–15, p. 11 cited here.
17 Norman, Bruce, “The ADFGVX Men,” The Sunday Times Magazine, August 11, 1974, pp. 8–15, p. 15 cited here.
World War I and Herbert O. Yardley ◾ 183
one was given to his widow for Yardley to be buried in.18 Yardley was anti-Semitic, whereas Friedman
was Jewish.19 Friedman had a long, apparently happy marriage, but one gets the impression that he
didn’t have a great deal of confidence with women. Yardley, on the other hand, was not at all shy. He
even bragged about knowing his way around a Chinese whorehouse and hosted orgies for visiting
journalists and diplomats while he was in China.20 Although it’s not often a good indicator, even
their handwriting represents their distinct personalities (Figures 5.5 and 5.6).
Figure 5.5 The precise penmanship of the Friedmans. (from the collection of the author.)
Figure 5.6 Yardley’s sloppy scrawl. (from the collection of the author.)
Yardley started out in a noncryptographic position at the State Department; he was just a telegraph
operator and code clerk, but having access to coded messages, including ones addressed to President
Wilson, he made attempts to break them. He eventually produced a report “Solution of American
Diplomatic Codes” for his boss. His successes made a strong case for the need for an improved
American cryptographic bureau, or black chamber, and his skills at self-promotion catapulted him into
a new position as chief of this new organization under the War Department, formally called the Cipher
Bureau, in 1917. Black chambers had played an important role in European history, but such cryptana-
lytic units were new in the United States. In 18 months, Yardley’s team (Military Intelligence Section
8, or MI-8) had read almost 11,000 messages in 579 cryptographic systems. This is even more amazing
18 Kahn, David, The Reader of Gentlemen’s Mail, Yale University Press, New Haven, Connecticut, 2004, p. 288.
19 Kahn, David, The Reader of Gentlemen’s Mail, Yale University Press, New Haven, Connecticut, 2004, pp. 88
and 146.
20 Kahn, David, The Reader of Gentlemen’s Mail, Yale University Press, New Haven, Connecticut, 2004, p. 196.
184 ◾ Secret History
when one considers that the team initially consisted of only Yardley himself and two clerks. At its peak
in November 1918, the team consisted of 18 officers, 24 civilians, and 109 typists.21
Of course, protecting America’s messages was also an important aspect of Yardley’s organiza-
tion. Messages traveling to and from Europe via the transatlantic cable could be obtained by the
Germans by having their submarines place long cables (hundreds of feet long) next to the transat-
lantic cable to pick up the messages by induction. Yardley’s team overhauled our codes and ciphers
so that such interceptions wouldn’t matter. Yardley lamented the results of the Germans reading
our World War I communications prior to this overhaul:
The American offensive of September 12, 1918, was considered a triumph, but it rep-
resents only a small part of what might have been a tremendous story in the annals of
warfare, had the Germans not been forewarned. The stubborn trust placed in inad-
equate code and cipher systems had taken its toll at the Front.22
Yardley had a method for gaining intercepts that was much simpler than induction. He or a State
department official approached high officers in the cable companies and asked for the messages.
It was illegal for anyone to agree to this request, but reactions weren’t all negative. “The govern-
ment can have anything it wants,” was the response from W. E. Roosevelt of the All-America
Cable Company.23 At one point, when the cable companies cut off Yardley’s supply of messages,
he regained them with bribes.24
The Cipher Bureau’s work wasn’t strictly limited to codes and ciphers. They also dealt with
secret inks and shorthand systems. The ability to detect messages hidden by the use of secret inks
exposed German spy networks in the United States, but such work could be dangerous, even in
times of peace. In an experiment with secret ink chemicals in 1933, Yardley cut his palm on a piece
of glass and an infection led to one of his fingers having to be amputated (Figure 5.7).25
Figure 5.7 Yardley’s injured right hand. (Courtesy of the David Kahn Collection, National
Cryptologic Museum, Fort Meade, Maryland.)
21 Lewand, Robert Edward, Cryptological Mathematics, MAA, Washington, DC, 2000, p. 42. Kahn put the total
at 165. Perhaps the few extra didn’t fit into any of the categories listed here. See p. xvii of the foreword to Yardley,
Herbert O., The American Black Chamber, Espionage/Intelligence Library, Ballantine Books, New York, 1981.
22 Yardley, Herbert O., The American Black Chamber, Espionage/Intelligence Library, Ballantine Books, New
York, 1981, p. 19.
23 Kahn, David, The Reader of Gentlemen’s Mail, Yale University Press, New Haven, Connecticut, 2004, p. 58.
24 Kahn, David, The Reader of Gentlemen’s Mail, Yale University Press, New Haven, Connecticut, 2004, p. 84.
25 Kahn, David, The Reader of Gentlemen’s Mail, Yale University Press, New Haven, Connecticut, 2004, pp.
146–147.
World War I and Herbert O. Yardley ◾ 185
The chamber might have been disbanded when World War I ended, but Yardley convinced
his superiors to keep it open. This represented a tremendous change for America. It is the first
time that the American codebreakers didn’t return to their prior lives following the end of a war.
Yardley explained, “As all the Great Powers maintained such a Cipher Bureau, the United States
in self-defense must do likewise.”26
26 Yardley, Herbert O., The American Black Chamber, Espionage/Intelligence Library, Ballantine Books, New
York, 1981, p. 133.
27 The funding from the State Department couldn’t be used within DC; thus, a move was required. See Yardley,
Herbert O., The American Black Chamber, Espionage/Intelligence Library, Ballantine Books, New York, 1981, p. 156.
28 Yardley, Herbert O., The American Black Chamber, Espionage/Intelligence Library, Ballantine Books, New
York, 1981, p. 208.
186 ◾ Secret History
Knowing how far Japan could be pushed concerning ship tonnage ratios allowed the United States
to do so with full confidence, merely waiting for the Japanese to give in. The Japanese finally
accepted the 10:10:6 ratio on December 10, 1921.
This was the greatest peacetime success of the Cipher Bureau, but they did attack many other
systems and eventually cracked the ciphers of 20 different countries.29 With the election of Herbert
Hoover in 1928, however, the politics of decipherment changed. Hoover’s secretary of state, Henry
L. Stimson, was actually offended when decrypts were provided to him. He later summed up his
feelings with the famous quote, “Gentlemen do not read each other’s mail.” Stimson withdrew the
Cipher Bureau’s funding and it was formally shut down on October 31, 1929.30
Of course, such organizations have a way of staying around whether they’re wanted or not.
Usually they just change names. In this case, William Friedman took possession of Yardley’s files
and records and the work continued under the Signal Intelligence Service (SIS),31 part of the Army
Signal Corps. Apparently Stimson was not aware of this group! Yardley was offered a position with
SIS, but the salary was low, and he was expected to refuse, as he did.
Out of work, in the aftermath of the stock market crash, and strapped for cash, Yardley decided
to write about his adventures in codebreaking. On June 1, 1931, Yardley’s book, The American
Black Chamber, was released. He stated in the foreword, “Now that the Black Chamber has been
destroyed there is no valid reason for withholding its secrets.”32 The book sold 17,931 copies, which
was a remarkable number for the time.33 An unauthorized Japanese edition was even more popu-
lar. American officials denied that the Cipher Bureau existed, but privately sought to prosecute
Yardley for treason. The nations whose ciphers had been broken were now aware of the fact and
could be expected to change systems. It’s often claimed that, as a result of Yardley’s disclosures, the
Japanese changed their ciphers and eventually began to make use of a tough machine cipher that
they called “type B cipher machine.” The Americans called it Purple and this was the system in use
at the time of the attack on Pearl Harbor. However, David Kahn makes a good case for Yardley’s
revelations having done no harm! Graphs of the number of cryptanalytic solutions to Japanese
codes and ciphers (and those of foreign nations in general) show no dip following the publication
of The American Black Chamber (Figure 5.8).
29 From the foreword (p. xi) to Yardley, Herbert O., The American Black Chamber, Espionage/Intelligence Library,
Ballantine Books, New York, 1981. On page 222 of this book, Yardley lists the countries as “Argentina, Brazil,
Chile, China, Costa Rica, Cuba, England, France, Germany, Japan, Liberia, Mexico, Nicaragua, Panama,
Peru, Russia, San Salvador, Santo Domingo, Soviet Union and Spain.”
30 From the foreword (p. xii) to Yardley, Herbert O., The American Black Chamber, Espionage/Intelligence
Library, Ballantine Books, New York, 1981. Years later, in 1944, Secretary of State Edward P. Stettinius acted
similarly and had the Office of Strategic Services (OSS) return Soviet cryptographic documents (purchased by
the OSS from Finnish codebreakers in 1944) to the Soviet Embassy! See Benson, Robert Louis and Michael
Warner, editors, Venona: Soviet Espionage and the American Response, 1939-1957, NSA/CIA, Washington,
DC, 1996, p. xviii and p. 59.
31 The size of the SIS at various times is given in Foerstel, Herbert N., Secret Science: Federal Control of American
Science and Technology, Praeger, Westport, Connecticut, 1993, p. 103.
32 Yardley, Herbert O., The American Black Chamber, Espionage/Intelligence Library, Ballantine Books, New
York, 1981, p. xvii.
33 From the foreword (p. xiii) to Yardley, Herbert O., The American Black Chamber, Espionage/Intelligence
Library, Ballantine Books, New York, 1981.
World War I and Herbert O. Yardley ◾ 187
80
60
50
40
30
20
10
0
J F M AM J J A S O N D J F MA M J J A S O N D J F M A M J J A S O N D J F M A M J J A S O N D
1930 1931 1932 1933
800
600
400
200
0
2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4
1926 1927 1928 1929 1930 1931 1932 1933
Figure 5.8 Graphs charting cryptanalytic successes show no dip from Yardley’s book. (Courtesy
of the David Kahn Collection, National Cryptologic Museum, Fort Meade, Maryland.)
Frank Rowlett, who played a key role in breaking Japanese ciphers during World War II, actu-
ally thought Yardley’s book helped U.S. codebreakers!34
they actually began to read. Kahn described it as “suffocatingly dull.”37 Yardley’s prior book was written
in a lively exciting style, but this volume didn’t even seem like it had been written by the same person.
In fact, it hadn’t. Yardley didn’t enjoy writing and used ghost writers for most of his work. Such was
the case for Japanese Diplomatic Secrets.38 Although anyone can purchase this book at the present time,
there is a new mystery associated with it. David Kahn explains:
I saw the fabled manuscript many years ago at the National Archives and used it in
my Yardley book. But when I called for it again recently, the half-dozen or so manila
envelopes that had held its 970 pages were all empty. The file had no withdrawal slip.
I have no idea where the manuscript may be.39
Figure 5.9 An advertisement promoting Yardley’s novel The Blonde Countess. (From Hannah,
T. M., The many lives of Herbert O. Yardley, Cryptologic Spectrum, Vol. 11, No. 4, 1981, p. 14.)
37 From the introduction (p. xiv) to Yardley, Herbert O., The American Black Chamber, Espionage/Intelligence
Library, Ballantine Books, New York, 1981.
38 The real author was Marie Stuart Klooz; for more information, see https://web.archive.org/
web/20111130025234/http://www.intelligence-history.org/jih/reviews-1-2.html.
39 Kahn, David, email to the author, July 13, 2010. Note: the manuscript is well over 1,000 pages with the
appendices. Kahn’s figure of 970 is for the text proper.
40 From the forward (p. xiv) to Yardley, Herbert O., The American Black Chamber, Espionage/Intelligence
Library, Ballantine Books, New York, 1981.
World War I and Herbert O. Yardley ◾ 189
The publicity generated by the government’s attack on The American Black Chamber was used
by Yardley’s publisher for promotional purposes. The small print at the top of the ad reproduced
in Figure 5.9 instructs “Be sure your Congressman’s in Washington—Then dip this sheet in water.”
Doing so revealed the hidden message
The only author ever Gagged by an Act of Congress resorts to fiction in his novel about
the American Black Chamber. “The Blonde Countess,” by Major Herbert O. Yardley,
published by Longmans, Green and Company, 114 Fifth Avenue, New York.
Yardley wrote several other books, including the fictional spy/adventure novels The Blonde
Countess, Red Sun of Nippon (Figure 5.10), and Crows Are Black Everywhere. These books were
not as controversial as his first two. Yardley’s last book sold over 100,000 copies. It was titled The
Education of a Poker Player.41
Figure 5.10 One of Yardley’s novels. (Book cover by Paul Bartlett Taylor.)
41 Lewand, Robert Edward, Cryptological Mathematics, MAA, Washington, DC, 2000, p. 44.
190 ◾ Secret History
Other attempts to cash in on his fame included a radio program, “Stories of the Black Chamber,”
which aired in 1935, and two movies: Rendezvous (Metro-Goldwyn-Mayer, 1935) was an adaptation
of Yardley’s novel The Blonde Countess, and Cipher Bureau followed in 1938 (see Figure 5.11).
Figure 5.11 Can Hollywood make frequency counts exciting? (Thanks to René Stein, former
National Cryptologic Museum librarian, for finding this poster for me and to Nicholas Altland
for photographing it.)
In addition to making money with books about cryptology, Yardley wanted to continue doing
cryptologic work. From September 1938 to July 1940, he attacked Japanese codes and ciphers for the
Chinese. Years after Yardley’s death, James Bamford visited with his widow and was thrilled to learn
that a manuscript by Yardley detailing his cryptanalytic work in China had been gathering dust in a
closet. Bamford penned an introduction and got the work published as The Chinese Black Chamber.42
From June 1941 through November 1941, Yardley worked as a codebreaker in Canada, where
he cracked transposition ciphers used by German spies in South America.43 There are rumors (you
might even say evidence) of Yardley having once again landed a role with American intelligence,
after his time in Canada, but like so much else having to do with Yardley, we don’t have proof one
way or the other. The greatest Yardley mystery is detailed below.
42 Yardley, Herbert O., The Chinese Black Chamber: An Adventure in Espionage, Houghton Mifflin, Boston, 1983.
43 Kahn, David, The Reader of Gentlemen’s Mail, Yale University Press, New Haven, Connecticut, 2004, p. 206.
44 Farago, Ladislas, The Broken Seal: The Story of “Operation Magic” and the Pearl Harbor Disaster, Random
House, New York, 1967, pp. 57–58.
World War I and Herbert O. Yardley ◾ 191
At 1661 Crescent Place, an elegant little graystone house off Connecticut Avenue, he
[Yardley] was received by Setsuzo Sawada, counselor of the Japanese embassy. Yardley
went to the heart of the matter at once. He introduced himself as the United States
government’s senior cryptologist, briefly sketched his background, then told Sawada
that he was prepared to sell his country’s most closely guarded secret – for $10,000 in
cash.
The offer was so staggering that it aroused Sawada’s suspicions. According to his
first report to Tokyo, in which he described this strange encounter, he had said to
Yardley: “But you’re making a lot of money in your job! Why are you willing to sell
your country?”
“Simple, sir,” Yardley replied, according to Sawada. “It just so happens that I need
more money.”
This was an unparalleled opportunity and Sawada acted quickly to make the most
of it. When his report reached Tokyo the two most important Japanese officials in
cryptography were sent to Washington, under assumed names on diplomatic pass-
ports, to examine Yardley’s proposition and advise Sawada. One of them was Captain
Kingo Inouye of the Imperial Navy, on loan to the Foreign Ministry to organize a
Code Research Group within its Cable Section. The other was Naoshi Ozeki, chief
cryptographer of the Foreign Ministry.
A deal was made, but not without some haggling. Contrary to the popular belief
that the Japanese had unlimited funds for such transactions, the Foreign Ministry
operated on a very tight budget from secret funds. Sawada countered Yardley’s demand
by offering him $3000 at first, and then $5000. For a while Yardley refused to lower
his price, but an agreement was finally reached. Yardley received $7000, with the
understanding that he would be paid more if he decided to continue to work for the
Japanese.
It was an amazing bargain at any price. In return for their money, the Japanese
obtained all the secrets of the “black chamber” – Yardley’s methodology in breaking
their codes, copies of his work sheets, and his solutions of other codes as well, includ-
ing those of the British Foreign Office, which they were especially anxious to get.
Moreover, Yardley agreed to cut back his work on Japanese messages.
A convincing amount of detail is provided; however, we must be careful to not confuse good writ-
ing with good history.
John F. Dooley went back to the original source Farago cited and pursued other avenues of investiga-
tion, as well. Ultimately, he concluded that Yardley was innocent of this particular crime. Farago’s work
is, in general, not very reliable. He was not a careful researcher. Indeed, Dooley found that some of the
details in the paragraphs reproduced above cannot be true45 and others are not backed by the material
Farago referenced. The reader is encouraged to examine the literature, especially Dooley’s paper, and
come to his or her own conclusions. While I side with Dooley on this issue, I’m sure the debate will con-
tinue. Just as English professors will always have the sanity or insanity of Hamlet to contemplate, we’ll
continue to have the guilt or innocence of Yardley to argue about.
45 Dooley wrote, “Farago says “1661 Crescent Place, an elegant little graystone house off Connecticut Avenue”
but such a house does not currently exist. The closest current building to this address is 1661 Crescent Place
NW, which is a six story apartment building between 16th and 17th Streets NW and about 5 or 6 blocks from
Connecticut Avenue.”
192 ◾ Secret History
You might think that Yardley’s writings alone would forever place him on some sort of “enemy
of the state” list, but his previous work was eventually recognized by his placement in NSA’s Hall
of Honor. He was also buried with military honors in Arlington National Cemetery.46
Often when looking for one thing, something else of interest turns up. Such was the case with
Dooley’s research into Yardley. He had the following telegram translated and presented it in his
paper for the first time.
Telegram Section
Confidential #48
Date: March 10th, 1925
From: Isaburo Yoshida, Acting Ambassador to the US
To: Kijuro Shidehara, Minister of Foreign Affairs
Re: Telegram Codes
Mr. W. Friedman, an American, from Cornell University seems very skilled in break-
ing codes; for he was engaged in breaking codes at the war in Europe (i.e., WWI),
and he is now working for the US Army. When he came to see me recently, he men-
tioned that the US Army had no difficulty breaking codes. In order to prevent this,
we have no choice but change codes very frequently. I am sending this note for your
information.
So even if Dooley managed to resolve Yardley’s alleged treason, another mystery has arisen: What
was Friedman up to?
5.9 Censorship
Japanese Diplomatic Secrets has been referred to as the only manuscript ever seized by the U.S.
government. This is not correct. The oral history of Captain Joseph Rochefort, who was involved
with Navy cryptologic activities during World War II, was impounded by the National Security
Agency (NSA).47 In another case, the U.S. Government bought a manuscript to block its pub-
lication. War Secrets of the Ether, by Wilhelm F. Flicke, described Germany’s interception and
cryptanalysis of messages from 1919 to 1945, and included how phone conversations between
Roosevelt and Churchill were unscrambled. The book passed from the Army Security Agency to
the National Security Agency and was classified “Restricted.”48
The National Security Agency considered the “purchase and hide” method to stop David
Kahn’s The Codebreakers from appearing. They even considered “clandestine service applications”
against Kahn, which certainly sounds sinister. James Bamford interpreted this as possibly mean-
ing “anything from physical surveillance to a black-bag job [theft].” The “surreptitious entry”
into Kahn’s home that was considered leaves less room for interpretation. In the end, NSA settled
for getting Kahn to delete three paragraphs; however, an endnote that made it through, allowed
anybody willing to trace sources to see much of what was removed. The deleted material appeared
years later, exactly as written by Kahn, in James Bamford’s The Puzzle Palace.49
46 From the Introduction (p. xvi) to Yardley, Herbert O., The American Black Chamber, Espionage/Intelligence
Library, Ballantine Books, New York, 1981.
47 Lewin, Ronald, The American Magic, Farrar Straus Giroux, New York, 1982, p. 139.
48 Clark, Ronald, The Man Who Broke Purple, Little, Brown and Company, Boston, Massachusetts, 1977, p. 209.
49 Bamford, James, The Puzzle Palace, Penguin Books, New York, 1983, pp. 168–173.
World War I and Herbert O. Yardley ◾ 193
By 1942, Stimson, who had shut down the Cipher Bureau, felt differently about how gentle-
men behaved. He now heartily approved of reading others’ mail.50 As Secretary of War, Stimson
wrote to the American Library Association (ALA):51
It has been brought to the attention of the War Department that member libraries
of the American Library Association have received numerous requests for books deal-
ing with explosives, secret inks, and ciphers.
It is requested that these books be removed from circulation and that these librar-
ies be directed to furnish their local office of the Federal Bureau of Investigation with
the names of persons making requests for same.
Stimson’s letter closed with the following paragraph:
This document contains information affecting the national defense of the United States
within the meaning of the espionage Act, U.S.C. 50; 31 and 32. Its transmission or the
revelation of its contents in any manner to an unauthorized person is prohibited by law.
Librarians who thought Stimson’s actions worthless due to a number of popular books on cryptol-
ogy being available at bookstores could not make the debate public. This approach has been used
in recent years in the form of the Federal Bureau of Investigation’s National Security Letters.
Despite the gag order, word got out, as evidenced by a piece of short fiction by Anthony
Boucher titled “QL69.C9.” This story, copyright 1943, describes such a program of librarian coop-
eration with the FBI, although it is simply part of the backdrop and not the point of the tale.
Happily, Stimson’s censorship ended a few months after the war:
Since hostilities have ceased, it is agreed that the necessity for limiting the circulation
and use of these types of books no longer exists and that the libraries which partici-
pated in this program should be notified to this effect.52
It appears that the FBI didn’t catch anyone with bad intentions attempting to borrow books on
cryptology, but this official censorship was nothing new.53
In 1918, the US War Department told the American Library Association to remove a
number of pacifist and “disturbing” books, including Ambrose Bierce’s Can Such Things
Be? from camp libraries, a directive which was taken to also apply to the homefront.
Bierce had passed away by this time, but those still living took a risk if they chose to openly oppose
the draft.
During World War I, the US government jailed those who were distributing anti-draft
pamphlets like this one.54 Schenck, the publisher of the pamphlet, was convicted, and
his conviction was upheld by the Supreme Court in 1919. (This decision was the source
of the well-known “fire in a theatre” quote.)55
And there were political censorship demands made by the government during the Cold War, as well.56
In the 1950s, according to Walter Harding, Senator Joseph McCarthy had overseas
libraries run by the United States Information Service pull an anthology of American
literature from the shelves because it included Thoreau’s Civil Disobedience.
The Food and Drug Administration (FDA) has engaged in book censorship by claiming that they
are used as labeling for banned foods and drugs. For example, the FDA seized a large quantity of
the book Folk Medicine by C. D. Jarvis, MD., claiming it was used to promote the sale of honey
and vinegar. In November 1964, the U.S. Court of Appeals ruled that the seizure was improper.
The court also ruled against the FDA when they seized a book that promoted molasses, claiming
that it was shipped with molasses and therefore constituted labeling. Another FDA seizure was
Calories Don’t Count by Dr. Herman Taller.57
The British government has also cracked down on material deemed inappropriate for public
consumption. They seized the manuscript GCHQ: The Negative Asset by Jock Kane under the
Official Secrets Act in 1984. It was never published, but James Bamford was able to obtain a copy
of the manuscript before the seizure and incorporate some of it into his own book, Body of Secrets.58
In addition to these examples, there has been a great deal of censorship in America targeted
at works that some find offensive because of sexual content. A somewhat typical example is pro-
vided by James Joyce’s Ulysses. It was declared obscene and barred from the United States for 15
years. U.S. Postal Authorities even seized copies of it in 1918 and 1930. The ban was finally lifted
in 1933. The Modern Library recently chose Ulysses as the best novel of the 20th century.59 This
example is typical, as many of the works once considered lewd have since become modern classics.
The Catholic Church’s Index of Prohibited Books, which included many classic scientific works
(as well as more risqué titles) over the centuries, was not abolished until 1966.
While this section is by no means meant to provide a thorough survey, another example will
help to illustrate the breadth of censorship in 20th-century America.
In 1915, Margaret Sanger’s husband was jailed for distributing her Family Limitation, which
described and advocated various methods of contraception. Sanger herself had fled the
country to avoid prosecution, but would return in 1916 to start the American Birth Control
League, which eventually merged with other groups to form Planned Parenthood.60
Later in the 20th-century, various methods of contraception were taught in many public schools,
including the junior high school I attended. Other portions of my formal education, such as skim-
ming a few of William Shakespeare’s plays, would also have been controversial in generations past.
Some individuals even went as far as to blame Shakespeare’s violent plays for inspiring the assas-
sination of Abraham Lincoln.61 There have been many censored versions of these plays published
56 http://onlinebooks.library.upenn.edu/banned-books.html.
57 Garrison, Omar V., Spy Government: The Emerging Police State in America, Lyle Stuart, New York, 1967, pp.
145–149.
58 Bamford, James, Body of Secrets, Doubleday, New York, 2001, p. 645.
59 See http://onlinebooks.library.upenn.edu/banned-books.html, which provides many more examples.
60 http://onlinebooks.library.upenn.edu/banned-books.html.
61 For example, according to the assassin’s friend and fellow actor John M. Barron, “the characters he [John
Wilkes Booth] assumed, all breathing death to tyrants, impelled him to do the deed.” I found this quote in
Shapiro, James, Shakespeare in a Divided America: What His Plays Tell Us About Our Past and Future, Penguin
Press, New York, 2020, p. 247, which cites Alford, Terry, Fortune’s Fool: The Life of John Wilkes Booth, Oxford
University Press, New York, 2015, p. 246.
World War I and Herbert O. Yardley ◾ 195
over the centuries. An example is The Family Shakespeare edited by Harriet and Thomas Bowdler
(1807 and 1818) which removed profanity, sexual content, and violence. This led to a new verb
being added to the English language. To “bowdlerize” basically means to ruin a work of literature
by removing profanity, sexual content, and violence.
So many of today’s classics appear in old lists of censored works that one might reasonably
expect to be able to predict the future curriculum by looking at what is presently banned.
It is very rare to find both complicated transposition and substitution methods used in combination.
If one is complicated, the other will usually be very simple; and ordinarily both are simple, the sender
depending on the combination of the two to attain indecipherability. It is evident how futile this idea is.
Hoy, Hugh Cleland, 40 O.B. or How the War Was Won, Hutchinson & Co., London, UK, 1932. “How the
War Was Won” books typically explain how the author won it. But not in this case! Hoy was never in
Room 40. The O.B. in the title indicates that Room 40 was in the Old Buildings (of the Admiralty).
James, Admiral Sir William, The Code Breakers of Room 40: The Story of Admiral Sir William Hall, Genius of
British Counter-Intelligence, St. Martin’s Press, New York, 1956.
Kelly, Saul, “Room 47: The Persian Prelude to the Zimmermann Telegram,” Cryptologia, Vol. 37, No. 1,
January 2013, pp. 11–50.
Knight, H. Gary, “Cryptanalysts’ Corner,” Cryptologia, Vol. 4, No. 4, October 1980, pp. 208–212. Knight
describes a cipher similar to ADFGVX that was used in Christopher New’s 1979 novel Goodbye
Chairman Mao and presents several ciphertexts of his own in the system for readers to solve. These
ciphers are substantially easier than ADFGVX because the Polybius square pairs are not split in the
transposition stage.
Konheim, Alan, “Cryptanalysis of ADFGVX Encipherment Systems,” in Blakley, George Robert “Bob”
and David Chaum, editors, Advances in Cryptology: Proceedings of CRYPTO 84, Lecture Notes in
Computer Science, Vol. 196, Springer, Berlin, Germany, 1985, pp. 339–341. This is an “Extended
Abstract” rather than a full paper. A full-length paper by Konheim on this topic appears as appendix A
in Childs, J. Rives, General Solution of the ADFGVX Cipher System, as reprinted in 1999 by Aegean
Park Press, Laguna Hills, California.
Langie, Andre, How I Solved Russian and German Cryptograms During World War I, Imprimerie T. Geneux,
Lausanne, 1944, translated from German into English by Bradford Hardie, El Paso, Texas, 1964.
Lasry, George, Ingo Niebel, Nils Kopal, and Arno Wacker, “Deciphering ADFGVX Messages from the
Eastern Front of World War I,” Cryptologia, Vol. 41, No. 2, March 2017, pp. 101–136.
Lerville, Edmond, “The Radiogram of Victory (La Radiogramme de la Victoire),” La Liason des Transmissions,
April 1969, pp. 16–23, translated from French to English by Steven M. Taylor.
Lerville, Edmond, “The Cipher: A Face-to-Face Confrontation After 50 Years,” L’Armee, May 1969, pp.
36–53, translated from the French “Le Chiffre «face à face» cinquante ans après” to English by Steven
M. Taylor.
Mendelsohn, Charles, Studies in German Diplomatic Codes Employed During the World War, War Department,
Office of the Chief Signal Officer, United States Government Printing Office, Washington, DC, 1937.
Mendelsohn, Charles, An Encipherment of the German Diplomatic Code 7500, War Department, Office
of the Chief Signal Officer, United States Government Printing Office, Washington, 1938. This is a
supplement to the item listed above.
Norman, Bruce, “The ADFGVX Men,” The Sunday Times Magazine, August 11, 1974, pp. 8–15. For
this piece, Norman interviewed both Fritz Nebel, the German creator of the ADFGVX cipher, and
Georges Painvin, the Frenchman who cracked it.
Ollier, Alexandre, La Cryptographie Militaire avant la guerre de 1914, Lavauzelle, Panazol, 2002.
Partenio, Pietro, unpublished booklet, 1606. This booklet was only used by students in Partenio’s cryp-
tography course. A copy is in the Venice State Archives. Partenio combined substitution and trans-
position by using a two-letter nomenclator (each group was encrypted with two letters) followed by
a shuffling of the pairs according to a phrase to be kept in memory. His sample phrase was “Lex tua
meditatio mea uoluntas tua …” Thanks to Paolo Bonavoglia for pointing this reference out to me. He
is planning to publish it. The transcription is still in progress as of this writing.
Pergent, Jacques, “Une figure extraordinaire du chiffre français de 1914 à 1918: le capitaine Georges
Painvin,” Armée et Défense, Vol. 47, No. 4, April 1968, pp. 4–8.
Rislakki, Jukka, “Searching for Cryptology’s Great Wreck,” Cryptologia, Vol. 31, No. 3, July 2007, pp.
263–267. This paper summarizes the Russian capture of the German Navy’s main code book from the
wreck of the Magdeburg and David Kahn’s excursion to the site 92 years later. This is what historians
of cryptology do on vacation!
World War I and Herbert O. Yardley ◾ 197
Samuels, Martin, “Ludwig Föppl: A Bavarian cryptanalyst on the Western front,” Cryptologia, Vol. 40, No.
4, July 2016, pp. 355–373.
Tuchman, Barbara W., The Zimmermann Telegram, The Macmillan Company, New York, 1970. This book
has gone through many editions, and this is not the first.
von zur Gathen, Joachim, “Zimmermann Telegram: The Original Draft,” Cryptologia, Vol. 31, No. 1,
January 2007, pp. 2–37.
About Yardley
Anonymous, “Yardley Sold Papers to Japanese,” Surveillant, Vol. 2, No. 4, January/February 1992, p. 99
Denniston, Robin, “Yardley’s Diplomatic Secrets,” Cryptologia, Vol. 18, No. 2, April 1994, pp. 81–127.
62 https://en.wikipedia.org/wiki/Rendezvous_(1935_film).
198 ◾ Secret History
Dooley, John F., “Was Herbert O. Yardley a Traitor?” Cryptologia, Vol. 35, No. 1, January 2011, pp. 1–15.
Farago, Ladislas, The Broken Seal: The Story of “Operation Magic” and the Pearl Harbor Disaster, Random
House, New York, 1967, pp. 56–58.
Hannah, Theodore M., “The Many Lives of Herbert O. Yardley,” Cryptologic Spectrum, Vol. 11, No. 4,
Fall 1981, pp. 5–29, available online at http://www.nsa.gov/public_info/_files/cryptologic_spectrum/
many_lives.pdf.
Kahn, David, “Nuggets from the Archive: Yardley Tries Again,” Cryptologia, Vol. 2, No. 2, April 1978, pp.
139–143.
Kahn, David, The Reader of Gentlemen’s Mail, Yale University Press, New Haven, Connecticut, 2004.
Nedved, Gregory J., “Herbert O. Yardley Revisited: What Does the New Evidence Say?” Cryptologia, to
appear.
Turchen, Lesta VanDerWert, “Herbert Osborne Yardley and American Cryptography,” Master’s Thesis
University of South Dakota, Vermillion, South Dakota, May 1969. Turchen made the following com-
ment concerning Farago’s book: “Obviously biased against Mr. Yardley The Broken Seal could have
been more clearly documented and more judicious in its conclusions.” (p. 95).
Addendum
Long before meeting his World War I adversary Fritz Nebel, Painvin met his American counter-
part Herbert Yardley. An image from their meeting (Figure 5.12) is one of many treasures pre-
served by the National Cryptologic Museum adjacent to Fort Meade, Maryland.
Figure 5.12 Georges Painvin and Herbert O. Yardley. (Courtesy of the National Cryptologic
Museum, Fort Meade, Maryland.)
Chapter 6
Matrix Encryption
Polygraphic substitution ciphers replace characters in groups, rather than one at a time. In
Section 4.4, we examined the Playfair cipher, which falls into this category, because the charac-
ters are substituted for in pairs. We now turn to a more mathematically sophisticated example,
matrix encryption.
1 Levine, Jack, “Variable Matrix Substitution in Algebraic Cryptography,” American Mathematical Monthly, Vol.
65, No. 3, March 1958, pp. 170–179.
2 The Jack Levine Papers, 1716-–994, North Carolina State University, MC 308.1.7, Correspondence 1981–
1991, Various, Levine, Jack, letter to Louis Kruh, July 24, 1989.
199
200 ◾ Secret History
(or 3-message). I had some correspondence with Hill and I think told him of my
youthful efforts. All this to explain why I believe my system was the precursor to his
very general mathematical formulation (I also used equations).
Figure 6.1 Lester Hill (1890–1961). (Courtesy of the David Kahn Collection, National
Cryptologic Museum, Fort Meade, Maryland.)
Figure 6.2 Jack Levine (1907–2005). (Courtesy of the Jack Levine Archive at North Carolina
State University.)
Flynn’s Weekly was a pulp magazine that consisted mainly of detective fiction. Agatha Christie
had a short story in the same November 13, 1926 issue in which Levine’s system was explained.
Matrix Encryption ◾ 201
This was not the best place to publish a mathematical idea, but Levine was still a teenager at the
time.
Hill’s description came three years later, but it appeared in American Mathematical Monthly,
so it isn’t hard to see why the system was named after Hill.3 The American Mathematical
Monthly paper provided a complete explanation of matrix encryption with examples, whereas
Levine’s letter didn’t actually reveal his method, although it can be inferred. And, although
Levine used algebra, he wasn’t quite doing the same thing as Hill, at least not publicly.
Levine went on to earn a doctorate in mathematics from Princeton University, serve his coun-
try in a cryptologic capacity (in the Army), and author many more papers on cryptology, some
of which are described in the pages to follow. Hill, on the other hand, only published two papers
dealing directly with cryptology. Hill served in the Navy during World War I, but not in a cryp-
tologic capacity. In later decades, he did do some cryptologic work for the Navy, but it appears to
have been unsolicited and not considered valuable.4 In any case, the military work of both Levine
and Hill was not known to the academic community.
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
This gives
19 7 4 1 4 18 19 22 0 24 19 14 3 4 0 11 22 8 19 7 19 4 12 15
19 0 19 8 14 13 8 18 19 14 24 8 4 11 3 19 14 8 19.
The key we select in this system is an invertible matrix (modulo 26) such as 6 11 . To enci-
3 5
pher, we simply multiply this matrix by the numerical version of the plaintext in pieces of length
two and reduce the result modulo 26; for example,
3 Hill, Lester, S., “Cryptography in an Algebraic Alphabet,” American Mathematical Monthly, Vol. 36, No. 6,
June–July 1929, pp. 306–312.
4 For details, see Christensen, Chris, “Lester Hill Revisited,” Cryptologia Vol. 38, No. 4, October 2014, pp.
293–332.
202 ◾ Secret History
gives the first two ciphertext values as 9 and 14 or, in alphabetic form, J O. Continuing in this
manner we have
The details of how to test if a matrix is invertible and, if so, calculate its inverse follow below, so that
you can create your own keys for matrix encryption. A natural starting point is the following definition.
The determinant of a 2 × 2 matrix M = a b is given by ad – bc. It is often denoted by
c d
det(M).
The determinant offers a quick test to determine if the matrix is invertible.
Theorem:
A matrix M is invertible if and only if its determinant is invertible.
If the entries of the matrix are real numbers, then the matrix is invertible if and only if its determi-
nant is not zero. This is because every nonzero real number is invertible. However, for matrix encryp-
tion the entries of the matrix are not real numbers. They are integers modulo n. In the Example I
provided above, n = 26. That means we are only allowed to use the numbers 0, 1, 2, 3, 4, 5, 6, 7, 8, 9,
10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, and 25. Which of these numbers are invertible?
Theorem:
A number a is invertible modulo n if and only if gcd(a, n) = 1.
That is, a number a is invertible modulo n if and only if the greatest common divisor of a and n
is 1. Another way of saying gcd(a, n) = 1 is “a and n are relatively prime.” A mod 26 multiplication
table was given in Section 1.16. You can use it to confirm that in the case of n = 26, the invertible
values are 1, 3, 5, 7, 9, 11, 15, 17, 19, 21, 23, and 25. This is simply all of the odd values with the
exception of 13. You can quickly find the inverse of each of these numbers in the table. Simply look
for the number you have to multiply the given number by to get 1.
Example:
You can see that the formula won’t work if the determinant of M is not invertible, because the
formula begins with the inverse of the determinant. Applying this formula to the matrix used in
the encryption example at the beginning of this section, we have
M−1 = (23)−1 5 −11 = 17 5 15 = 17 ⋅ 5 17 ⋅ 15 = 7 21 .
−3 6 23 6 17 ⋅ 23 17 ⋅ 6 1 24
So, if we use M = 6 11 to encipher, then we can use M−1 = 7 21 to decipher, as was claimed.
3 5 1 24
204 ◾ Secret History
There are other methods that can be used to calculate the inverse of a matrix. You can find
them detailed in linear algebra textbooks. Some of these generalize easily to larger matrices, which
is great because square (invertible) matrices of any dimension may be used for encryption. A 5 ×
5 matrix would allow us to encipher characters in groups of five, and a particular word could be
enciphered in as many as five different ways depending on its position in the message. There are
other advantages to using larger matrices, such as larger keyspaces.
5 Levine, Jack, “Some Applications of High-Speed Computers to the Case n = 2 of Algebraic Cryptography,”
Mathematics of Computation, Vol. 15, No. 75, July 1961, pp. 254–260.
6 Levine gave the incorrect value 740 in his paper. A list of the number of self-inverse matrices for various moduli
is given at http://oeis.org/A066907.
7 The number of plaintext/ciphertext pairs needed to uniquely determine the matrix depends on the size of the
matrix and another factor. You are asked to determine this for 3 × 3 matrix encryption in one of the online
exercises for this chapter.
8 Levine, Jack, “Some Elementary Cryptanalysis of Algebraic Cryptography,” American Mathematical Monthly,
Vol. 68, No. 5, May 1961, pp. 411–418.
Matrix Encryption ◾ 205
Labeling these four forms as 0, 1, 2, and 3, we see (by applying the facts above in the context
of matrix multiplication) that the enciphering matrix sends form 0 to another pair of letters having
form 0. In the case of form 1, enciphering yields a pair of letters having form 2. A nice symmetry
exists, as a plaintext pair of form 2 will be sent to a ciphertext pair of form 1. Finally, a plaintext
pair of form 3 will be sent to a ciphertext pair of form 3.
These pairings can be tersely expressed as
0 ↔ 0, 1 ↔ 2, 3 ↔ 3
even odd
If the enciphering matrix has a form other than odd even , these pairings may differ, but the
pairings are always constant for a given matrix and we always have 0 ↔ 0. If the matrix is self-
inverse, the pairings will be reciprocal, otherwise some of the arrows could be one-way.
Now consider the following ciphertext obtained by enciphering with a 2 × 2 self-inverse matrix
of unknown form:
CRSFS HLTWB WCSBG RKBCI PMQEM FOUSC
PESHS GPDVF RTWCX FJDPJ MISHE W
If this message was sent from one mathematician to another, we might guess that the word
MATHEMATICS appears somewhere in the plaintext. Word spacing has not been preserved in the
ciphertext, so we cannot immediately locate the word’s position. However, the word can appear in
only two fundamentally different positions with respect to the enciphering matrix. We have either:
1. MA TH EM AT IC Sx
or
2. xM AT HE MA TI CS
We disregard the pairs with an x (representing an unknown plaintext letter) in them. The
remaining pairs have the forms (0, 3, 0, 1, 0) and (1, 2, 0, 2, 0), respectively. Finding the forms of
the ciphertext pairs yields:
1 1 1 3 1 0 1 1 1 0 2 0 1 0 0 2 1 0 3 3 3
CR SF SH LT WB WC SB GR KB CI PM QE MF OU SC PE SH SG PD VF RT
0 3 3 3 0 1 0
WC XF JD PJ MI SH EW
Suppose MATHEMATICS lined up with the enciphering matrix in the manner of the first
possibility listed, (0, 3, 0, 1, 0). Could the message begin with this word? If it did, this would
imply the plaintext/ciphertext pairing 0 ↔ 1, while we know 0 ↔ 0 for all matrices. Therefore, if
MATHEMATICS appears in the message, it cannot be at the very beginning. Lining up with the
first zero in the ciphertext gives:
0 3 0 1 0
1 1 1 3 1 0 1 1 1 0 2 0 1 0 0 2 1 0 3 3 3 0 3 3 3 0 1 0
This is not possible, because the middle pairing is 0 ↔ 1. Also, if 3 ↔ 1, we must have 1 ↔
3, but the above alignment has 1 ↔ 1. We now line up the first 0 in our proposed plaintext with
the second zero in the ciphertext.
206 ◾ Secret History
0 3 0 1 0
1 1 1 3 1 0 1 1 1 0 2 0 1 0 0 2 1 0 3 3 3 0 3 3 3 0 1 0
The pairings given by this alignment are consistent. This is, in fact, the only consistent align-
ment for when MATHEMATICS is broken up as MA TH EM AT IC Sx and enciphered.
However, as explained earlier, MATHEMATICS could be broken up as xM AT HE MA TI CS
and then enciphered. In this case, the form pattern is (1, 2, 0, 2, 0). Because this pattern has two
zeros separated by a single value we check alignments in positions where the ciphertext also has
this form:
1 2 0 2 0
1 1 1 3 1 0 1 1 1 0 2 0 1 0 0 2 1 0 3 3 3 0 3 3 3 0 1 0
This alignment cannot be correct, because it sends the first 2 to 1 and the second 2 to 2, which
is inconsistent.
1 2 0 2 0
1 1 1 3 1 0 1 1 1 0 2 0 1 0 0 2 1 0 3 3 3 0 3 3 3 0 1 0
This alignment is rejected, because it sends the 1 to 0.
1 2 0 2 0
1 1 1 3 1 0 1 1 1 0 2 0 1 0 0 2 1 0 3 3 3 0 3 3 3 0 1 0
This alignment is also impossible, as 1 cannot be paired with 3, if 2 is also paired with 3.
Sometimes there will be several consistent alignments. The fact that the ciphertext was short
made this less likely for the example above. Usually having more ciphertext makes the cipher easier
to break, but for this attack it creates more work! Our only consistent alignment
0 3 0 1 0
1 1 1 3 1 0 1 1 1 0 2 0 1 0 0 2 1 0 3 3 3 0 3 3 3 0 1 0
A little time could have been saved by calculating the deciphering matrix first—that is, by
finding the matrix that sends the known ciphertext to the known plaintext. The steps would be
the same as above; only the numbers would change. It should also be noted that the equations
generated by the crib may be solved by placing them in a matrix and row reducing, rather than
investigating them in the manner detailed above. For larger matrices, with more unknowns to
solve for, the more systematic approach of row reduction would be preferred.
A good cipher must resist crib attacks. An intercepted message always has some context that
allows an attacker to guess cribs. If one particular crib doesn’t lead to a solution, the attacker can
start over with another crib. Modern ciphers are even expected to resist chosen plaintext attacks,
where the attacker gets to pick any plaintext he or she likes and receive the corresponding cipher-
text. For a matrix encryption system using a 2 × 2 matrix, the chosen plaintext pairs BA and AB
would be excellent choices. The ciphertext that results, upon being converted back to numerical
values would be the entries in rows 1 and 2 of the matrix, respectively. One could work these pairs
into a meaningful message like so:
For a 3 × 3 matrix, the ideal chosen plaintext triplets would be BAA, ABA, and AAB. Casually
fitting these into an innocent sounding message would be a little harder.
9 Bauer Craig and Katherine Millward, “Cracking Matrix Encryption Row by Row,” Cryptologia, Vol. 31, No.
1, January 2007, pp. 76–83. The following pages are adapted from this paper and updated with material from
sequels by other authors.
208 ◾ Secret History
Figure 6.3 Craig P. Bauer. (Photograph by Mike Adams and used with permission.)
deciphering matrix as a b and guess that a = 7 and b = 3, applying this row will then give the
c d
plaintext equivalents for positions 1, 3, 5, 7, … We cannot tell at a glance if our guess is correct, as
we could if we made guesses for both rows and had a complete potential decipherment, but we can
analyze the letters we do obtain statistically to see if they seem reasonable. We do this by compar-
ing the frequencies of the letters recovered (every other letter of the complete plaintext) with the
frequencies of the characters in normal English. The choice for a and b that yields the best match
provides our most likely values for the first row of the matrix.
Basically, if our guess for row 1 of the deciphering matrix yields common letters such as E, T, and
A, it is considered a good guess; however, if we get rare letters like Z and Q, it is a bad guess. All we
need is a way to assign scores to each guess based on the letters generated, so that the process can be
automated.
Repeating this procedure for the second row will suggest the same values as the most likely. For a
matrix to be invertible, the rows must be distinct, so we need to consider a few of the most likely solutions
for the rows and try them in different orders to recover the matrix. For example, if the rows yielding the
best match to normal English letter frequencies are (4 13) and (9 6), the deciphering matrix could be
4 13 9 6
9 6 or 4 13 . Applying each of these matrices to the ciphertext would quickly reveal which is
correct. However, for some ciphertexts, the two rows that seem to be the most likely are not correct! In
those cases we have to consider a larger number of potential rows, again trying them in various orders
within the matrix until we finally recover a meaningful message.
So how can we score each potential row to determine which are most likely to be correct? A
natural way of ranking them is to award each a point value equal to the sum of the probabilities of
the letters it generates when applied to the ciphertext. However, this approach to scoring was not
as successful as we had hoped and we actually achieved better results with a less refined approach!
We settled on the following scheme for awarding points to a potential row:
In order to test this attack, we needed a selection of ciphertexts. To create these we took a list
of books, namely Modern Library’s 100 Best English-Language Novels of the Twentieth Century,10
and arbitrarily took the first 100 letters of each of the top 25 novels as our plaintext messages. We
then used a computer program to generate invertible matrices, which were used to encipher these
messages. Separate programs examined the attack against 2 × 2, 3 × 3, and 4 × 4 matrices. While
working on the programs, we discovered that our attack for the special case of a 3 × 3 matrix was
previously and independently put forth online by Mark Wutka of the Crypto Forum.11 We con-
tacted him by email and were encouraged to continue with our investigation.
Our first program investigated the case in which a 2 × 2 matrix was used. All possible rows
were considered, and it was discovered that the highest scoring rows were often impossible! For
example, the row (13 0) sends every pair of ciphertext letters where the first letter is even to 0 =
A, because we are performing all arithmetic modulo 26. The letter A, being so frequent in normal
English, was worth 2 points in our scoring scheme, so this row scored very high. However, any
matrix containing this row would have a determinant that is a multiple of 13 and therefore not
invertible modulo 26. Hence, this row could not possibly arise as part of the deciphering matrix.
Similarly, (0 13) and (13 13) cannot be rows in an invertible matrix. Also, all rows of the form
(even even) would be impossible, as well, because such a row would make the determinant an
even number and therefore not invertible modulo 26. In terms of coding, it was easiest to inves-
tigate possibilities for the row (a b) by using nested loops where a and b each run from 0 to 25.
The impossible rows described above were assigned scores, but were ignored when displaying the
results.
The results (possible matrix rows and their scores) were then ordered by score. The rows with
the highest scores should be tried first; some combination of them is very likely to yield the correct
deciphering matrix.
The results for our investigation of the 2 × 2 case are summarized graphically in Figure 6.4.
The graph shows an increasing number of ciphertexts being correctly deciphered as we go deeper
into our list of possible matrix rows ordered by the scores they generate.
Examining some particular results, in detail, will make the graph clearer. For the fourth
ciphertext we attacked, the top scoring rows were (7 25) with a score of 62 and (14 21) with a score
7 25
of 57.5. The correct deciphering matrix was 14 21 , so our attack worked wonderfully.
For the seventh ciphertext, the top scoring rows were (19 16) with a score of 63 and (22 19) with
a score of 59.5. The correct deciphering matrix in this case was 22 19 , so the rows appeared in
19 16
a different order, but we still had the correct rows as our top two possibilities. We found that, for
10 Modern Library’s 100 Best English-Language Novels of the Twentieth Century, https://web.archive.org/web/
20150910153230/http://home.comcast.net/∼http://home.comcast.net/∼netaylor1/modlibfiction.html.
11 Wutka, Mark, The Crypto Forum, http://s13.invisionfree.com/Crypto/index.php?showtopic=80.
210 ◾ Secret History
100
90
Figure 6.4 Recovery rate of keys as a function of the number of high scoring rows considered
for the 2 × 2 case.
64% of the ciphertexts we attacked, the two most likely rows could be used to obtain the correct
deciphering matrix, as described above. Hence, in Figure 6.4, we have the point (2, 64).
For the second ciphertext we attacked, the two most likely rows (highest scoring) did not yield
the correct deciphering matrix. The list of highest scoring rows began (13 22), (0 9), (13 9), (0 7),
4 15
(13 7), (4 15), … The correct deciphering matrix was , which used the rows in positions
13 22
1 and 6 of our ordered list. So, while the correct rows usually appeared as the two most likely, we
sometimes had to go further down the list. When considering the six highest scoring rows, we
found the correct two rows for the deciphering matrix to be among them 92% of the time. Hence,
the graph in Figure 6.4 includes the point (6, 92).
Considering the top seven rows, by our ranking scheme, caught the correct two 100% of the
time. Selecting two rows from the seven most likely to form a matrix allows for 7P2 = 42 possibili-
ties. It was necessary to use a permutation in this calculation, because the order of the rows mat-
ters. The 42 distinct matrices obtained as possibilities in this worst-case scenario (we can usually
find our answer within a smaller set) can still be found and checked far more rapidly than the set
of all 157,248 invertible 2 × 2 matrices. Even considering the overhead of calculating scores for 262
= 676 rows, we still have a great savings.
We continued on to apply our attack to the same set of messages enciphered with 3 × 3 matri-
ces. Once again, our list of ranked rows greatly reduced the number of matrices that needed to
be considered. Our attack found the correct three rows over half the time among those scoring
in the top 17. This leaves more possibilities to consider than in the 2 × 2 case, yet testing 17P3 =
4,080 matrices once again represents a tremendous savings over a brute force attack on the set of
all 1,634,038,189,056 possibilities. In this case, the overhead consists in calculating scores for 263
= 17,576 rows. The correct rows were among the top 76 in 88% of the trials, but we had to consider
the top 394 scorers before correctly identifying all 25 matrices. Still, 394P3 = 60,698,064 represents
only about 0.0037% of the full brute-force search.
Matrix Encryption ◾ 211
The last case we investigated was when a 4 × 4 matrix had been applied to yield the cipher-
text. The results continue the trend established by the smaller cases. We achieve success in 52%
of the cases with a relatively small number of rows considered from our ranking (namely 1,469),
continue on to get 88% within the top 7,372, and then had to go all the way out to the top
24,541 rows to get 100% of the test messages correctly deciphered. Thus, the trend is that a larger
and larger number of rows must be considered as the dimension of the matrix grows, yet as a
percentage of the total keyspace, which, for the 4 × 4 case, is 12,303,585,972,327,392,870,400,
our attack is seen to be improving in terms of efficiency over brute-force. It should also be noted
that, because the messages were kept at a fixed length of 100 characters, as the enciphering
matrix grew from 2 × 2 to 4 × 4, the number of characters on which the rankings of the rows
is determined diminished from 50 to 25. It is likely that the results would be improved, if the
sample ciphertexts were longer.
The scoring scheme we used was never claimed to be optimal. Indeed, in 2009, Dae Hyun
Yum and Pil Joong Lee, a pair of Korean researchers, found a better scoring method, improving
the efficiency of the attack.12 They called their scoring scheme “simplified multinomial” or SM
for short and directly compared it to our scoring scheme (Bauer-Millward, or BM for short) in
Table 6.1.
Yum and Lee also generalized the attack to work on Hill ciphers where the numerical represen-
tations of the letters do not simply follow the alphabet.
Further improvements were made by Elizabethtown College professors Tom Leap (com-
puter science) and Tim McDevitt (mathematics), working with a pair of undergraduate applied
mathematics majors, Kayla Novak and Nicolette Siermine. They refined the scoring scheme by
first doing preliminary scoring with the index of coincidence. An important observation they
made was that if a row yields a low score for the index of coincidence, then it is safe to not only
reject that row, but to also reject all multiples of it, where the multiplier is relatively prime to
the modulus, because they would give the same IC value. This leads to a savings of a factor of
12 Yum, Dae Hyun and Pil Joong Lee, “Cracking Hill Ciphers with Goodness-of-Fit Statistics,” Cryptologia, Vol.
33, No. 4, October 2009, pp. 335–342.
212 ◾ Secret History
φ(L), where L is the size alphabet being used.13 They followed this calculation (for the highest
scoring rows) with a check of a goodness-of-fit statistic, like Yum and Lee used. For the best of
these possibilities, they used plaintext digraphs statistics to determine the correct deciphering
matrix.14
The Elizabethtown College team confirmed that their attack works for matrices all the way
up to 8 × 8, for which the attack took 4.8 hours on an ordinary quad core desktop computer. The
ciphertext in this instance was 1,416 characters long and arose from a message using a 27-letter
alphabet that included the space needed to separate words. For smaller matrices, attacks were
made on shorter ciphertexts. The team noted, “For short texts with small matrices, the multino-
mial approach of Yum and Lee may be best, but for larger matrices or longer texts, our method
seems to be a substantial improvement.”15
Two years later, Tim McDevitt, working with a new pair of coauthors, Jessica Lehr (an actu-
arial science major), and Ting Gu (a computer science professor) put forth an even better attack.
This time the team was able to crack 8 × 8 matrix encryption, over a 29-character alphabet, in
seconds. The attack allowed them to succeed all the way up to the 14 × 14 case, with an average
runtime just under four hours.16
In 2020, George Teşeleanu, a Romanian mathematician and computer scientist, broadened
the attack to other modes of matrix encryption.17 These modes are covered in Section 13.6.
The Hill cipher is important because of the explicit connection it makes between algebra and
cryptography. It is not known to have been important in use. In fact, Jack Levine enjoyed working
on it because he didn’t have to worry about intersecting classified work. That is not to say that it
wasn’t used during World War II. Indeed, it was used by the American military to encipher radio
call signs in that war 18 and in the Korean War.19 There are also rumors of it having been used in
Vietnam, where jungle conditions sometimes prevented more secure machine systems from being
implemented successfully.
13 The function φ was previously seen in Section 1.16 of this book and will be seen again later.
14 Leap, Tom, Tim McDevitt, Kayla Novak, and Nicolette Siermine, “Further Improvements to the Bauer-
Millward Attack on the Hill Cipher,” Cryptologia, Vol. 40, No. 5, September 2016, pp. 452–468.
15 Leap, Tom, Tim McDevitt, Kayla Novak, and Nicolette Siermine, “Further Improvements to the Bauer-
Millward Attack on the Hill Cipher,” Cryptologia, Vol. 40, No. 5, September 2016, pp. 452–468.
16 McDevitt, Tim, Jessica Lehr, and Ting Gu, “A Parallel Time-Memory Tradeoff Attack on the Hill Cipher,”
Cryptologia, Vol. 42, No. 5, September 2018, pp. 408–426. The authors noted that the results were “gener-
ated on a Supermicro 828-14 Server with 4× AMD Operation 6376 Sixteen Core 2.30 GHz processors. It has
sixteen 8 GB DIMM RAM and a 120 GB SSD drive, and the programming was done in Java.”
17 Teşeleanu, George, “Cracking Matrix Modes of Operation with Goodness-of-Fit Statistics,” in Megyesi, Beáta,
editor, Proceedings of the 3rd International Conference on Historical Cryptology, HistoCrypt 2020, pp. 135–145.
18 See Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 408 and Bauer, Friedrich L.,
Decrypted Secrets: Methods and Maxims of Cryptology, second edition, Springer, Berlin, Germany, 2007, p. 85.
19 Burke, Colin B., “It wasn’t All Magic: The Early Struggle to Automate Cryptanalysis, 1930s-1960s,” United
States Cryptologic History, Special Series, Vol. 6, Center for Cryptologic History, National Security
Agency, Fort George G. Meade, MD, 2002, p. 254, available online at http://cryptome.org/2013/06/NSA-
WasntAllMagic_2002.pdf and https://fas.org/irp/nsa/automate.pdf.
Matrix Encryption ◾ 213
20 History of the Math Department at NCSU: Jack Levine, December 31, 1986, interview, https://web.archive.org/
web/20160930110613/∼http://www4.ncsu.edu/∼njrose/Special/Bios/Levine.html.
214 ◾ Secret History
Levine, Jack, “Some Further Methods in Algebraic Cryptography,” Journal of the Elisha Mitchell Scientific
Society, Vol. 74, 1958, pp. 110–113.
Levine, Jack, “Some Elementary Cryptanalysis of Algebraic Cryptography,” American Mathematical
Monthly, Vol. 68, No. 5, May 1961, pp. 411–418.
Levine, Jack, “Some Applications of High-Speed Computers to the Case n = 2 of Algebraic Cryptography,”
Mathematics of Computation, Vol. 15, No. 75, July 1961, pp. 254–260.
Levine, Jack, “Cryptographic Slide Rules,” Mathematics Magazine, Vol. 34, No. 6, September-October
1961, pp. 322–328. Levine presented a device for performing matrix encryption in this paper. Also
see Figures 6.5 and 6.6.
Levine, Jack, “On the Construction of Involutory Matrices,” American Mathematical Monthly, Vol. 69, No.
4, April, 1962, pp. 267–272. Levine provides a technique for generating matrices that are self-inverse
in this paper.
Noninvertible matrices can be used for encryption as the following four papers demonstrate.
Levine, Jack and Robert E. Hartwig, “Applications of the Drazin Inverse to the Hill Cryptographic System
Part I,” Cryptologia, Vol. 4, No. 2, April 1980, pp. 71–85.
Levine, Jack and Robert E. Hartwig, “Applications of the Drazin Inverse to the Hill Cryptographic System
Part II,” Cryptologia, Vol. 4, No. 3, July 1980, pp. 150–168.
Levine, Jack and Robert E. Hartwig, “Applications of the Drazin Inverse to the Hill Cryptographic System
Part III,” Cryptologia, Vol. 5, No 2, April 1981, pp. 67–77.
Levine, Jack and Robert E. Hartwig, “Applications of the Drazin Inverse to the Hill Cryptographic System
Part IV,” Cryptologia, Vol. 5, No. 4, October 1981, pp. 213–228.
Levine, Jack and Richard Chandler, “The Hill Cryptographic System with Unknown Cipher Alphabet but
Known Plaintext,” Cryptologia, Vol. 13, No. 1, January 1989, pp. 1–28.
McDevitt, Tim, Jessica Lehr, and Ting Gu, “A Parallel Time-Memory Tradeoff Attack on the Hill Cipher,”
Cryptologia, Vol. 42, No. 5, September 2018, pp. 408–426.
Ohaver, M. E., “Solving Cipher Secrets,” Flynn’s Weekly, October 22, 1926, p. 798. M. E. Ohaver, is one
of the pseudonyms used by Kendell Foster Crossen (1910–1981). Problem No. 6, on page 798 of this
article, is the one posed by Jack Levine.
Ohaver, M. E., “Solving Cipher Secrets,” Flynn’s Weekly, November 13, 1926, pp. 794–800. This is the
column in which an explanation of Levine’s system, from the October 22, 1926 issue, appeared (see
pages 799–800). Levine believed it laid the foundation for matrix encryption.
Overbey, Jeffrey, William Traves, and Jerzy Wojdylo, “On the Keyspace of the Hill Cipher,” Cryptologia, Vol.
29, No. 1, January 2005, pp. 59–72, available online at https://web.archive.org/web/20050910055747/
http://jeff.actilon.com/keyspace-final.pdf this paper presents formulas yielding the size of the key-
space for matrix encryption with arbitrary dimension and moduli. Some of this material may also be
found (more tersely) in Friedrich L. Bauer’s Decrypted Secrets, 1st edition, Springer, Berlin, Germany,
1997, p. 81. Second, third, and fourth editions of the latter have since been released.
Teşeleanu, George, “Cracking Matrix Modes of Operation with Goodness-of-Fit Statistics,” in Megyesi,
Beáta, editor, Proceedings of the 3rd International Conference on Historical Cryptology, HistoCrypt 2020,
pp. 135–145.
Thilaka, B. and K. Rajalakshni, “An Extension of Hill Cipher Using Generalized Inverses and mth Residue
Modulo m,” Cryptologia, Vol. 29, No. 4, October 2005, pp. 367–376.
Wutka, Mark, The Crypto Forum, http://s13.invisionfree.com/Crypto/index.php?showtopic=80. This link
is now broken and the page was not archived by Wayback Machine, but I’m including it, because I
want to continue pointing out Wutka’s priority on what is sometimes called the “Bauer-Millward
attack.”
Matrix Encryption ◾ 215
Yum, Dae Hyun and Pil Joong Lee, “Cracking Hill Ciphers with Goodness-of-Fit Statistics,” Cryptologia,
Vol. 33, No. 4, October 2009, pp. 335–342.
There have been many other papers in recent years in less specialized journals that attempt to describe
stronger variants of matrix encryption; however, the cryptographic community is, in general, skeptical. One
modification described in some of the papers referenced above foreshadows the various modes of encryption
used for modern cipher systems. This is discussed in Section 13.6.
The Jack Levine archive at North Carolina State University is home to cipher wheels for performing matrix
encryption. They are shown in Figures 6.5 and 6.6.
Figure 6.5 A cipher wheels for performing matrix encryption. (From Jack Levine Papers,
1716–1994, North Carolina State University, MC 308.5.1, General Cryptography, Algebraic
Encipherment Wheels.)
216 ◾ Secret History
Figure 6.6 A set of cipher wheels for performing matrix encryption. (From Jack Levine Papers,
1716–1994, North Carolina State University, MC 308.5.1, General Cryptography, Algebraic
Encipherment Wheels.)
Chapter 7
1 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, pp. 415–420.
2 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 420.
3 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, pp. 420–422.
217
218 ◾ Secret History
Eventually, with Hitler’s rise to power and the re-arming of Germany, Enigma machines were
mass produced. Scherbius was probably dead by this time, and it is not known if anyone else
profited from these sales. In any case, the Nazis did not nationalize the business.4 The majority of
this chapter is focused on the Enigma, but the last of the four rotor inventors should be mentioned
before moving on.
Arvid Gerhard Damm filed a patent in Sweden for a cipher machine with rotors only a few
days after Koch. Damm died just as the company he began to market cipher machines started to
take off. Boris Hagelin took over and in 1940 came to America, where he was eventually able to
sell machines to the U.S. Army.5 The M-209 created by Hagelin and used by America is pictured
in Figures 7.2 and 7.3. It was not the most secure machine America had in use during World War
II, but it was lightweight and easy to use.
Hagelin earned millions and returned to Sweden in 1944. The Cold War paved the way for
millions more to be made, and Hagelin relocated his business to Switzerland, so the Swedish gov-
ernment couldn’t take over in the name of national defense. In Switzerland, the company became
Crypto Aktiengesellschaft, or Crypto AG for short.6 The Swiss reputation for neutrality worked to
Hagelin’s advantage, as many nations felt safe purchasing cipher machines from them. This may
have been to the United States’ advantage, as well. Hagelin’s story is continued in Section 12.7, but
for now we return to Scherbius’s Enigma.
The Enigma machine existed first in a commercial version. It was modified (in small ways) for
military use and adopted by the German Navy around 1926 and by the German Army on July 15,
1928.7 The Luftwaffe also came to use it. Altogether, about 40,000 of these machines were used
Figure 7.2 Former National Cryptologic Museum curator Patrick Weadon with an M-209B. The
M-209B is a later version of the M-209, but the differences between the two are very minor.
Figure 7.3 A closer look at an M-209B. (This machine is in the collection of the National
Cryptologic Museum and was photographed by the author.)
220 ◾ Secret History
by the Nazis during World War II.8 A description of the military version follows, referencing the
commercial version only when necessary.
When a key was pressed on the Enigma machine (Figure 7.4), one of the lights above the keys
would switch on. This represented the ciphertext letter. As will be seen, a letter could never be
enciphered as itself. This is a weakness. Another weakness is that, at a given setting, if pressing X
lights Y, then pressing Y lights X. This reciprocity allows the same setting that was used to encipher
the message to be used for deciphering. It is analogous to using an involutory (self-inverse) matrix
for the Hill cipher, although this machine operates in a completely different manner. We will now
examine the electrical path connecting the depressed key to the lighted ciphertext letter. Each key
is linked to a pair of holes in the Steckerbrett (literally “stick-board” or “plugboard”)9 pictured in
Figure 7.5. Cables may connect these in pairs, performing the first substitution. Initially, six cables
were used.
8 Erskine, Ralph, “Enigma’s Security: What the Germans Really Knew,” in Erskine, Ralph and Michael Smith,
editors, Action this Day, Bantam Press, London, UK, 2001, pp. 370–385, p. 370 cited here.
9 The plugboard didn’t exist for early (commercial) versions of the Enigma. It was introduced in 1928.
World War II: The Enigma of Germany ◾ 221
Later, the number of cables used could be anywhere from 0 to 13, inclusive. Because the letters
are connected in pairs, 13 is the limit for a 26-letter alphabet. The German alphabet has a few
extra letters, namely ä, ö, ü, and ß, which are rendered as ae, oe, ue, and ss, respectively, for this
machine.
After the plugboard substitution, we may follow the path in the schematic diagram provided
in Figure 7.6 to the rightmost rotor. There are 26 wires internal to the machine connecting the
plugboard to the rotor system. This offers the opportunity for another scrambling (more on this
later). The rotors each make another substitution.
Figure 7.7 shows a disassembled rotor. Each side has 26 electrical contacts, one representing
each letter. The wires inside connect these, yielding the substitution.
Initially, there were only three distinct rotors that could be used, although they could be
placed in the machine in any order. Their inner wirings differed from those on the commercial
version of the Enigma and were kept secret. They were as follows:
Rotor I
Input: ABCDEFGHIJKLMNOPQRSTUVWXYZ
Output: EKMFLGDQVZNTOWYHXUSPAIBRCJ
Rotor II
Input: ABCDEFGHIJKLMNOPQRSTUVWXYZ
Output: AJDKSIRUXBLHWTMCQGZNPYFVOE
Rotor III
Input: ABCDEFGHIJKLMNOPQRSTUVWXYZ
Output: BDFHJLCPRTXVZNYEIWGAKMUSQO
The above might seem like the tersest possible notation for the action of the rotors, but this isn’t
so! Such permutations (as mathematicians usually refer to them) can be written even more briefly.
Consider Rotor I. It sends A to E, E to L, L to T, T to P, P to H, H to Q, Q to X, X to R, R to U, and
U to A, which is where I started this chain. We may write this as (AELTPHQXRU), leaving off the
last A with the understanding that when we reach the end, we loop back to the start. Now, this is
very nice, but it doesn’t include all of the letters. So, we pick a letter that was missed and repeat the
process to get (BKNW). Putting these together yields (AELTPHQXRU)(BKNW), which still doesn’t
include all 26 letters, so again we start with one that was missed and form another cycle, as these
groups of letters are called. Eventually we get
Axle
Notch
Notch
Battery
Plug Board
Figure 7.6 From keyboard to bulb—the enciphering path of an Enigma. (From Miller, R., The
Cryptographic Mathematics of Enigma, Center for Cryptologic History, Fort Meade, Maryland, 2001,
p. 2.)
Figure 7.7 A disassembled Enigma rotor. (Courtesy of René Stein, National Cryptologic Museum.)
World War II: The Enigma of Germany ◾ 223
In Figure 7.8, we see three rotors side by side. Each rotor performs a substitution. After pass-
ing through all three rotors, the electrical impulse passes to the reflector, pictured up close in
Figure 7.9. The reflector makes another substitution.
In our cyclic permutation notation, the reflector first used by the German military, called
reflector A, was (AE)(BJ)(CM)(DZ)(FL)(GY)(HX)(IV)(KW)(NR)(OQ)(PU)(ST). A differently wired
reflector B was introduced later. Notice that all of the cycles in the permutation for reflector A
are in pairs (2-cycles) and no two 2-cycles have a letter in common. When cycles have nothing in
common, they are said to be disjoint.
After passing through the reflector, the electrical impulse passes back through the three rotors
(along a different path than the first time), through the wire connecting it to the plugboard,
through the plugboard itself, and finally to one of the lights above the keypad.
The composition of monoalphabetic substitutions is a monoalphabetic substitution, so, if
Enigma worked exactly as described above, it would not result in a strong encryption. What
makes the machine special is that the first rotor turns by one position every time a key is pressed,
changing the substitution that will be performed. The first two rotors have notches that cause the
next rotor to turn by one position for every 26 turns they make themselves.
Thus, it appears that after each of the three rotors has turned all of the way around—that is,
after (26)(26)(26) = 17,576 letters have been typed—the machine returns to its original rotor set-
ting. In this way, the Enigma would work like a Vigenère Cipher with 17,576 independently mixed
224 ◾ Secret History
alphabets. Actually, although many authors have made this mistake,10 the period is not 17,576, but
rather a little smaller. Stephen Budiansky is one of the authors who got this right. In his excellent
history of World War II cryptanalysis, Battle of Wits, he explained:11
There are 26 × 26 × 26, or 17,576, different possible combinations of the three rotor
settings. However, in actual operation of the Enigma, the turnover mechanism causes
a “double stepping” to occur in the middle rotor: each time the middle rotor advances
to the position where it will trigger a turnover of the left rotor, it then immediately
advances again (along with the left rotor) as the next letter is typed in. If, for example,
the turnover occurs between E and F on the middle rotor and between V and W on the
right rotor, then an actual rotor sequence would be as follows:
ADU
ADV
AEW
BFX
BFY
BFZ
BFA
Thus the key length of the normal Enigma is actually 26 × 25 × 26, or 16,900. When
rotors with multiple turnover notches were later introduced, the key length was short-
ened even further.
10 Even Gordon Welchman, a mathematician at Bletchley Park who worked on breaking Enigma, made this error
when he wrote about his work decades later! See Welchman, Gordon, The Hut Six Story, McGraw-Hill Book
Company, New York, p. 45 footnote.
11 Budiansky Stephen, Battle of Wits: The Complete Story of Codebreaking in World War II, The Free Press, New
26 26! (2 p )!
2 p ( 2 p − 1)( 2 p − 3)( 2 p − 5)…(1) = (26 − 2 p )!(2 p )! p !(2 p )
26!
=
(26 − 2 p )! p !2 p
But the above result is only for when p cables are used. Because p is a variable somewhere between
0 and 13 inclusive, the total number of possible plugboard settings is
13
∑ (26 − 226!p)! p !2
p =0
p = 532,985,208,200,576
Originally, exactly six cables were used. So, in calculating the keyspace, some authors use the fig-
ure for six cables, instead of the much larger summation value given above. Later on, the number
of cables used varied.
The next factor is how the internal wiring connects the plugboard to the rotor assembly. There
are 26! ways to do this. We now come to the first rotor. There are 26! ways for a rotor to be wired.
We assume that the users will want to be able to reorder the rotors in the machine to get different
encipherments. Therefore, it makes sense for the rotors to be wired differently. If they are all wired
identically, reordering would have no effect. So, if each rotor is wired differently, we have (26!)
(26! – 1) (26! – 2) possibilities for the wirings.
This brings us to some common errors. It is tempting to insert other factors at this point hav-
ing to do with the possible orderings of the three rotors and the 26 positions each individual rotor
can be rotated to, but these have already been accounted for. To see this, imagine setting a rotor
in place and then turning it by one position. This is no different from having inserted another
rotor that is wired this way, and we already counted all 26! ways that the first rotor can be wired.
226 ◾ Secret History
Similarly, having accounted for the possible wirings of each of the three rotors, rearranging them
simply counts these possibilities again. We do not want this duplication. To make a simple anal-
ogy, consider three mailboxes arranged side by side. If we have five letters, and can place only one
in each box, then there are five choices for the first box, four choices for the second box, and three
choices for the third box. We get a total of 5 × 4 × 3 = 60 possibilities. We do not then consider
rearranging the order of the letters in the boxes. For the Enigma, distinctly wired rotors take the
place of letters in this example and we have 26! of them instead of just five, but the argument
against counting rearrangements is unchanged.
The locations of the notches (ring setting) on the two fastest rotors (right and middle) deter-
mine when the next wheel turns, so they must be considered as part of the key. This gives us
another factor of (26) (26) = 676. The contact points of the reflector were wired together in
pairs, so the number of possible wirings is the same as for a plugboard with 13 cables, namely
7,905,853,580,625. To summarize, we have the following enumerations:
Plugboard Settings: 532,985,208,200,576
Wiring from Plugboard to Rotors 403,291,461,126,
605,635,584,000,000
Wiring of the Rotors: 65,592,
937,459,144,468,297,
405,473,480,371,753,
615,896,841,298,988,
710,328,553,805,190,
043,271,168,000,000
Notch Position of Rotors: 676
Reflector Wiring: 7,905,853,580,625
The keyspace is obtained by multiplying all of the numbers above together to get 753,506,019,827,
465,601,628,054,269,182,006,024,455,361,232,867,996,259,038,139,284,671,620,842,209,198,
855,035,390,656,499,576,744,406,240,169,347,894,791,372,800,000,000,000,000.
This is ridiculously larger than is necessary to prevent a brute force attack.
Figure 7.10 Marian Rejewski (c. 1932), Jerzy Różycki, and Henryk Zygalski. (Rejewski photo-
graph Creative Commons Attribution-Share Alike 2.5 Generic license, Obtained from Marian
Rejewski’s daughter and published in commons under CC-BY-SA with her permission.)
Figure 7.11 A first day cover for the first stamp honoring codebreaking.
the wiring of the rotors. Enigma messages were intercepted by the Poles from July 15, 1928, when
they first went on the air (with Army messages), but the only progress that was made in the first
few years was the creation of a mathematical model of the machine. Later writers have continued
the use of the Pole’s notation, which follows:
S = plugboard permutation (the S stands for Steckerbrett, German for “stickerboard”)
N = rightmost rotor permutation (this is the fast turning rotor)
M = middle rotor permutation
L = leftmost rotor permutation
R = reflector permutation
H = permutation representing the internal wiring from the plugboard to the entry point for
the set of rotors
228 ◾ Secret History
H:
Input: ABCDEFGHIJKLMNOPQRSTUVWXYZ
Output: JWULCMNOHPQZYXIRADKEGVBTSF
H−1:
Input: ABCDEFGHIJKLMNOPQRSTUVWXYZ
Output: QWERTZUIOASDFGHJKPYXCVBNML
Does it look more familiar now? H−1 almost matches our American QWERTY keyboards. In
fact, it matched the Enigma keyboard perfectly. Of course, the World War II cryptanalysts didn’t
have to look at different notations to see this pattern. The Poles had a commercial Enigma and
could simply observe, without using any notation at all, that the keys were connected, in order, to
the rotor system input. I chose to present it in this manner to illustrate that notation needs to fit
the circumstance and we can’t simply say one form is always superior.
The values of the reflector, called R now, and the three rotors, N, M, and L, were given
earlier in this chapter. Of course, the rotors could be used in any order, so we cannot simply say
N = Rotor I, for example. The Poles, however, didn’t know any of these permutations at first.
They had to derive them mathematically. Hence, all of the letters above represented unknowns
initially.
We need one more permutation to show the change that occurs every time the fast rotor
advances one position. Fortunately, this one is known and is given by
P = (ABCDEFGHIJKLMNOPQRSTUVWXYZ).
In order to encipher a message, the user set the machine up to the “daily key,” say HLD, by
turning the rotors so these are the three letters on top.12 He then selected another “session key”
(randomly, if he was following orders…), say EBW. Typing the session key twice might result in
the ciphertext GDOMEH. This would be sent at the start of the message. The intended recipient,
who knew the daily key and had set his Enigma to it, typed in GDOMEH to get out EBWEBW. Now
back to the encipherer. After he typed the session key twice, he reset his Enigma to the session key
and typed out the message. The intended recipient, having used the daily key setting to recover
the session key, also reset the machine to the session key and typed the rest of the ciphertext to
recover the original message. Clearly, the session key needn’t be typed twice. This redundancy was
intentionally introduced to ensure the session key was received correctly. It worked in that regard,
but it also proved to be a weakness.
An example will illustrate how Marian Rejewski exploited the repeated session key.13 Consider
the following beginnings for ciphertexts using various session keys, but sent on the same day (thus
using the same daily key for the first six letters):14
12 If the rotors used displayed numbers, rather than letters, the user would set them using the correspondence
A = 1, B = 2, etc.
13 The method that follows has been adapted from Rejewski’s own explanation.
14 From Bauer, Friedrich L., Decrypted Secrets: Methods and Maxims of Cryptology, second edition, Springer, New
AUQ AMN IND JHU PVJ FEG SJM SPO WTM RAO
BNH CHL JWF MIC QGA LYB SJM SPO WTM RAO
BCT CGJ JWF MIC QGA LYB SJM SPO WTM RAO
CIK BZT KHB XJV RJL WPX SUG SMF WKI RKK
DDB VDV KHB XJV RJL WPX SUG SMF XRS GNM
EJP IPS LDR HDE RJL WPX TMN EBY XRS GNM
FBR KLE LDR HDE RJL WPX TMN EBY XOI GUK
GPB ZSV MAW UXP RFC WQQ TAA EXB XYW GCP
HNO THD MAW UXP SYX SCW USE NWH YPC OSQ
HNO THD NXD QTU SYX SCW VII PZK YPC OSQ
HXV TTI NXD QTU SYX SCW VII PZK ZZY YRA
IKG JKF NLU QFZ SYX SCW VQZ PVR ZEF YOC
IKG JKF OBU DLZ SYX SCW VQZ PVR ZSJ YWG
It will be convenient to let A, B, C, D, E, and F denote the permutations that the Enigma set-
ting yielding these messages imposes upon the first, second, third, fourth, fifth, and sixth letters
typed.
The first encrypted session key is AUQ AMN. Now looking at the first and fourth ciphertext
letters (which result from enciphering the same plaintext letter), we see that permutations A and
D both send the first letter of the session key to the ciphertext letter A. The first letter of the ses-
sion key is unknown, but we may denote it by α. We have A(α) = A and D(α) = A. Because the
Enigma is self-inverse, we also have A(A) = α and D(A) = α. Thus, if we form the composition of
the permutations, AD (read left to right—that is, perform A first, then D), we see that this new
permutation will send the letter A to A.
Thus, the permutation AD begins with (A). Continuing this argument with the second and
fourth encrypted session keys, BNH CHL and CIK BZT, we see that the permutation AD sends
B to C, and also sends C to B, so we now have AD = (A) (BC)…
Examining more session keys, we next get the longer cycles, (DVPFKXGZYO) and
(EIJMUNQLHT). Finally, we end the permutation with (RW) and (S). Putting it all together, with
the cycles in order of decreasing size, we have
AD = (DVPFKXGZYO) (EIJMUNQLHT) (BC) (RW) (A) (S)
Repeating this approach for the second and fifth letters and then once more for the third and
sixth letters we get two more permutations.
BE = (BLFQVEOUM) (HJPSWIZRN) (AXT) (CGY) (D) (K)
CF = (ABVIKTJGFCQNY) (DUZREHLXWPSMO)
Notice that the cycles making up AD have the lengths 10, 10, 2, 2, 1, and 1. For BE the cycle
lengths are 9, 9, 3, 3, 1, and 1. And for CF we have 13 and 13. The exact results for the lengths
of the cycles depended upon the daily key, but if a cycle of a given length is present, then another
cycle of that same length will also appear.15 Rejewski referred to the pattern in the cycle lengths as
the characteristic structure, or more briefly the characteristic for a given day.
It would be more useful to know A, B, C, D, E, and F, instead of the products above, but we
cannot get them as easily. In fact, we will need to determine A, B, C, D, E, and F by factoring the
15 This is a consequence of the Enigma being self-inverse. The permutations A, B, C, D, E, and F must therefore
all be self-inverse and, hence, consist of disjoint 2-cycles. Any product of permutations that consists of disjoint
2-cycles will have a cycle structure with all lengths appearing in pairs.
230 ◾ Secret History
more readily available product permutations AD, BE and CF. Fortunately, there’s a nice formula
for doing so. That is, knowing the product AD we can quickly find A and D and similarly for
BE and CF. However, unlike factoring an integer as a product of primes, the factorizations here
won’t be unique. This non-uniqueness of factorization makes extra work for the cryptanalyst! For
instance, if XY = (AB)(CD), factorizations are given by
X = ( AD)(BC) and Y = (BD)( AC)
and
X = ( AC)(BD) and Y = (BC)( AD).
For larger pairs of cycles, there are more possible factorizations. In general,16 if XY = (x1x3x5…x 2n-1)
(y2n…y6y4y2) then we can factor it as
Expressing the cycle (x1x3x5…x 2n-1) as (x3x5…x 2n-1 x1) and following the same rule gives a different
factorization. Because a cycle of length n can be expressed in n different ways, we’ll get n dis-
tinct factorings. We can factor pairs of disjoint cycles having the same length independently and
then piece all such results together to get our “overall” factoring. This will be done below. When
Rejewski explained his work, he skipped over much of what follows by writing
We assume that thanks to the theorem on the product of transpositions, combined
with a knowledge of encipherers’ habits, we know separately the permutations A
through F.
It is hoped that what the following pages lose in terms of terseness, compared to Rejewski’s expla-
nation, is made up for in clarity.
We start by factoring AD = (DVPFKXGZYO) (EIJMUNQLHT) (BC) (RW) (A) (S) by making
repeated use of our factoring rule, for each pair of cycles of equal length, with the example to guide us.
AD includes the 1-cycles (A) and (S) ⇒
(AS) is part of A and (SA) is part of D.
AD includes the 2-cycles (BC) and (RW) ⇒
(BR)(CW) is part of A and (RC)(WB) is part of D
or
(BW)(CR) is part of A and (WC)(RB) is part of D.
AD includes the 10-cycles (DVPFKXGZYO) and (EIJMUNQLHT) ⇒
(DT)(VH)(PL)(FQ)(KN)(XU)(GM)(ZJ)(YI)(OE) is part of A and
(TV)(HP)(LF)(QK)(NX)(UG)(MZ)(JY)(IO)(ED) is part of D
or
(DE)(VT)(PH)(FL)(KQ)(XN)(GU)(ZM)(YJ)(OI) is part of A and
(EV)(TP)(HF)(LK)(QX)(NG)(UZ)(MY)(JO)(ID) is part of D.
16 It should be noted that this result and other theorems from abstract algebra used in this chapter existed before
they were needed for cryptanalysis. The Poles didn’t have to invent or discover the mathematics; they only had
to know it and apply it.
World War II: The Enigma of Germany ◾ 231
or
or
(DJ)(VI)(PE)(FT)(KH)(XL)(GQ)(ZN)(YU)(OM) is part of A and
(JV)(IP)(EF)(TK)(HX)(LG)(QZ)(NY)(UO)(MD) is part of D.
or
(DM)(VJ)(PI)(FE)(KT)(XH)(GL)(ZQ)(YN)(OU) is part of A and
(MV)(JP)(IF)(EK)(TX)(HG)(LZ)(QY)(NO)(UD) is part of D.
or
(DU)(VM)(PJ)(FI)(KE)(XT)(GH)(ZL)(YQ)(ON) is part of A and
(UV)(MP)(JF)(IK)(EX)(TG)(HZ)(LY)(QO)(ND) is part of D.
or
(DN)(VU)(PM)(FJ)(KI)(XE)(GT)(ZH)(YL)(OQ) is part of A and
(NV)(UP)(MF)(JK)(IX)(EG)(TZ)(HY)(LO)(QD) is part of D.
or
or
or
or
(AG)(XC)(TY) is part of B and (CT)(GX)(YA) is part of E
or
or
or
or
or
or
or
or
or
(BR)(LZ)(FI)(QW)(VS)(EP)(OJ)(UH)(MN) is part of B and
(RL)(ZF)(IQ)(WV)(SE)(PO)(JU)(HM)(NB) is part of E
Mixing and matching the choices we have for the decomposition of the 9-cycle and the 3-cycle
to go with our only option for the 1-cycle, yields 27 possible decompositions for BE.
We also have CF = (ABVIKTJGFCQNY)(DUZREHLXWPSMO).
CF includes only 13-cycles, so we have one of the following:
C = (AO)(BM)(VS)(IP)(KW)(TX)(JL)(GH)(FE)(CR)(QZ)(NU)(YD)
F = (OB)(MV)(SI)(PK)(WT)(XJ)(LG)(HF)(EC)(RQ)(ZN)(UY)(DA)
or
C = (AD)(BO)(VM)(IS)(KP)(TW)(JX)(GL)(FH)(CE)(QR)(NZ)(YU)
F = (DB)(OV)(MI)(SK)(PT)(WJ)(XG)(LF)(HC)(EQ)(RN)(ZY)(UA)
or
C = (AU)(BD)(VO)(IM)(KS)(TP)(JW)(GX)(FL)(CH)(QE)(NR)(YZ)
F = (UB)(DV)(OI)(MK)(ST)(PJ)(WG)(XF)(LC)(HQ)(EN)(RY)(ZA)
or
C = (AZ)(BU)(VD)(IO)(KM)(TS)(JP)(GW)(FX)(CL)(QH)(NE)(YR)
F = (ZB)(UV)(DI)(OK)(MT)(SJ)(PG)(WF)(XC)(LQ)(HN)(EY)(RA)
or
C = (AR)(BZ)(VU)(ID)(KO)(TM)(JS)(GP)(FW)(CX)(QL)(NH)(YE)
F = (RB)(ZV)(UI)(DK)(OT)(MJ)(SG)(PF)(WC)(XQ)(LN)(HY)(EA)
or
C = (AE)(BR)(VZ)(IU)(KD)(TO)(JM)(GS)(FP)(CW)(QX)(NL)(YH)
F = (EB)(RV)(ZI)(UK)(DT)(OJ)(MG)(SF)(PC)(WQ)(XN)(LY)(HA)
or
C = (AH)(BE)(VR)(IZ)(KU)(TD)(JO)(GM)(FS)(CP)(QW)(NX)(YL)
F = (HB)(EV)(RI)(ZK)(UT)(DJ)(OG)(MF)(SC)(PQ)(WN)(XY)(LA)
or
C = (AL)(BH)(VE)(IR)(KZ)(TU)(JD)(GO)(FM)(CS)(QP)(NW)(YX)
F = (LB)(HV)(EI)(RK)(ZT)(UJ)(DG)(OF)(MC)(SQ)(PN)(WY)(XA)
234 ◾ Secret History
or
C = (AX)(BL)(VH)(IE)(KR)(TZ)(JU)(GD)(FO)(CM)(QS)(NP)(YW)
F = (XB)(LV)(HI)(EK)(RT)(ZJ)(UG)(DF)(OC)(MQ)(SN)(PY)(WA)
or
C = (AW)(BX)(VL)(IH)(KE)(TR)(JZ)(GU)(FD)(CO)(QM)(NS)(YP)
F = (WB)(XV)(LI)(HK)(ET)(RJ)(ZG)(UF)(DC)(OQ)(MN)(SY)(PA)
or
C = (AP)(BW)(VX)(IL)(KH)(TE)(JR)(GZ)(FU)(CD)(QO)(NM)(YS)
F = (PB)(WV)(XI)(LK)(HT)(EJ)(RG)(ZF)(UC)(DQ)(ON)(MY)(SA)
or
C = (AS)(BP)(VW)(IX)(KL)(TH)(JE)(GR)(FZ)(CU)(QD)(NO)(YM)
F = (SB)(PV)(WI)(XK)(LT)(HJ)(EG)(RF)(ZC)(UQ)(DN)(OY)(MA)
or
C = (AM)(BS)(VP)(IW)(KX)(TL)(JH)(GE)(FR)(CZ)(QU)(ND)(YO)
F = (MB)(SV)(PI)(WK)(XT)(LJ)(HG)(EF)(RC)(ZQ)(UN)(DY)(OA)
AUQ AMN IND JHU PVJ FEG SJM SPO WTM RAO
BNH CHL JWF MIC QGA LYB SJM SPO WTM RAO
BCT CGJ JWF MIC QGA LYB SJM SPO WTM RAO
CIK BZT KHB XJV RJL WPX SUG SMF WKI RKK
DDB VDV KHB XJV RJL WPX SUG SMF XRS GNM
EJP IPS LDR HDE RJL WPX TMN EBY XRS GNM
FBR KLE LDR HDE RJL WPX TMN EBY XOI GUK
GPB ZSV MAW UXP RFC WQQ TAA EXB XYW GCP
HNO THD MAW UXP SYX SCW USE NWH YPC OSQ
HNO THD NXD QTU SYX SCW VII PZK YPC OSQ
HXV TTI NXD QTU SYX SCW VII PZK ZZY YRA
IKG JKF NLU QFZ SYX SCW VQZ PVR ZEF YOC
IKG JKF OBU DLZ SYX SCW VQZ PVR ZSJ YWG
To recover the first enciphered session key, AUQ AMN, we begin with the first character, A.
Because it is in the first position, it was enciphered by permutation A. Permutation A contains the
swap (AS), so the ciphertext letter A becomes plaintext S. The second cipher letter is U, so we look
for U in permutation B and find the swap (UJ). Thus, U deciphers to J. Finally, we look up Q in
permutation C. We find the swap (QZ), so ciphertext Q deciphers to Z. We’ve now recovered the
session key SJZ. The three remaining ciphertext letters, AMN, serve to check our result. We look
for A, M, and N in permutations D, E, and F, respectively, and the swaps once again give us SJZ.
Thus, we are confident that, if the permutations are correct, the first session key is SJZ.
Deciphering the entire list of session keys with our proposed permutations, A through F, gives
SJZ SJZ YBY YBY LWL LWL AUB AUB CCB CCB
RBG RBG ZVE ZVE FXO FXO AUB AUB CCB CCB
RTX RTX ZVE ZVE FXO FXO AUB AUB CCB CCB
WQW WQW NMM NMM BUJ BUJ AJH AJH CDP CDP
TKM TKM NMM NMM BUJ BUJ AJH AJH ULV ULV
OUI OUI PKC PKC BUJ BUJ DHU DHU ULV ULV
QNC QNC PKC PKC BUJ BUJ DHU DHU UPP UPP
MOM MOM GYK GYK BZR BZR DYO DYO UAK UAK
VBA VBA GYK GYK AAT AAT XEF XEF IOR IOR
VBA VBA KGY KGY AAT AAT HQP HQP IOR IOR
VGS VGS KGY KGY AAT AAT HQP HQP JFD JFD
YDH YDH KRN KRN AAT AAT HIQ HIQ JSE JSE
YDH YDH ENN ENN AAT AAT HIQ HIQ JEL JEL
This is the first of the two potential solutions you are asked to consider.
Now, consider another possible factorization (option 2):
A = (AS)(BR)(CW)(DI)(VE)(PT)(FH)(KL)(XQ)(GN)(ZU)(YM)(OJ)
B = (DK)(AY)(XG)(TC)(BJ)(LH)(FN)(QR)(VZ)(EI)(OW)(US)(MP)
C = (AX)(BL)(VH)(IE)(KR)(TZ)(JU)(GD)(FO)(CM)(QS)(NP)(YW)
D = (SA)(RC)(WB)(IV)(EP)(TF)(HK)(LX)(QG)(NZ)(UY)(MO)(JD)
E = (KD)(GT)(YX)(CA)(JL)(HF)(NQ)(RV)(ZE)(IO)(WU)(SM)(PB)
F = (XB)(LV)(HI)(EK)(RT)(ZJ)(UG)(DF)(OC)(MQ)(SN)(PY)(WA)
With these selections, the session keys decipher to
236 ◾ Secret History
SSS SSS DFG DFG TZU TZU ABC ABC CCC CCC
RFV RFV OOO OOO XXX XXX ABC ABC CCC CCC
RTZ RTZ OOO OOO XXX XXX ABC ABC CCC CCC
WER WER LLL LLL BBB BBB ASD ASD CDE CDE
IKL IKL LLL LLL BBB BBB ASD ASD QQQ QQQ
VBN VBN KKK KKK BBB BBB PPP PPP QQQ QQQ
HJK HJK KKK KKK BBB BBB PPP PPP QWE QWE
NML NML YYY YYY BNM BNM PYX PYX QAY QAY
FFF FFF YYY YYY AAA AAA ZUI ZUI MMM MMM
FFF FFF GGG GGG AAA AAA EEE EEE MMM MMM
FGH FGH GGG GGG AAA AAA EEE EEE UVW UVW
DDD DDD GHJ GHJ AAA AAA ERT ERT UIO UIO
DDD DDD JJJ JJJ AAA AAA ERT ERT UUU UUU
Before you continue reading, stop and ponder the lists of session keys arising from factoriza-
tion option 1 and option 2. Which of these were more likely to be chosen by the Nazis using the
machines? The answer follows below.
Recall that the Nazis operating these machines were told to select their session keys randomly.
Well, despite what some war criminals claimed at the Nuremburg trials, Nazis didn’t always follow
orders. The second set of session keys is the correct one, and these keys are far from random. For
example, there are many triplets, such as AAA and KKK.
Looking at the layout of an Enigma keyboard reveals the inspiration for other keys.
Q W E R T Z U I O
A S D F G H J K
P Y X C V B N M L
Of the 65 keys, 39 use a triplet, 18 read along a row of keys, 3 read down a diagonal on the
keyboard (RFV, IKL, QAY), and 5 use three consecutive letters (one of which, CDE, happens to
also be up a diagonal on the keyboard). Every single session key had some pattern! By contrast, in
option 1 we can’t find any of these patterns.
This is what Rejewski meant by “knowledge of encipherers’ habits” The patterns he looked for
did not have to be of the type shown above. Enigma operators sometimes used a person’s initials,
letters connected to a girlfriend’s name, or something else that could be guessed. Taking advan-
tage of this human tendency toward the nonrandom is often referred to as using the psychological
method.
With this method, the session keys from the correct factorization will stand out from a large
group of possibilities just as clearly as from the pair presented above. Having to consider 7,020
possibilities may sound painful, but the correct answer will be found, on average, after 3,510
attempts (half the total), and the problem is perfect for parallel processing. Of course, during
World War II, this meant many people working simultaneously on separate possibilities. This is
why cryptanalytic groups often had a large number of clerks! With 100 people working on this
problem, a single person would have only 35 possibilities to check on average. Also, to speed things
up, one could move on to the next possibility if, say, the first five session keys all fail to fit any of
the expected forms. But due to security concerns, the Poles didn’t adopt the parallel processing
approach. It should be noted that the number of factorization possibilities varies depending on the
World War II: The Enigma of Germany ◾ 237
characteristic for the set of messages. Some days there were more possibilities to investigate and
some days fewer.
Rejewski presented the correct solution in a manner that looks a bit different, but recall that
disjoint cycles commute and 2-cycles may be written in the form (XY) or (YX). Rejewski’s repre-
sentation of permutations A through F follows:
A = (AS)(BR)(CW)(DI)(EV)(FH)(GN)(JO)(KL)(MY)(PT)(QX)(UZ)
B = (AY)(BJ)(CT)(DK)(EI)(FN)(GX)(HL)(MP)(OW)(QR)(SU)(VZ)
C = (AX)(BL)(CM)(DG)(EI)(FO)(HV)(JU)(KR)(NP)(QS)(TZ)(WY)
D = (AS)(BW)(CR)(DJ)(EP)(FT)(GQ)(HK)(IV)(LX)(MO)(NZ)(UY)
E = (AC)(BP)(DK)(EZ)(FH)(GT)(IO)(JL)(MS)(NQ)(RV)(UW)(XY)
F = (AW)(BX)(CO)(DF)(EK)(GU)(HI)(JZ)(LV)(MQ)(NS)(PY)(RT)
If only the rightmost rotor turns while enciphering the session key, then we have the following.18
−1 −1 −1 −1 −1
A = SHPNP QPN P H S
−2 2 −1 −2 −1 −1
B = SHP NP QP N P H S
2
C = SHP3 NP −3QP3 N −1P −3H −1S−1
D = SHP 4 NP −4QP 4 N −1P −4H −1S−1
E = SHP5 NP −5QP5N−1P −5H −1S−1
F = SHP6 NP −6QP6 N −1P −6 H −1S−1
18 Rejewski, in explaining his work, sometimes wrote the first equation without the necessary P−1 in position 5
and P in position 11. It seems he didn’t want to confuse the reader with too many details. Later on he includes
it (when he’s being more detailed). Christensen followed this style in his excellent paper.
238 ◾ Secret History
Because there are six equations and only four unknowns (S, H, N, and Q), we ought to be able
to find a solution.
Figure 7.12 Hans Thilo Schmidt. (From the David Kahn Collection, National Cryptologic
Museum, Fort Meade, Maryland.)
−1 −1 −1 −1 −1
H S ASH = PNP QPN P
H −1S−1BSH = P 2 NP −2QP 2 N −1P −2
H −1S−1CSH = P3 NP −3QP3 N −1P −3
H −1S−1DSH = P 4 NP −4QP 4 N −1P −4
H −1S−1ESH = P5 NP −5QP5N−1P −5
H −1S−1FSH = P6 NP −6QP6 N −1P −6
All of the permutations on the left side of the above equalities were believed to be known.
19 Rejewski, Marian, “Mathematical Solution of the Enigma Cipher,” Cryptologia, Vol. 6, No. 1, January 1982,
pp. 1–18, p. 8 cited here.
World War II: The Enigma of Germany ◾ 239
A quick definition: If we start with a permutation A and use another permutation P to form
the product P−1AP, we say that A has been transformed by the action of P.
Now take the six equations above and transform both sides by the action of P, P2, P3, P4, P5,
and P6, respectively. We label the results with new letters for convenience:
Removing the parenthesis, the N−1N in the middle drops out of each equation. We then combine
the powers of P that appear next to one another and insert a new pair of parenthesis (to stress a
portion the equations all share) and get the following.
Now take the first four equations above and transform both sides of each by the action of NPN−1:
−1 −1 −1 −1 −1 −1 −1 −1 −1
NP N (UV)NPN = NP N NP (QP QP)PN NPN
−1 −1 −1 −1 −1 −2 −1 2 −1 −1
NP N (VW)NPN = NP N NP (QP QP)P N NPN
−1 −1 −1 −1 −1 −3 −1 3 −1 −1
NP N (WX)NPN = NP N NP (QP QP)P N NPN
−1 −1 −1 −1 −1 −4 −1 4 −1 −1
NP N (XY )NPN = NP N NP (QP QP)P N NPN
240 ◾ Secret History
UV = (AEPFTYBSNIKOD) (RHCGZMUVQWLJX)
VW = (AKJCEVZYDLWNU) (SMTFHQIBXOPGR)
gives
T = (A)(EKWONDUILPJGFCT)(YVBZHMQXRS)
The only problem is that UV could be written, switching the order of the 13-cycles, as
UV = (RHCGZMUVQWLJX) (AEPFTYBSNIKOD)
and this will give a different result. That is, the solution to the problem of finding a permutation
T such that T−1(UV)T = VW is not unique. Indeed, beginning one of the 13-cycles that make up
UV with a different letter also changes the solution.
So, the first equation will give us dozens of possibilities for NP−1N−1 (recall that T was taking
the place of NP−1N−1 in the discussion above). Exactly how many solutions exist depends on the
cycle structure of UV.
In any case, each of the equations below will offer various possibilities for NP−1N−1.
−1 −1 −1
NP N (UV)NPN = VW
NP −1N −1 (VW)NPN −1 = WX
NP −1N −1 (WX)NPN −1 = XY
NP −1N −1 (XY )NPN −1 = YZ
World War II: The Enigma of Germany ◾ 241
However, there will only be one possibility suggested repeatedly—this is the one we take. We
won’t even need all four equations. We can simply find all solutions given by the first two equa-
tions above and take the one that arises twice.
Continuing with our example, we have
UV = (AEPFTYBSNIKOD) (RHCGZMUVQWLJX)
VW = (AKJCEVZYDLWNU) (SMTFHQIBXOPGR)
WX = (AQVLOIKGNWBMC) (PUZFTJRYEHXDS)
So we write VW under UV all possible ways (13 × 13 × 2 = 338 of them!)20 and see what permu-
tations they give, and we also write WX under VW all possible ways, and then look for a match.
For example,
VW = (AKJCEVZYDLWNU) (SMTFHQIBXOPGR)
WX = (AQVLOIKGNWBMC) (PUZFTJRYEHXDS)
gives
(A)(KQJVIRSPXEOHTZ) (CLWBYGDNMU)(F)
AYURICXQMGOVSKEDZPLFWTNJHB ⇒ ABCDEFGHIJKLMNOPQRSTUVWXYZ
ABCDEFGHIJKLMNOPQRSTUVWXYZ AZFPOTJYEXNSIWKRHDMVCLUGBP
AYURICXQMGOVSKEDZPLFWTNJHB ⇒ ABCDEFGHIJKLMNOPQRSTUVWXYZ
BCDEFGHIJKLMNOPQRSTUVWXYZA BAGQPUKZFYOTJXLSIENWDMVHCR
AYURICXQMGOVSKEDZPLFWTNJHB ⇒ ABCDEFGHIJKLMNOPQRSTUVWXYZ
CDEFGHIJKLMNOPQRSTUVWXYZAB CBHRQVLAGZPUKYMTJFOXENWIDS
AYURICXQMGOVSKEDZPLFWTNJHB ⇒ ABCDEFGHIJKLMNOPQRSTUVWXYZ
ZABCDEFGHIJKLMNOPQRSTUVWXY ZYEONSIXDWMRHVJQGCLUBKTFAP
One of the 26 possibilities indicated above will be N, the wiring of the fast rotor.
20 The factors of 13 come from the fact that either 13-cycle can be expressed beginning with any of its 13 letters.
The factor of 2 comes from the choice as to which of the two 13-cycles we write first.
242 ◾ Secret History
The attack described above was based on Rejewski’s description. He didn’t use real data, so,
although the approach works, the rotor possibilities we end up with don’t match any that were
actually used.
In any case, the work we just went through only provides the wiring for one rotor—the right-
most and fastest rotor. The Germans, however, placed the rotors in the machine in different orders,
every three months, as part of the key, and the settings supplied by Schmidt straddled two quar-
ters. Happily, two different rotors fell in the rightmost position in these two quarters and both
were recovered by the method detailed above.
There was some more work involved, as the wiring of the third rotor and the reflector still had
to be recovered. But the above gives the flavor of the work. It should also be noted that we can-
not narrow down the possibilities for each rotor individually. We must look at them as a group to
decide which are correct.21
After everything was determined, and the Poles were expecting to be able to read messages,
only gibberish appeared! Something was wrong. Finally, Rejewski turned to the wiring from the
plugboard to the rotor entry points. Perhaps it wasn’t the same as in the commercial Enigma the
Poles had acquired. So, what complex mathematical machinery did Rejewski apply to this new
problem? None—he guessed. Maybe H was simply
Input: ABCDEFGHIJKLMNOPQRSTUVWXYZ
Output: ABCDEFGHIJKLMNOPQRSTUVWXYZ
The new Enigmas were being delivered to frontier units, and in early 1939 a military
truck containing one was ambushed. Polish agents staged an accident in which fire
destroyed the evidence. German investigators assumed that some charred bits of coils,
springs, and rotors were the remains of the real Enigma.22
The author does mention the star of this chapter, sort of; he refers to “Mademoiselle Marian
Rejewski.” Richard A. Woytak admired Stevenson’s “thereby managing, with masterful economy
of expression, to get wrong both Rejewski’s sex and marital status.”23
That was a lot easier than wrapping your head around the mathematics wasn’t it? Everyone
knows you can’t believe everything you read on the internet, but I really don’t see how it’s any dif-
ferent from the print world.
Thus, the Poles, having been given the daily keys, were able to reconstruct the Enigma machine.
Eventually the keys expired and the Poles had to face the opposite problem: Having the machine,
how could the daily keys be recovered?
21 Rejewski, Marian, translated by Christopher Kasparek, “Mathematical Solution of the Enigma Cipher,”
Cryptologia, Vol. 6, No. 1, pp. 1–18, p. 11 states “But those details may only be established following the basic
reconstruction of the connections in all the rotors.”
22 Stevenson, William, A Man Called Intrepid, Harcourt Brace Jovanovich, New York, 1976, p. 49. Or see the
Cryptologia, Vol. 6, No.1, January 1982, pp. 75–83. This piece was translated into English by Christopher
Kasparek and contains a prefatory note by Richard A. Woytak.
World War II: The Enigma of Germany ◾ 243
Thus,
The Poles observed that as the settings on Enigma were changed from day to day, this disjoint
cycle structure also changed. That is, the cycle structure is determined by the order of the rotors
and their initial positions. The ring settings do not affect it and can therefore be ignored. If the
Poles could build a catalog showing the correspondence between the rotor settings and the disjoint
cycle structures, then the latter, when recovered from a set of intercepted messages, would tell
them how to set up their bootleg Enigmas to decipher the intercepts. But how large would this
catalog be? Would its creation even be possible?
There are 6 ways to order the three rotors and 263 = 17,576 ways to select a daily key to deter-
mine their initial positions. Thus, the total number of possibilities comes to (6)(17,576) = 105,456.
A catalog with this many entries would take some time to create, but it could be done. However,
there is also the plugboard. We saw in Section 7.3 that there are 532,985,208,200,576 possible
plugboard settings. Having a factor of this size in the calculation of the catalog size would make
constructing it impossible. This bring us to what is sometimes called “The Theorem that Won the
War.” It can be stated tersely as:
Conjugate permutations have the same disjoint cycle structure.
Recall that if P is a permutation and C is some other permutation, then P and the product C−1PC
are conjugate permutations. That is, we get such a pair by multiplying the original on one side by
a permutation C and on the other side by the inverse of that permutation, C−1. The fact that this
does not change the cycle structure can be stated as follows.
If P and C are permutations and P(α) = β, then the permutation C−1PC sends C(α) to
C(β). Hence, P and C−1PC have the same disjoint cycle structure.24
24 A proof of this theorem is provided in Rejewski, Marian, Memories of my work at the Cipher Bureau of the
General Staff Second Department 1930–1945, Adam Mickiewicz University, Poznań, Poland, 2011.
244 ◾ Secret History
The plugboard, represented by S, acts on the rest of the Enigma encryption by conjugation. We can
see this by looking again at the mathematical model the Poles built of Enigma. A set of parenthesis
is included around the letters representing permutations caused by non-plugboard components.
We have S on one side and S−1 on the other.25 Thus, the plugboard doesn’t alter cycle structures, no
matter how it is wired. It will change what letters are involved in each cycle, but not the lengths
of the cycles. So, the plugboard can be ignored when building the catalog! The catalog only needs
to contain the 105,456 entries corresponding to the order and initial positions of the three rotors.
To create the catalog, the Poles could have set their new bootleg military Enigma to a particular
key and enciphered a session key twice, noting the result, then reset the key and enciphered another ses-
sion key twice, and so on. After obtaining dozens of these enciphered session keys, Rejewski’s method,
described in Section 7.4, could be applied to yield the permutations AD, BE, and CF, and thus deter-
mine their cycle structure; however, to do this 105,456 times would be very time consuming.
Instead, to create the desired catalog, the Poles made a machine called a cyclometer, depicted
in Figure 7.13.
Figure 7.13 The Polish cyclometer. (Illustration by Dan Meredith from Christensen, Chris,
Mathematics Magazine, Vol. 80, No. 4, 2007, p. 260.)
25 Note: It doesn’t matter if we have S on the left and S−1 on the right or S−1 on the left and S on the right. All that
matters is that the permutations we have on either side are inverses of each other. The labeling is arbitrary anyway.
World War II: The Enigma of Germany ◾ 245
This time-saving device consisted of two rotor sets (with reflectors). As you may have already
guessed, one represents A and the other D (or B and E, C and F, whichever permutation is being
investigated at that moment). A charge can be applied to any of the letters and the current will
flow through the set of rotors representing permutation A and then through the rotors repre-
senting permutation D; the letter that comes out will be illuminated, but this is not the end.
The charge continues through A and D again and may light another letter, if the cycle is not
yet complete. The handle on the left side of the front of the cyclometer is to control the amount
of current.26 If the amount of current needed to clearly light the bulbs for all of the letters in a
large cycle were used for a very short cycle, it might burn out the filaments. Hence, the operator
should start out low, and increase the current only if it looks like enough bulbs are lighting to
avoid damage.
Naturally it took some time to construct the cyclometer, but it allowed the catalog to be
built much more quickly than by applying the method described in Section 7.4. The original
catalog no longer exists, but Alex Kuhl, an undergraduate at Northern Kentucky University
at the time, reconstructed it. 27 Naturally, he used a personal computer rather than doing it by
hand with a cyclometer. Although the cyclometer method was quicker than a mathematical
analysis, it still took over a year for the Poles to complete their catalog. I find it amazing that
more people were not assigned to this extremely important work in order to speed its comple-
tion. It was a task extremely well suited to what we would now call parallel processing. Twice
as many workers could have completed the job in half the time, and so on for three, four, etc.
times as many workers. Once the cyclometers were made, anyone could have been trained to
work on a portion of the catalog. It didn’t require any skills other than attention to detail. In
fact, it must have been very monotonous work. But, even though it was tremendously impor-
tant, the Poles weren’t even working on the catalog full time; they were also reconstructing
Enigma keys by another method each day. Kozaczuk described what it was like when cyclom-
eter work was being done:28
This was a tedious, time-consuming job and, on account of the work’s secrecy, the
mathematicians could not delegate it to the Cipher Bureau’s technical personnel. In
their haste the men would scrape their fingers raw and bloody.
When completed, the catalog did not offer a one-to-one correspondence, for there were 105,456
ways to order and position the rotors, but only 21,230 different disjoint cycle structures. A recov-
ered disjoint cycle structure would lead to, on average, 5 possibilities for the rotors. However,
averages can be misleading. On average, humans have one testicle and one ovary each. Perhaps
the average of 5 rotor settings is also unrepresentative of a typical result. What does the catalog
Allies in World War Two, edited and translated from the original 1979 Polish version by Christopher Kasparek,
University Publications of America, Inc., Frederick, Maryland, 1984, p. 29.
246 ◾ Secret History
actually look like? Kuhl’s reconstruction revealed exactly how the 105,465 rotor settings map to
the 21,230 different disjoint cycle structures. Some of his results are provided below.29
• The good news is that 11,466 of the disjoint cycle structures have unique rotor settings that
give rise to them. Thus, over half the time the cycle structure, when checked in the catalog,
would immediately yield the rotor settings. Over 92% of the disjoint cycle structures cor-
respond to 10 or fewer possible rotor settings.
• The bad news is that there are disjoint cycle structures that give far more possibilities (See
Table 7.1).
It’s not known what the Poles did on the worst days, but such days were rare, and recovery of
the daily key typically took only 10 to 20 minutes.
Nevertheless, the Poles’ work was not ended! Rejewski lamented:
Unfortunately, on November 2, 1937, when the card catalogue was ready, the Germans
exchanged the reversing drum30 that they had been using, which they designated by
the letter A, for another drum, a B drum, and consequently, we had to do the whole
job over again, after first reconstructing the connections in drum B, of course.31
Still, a catalog with 105,456 entries can be generated by hand, even twice.
It should be pointed out that the catalog doesn’t reveal how the plugboard is wired. The catalog
only helps find the order and positions of the rotors. Once the correct rotor setting is determined, we
still won’t get perfect plaintext out unless no plugboard cables were in use. The more cables in use,
the worse our result will be; however, unless 13 cables were used, some correct plaintext letters will be
revealed. Recall that, originally, exactly six cables were used. This, coupled with the use of cribs, and
the fact that no letter can be enciphered as itself, allows the plugboard to be reconstructed.
29 Kuhl, Alex, “Rejewski’s Catalog,” Cryptologia, Vol. 31, No. 4, October 2007, pp. 326–332, pp. 329–330 cited
here.
30 I’ve been referring to this as the reflector.
31 Rejewski, Marian, “How the Polish Mathematicians Broke Enigma,” Appendix D, in Kozaczuk, Wladyslaw,
Enigma: How the German Machine Cipher Was Broken, and How It Was Read by the Allies in World War Two,
Arms & Armour Press, London, UK, 1984, pp. 246–271, p. 264 cited here.
32 According to Kahn’s chronology. In contrast, the date of July 25 is given in Welchman, Gordon, The Hut Six
Story, McGraw-Hill, New York, 1982, p. 16, and in Rejewski, Marian, “Mathematical Solution of the Enigma
Cipher,” Cryptologia, Vol. 6, No. 1, January 1982, pp. 1–18, p. 17 cited here.
World War II: The Enigma of Germany ◾ 247
Enigma keys to England. Once there, however, in the words of David Kahn, “The British showed
their gratitude by excluding them from any further contact with codebreaking.”33
33 Kahn, David, “The Significance of Codebreaking and Intelligence in Allied Strategy and Tactics,” Cryptologia,
Vol. 1, No. 3, July 1977, pp. 209–222.
34 http://www-groups.dcs.st-and.ac.uk/~history/Mathematicians/Turing.html.
35 Hodges, Andrew, Alan Turing: The Enigma, Simon and Schuster, New York, 1983, p. 26.
248 ◾ Secret History
This says more about the failings of public schools than about Turing. The Headmaster seemed to
think education meant becoming “familiar with the ideas of authority and obedience, of coopera-
tion and loyalty, of putting the house and the school above your personal desires.”36 He was later
to complain of Turing, who did not buy into this, “He should have more esprit de corps.”37 The
spirit at Sherborne was perhaps summarized by Turing’s form-master for the fall of 1927: “This
room smells of mathematics! Go out and fetch a disinfectant spray!” Turing later observed that38
36 Hodges, Andrew, Alan Turing: The Enigma, Simon and Schuster, New York, 1983, p. 22.
37 Hodges, Andrew, Alan Turing: The Enigma, Simon and Schuster, New York, 1983, p. 24.
38 Hodges, Andrew, Alan Turing: The Enigma, Simon and Schuster, New York, 1983, p. 381.
World War II: The Enigma of Germany ◾ 249
The great thing about a public school education is that afterwards, however miserable
you are, you know it can never be quite so bad again.
Turing educated himself by reading Einstein’s papers on relativity and Eddington’s The Nature of
the Physical World. He seems to have encountered one decent teacher before graduating and mov-
ing on to King’s College, Cambridge, in 1931. This teacher wrote:39
All that I can claim is that my deliberate policy of leaving him largely to his own
devices and standing by to assist when necessary, allowed his natural mathematical
genius to progress uninhibited.
In 1933, Turing joined the Anti-War Council, telling his mother in a letter, “Its programme is
principally to organize strikes amongst munitions and chemical workers when government intends
to go to war. It gets up a guarantee fund to support the workers who strike.”40 The group also pro-
tested films such as Our Fighting Navy, which Turing called “blatant militarist propaganda.”41
Turing graduated from King’s College in 1934. In 1935, he attended a course that dealt with
a result found by the Austrian logician Kurt Gödel and an open question that traced back to the
German mathematician David Hilbert. Gödel had shown, in 1931, that in any axiomatic system
sufficient to do arithmetic, there will always be statements that can be neither proven nor disproven.
That is, such statements are independent of the axioms. This is very disappointing and it is known
as Gödel’s incompleteness theorem, because it shows that mathematics will always be incomplete
in a sense. The question of decidability asked if there was a way, ideally an efficient way, to identify
which statements fall into this unfortunate category. That is, can we decide whether or not a given
statement is provable (in the system under consideration)? This was known as the decision problem,
although more commonly referred to by its German name, the Entscheidungsproblem. It represented
a generalization of Hilbert’s 10th problem, from a famous talk he gave in 1900, in which he described
the most important problems in mathematics for the coming century.
Turing began working on the Entscheidungsproblem at this time, but his dissertation was on
another topic, the reasons behind why so many phenomena follow a Gaussian distribution. He
proved the central limit theorem (seen in every college-level course in probability), but later learned
that his proof was not the first. Although another proof had been published slightly earlier, Turing
discovered his independently.
Turing then took on the decision problem in his landmark paper “On Computable Numbers,
with an Application to the Entscheidungsproblem.” It was submitted in 1936 and published in 1937.
Like Gödel’s incompleteness theorem, the answer was disappointing. Turing proved that there can
be no general process for determining if a given statement is provable or not. And, again, Turing’s
proof was not the first. Alonzo Church had established this result shortly before him, with a dif-
ferent manner of proof.42
However, there is a silver lining. Turing’s proof differed from Church’s. It included a descrip-
tion of what is now known as a Turing machine. The theoretical machine read symbols from a tape
39 Hodges, Andrew, Alan Turing: The Enigma, Simon and Schuster, New York, 1983, p. 32.
40 Hodges, Andrew, Alan Turing: The Enigma, Simon and Schuster, New York, 1983, p. 71.
41 Hodges, Andrew, Alan Turing: The Enigma, Simon and Schuster, New York, 1983, p. 87.
42 Church’s paper saw print in 1936. The citation is Church, Alonzo, “An Unsolvable Problem of Elementary
Number Theory,” American Journal of Mathematics, Vol. 58, No. 2, April 1936, pp. 345–363. Also see Church,
Alonzo, “A Note on the Entscheidungsproblem,” The Journal of Symbolic Logic, Vol. 1, No. 1, March 1936, pp.
40–41.
250 ◾ Secret History
and could also delete or write symbols on the tape as well. A computable number was defined to
be a real number whose decimal expansion could be produced by a Turing machine starting with a
blank tape. Because only countably many real numbers are computable and there are uncountably
many real numbers, there exists a real number that is not computable. This argument was made
possible by Georg Cantor’s work on transfinite numbers. Turing described a number which is not
computable, remarking that this seemed to be a paradox, because he had apparently described in
finite terms, a number that cannot be described in finite terms. The answer was that it is impossible
to decide, using another Turing machine, whether a Turing machine with a given table of instruc-
tions will output an infinite sequence of numbers.43 The Turing machine provides a theoretical
foundation for modern computers. Thus, it’s one of the key steps leading to the information age.
Turing also discussed a universal machine in his 1936 paper. It is a machine:44
… which can be made to do the work of any special-purpose machine, that is to say to
carry out any piece of computing, if a tape bearing suitable “instructions” is inserted into it.
Turing then began studying at Princeton. Systems of Logic Based on Ordinals (1939) was the main
result of this work.
Figure 7.15 The mansion at Bletchley Park. (Creative Commons Attribution-Share Alike 4.0
International license, by Wikipedia user DeFacto, https://en.wikipedia.org/wiki/File:Bletchley_
Park_Mansion.jpg)
When war was declared in 1939, Turing moved to the Government Code and Cypher45 School
(GCCS) at Bletchley Park (Figure 7.15).46 Bletchley eventually grew to employ about 10,000 peo-
ple. The vast majority were women.47 His work there in breaking some of the German ciphers,
43 http://www-groups.dcs.st-and.ac.uk/~history/Mathematicians/Turing.html.
44 Newman, Maxwell Herman Alexander, “Alan Mathison Turing, 1912-1954,” Biographical Memoirs of Fellows
of the Royal Society of London, Vol. 1, November 1955, pp. 253–263, pp. 257–258 quoted here, available online
at https://royalsocietypublishing.org/doi/pdf/10.1098/rsbm.1955.0019.
45 This is how the British spell Cipher.
46 Also known as Station X, as it was the tenth site acquired by MI-6 for its wartime operations.
47 Smith Michael, The Emperor’s Codes: The Breaking of Japan’s Secret Ciphers, Penguin Books, New York, 2002, p. 2.
World War II: The Enigma of Germany ◾ 251
provided the Allies with information that saved many lives. It has been estimated that this short-
ened the war by about two years.48 It was also a lot of fun. Turing remarked,49
Before the war my work was in logic and my hobby was cryptanalysis and now it is
the other way round.
The cryptanalysts at Bletchley Park had been working on cracking Enigma prior to learning of the
successes of the Polish mathematicians. They had been stymied by the wiring from the plugboard
to the rotor assembly, never having considered that it simply took each letter to itself. Despite his
earlier impressive academic work, not even Turing considered this possibility. His colleague and
fellow mathematician, Gordon Welchman, was furious upon learning how simple the answer was.
Turing redesigned the Polish bomba, and then Welchman made further improvements. The
machine was now known as a bombe. This seems to have simply been a modification of the Polish
name, but nobody knows why the Poles chose the name in the first place. According to one of
several stories, it was because it ticked while in operation, like a time bomb.50 The ticking stopped
when a potential solution arose.
Stories of Turing’s eccentricities circulated at Bletchley; for example, beginning each June,
he would wear a gas mask while bicycling. This was to keep the pollen out; he suffered from hay
fever. The bicycle was also unusual. Periodically a bent spoke would touch a particular link and
action would have to be taken to prevent the chain from coming off. Turing kept track of how
many times the bicycle’s wheel had turned and would stop the bike and reset the chain, before it
came off.51
Still, Turing and the other eccentrics at Bletchley were very good at what they did. By 1942,
the deciphered messages totaled around 50,000 per month;52 however, in February 1942, the
German Navy put a four-rotor Enigma into use. This caused the decipherments to come to an
immediate halt and the results were bloody.
Kahn convincingly illustrated the importance of Enigma decipherments in the Atlantic naval
war by comparing sinkings of Allied ships in the second half of 1941, when Enigma messages were
being read, with the second half of 1942, when the messages went unsolved. The respective figures
for tons sunk are 600,000 vs. 2,600,000.53 Kahn also puts a human face on these figures:54
And each of the nearly 500 ships sunk in those six months meant more freezing deaths
in the middle of the ocean, more widows, more fatherless children, less food for some
toddler, less ammunition for some soldier, less fuel for some plane—and the prospect
of prolonging these miseries.
Reading Enigma messages also allowed the allies to sink Axis convoys in greater numbers. Rommel
was greatly handicapped in Africa by the lack of much needed gasoline due to these sinkings. At
48 Smith Michael, The Emperor’s Codes: The Breaking of Japan’s Secret Ciphers, Penguin Books, New York, 2002, p. 3.
49 Hodges, Andrew, Alan Turing: The Enigma, Simon and Schuster, New York, 1983, pp. 214–215.
50 There are several different stories given to explain why the machines were called bombes. There is no consensus
as to which is correct.
51 Hodges, Andrew, Alan Turing: The Enigma, Simon and Schuster, New York, 1983, p. 209.
52 Hodges, Andrew, Alan Turing: The Enigma, Simon and Schuster, New York, 1983, p. 237.
53 Kahn, David, Seizing the Enigma, Houghton Mifflin Company, Boston, Massachusetts, 1991, pp. 216–217.
54 Kahn, David, Seizing the Enigma, Houghton Mifflin Company, Boston, Massachusetts, 1991, p. 217.
252 ◾ Secret History
one point he sent a sarcastic message of thanks to Field-Marshal Kesselring for the few barrels that
washed ashore from wrecks of a convoy.55
Figure 7.16 Former National Cryptologic Museum curator Patrick Weadon with a Lorenz
machine. This machine dwarfs the M-209 and is also significantly larger than the Enigma.
For a description of how the Lorenz SZ machines operated and how the British cryptanalysts
broke them, the reader is referred to items listed in the References and Further Reading portion of
this chapter. Due to its great historical importance, I will mention that the cryptanalysis of these
ciphers marked a key moment in the history of computer science.
Colossus, shown in Figure 7.18, was constructed by Tommy Flowers, a telegraph engineer, to
defeat the Lorenz ciphers. It saw action by D-Day, June 6, 1944 and is now considered the first
programmable electronic computer. It was long believed that ENIAC was first, as the existence of
Colossus remained classified until the 1970s! A Colossus has since been rebuilt, by a team led by
Tony Sale, and is on display at Bletchley Park.57
55 Winterbotham, Frederick William, The Ultra Secret, Harper & Row, New York, 1974, p. 82 (p. 122 of the
paperback edition).
56 The “SZ” in the name is short for Schlüsselzusatz, which translates to “cipher attachment.” These machines were
Figure 7.18 Nazi ciphers being broken by the first programmable computer, Colossus. (https://
en.wikipedia.org/wiki/Colossus_computer#/media/File:Colossus.jpg.)
Bletchley also has a bombe that was rebuilt by a team led by John Harper. Again, no originals
survived in England to the present day. One U.S. Navy bombe, built in Dayton, Ohio, was pre-
served and is now on display at the National Cryptologic Museum.
I feel that intelligence was a vital factor in the Allied victory – I think that without it
we might not have won, after all.
—Harold Deutsch58
58 Kahn, David, “The Significance of Codebreaking and Intelligence in Allied Strategy and Tactics,” Cryptologia,
Vol. 1, No. 3, July 1977, pp. 209–222, p. 221 cited here.
254 ◾ Secret History
There are other experts who share Deutsch’s opinion. Among experts who believe we would have
won without the cryptanalysts, the consensus is that the victory would have arrived at least a
year later, perhaps two. Certainly, a tremendous number of lives were saved by the work of the
cryptanalysts.
These cryptanalytic heroes had to wait in silence for decades before their contributions could
be revealed. Eventually some honors were bestowed upon them. We saw the Polish postage stamp
earlier in this chapter. Other countries eventually issued special stamps, as well. In addition to a
monument in Poland (Figure 7.19), there are cryptologic museums in England and America. This
is good. We should pay our respects in this manner, but this is not the reward of the cryptanalysts.
Look closely the next time you see a crowd of people at a sporting event or at a shopping center
and know that many of them would never have existed, if the war had lasted a year or two longer.
How many of our parents, grandparents, and great-grandparents made it back from the war, but
wouldn’t have, if it had gone on longer? How many of us have forefathers who weren’t called to
duty because they were a little bit too young? How many more hospital beds would be filled with
the horribly wounded? How many more Holocaust victims would there have been? The reward
of the cryptanalysts is pride in knowing their work made an incredible difference in the lives of
those around them.
Figure 7.19 Monument in Poznan, Poland, honoring Rejewski, Różycki, and Zygalski. (From
Grajek, Marek, “Monument in Memoriam of Marian Rejewski, Jerzy Różycki and Henryk
Zygalski Unveiled in Poznań,” Cryptologia, Vol. 32, No. 2, April 2008, pp. 101-103, p. 103 cited
here.)
World War II: The Enigma of Germany ◾ 255
59 Parrish, Thomas, The Ultra Americans, Stein and Day, Briarcliff Manor, New York, 1986. A trade paperback
edition of this book published in 1991 by Scarborough House, Chelsea, Michigan, bears the title The American
Codebreakers.
60 The best list of extant Enigmas is available online at http://enigmamuseum.com/dhlist.xls.
61 More information on the National Cryptologic Museum can be found at http://www.nsa.gov/about/
cryptologic_heritage/museum/.
62 http://www-groups.dcs.st-and.ac.uk/~history/Mathematicians/Turing.html.
63 Hodges, Andrew, Alan Turing: The Enigma, Simon and Schuster, New York, 1983, p. 386.
64 Hodges, Andrew, Alan Turing: The Enigma, Simon and Schuster, New York, 1983, p. 11.
256 ◾ Secret History
The Turing test was introduced, in which questions are posed and the examiner is to decide
whether a person or a machine is answering him.65 Turing wrote66
…I believe that at the end of the century the use of words and general educated opin-
ion will have altered so much that one will be able to speak of machines thinking
without expecting to be contradicted.
These ideas must have seemed absurd to many in 1950 when “it was not unknown for computer
users to be sweltering in 90° F heat, and banging the racks with a hammer to detect loose valves.”67
In 1952, Turing, attempting to evade blackmail, gave the police details of a homosexual affair he
had. He was charged with “gross indecency” for violating the British homosexuality statutes. On the
advice of his counsel, he decided to not offer a defense and pled guilty. He likely made a defense that
he saw nothing wrong in his actions, but not to the judge. Turing was found guilty and given a choice
of prison or a year of estrogen treatment (as a subcutaneous implant). He took the estrogen treat-
ment. After this incident, some speculate that Turing was viewed as a security risk. His clearance was
withdrawn and his foreign colleagues were investigated. Hodge’s biography of Turing indicates that
Turing was fairly open about his sexuality, although Turing’s nephew, Sir Dermot Turing, states that
Alan was only open about it in safe company, with those who were very close to him. Alan’s older
brother John had no idea for 40 years.68 It is possible that some of the powers that be knew Turing
was gay, but chose to ignore the fact, because he was so useful in the war effort.
Turing died in 1954 of potassium cyanide poisoning. He had been conducting electrolysis
experiments and cyanide was found on a half-eaten apple beside him. It was ruled a suicide, but
Turing’s mother maintained that it was an accident.69
In September 2009, Gordon Brown, the Prime Minister of England, offered a formal apology to
Turing. While this pleased many, mathematics professor Steve Kennedy had a different reaction:70
I didn’t feel elated and I wondered if there was something wrong with me. Oh sure, I
recognized that this was a good and necessary step, but I couldn’t help but feel that it
was not proportionate. The British government, in the name of the British people, tor-
tured this good and decent man (and thousands of others) because they disapproved of
his sexual habits. Now, half a century later, they offer only words of regret. Maybe I’d
feel better if Gordon Brown vowed not to rest until gay marriage was legal in Britain.
Of course in Britain today the legal status of gays is a thousand times better than in
the US, so maybe Brown could work on educating America? How about passing a
heavy tax—the Turing Tariff—on all computing hardware and software imported
into the UK from countries, like the US, that still discriminate against homosexuals
by banning gay marriage? The proceeds of the tariff could be donated, in the name
of Alan Mathison Turing, to the leading gay rights organizations in the exporting
country.
65 Philip K. Dick made use of this test in his novel Do Androids Dream of Electric Sheep?, which was later made into
the film Bladerunner. The book is, of course, superior. There are also three humorous installments of the comic strip
“Dilbert” that deal with Turing tests. They can be viewed online at http://search.dilbert.com/comic/Turing%20Test.
66 http://www-groups.dcs.st-and.ac.uk/~history/Quotations/Turing.html.
67 Hodges, Andrew, Alan Turing: The Enigma, Simon and Schuster, New York, 1983, p. 402.
68 Turing, Sir Dermot, email to the author, June 22, 2018.
69 http://www-groups.dcs.st-and.ac.uk/~history/Mathematicians/Turing.html.
70 Kennedy, Steve, “A Politician’s Apology,” Math Horizons, Vol. 17, No. 2, November 2009, p. 34.
World War II: The Enigma of Germany ◾ 257
Kennedy knew his suggestions wouldn’t be adopted, but his point was well made. Actions speak
louder than words.
I’ll try to lighten the mood a little bit by closing this chapter with an anecdote. This and sev-
eral other humorous incidents are recounted in Neal Stephenson’s Cryptonomicon. Although it is a
work of fiction, many real events have been incorporated into the book. Alan Turing is one of the
characters in the book and it is obvious that Stephenson read Hodge’s biography of Turing and
that it had a big influence on parts of the novel.
At one point during the war, Turing enrolled in the infantry section of the Home Guard. He
had to complete forms to do this. One question asked, “Do you understand that by enrolling in
the Home Guard you place yourself liable to military law?” Turing saw no advantage in writing
yes, so he wrote no. Of course, nobody ever looked closely at the form. He was taught how to
shoot, became very good at it, and then no longer having any use for the Home Guard, moved on
to other things. He was eventually summoned to explain his absence. Turing explained that he
only joined to learn how to shoot and that he was not interested in anything else. He did not want,
for example, to attend parades. The conversation progressed as follows:
“But it is not up to you whether to attend parades or not. When you are called on
parade, it is your duty as a soldier to attend.”
“But I am not a soldier.”
“What do you mean, you are not a soldier! You are under military law!”
“You know, I rather thought this sort of situation could arise. I don’t know I am under
military law. If you look at my form you will see that I protected myself against this
situation.”
He was right and nothing could be done.
Grey, Christopher, “From the Archives: Colonel Butler’s Satire of Bletchley Park,” Cryptologia, Vol.
38, No. 3, July 2014, pp. 266–275.
Hinsley, Francis Harry and Alan Stripp, editors, Codebreakers: The Inside Story of Bletchley Park, Oxford
University Press, Oxford, 1993.
Hodges, Andrew, Alan Turing: The Enigma, Simon and Schuster, New York, 1983. Hodges, able to relate to
Turing on many levels, is an ideal biographer. Like Turing, Hodges is a mathematician, a homosexual,
an atheist, and British. More mathematical detail is provided in this biography than one could fairly
expect from anything written by a non-mathematician. If you want to learn more about Turing’s life
and work, start here. The 2014 film The Imitation Game71 was based on this book, but it introduced
many errors that Hodges would never have made.
Kahn, David, Seizing the Enigma, Houghton Mifflin Company, Boston 1991.
Kahn, David, “An Enigma Chronology,” Cryptologia, Vol. 17, No. 3, July 1993, pp. 237–246. This is a very
handy reference for anyone wishing to keep the dates and details straight when writing on or speaking
about Enigma.
Kenyon, David and Frode Weierud, “Enigma G: The Counter Enigma,” Cryptologia, Vol. 44, No. 5,
September 2020, pp. 385–420.
Körner, Thomas William, The Pleasure of Counting, Cambridge University Press, Cambridge, UK, 1996.
Part IV of this book takes a look at Enigma cryptanalysis.
Kuhl, Alex, “Rejewski’s Catalog,” Cryptologia, Vol. 31, No. 4, October 2007, pp. 326–331. This paper won
one of Cryptologia’s undergraduate paper competitions.
Lasry, George, Nils Kopal, and Arno Wacker, “Ciphertext-only Cryptanalysis of Hagelin M-209 Pins and
Lugs,” Cryptologia, Vol. 40, No. 2, March 2016, pp. 141–176.
Lasry, George, Nils Kopal, and Arno Wacker, “Automated Known-Plaintext Cryptanalysis of Short Hagelin
M-209 Messages,” Cryptologia, Vol. 40, No. 1, pp. 49–69.
Lasry, George, Nils Kopal, and Arno Wacker, “Ciphertext-only Cryptanalysis of Short Hagelin M-209
Ciphertexts,” Cryptologia, Vol. 42, No. 6, November 2018, pp. 485–513.
Lasry, George, Nils Kopal, and Arno Wacker, “Cryptanalysis of Enigma Double Indicators with Hill
Climbing,” Cryptologia, Vol. 43, No. 4, July 2019, pp. 267–292.
List, David and John Gallehawk, “Revelation for Cilli’s,” Cryptologia, Vol. 38, No. 3, July 2014, pp. 248–265.
Marks, Philip, “Enigma Wiring Data: Interpreting Allied Conventions from World War II,” Cryptologia,
Vol. 39, No. 1, January 2015, pp. 25–65.
Marks, Philip, “Mr. Twinn’s bombes,” Cryptologia, Vol. 42, No. 1, January 2018, pp. 1–80.
Miller, Ray, The Cryptographic Mathematics of Enigma, revised edition, Center for Cryptologic History,
National Security Agency, Fort George G. Meade, Maryland, 2019. This is available at no charge at
the National Cryptologic Museum in print form, as well as online at https://tinyurl.com/y635qumc.
Muggeridge, Malcolm, Chronicles of wasted Time. Chronicle 2: The Infernal Grove, Collins, London, 1973.
This is a pre-Winterbotham revelation that the allies broke Nazi ciphers during World War II.
Ostwald, Olaf and Frode Weierud, “History and Modern Cryptanalysis of Enigma’s Pluggable Reflector,”
Cryptologia, Vol. 40, No. 1, January 2016, pp. 70–91.
Ostwald, Olaf and Frode Weierud, “Modern Breaking of Enigma Ciphertexts,” Cryptologia, Vol. 41, No. 5,
September 2017, pp. 395–421.
Parrish, Thomas, The Ultra Americans, Stein and Day, Briarcliff Manor, New York, 1986. A trade paperback
edition of this book published in 1991 by Scarborough House, Chelsea, Michigan, bears the title The
American Codebreakers.
Randell Brian, The Colossus, Technical Report Series No. 90, Computing Laboratory, University of
Newcastle upon Tyne, 1976.
Randell, Brian, “Colossus: Godfather of the Computer,” New Scientist, Vol. 73, No. 1038, February 10,
1977, pp. 346–348.
Rejewski, Marian, “How Polish Mathematicians Deciphered the Enigma,” Annals of the History of
Computing, Vol. 3, No. 3, July 1981, pp. 213–234.
71 https://www.imdb.com/title/tt2084970/?ref_=fn_al_tt_1.
World War II: The Enigma of Germany ◾ 259
Rejewski, Marian, “Mathematical Solution of the Enigma Cipher,” Cryptologia, Vol. 6, No. 1, January
1982, pp. 1–18.
Rejewski, Marian, Memories of My Work at the Cipher Bureau of the General Staff Second Department 1930-
1945, Adam Mickiewicz University, Poznań, Poland, 2011.
Sebag-Montefiore, Hugh, Enigma: The Battle for the Code, The Folio Society, London, UK, 2005. This book
was first published by Weidenfeld & Nicolson, London, UK, 2000.
Stevenson, William, A Man Called Intrepid, Harcourt Brace Jovanovich, New York, 1976. A paperback edi-
tion is from Ballantine Books, New York, 1977.
Sullivan, Geoff, Geoff’s Crypto page, http://www.hut-six.co.uk/. This page has links to emulators for vari-
ous cipher machines, including several versions of the Enigma.
Teuscher, Christof, editor, Alan Turing: Life and Legacy of a Great Thinker, Springer, New York, 2004.
Thimbleby, Harold, “Human Factors and Missed Solutions to Enigma Design Weaknesses,” Cryptologia,
Vol. 40, No. 2, March 2016, pp. 177–202.
Turing, Sara, Alan M. Turing, W. Heffer & Sons, Ltd., Cambridge, UK, 1959.
Turing, Sara, Alan M. Turing: Centenary Edition, Cambridge University Press, Cambridge, UK, 2012. This
special edition of the long out-of-print title listed above commemorates what would have been Alan
Turing’s 100th birthday in 2012. It includes a new foreword by Martin Davis and a never-before-
published memoir by John F. Turing, Alan’s older brother.
Vázquez, Manuel and Paz Jiménez-Seral, “Recovering the Military Enigma Using Permutations–Filling in
the Details of Rejewski’s Solution,” Cryptologia, Vol. 42, No. 2, March 2018, pp. 106–134.
Weierud, Frode and Sandy Zabell, “German Mathematicians and Cryptology in WWII,” Cryptologia, Vol.
44, No. 2, March 2020, pp. 97–171.
Welchman, Gordon, The Hut Six Story, McGraw-Hill, New York, 1982.
Wik, Anders, “Enigma Z30 Retrieved,” Cryptologia, Vol. 40, No. 3, May 2016, pp. 215–220.
Winterbotham, Frederick William, The Ultra Secret, Harper & Row, New York, 1974.
Despite what you might read elsewhere, this was not the first public revelation that the allies had bro-
ken Nazi ciphers during World War II. Winterbotham claimed that Ultra revealed that the Germans
would be bombing Coventry, but Churchill declined an evacuation order for fear that the Germans
would take it as a sign that the Brits had inside information, perhaps from a compromised cipher! If
the Germans replaced Enigma or modified it in such a way as to shut out the codebreakers, the loss
would far exceed the damage in terms of lives and material sacrificed at Coventry. This claim was sup-
ported by Anthony Cave Brown (Bodyguard of Lies) and William Stevenson (A Man Called Intrepid)
but has not stood up to the scrutiny of other historians.72 Brown knew that Enigma had been broken
before Winterbotham revealed it, but he did not get his own book out until after The Ultra Secret.73
Wright, John, “Rejewski’s Test Message as a Crib,” Cryptologia, Vol. 40, No. 1, January 2016, pp. 92–106.
Wright, John, “A Recursive Solution for Turing’s H-M Factor,” Cryptologia, Vol. 40, No. 4, July 2016, pp.
327–347.
Wright, John, “The Turing Bombe Victory and the First Naval Enigma Decrypts,” Cryptologia, Vol. 41, No.
4, July 2017, pp. 295–328.
Wright, John, “Rejewski’s Equations: Solving for the Entry Permutation,” Cryptologia, Vol. 42, No. 3, May
2018, pp. 222–226.
Video
There are many videos, aimed at a general audience, describing World War II codebreaking. Just one of
these is singled out here, for it contains information on the American construction of bombes (Figure 7.20)
in Dayton, Ohio, an important topic mentioned only very briefly in this book. The focus of the video is on
72 See Evans, N. E., “Air Intelligence and the Coventry Raid,” Royal United Service Institution Journal, September
1976, pp. 66–73 for an early refutation.
73 See Parish, Thomas, The Ultra Americans, Stein and Day, Briarcliff Manor, New York, 1986, p. 287.
260 ◾ Secret History
Figure 7.20 René Stein, former National Cryptologic Museum librarian, in front of an American-
made bombe that is preserved in the museum’s collection.
Joe Desch, who was responsible for much of the success in Dayton. Decades after carrying out his wartime
work, Desch was inducted into NSA’s Hall of Honor.
Dayton Codebreakers, The Dayton Codebreakers Project, 2006, 56 minutes. See http://www.daytoncode-
breakers.org/, a website maintained by Desch’s daughter, Debbie Anderson, for much more information.
Ordering instructions for the DVD can be found at https://daytoncodebreakers.org/video/dvds/.
Equipment
An Enigma machine made with modern technology (Figure 7.21) shows the positions of the simulated
rotors digitally. These are available for sale in kit form—some assembly required!
Figure 7.21 Enigma machine made with modern technology. (Simons, Marc and Paul Reuvers,
Crypto Museum, http://www.cryptomuseum.com/kits/.)
Chapter 8
This chapter examines how American cryptanalysts broke Japanese diplomatic ciphers during
World War II, while some of the United States’ own communications were protected by being
“enciphered” using the natural language of the Navajo, as spoken by the Indians themselves, with
some code words mixed in. But before we get to these topics, we take a look at a potential case of
steganography.
Figure 8.1 Advertisement appearing in The New Yorker, November 22, 1941.
261
262 ◾ Secret History
Flipping ahead to page 86 of the magazine, we find the rest of the ad (Figure 8.2).
Figure 8.2 Continuation of advertisement appearing in The New Yorker, November 22, 1941.
Beneath this second image is the following text and image (see also Figure 8.3).
We hope you’ll never have to spend a long winter’s night in an air-raid shelter, but we
were just thinking… it’s only common sense to be prepared. If you’re not too busy
between now and Christmas, why not sit down and plan a list of the things you’ll
want to have on hand. …Canned goods, of course, and candles, Sterno, bottled water,
sugar, coffee or tea, brandy, and plenty of cigarettes, sweaters and blankets, books or
magazines, vitamin capsules… and though it’s no time, really, to be thinking of what’s
fashionable, we bet that most of your friends will remember to include those intrigu-
ing dice and chops which make Chicago’s favorite game.
Figure 8.3 Image appearing below advertisement appearing in The New Yorker, November 22,
1941.
Cryptologic War against Japan ◾ 263
Did you notice that the dice show 12-07 (December 7)? Was it meant to serve as a warning?
Perhaps for Japanese living in America? Federal Bureau of Investigation (FBI) agent Robert L.
Shivers wondered the same thing and on January 2, 1942 sent a radiogram to FBI Director J.
Edgar Hoover posing the question. Other inquiries followed from government employees, as well
as private citizens. The investigation that followed revealed that it was simply a coincidence.1
1 Kruh, Louis, “The Deadly Double Advertisements - Pearl Harbor Warning or Coincidence?” Cryptologia, Vol.
3, No. 3, July 1979, pp. 166–171.
2 Rowlett, Frank B., The Story of Magic: Memoirs of an American Cryptologic Pioneer, Aegean Park Press, Laguna
Hills, California, 1989, p. 17.
3 Rowlett, Frank B., The Story of Magic: Memoirs of an American Cryptologic Pioneer, Aegean Park Press, Laguna
Hills, California, 1989, pp. 6–35.
264 ◾ Secret History
In this ciphertext, A, E, I, O, U, and Y make up 39% of the total and their distribution in
the ciphertext resembles that of vowels in plaintext. Hence, it appears that the encryption process
takes vowels to vowels and consonants to consonants, but both the frequencies of individual letters
and the index of coincidence indicate that it is not simply a monoalphabetic substitution cipher.
After some further study, we find what appear to be isomorphs (identical plaintexts enciphered
differently)7:
4 Also known as 91-shiki-obun In-ji-ki, which translates to “alphabetical typewriter 91.” The 91 refers to the year
it was first developed, 2591 by the Japanese reckoning, 1931 for the Americans.
5 Originally, it was referred to in conversation and official reports as “A Machine,” but Friedman realized this was
poor security, as it was so close to the Japanese name for the device. The color code name was settled on after much
discussion, and the other colors of the spectrum were assigned to other machines, including Enigma, Kryha, and
Hagelin. (From Kruh, Louis, “Reminiscences of a Master Cryptologist,” Cryptologia, Vol. 4, No. 1, January 1980,
pp. 45–50, p. 49 cited here.) Also, the binders in which information on these messages were kept were red.
6 Found, along with the solution I provide, in Deavours, Cipher and Louis Kruh, Machine Cryptography and
Modern Cryptanalysis, Artech House, Inc., Dedham, Massachusetts, 1985, pp. 213–215.
7 In general, two strings of characters are isomorphic if one can be transformed into the other via a monoalpha-
betic substitution. Isomorph attacks have been used against various other cipher systems, including the Hebern
machine. See Deavours, Cipher A., “Analysis of the Hebern Cryptograph Using Isomorphs,” Cryptologia, Vol.
1, No. 2, April 1977, pp. 167–185.
Cryptologic War against Japan ◾ 265
We list the consonants, because they appear to be enciphered separately from the group of vowels:
BCDFGHJKLMNPQRSTVWXZ (20 consonants)
Looking at the longest set of three isomorphs, we make some interesting observations.
1. LNOLLIWQAVEMZIZS
PRYPPEBTUZIQDEDW
From any consonant in the top line to the one directly beneath it is distance 3 in our consonant
alphabet. This is not likely to be a coincidence!
2. VXOVVIHBAGEWKIKD
LNOLLIWQAVEMZIZS
For this pair, the distance is always 12. We start, for example, at V and move forward through the
consonants W, X, and Z, and then start back at the beginning of the alphabet and continue until
we arrive at L. Again, this is not likely to be a coincidence!
This attack, which should remind you of Kasiski’s attack on the Vigenère cipher (but using iso-
morphs instead of identical ciphertext segments), indicates that the basic consonant substitution
alphabet is simply shifted to create the various encipherment possibilities. Thus, our substitution
table should look something like this:
BCDFGHJKLMNPQRSTVWXZ Plaintext
1. BCDFGHJKLMNPQRSTVWXZ Cipher Alphabet 1
2. CDFGHJKLMNPQRSTVWXZB Cipher Alphabet 2
3. DFGHJKLMNPQRSTVWXZBC
4. FGHJKLMNPQRSTVWXZBCD
5. GHJKLMNPQRSTVWXZBCDF
6. HJKLMNPQRSTVWXZBCDFG
7. JKLMNPQRSTVWXZBCDFGH
8. KLMNPQRSTVWXZBCDFGHJ
9. LMNPQRSTVWXZBCDFGHJK
10. MNPQRSTVWXZBCDFGHJKL
11. NPQRSTVWXZBCDFGHJKLM
12. PQRSTVWXZBCDFGHJKLMN
13. QRSTVWXZBCDFGHJKLMNP
14. RSTVWXZBCDFGHJKLMNPQ
15. STVWXZBCDFGHJKLMNPQR
16. TVWXZBCDFGHJKLMNPQRS
17. VWXZBCDFGHJKLMNPQRST
18. WXZBCDFGHJKLMNPQRSTV
19. XZBCDFGHJKLMNPQRSTVW
20. ZBCDFGHJKLMNPQRSTVWX
However, our results above don’t specify that we start with what is labeled Cipher Alphabet 1. We may
start anywhere in this table. We can simply try each of the 20 possible start positions for the first line,
as if we were breaking a Caesar shift cipher by brute force. We delete the artificial spacing in groups of
five to save space and progress through our 20 cipher alphabets one at a time with each letter of text.
266 ◾ Secret History
Starting cipher:
EBJHEVAWRAUPVXOVVIHBAGEWKIKDYUCJEKBPOEKYGRYLU
1 -ZGD-P-NH––BGH-CB-KC-F-SF-CT––PT-SJV––M-GQ-H-
2 -XFC-N-MG––ZFG-BZ-JB-D-RD-BS––NS-RHT––L-FP-G-
3 -WDB-M-LF––XDF-ZX-HZ-C-QC-ZR––MR-QGS––K-DN-F-
4 -VCZ-L-KD––WCD-XW-GX-B-PB-XQ––LQ-PFR––J-CM-D-
5 -TBX-K-JC––VBC-WV-FW-Z-NZ-WP––KP-NDQ––H-BL-C-
6 -SZW-J-HB––TZB-VT-DV-X-MX-VN––JN-MCP––G-ZK-B-
7 -RXV-H-GZ––SXZ-TS-CT-W-LW-TM––HM-LBN––F-XJ-Z-
8 -QWT-G-FX––RWX-SR-BS-V-KV-SL––GL-KZM––D-WH-X-
9 -PVS-F-DW––QVW-RQ-ZR-T-JT-RK––FK-JXL––C-VG-W-
10 -NTR-D-CV––PTV-QP-XQ-S-HS-QJ––DJ-HWK––B-TF-V-
11 -MSQ-C-BT––NST-PN-WP-R-GR-PH––CH-GVJ––Z-SD-T-
12 -LRP-B-ZS––MRS-NM-VN-Q-FQ-NG––BG-FTH––X-RC-S-
13 -KQN-Z-XR––LQR-ML-TM-P-DP-MF––ZF-DSG––W-QB-R-
14 -JPM-X-WQ––KPQ-LK-SL-N-CN-LD––XD-CRF––V-PZ-Q-
15 -HNL-W-VP––JNP-KJ-RK-M-BM-KC––WC-BQD––T-NX-P-
16 -GMK-V-TN––HMN-JH-QJ-L-ZL-JB––VB-ZPC––S-MW-N-
17 -FLJ-T-SM––GLM-HG-PH-K-XK-HZ––TZ-XNB––R-LV-M-
18 -DKH-S-RL––FKL-GF-NG-J-WJ-GX––SX-WMZ––Q-KT-L-
19 -CJG-R-QK––DJK-FD-MF-H-VH-FW––RW-VLX––P-JS-K-
20 -BHF-Q-PJ––CHJ-DC-LD-G-TG-DV––QV-TKW––N-HR-J-
Now there’s a small surprise—we can’t simply read across any of the lines, as we could nor-
mally do if one of them were a plaintext only missing the vowels! Double checking reveals no
errors were made. Line 10 starts out promising, but then fizzles out. Taking a closer look at line
10, along with nearby lines shows that the word INTRODUCTION seems to begin on line 10, but
break off and continue on line 11.
7 -RXV-H-GZ—-SXZ-TS-CT-W-LW-TM—-HM-LBN–F-XJ-Z-
8 -QWT-G-FX—-RWX-SR-BS-V-KV-SL—-GL-KZM–D-WH-X-
9 -PVS-F-DW—-QVW-RQ-ZR-T-JT-RK—-FK-JXL–C-VG-W-
10 -NTR-D-CV—-PTV-QP-XQ-S-HS-QJ—-DJ-HWK–B-TF-V-
11 -MSQ-C-BT—-NST-PN-WP-R-GR-PH—-CH-GVJ–Z-SD-T-
12 -LRP-B-ZS—-MRS-NM-VN-Q-FQ-NG—-BG-FTH–X-RC-S-
13 -KQN-Z-XR—-LQR-ML-TM-P-DP-MF—-ZF-DSG–W-QB-R-
Another shift occurs later from line 11 to line 12. These shifts are referred to as a stepping
action. The mechanics of the Red cipher, which will be detailed momentarily, makes how this
happens clearer.
Filling in vowels and word breaks is now easy:
The word STOP shows the message to be in the style of telegraph traffic. We were even able to
complete the last word, which requires one more letter of ciphertext than was provided. One
can now go back and see that the vowel substitutions were made using the mixed order alphabet
AOEUYI with various shifts, like the consonant alphabet.
Cryptologic War against Japan ◾ 267
8.3.1 Orange
The Americans used the code name Orange for a variant of Red used as the Japanese naval attaché
machine. This machine enciphered kana syllables, rather than romaji letters. The first breaks for
the Americans came in February 1936.11 Lieutenant Jack S. Holtwick, Jr. of the U.S. Navy made
a machine that broke this naval variant.12 The British also attacked this system and had results
even earlier. Hugh Foss and Oliver Strachey broke it for the British in November 1934.13 The
8 Smith, Michael, The Emperor’s Codes: The Breaking of Japan’s Secret Ciphers, Penguin Books, New York, 2002,
pp. 46–47.
9 The Japanese called this romaji. It is still used.
10 Deavours, Cipher and Louis Kruh, Machine Cryptography and Modern Cryptanalysis, Artech House, Inc.,
Dedham, Massachusetts, 1985, The American Red machine is pictured on pages 216–217.
11 Smith, Michael, The Emperor’s Codes: The Breaking of Japan’s Secret Ciphers, Penguin Books, New York, 2002, p. 35.
12 Kahn, David, The Codebreakers, second edition, Scribner, New York, pp. 20 and 437. Also see Deavours,
Cipher and Louis Kruh, Machine Cryptography and Modern Cryptanalysis, Artech House, Inc., Dedham,
Massachusetts, 1985, p. 11.
13 Smith, Michael, The Emperor’s Codes: The Breaking of Japan’s Secret Ciphers, Penguin Books, New York, 2002,
pp. 34–35.
268 ◾ Secret History
successes were only temporary, however. The Japanese would eventually switch over to a more
secure machine, Angooki Taipu B (type B cipher machine).14
L M R
Stepping
The characters typed are immediately permuted by a plugboard, which then separates them
into two groups of unequal size—the sixes16 and the twenties. If a letter is among the sixes, it fol-
lows one of 25 paths through S, each of which is a permutation (among these six letters). The paths
are cycled through in order (controlled by a telephone switch, not a rotor, as in Red and Enigma).
The result is then fed through another plugboard to get the output.
If the letter typed does not become one of the sixes, it follows a path through R, M, and L
before returning to the plugboard on the left side of the schematic. R, M, and L each contain 25
permutations of the alphabet, but these are not simply cycled through in order; rather, a stepping
mechanism determines the rotation. The stepping mechanism takes its cue from S. The details fol-
low, but we first note that R, M, and L will differ in that one will switch slowly, another quickly
and the third at an intermediate (medium) value. This is in regards to how long it takes to cycle
14 Also known as 97-shiki-obun In-ji-ki, which translates to “alphabetical typewriter 97.” The 97 refers to the year
it was first developed, 2597 by the Japanese reckoning, 1937 for the Americans.
15 Of course, this division of the rainbow into seven colors in arbitrary. That particular value was chosen by Isaac
Newton to create a nice symmetry with the seven-note musical scale (A, B, C, D, E, F, G).
16 Unlike Red, the sixes in Purple weren’t necessarily vowels; they could be any six letters and they changed daily.
Actually, this innovation was one of the changes made in Red usage before its demise.
Cryptologic War against Japan ◾ 269
through all 25 permutations, not the speed of a single switch, which is the same for all. The speeds
for the switches are part of the key.
If we label the permutations for S, R, M, and L as 0 through 24, we then have:
If S is in position 23 and M is in position 24, the slow switch advances.
If S is in position 24, the medium switch advances.
In all other cases, the fast switch advances.
The period for Purple is a bit shorter than for the Enigma, which cycled through 16,900 permu-
tations before returning to the first. Purple cycled through 253 = 15,625. Of course, the sixes for
Purple had a much shorter cycle of 25.
As with the Enigma, the plugboard settings were determined in advance and the same for all
users on a particular day; however, the Japanese limited themselves to 1,000 different connections
(out of a possible 26!; the connections needn’t be reciprocal, as with Enigma).17 Like Enigma, part
of the Purple key for any given message was to be determined at random and sent as a header along
with the ciphertext. This was a five-digit number, selected initially from a list of 120 (later on
the list offered 240 choices), which stood for the initial positions of each switch and the stepping
switch. There was no mathematical relation between the five-digit number and the setting it stood
for. It was simply a code number. For example, 13579 meant the sixes switch starts at position 3
and the twenties switches start at positions 24, 8, and 25, while the 20-switch motion (assignment
of speeds slow, medium, and fast to particular switches) is 3-2-1.18 Of the 120 five digit codes, half
consisted only of odd digits and half only of even digits. It was a great mistake for the Japanese to
artificially restrict themselves to this tiny fraction of the initial settings. With 25 possible settings
for each of the twenties and the sixes, we have 254 = 390,625 possibilities.
The Japanese did make some wise choices in how Purple was used. The first was enciphering
the five digit key using an additive. Thus, messages using the same key would not bear identical
indicators. Another good decision was encoding messages, prior to running them through Purple,
but they did this with the commercially available Phillips Code—not the best choice!19
Because a large keyspace is essential for security, we should address the overall keyspace
for Purple. A variety of values may be given, depending on how we perform the calculation.
Determining the keyspace of Purple requires that several subjective choices be made; for example,
should one use all possible plugboard configurations as a factor, or limit it to the 1,000 that were
actually used? Should all possible initial positions of the switches be considered or only the 120
(later 240) the Japanese allowed? Should we count all possible internal wirings? The cryptanalysts
had to determine these, but the Japanese couldn’t change them once the machines were assembled,
so one could say they weren’t part of the key. Thus, it’s not surprising that a range of values for the
keyspace can be found in the literature. Stephen J. Kelley offers a figure of 1.8 × 10138, which is
much larger than for the three rotor Enigma.20 Mark Stamp, more conservative in assigning values
to various factors, came up with 2198 ≈ 4 × 1059 as the keyspace.21 Quite a difference! However,
17 Smith, Michael, The Emperor’s Codes: The Breaking of Japan’s Secret Ciphers, Penguin Books, New York, 2002,
p. 68.
18 Freeman, Wes, Geoff Sullivan, and Frode Weierud, “Purple Revealed: Simulation and Computer-Aided
Cryptanalysis of Angooki Taipu B,” Cryptologia, Vol. 27, No. 1, January 2003, pp. 1–43, p. 38 cited here.
19 Smith, Michael, The Emperor’s Codes: The Breaking of Japan’s Secret Ciphers, Penguin Books, New York, 2002,
p. 68.
20 Kelley, Stephen J., Big Machines, Aegean Park Press, Laguna Hills, California, 2001, p. 178
21 Stamp, Mark and Richard M. Low, Applied Cryptanalysis: Breaking Ciphers in the Real World, John Wiley &
even this lower value was sufficiently large to avoid a brute force attack. So, the keyspace was defi-
nitely large enough. This was not why Purple ultimately proved insecure.
22 Smith, Michael, The Emperor’s Codes: The Breaking of Japan’s Secret Ciphers, Penguin Books, New York, 2002,
pp. 70–71.
Cryptologic War against Japan ◾ 271
Figure 8.5 A pattern in the Purple alphabets. (From Budiansky, Stephen, Battle of Wits: The
Complete Story of Codebreaking In World War II, Free Press, New York, 2000, p. 354. With
permission.)
The example in Figure 8.5 shows one of the patterns in the Purple alphabets. Suppose the first
alphabet enciphers A as U, then when we come to the 26th alphabet (i.e., alphabet 1 of cycle 2), A
is enciphered as H. Then whatever is enciphered as U in alphabets 2 through 25 will be enciphered
as H in alphabets 2 through 25 of cycle 2. The same goes for all other letters. So, knowing alpha-
bets 1 through 25 of cycle 1 and the first alphabet of cycle 2 reveals the remaining 24 alphabets of
cycle 2. In real cryptanalysis, the process wouldn’t be so orderly. The cycles would get filled in like
a jigsaw puzzle with some values in each cycle known from cribs allowing corresponding values to
be filled in other cycles.
Another pattern Grotjan recognized (Figure 8.6) is even easier to grasp and exploit for crypt-
analytic purposes. The columns in the substitution table for cycle 2 are identical to those of cycle
1, but reordered. Which pattern held for a particular message depended on the location of the slow
switch during that message’s encipherment. Grotjan first found a pattern on September 20, 1940,
and Purple was completely broken on September 27, 1940. Thus ended 18 months of intense work.
Images of Rowlett, Rosen, and Grotjan are provided in Figure 8.7.
Following this feat, Friedman had a nervous breakdown that required several months off and
Leo Rosen led the construction of a “Purple analog” to simulate the Japanese machine (Figure 8.8).
A later American version is shown in Figure 8.9. These recreations were actually better than
the originals! They used brass for some contact points that, in the Japanese version, were made of
copper, which would wear with use and then tend to produce garbled text.23
23 Lewin, Ronald, The American Magic: Codes, Ciphers and the Defeat of Japan, Farrar Straus Giroux, New York,
p. 40 footnote.
272 ◾ Secret History
Figure 8.6 Another pattern in the Purple alphabets. (From Budiansky, Stephen, Battle of Wits:
The Complete Story of Codebreaking In World War II, Free Press, New York, 2000, p. 355. With
permission.)
Figure 8.8 A Purple analog built in 1940 without seeing the original. (Courtesy of the National
Security Agency, https://web.archive.org/web/20160325223400/http://www.nsa.gov/about/_
images/pg_hi_res/purple_analog.jpg.)
Figure 8.9 René Stein, former National Cryptologic Museum librarian, stands by the back of a
Purple Analog from 1944.
The Americans eventually had success in breaking the Japanese naval codes JN-25 as well
as subsequent codes such as JN-25a and JN-25b. A JN-25b message decoded on May 14, 1942
provided warning of a large invasion force heading to AF, where AF represented a still unbroken
portion of the message. Although there was no definite proof, Navy cryptanalyst Joseph Rochefort
believed that AF stood for Midway Island. He was supported in this by Admiral Nimitz, but back
in Washington, DC, the Aleutian Islands were believed to be the target. To test his conjecture,
Rochefort had a message sent in plain language24 from Midway saying that their desalination
plant had broken. Following this, a Japanese message encoded in JN-25b was soon intercepted that
included the unknown code group AF. After plugging in the known code groups and translating,
the message stated that AF was short of water. Thus, the conjecture was proven.25 Yamamoto was
indeed planning to invade Midway, and when he did on June 4, the U.S. Navy was ready. The
battle of Midway proved to be the turning point of the Pacific war. Prior to Midway, the United
States had never had a victory, and following Midway, she never saw defeat.
Yamamoto suffered a much more personal defeat when a message encoded with the latest version
of JN-25, and sent on April 13, 1943, provided the American codebreakers with his itinerary for the
next five days. This itinerary brought him close to the combat zone and Nimitz made the decision to
attempt to shoot down his plane. It was risky, for if the Japanese realized how the U.S. Navy knew
where to find Yamamoto, their naval codes would be promptly changed. On April 18, Major John
Mitchell led a formation of 16 P-38 Lightnings, which included four ace pilots. They carried out their
mission successfully. Because of the tremendous importance of Admiral Yamamoto to the Japanese,
David Kahn described his assassination as “the equivalent of a major victory.”26
The decoded message that resulted in Yamamoto’s death was translated by Marine Corps
Captain Alva Lasswell. Despite being a farmboy from Arkansas with only an eighth-grade educa-
tion, Lasswell, as a Marine, traveled to Japan, where he mastered the language. Tom “Captain T”
Hunnicutt commemorated Lasswell’s accomplishments by dubbing him “The Sigint Sniper” in a
song of the same title.27
Reading Purple kept the Allies informed of German plans as well as Japanese. Ōshima
Hiroshi, the Japanese military attaché in Berlin, sent detailed messages concerning Nazi plans
back to Tokyo. Carl Boyd devoted a book to the study of intelligence about The Nazis obtained
from Hiroshi.28 A few paragraphs are reproduced below from messages Hiroshi sent on November
10, 1943 (nearly six months before the June 6, 1944, D-Day invasion) enciphered with Purple.29
All of the German fortifications on the French coast are very close to the shore and it
is quite clear that the Germans plan to smash any enemy attempt to land as close to
the edge of the water as possible.
The Strait of Dover area is given first place in the German Army’s fortification
scheme and troop dispositions, and Normandy and the Brittany peninsula come next.
Other parts of the coast are regarded with less significance. Although the possibility
24 In some accounts, the message was enciphered, but in a weak system, that it was assumed the Japanese could
break.
25 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 569.
26 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, pp. 595–601.
27 “Sigint Sniper (The Yamamoto Shoot Down),” The Hunnicutt Collection, White Swan Records, ASCAP
Music, 2009.
28 Boyd, Carl, Hitler’s Japanese Confidant: General Ōshima Hiroshi and Magic Intelligence, 1941–1945, University
Figure 8.10 Part of a Japanese Purple machine. (Courtesy of the National Security Agency,
http://www.nsa.gov/about/_images/pg_hi_res/purple_switch.jpg.)
276 ◾ Secret History
Cryptologic Museum adjacent to Fort Meade, Maryland. In the photograph behind the Purple
fragment, Hiroshi may be seen shaking hands with Hitler.
President Truman awarded William Friedman the Medal of Merit, which was the highest
Presidential civilian award. Friedman retired in 1955 and died on November 12, 1969. He’s bur-
ied in Arlington National Cemetery.30 Frank Rowlett received the National Security Medal from
President Johnson.
It has been estimated that cryptanalysis saved a year of war in the Pacific.
—David Kahn31
Just as in the Enigma chapter, there is much more to this story of cryptanalysis. The Japanese
also had a Green machine, which was “a rather strangely constructed version of the commercial
Enigma machine,”32 and variants of Purple, codenamed Coral and Jade.
There was overlap between the American and British cryptanalytic efforts, but a basic division of
labor emerged with the British attacking the codes and ciphers of the Germans, while the Americans
focused on those of the Japanese. The two nations shared their results in what began as an uneasy
relationship between intelligence agencies. This is discussed in greater detail in Section 12.10.
Were it not for the Navajos, the marines would never have taken Iwo Jima!
—Major Howard M. Conner33
It’s been claimed that the Japanese would be the world’s worst foreign language students if it
weren’t for the Americans. So perhaps it isn’t surprising that the Americans have repeatedly used
foreign languages as codes and that they had their most famous success against the Japanese. The
story of such codes goes back to the U.S. Civil War, when the North used Hungarians to befuddle
the South.
In the last month of World War I, Native Americans began serving as code talkers, but as in
the Civil War, this language code did not play a major role. Only eight Choctaws were initially
used as radio operators. They were in Company D, 141st Infantry, under Captain E. W. Horner in
Northern France.34 Before World War I ended, the number of Choctaw code talkers grew to fif-
teen.35 They were able to use their own language to communicate openly without fear of the enemy
understanding. The effort was successful. As Colonel A. W. Bloor of the 142nd Infantry put it,
“There was hardly one chance in a million that Fritz would be able to translate these dialects.”36
30 http://www.sans.org/infosecFAQ/history/friedman.htm.
31 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. xi.
32 Deavours, Cipher and Louis Kruh, Machine Cryptography and Modern Cryptanalysis, Artech House, Inc.,
2002, p. 18.
36 Meadows, William C., The Comanche Code Talkers of World War II, University of Texas Press, Austin, Texas,
2002, p. 20.
Cryptologic War against Japan ◾ 277
Other Native American tribes who contributed to the code war in World War I, with their native
tongues, included the Comanche and Sioux.37
Some problems arose from the lack of necessary military terms in the language. A few Indian
words were applied to these terms, but they were few in number and did not cover everything that
was needed. Marine TSgt. Philip Johnston found a way around this difficulty in time for World War
II.38 By adopting code words for terms not provided for in the Indian language, and spelling out
other needed words or names or locations that arose and weren’t in the code, the code talkers could
have even greater success. Johnston, the son of a missionary, had lived on a Navajo reservation for 22
years beginning at age 4; thus, he learned the language, which he recognized as being very difficult.
Johnston’s reasons for suggesting use of Navajos weren’t solely due to his own experience. He
pointed out that, despite their low literacy rate compared with other tribes, the sheer size of the
Navajo Nation, at nearly 50,000 people (more than twice the size of any other tribe at that time),
would make it easier to recruit the desired numbers. Another advantage of using Navajo was
expressed by Major General Clayton B. Vogel.
Mr. Johnston stated that the Navajo is the only tribe in the United States that has not
been infested with German students during the past twenty years. These Germans,
studying the various tribal dialects under the guise of art students, anthropologists,
etc., have undoubtedly attained a good working knowledge of all tribal dialects except
Navajo. For this reason the Navajo is the only tribe available offering complete security
for the type of work under consideration.39
But would the idea work? The Germans weren’t the only enemy who had attempted to study the
Indian languages between the wars. Some Japanese had been employed by the Indian Affairs
Bureau. And why would the Navajo, who faced countless agonies at the hands of the white men,
including attempted genocide, be willing to help? This is an obvious question that many modern
authors attempt to answer. I’ll let the Navajos answer for themselves. The following is a resolution
passed unanimously by the Navajo Tribal Council at Window Rock on June 3, 1940:
Whereas, the Navajo Tribal Council and the 50,000 people we represent, cannot fail
to recognize the crisis now facing the world in the threat of foreign invasion and the
destruction of the great liberties and benefits which we enjoy on the reservation, and
Whereas, there exists no purer concentration of Americanism than among the first
Americans, and
Whereas, it has become common practice to attempt national destruction through the
sowing of seeds of treachery among minority groups such as ours, and
Whereas, we hereby serve notice that any un-American movement among our people
will be resented and dealt with severely, and
Now, Therefore, we resolve that the Navajo Indians stand ready as they did in 1918, to
aid and defend our Government and its institutions against all subversive and armed
conflict and pledge our loyalty to the system which recognizes minority rights and a
way of life that has placed us among the greatest people of our race.40
37 Meadows, William C., The Comanche Code Talkers of World War II, University of Texas Press, Austin, Texas,
2002, p. 29.
38 Johnston came up with his idea in late December 1941 (after Pearl Harbor).
39 Paul, Doris A., The Navajo Code Talkers, Dorrance Publishing Co., Inc., Pittsburgh, Pennsylvania, 1973, p. 157.
40 Paul, Doris A., The Navajo Code Talkers, Dorrance Publishing Co., Inc., Pittsburgh, Pennsylvania, 1973, pp. 2–3.
278 ◾ Secret History
Eventually, 540 Navajo served as Marines, of which 420 were code talkers.41
The lack of Navajo words for needed military terms was remedied by the creation of easily
remembered code words. A tank would be referred to as a TORTOISE, an easy code word to
remember because of the tortoise’s hard shell. Planes became BIRDS, and so on. The basic idea
was Johnston’s, but the Navajo came up with the actual code words themselves.
Words that were not part of the code, such as proper names and locations, or anything else
that might arise, could be spelled out. Initially the alphabet consisted of one Navajo word for each
letter, but as the enemy might catch on when words with easily recognized patterns of letters,
such as GUADALCANAL, were spelled out, alternate representations of frequent letters were soon
introduced.42 The expanded alphabet is reproduced in Table 8.1 along with a handful of the 411
code words. The complete list of code words may be found in various books on the Navajo Code
Talkers, as well as online at https://web.archive.org/web/20130329065820/http://www.history.
navy.mil/faqs/faq61-4.htm, which was the source used here.
Even with the initial code (prior to its expansion), tests showed that Navajo who were not
familiar with the code words couldn’t decipher the messages.43 An important feature of using a
natural language was increased speed. There was no lengthy process of looking up code groups
in order to recover the original messages. A significant savings in time (minutes instead of
hours) yielded a combat advantage to the American troops. The expanded code was even faster,
as there were fewer delays due to having to spell out words not in the Navajo language or the
code. Also, both the original and expanded code caused fewer errors than traditional codes.
It did cause some confusion though, for allies not in on the secret. When the Navajo first hit
the combat airwaves in Guadalcanal, some of the other American troops thought it was the
Japanese broadcasting.
With regard to the Navajo role at Iwo Jima (Figure 8.11), Major Conner had this to say:44
The entire operation was directed by Navajo code. Our corps command post was
on a battleship from which orders went to the three division command posts on
the beachhead, and on down to the lower echelons. I was signal officer of the Fifth
Division. During the first forty-eight hours, while we were landing and consolidat-
ing our shore positions, I had six Navajo radio nets operating around the clock. In
that period alone they sent and received over eight hundred messages without an
error.
The “without an error” portion of the quote above is not something that was taken for granted
in World War II-era coded transmissions. When Leo Marks began his cryptographic work for
Britain’s Special Operations Executive (SOE) in 1942, about 25% of incoming messages from
their agents couldn’t be read for one reason or another.
41 Paul, Doris A., The Navajo Code Talkers, Dorrance Publishing Co., Inc., Pittsburgh, Pennsylvania, 1973, p. 117.
42 This idea was due to Captain Stilwell, a cryptographer. See Paul, Doris A., The Navajo Code Talkers, Dorrance
Publishing Co., Inc., Pittsburgh, Pennsylvania, 1973, p. 38.
43 Paul, Doris A., The Navajo Code Talkers, Dorrance Publishing Co., Inc., Pittsburgh, Pennsylvania, 1973, p. 30.
44 Paul, Doris A., The Navajo Code Talkers, Dorrance Publishing Co., Inc., Pittsburgh, Pennsylvania, 1973, p. 73.
Cryptologic War against Japan ◾ 279
Table 8.1 Navajo Code Talkers’ Dictionary. (Revised 15 June 1945, and Declassified
under Department of Defense Directive 5200.9)
Letter Navajo Word Translation Letter Navajo Word Translation
A WOL-LA-CHEE Ant K KLIZZIE-YAZZIE Kid
A BE-LA-SANA Apple L DIBEH-YAZZIE Lamb
A TSE-NILL Axe L AH-JAD Leg
B NA-HASH-CHID Badger L NASH-DOIE-TSO Lion
B SHUSH Bear M TSIN-TLITI Match
B TOISH-JEH Barrel M BE-TAS-TNI Mirror
C MOASI Cat M NA-AS-TSO-SI Mouse
C TLA-GIN Coal N TSAH Needle
C BA-GOSHI Cow N A-CHIN Nose
D BE Deer O A-KHA Oil
D CHINDI Devil O TLO-CHIN Onion
D LHA-CHA-EH Dog O NE-AHS-JAH Owl
E AH-JAH Ear P CLA-GI-AIH Pant
E DZEH Elk P BI-SO-DIH Pig
E AH-NAH Eye P NE-ZHONI Pretty
F CHUO Fir Q CA-YEILTH Quiver
F TSA-E-DONIN-EE Fly R GAH Rabbit
F MA-E Fox R DAH-NES-TSA Ram
G AH-TAD Girl R AH-LOSZ Rice
G KLIZZIE Goat S DIBEH Sheep
G JEHA Gum S KLESH Snake
H TSE-GAH Hair T D-AH Tea
H CHA Hat T A-WOH Tooth
H LIN Horse T THAN-ZIE Turkey
I TKIN Ice U SHI-DA Uncle
I YEH-HES Itch U NO-DA-IH Ute
I A-CHI Intestine V A-KEH-DI-GLINI Victor
J TKELE-CHO-G Jackass W GLOE-IH Weasel
J AH-YA-TSINNE Jaw X AL-NA-AS-DZOH Cross
J YIL-DOI Jerk Y TSAH-AS-ZIH Yucca
K JAD-HO-LONI Kettle Z BESH-DO-TLIZ Zinc
K BA-AH-NE-DI-TININ Key
(Continued)
280 ◾ Secret History
Table 8.1 (Continued) Navajo Code Talkers’ Dictionary. (Revised 15 June 1945, and
Declassified under Department of Defense Directive 5200.9)
Countries Navajo Word Translation
(Continued)
1 I know. This was the category it was placed under in the code. Don’t blame me.
2 The Navajos were unable to think of an appropriate, easy to remember, code word for Italy.
Finally, one mentioned that he knew an Italian who stuttered....
Cryptologic War against Japan ◾ 281
Table 8.1 (Continued) Navajo Code Talkers’ Dictionary. (Revised 15 June 1945, and
Declassified under Department of Defense Directive 5200.9)
Patrol plane GA-GIH Crow
Figure 8.11 The U.S. Marine Cemetery on Iwo Jima shows the price of victory, a price that
would’ve been even higher without the Navajo; Mount Suribachi, site of the famous flag raising,
is in the background (https://web.archive.org/web/20130306120012/http://history.navy.mil/
library/online/battleiwojima.htm).
The security surrounding the use of the Navajo as code talkers was very poor. Several accounts
appeared in the media before the war’s end. Without leaks like these, however, the program might
never have existed! Johnston explained how he came up with his idea:
[O]ne day, a newspaper story caught my eye. An armored division on practice maneu-
vers in Louisiana had tried out a unique idea for secret communication. Among the
enlisted personnel were several Indians from one tribe. Their language might possibly
282 ◾ Secret History
offer a solution for the oldest problem in military operations – sending a message that
no enemy could possibly understand.45
William C. Meadows has tentatively identified this “newspaper story” with a piece from the
November 1941 issue of The Masterkey for Indian Lore and History, from which the relevant para-
graphs are reproduced below.46
The classic World War I trick of using Indians speaking their own languages as “code”
transmitters, is again being used in the Army, this time during the great maneuvers in
the South, says Science Service. Three units of the 32nd Division have small groups of
Indians from Wisconsin and Michigan tribes, who receive instructions in English, put
them on the air in a tongue intelligible only to their listening fellow-tribesmen, who in
turn retranslate the message into English at the receiving end.
The Indians themselves have had to overcome certain language difficulties, for
there are no words in their primitive languages for many of the necessary military
terms. In one of the groups, ingenious use was made of the fact that infantry, cavalry,
and artillery wear hat cords and other insignia of blue, yellow, and red, respectively.
The Indian word for “blue” thus comes to mean infantry, “yellow” means cavalry, and
“red” means artillery. The Indian term for “turtle” signifies a tank.
The article went on to state that 17 [Comanche] Indians had been trained.
Recall that Johnston preferred Navajo, in part, because it hadn’t been studied by the Germans.
Yet, despite the Germans’ study of the other dialects, the Comanche code talkers, used by the U.S.
Army for the D-day landing at Normandy and after, sent and received messages that the Germans
failed to crack.47 The Comanche were recruited to serve as code talkers about 16 months before
the Navajo, but have attracted much less attention, in large part because of their much smaller
numbers. Although 17 were trained, only 14 actually served in Europe.48 Like the Navajo did later,
the Comanche created their own code words (nearly 250 of them) and the result was that non-
code talking Comanche couldn’t understand the messages. In contrast to the Navajo, there was
no attempt to keep the Comanche code talkers secret, which is ironic considering how few people
are presently aware of them, compared to the Navajo!49
Despite poor security (see the items in the reference section that appeared during the war!), the
World War II code talkers were highly successful. Although it is difficult to measure the impact
of any single component in a war, due to the many other variable factors, at least one statistic does
support their impact being substantial. American pilots faced a 53% fatality rate prior to the intro-
duction of the Navajo code talkers, a number that dropped afterwards to less than 7%.50
45 Johnston, Philip, “Indian Jargon Won Our Battles,” The Masterkey for Indian Lore and History, Vol. 38, No. 4,
October–December, 1964, pp. 130–137, p. 131 quoted here.
46 Meadows, William C., The Comanche Code Talkers of World War II, University of Texas Press, Austin, Texas, 2002,
p. 75.
47 Meadows, William C., The Comanche Code Talkers of World War II, University of Texas Press, Austin, Texas, 2002,
p. xv.
48 Meadows, William C., The Comanche Code Talkers of World War II, University of Texas Press, Austin, Texas, 2002,
p. 80.
49 Meadows, William C., The Comanche Code Talkers of World War II, University of Texas Press, Austin, Texas, 2002,
pp. 108–109.
50 McClain, Sally, Navajo Weapon: The Navajo Code Talkers, Rio Nuevo Publishers, Tucson, Arizona, 2001, p. 118.
Cryptologic War against Japan ◾ 283
In the two world wars, the U.S. military used Native Americans from at least 19 different
tribes who spoke their natural languages with or without the addition of code words.51 Of these,
the Hopi also deserve to be singled out for mixing in code words with their natural language, just
as the Navajo, Comanche, and Choctaw did. It is believed that most of the tribes did not make
this step. The Hopi first hit the airwaves in the Marshall Islands, then New Caledonia and Leyte.52
They only numbered 11. As far as is known, the largest code talking group after the Navajo was
only 19 strong, and it did not make use of code words. For most of the groups, very little is known,
but the presence of Navajo in such comparatively large numbers helped to ensure that their story
would be told. Despite honors being bestowed upon various code talkers following the official
declassification of the not-so-secret program in 1968,53 the veterans hadn’t always been treated
fairly. One of the code talkers complained to Philip Johnston in a letter dated June 6, 1946:
The situation out in the Navajoland is very bad and we as vets of World War II are
doing everything we can to aid our poor people. We went to Hell and back for what?
For the people back here in America to tell us we can’t vote!! Can’t do this! Can’t do
that!, because you don’t pay taxes and are not citizens!! We did not say we were not
citizens when we volunteered for service against the ruthless and treacherous enemies,
the Japs and Germans! Why?54
Indians in Arizona and New Mexico weren’t allowed to vote until 1948.
51 Meadows, William C., The Comanche Code Talkers of World War II, University of Texas Press, Austin, Texas,
2002, p. xv.
52 Meadows, William C., The Comanche Code Talkers of World War II, University of Texas Press, Austin, Texas,
2002, p. 68.
53 Navajo and other code talkers served in the Korean and Vietnam wars, so despite the many leaks, it was still
officially secret.
54 Paul, Doris A., The Navajo Code Talkers, Dorrance Publishing Co., Inc., Pittsburgh, Pennsylvania, 1973, p. 111.
55 The episode was titled “Anasazi.”
56 Paul, Doris A., The Navajo Code Talkers, Dorrance Publishing Co., Inc., Pittsburgh, Pennsylvania, 1973, p. 85.
57 McClain, Sally, Navajo Weapon: The Navajo Code Talkers, Rio Nuevo Publishers, Tucson, Arizona, 2001, p. 104.
appeared before the film was made and was not simply a creation of the script writer. It was presented
in Deanne Durrett’s Unsung Heroes of World War II: The Story of the Navajo Code Talkers.58
In one case of mistaken identity, non-Navajo Marines, who had advanced to a location previously
held by Japanese, were being bombarded by artillery from their fellow troops and when they tried to
call off the attack, it continued! The Japanese had so often imitated Americans on the airwaves that
these real Americans were thought to be fakes. Finally headquarters asked, “Do you have a Navajo?”
The Japanese couldn’t imitate the Navajo, and when one responded, the salvo ceased.59
Joe Kieyoomia, a Navajo who was not a code talker, was captured early in the war by the
Japanese. At first, despite his denials, they thought he was a Japanese-American. Eventually, when
they realized an Indian language was being used as a code, they came to believe he was in fact
Navajo, but they didn’t believe he couldn’t understand the coded messages. Joe, who had already
survived the Bataan Death March, now faced more torture, but there was nothing he could tell
them. In all, he spent 1,240 days as a POW before being freed after the end of the war.60
The action figure shown in Figure 8.12, with voice supplied by the real Navajo code Talker
Sam Billison, was first released in 1999, and was described by a vendor as follows:61
58 Durrett, Deanne, Unsung Heroes of World War II: The Story of the Navajo Code Talkers, Facts On File, New
York, 1998, p. 77. Thanks to Katie Montgomery for introducing me to this source.
59 Paul, Doris A., The Navajo Code Talkers, Dorrance Publishing Co., Inc., Pittsburgh, Pennsylvania, 1973, p. 66.
60 McClain, Sally, Navajo Weapon: The Navajo Code Talkers, Rio Nuevo Publishers, Tucson, Arizona, 2001, pp.
119–121.
61 http://www.southernwestindian.com/prod/G-I-Joe-Navajo-Code-Talker.cfm. This link is now broken and was
62 Meadows, William C., The Comanche Code Talkers of World War II, University of Texas Press, Austin, Texas,
2002, p. 5.
63 Dawidoff, Nicholas, The Catcher Was a Spy: The Mysterious Life of Moe Berg, Pantheon Books, New York, 1994,
p. 34.
286 ◾ Secret History
Clark’s biography of Friedman does not do him justice. He deserves better. The book should
have been written by someone who knew more of what went on after the Signal Intelligence
Service was formed. The early years of Friedman’s career were excellently characterized but the
biography was very weak during the period after 1930.64
A good deal of misinformation has been written about U.S. cryptologic work and, unfortu-
nately, succeeding writers such as Clark pick up that kind of incorrect material with the result
errors are perpetuated and are eventually accepted as facts.65
Clarke, Brigadier General Carter W., with introductory material from the editors of Cryptologia, “From the
Archives: Account of Gen. George C. Marshall’s Request of Gov. Thomas E. Dewey,” Cryptologia,
Vol. 7, No. 2, April 1983, pp. 119–128. The secrecy of the successful cryptanalysis of Purple wasn’t
maintained nearly as well as the Ultra secret. Dewey, a political opponent of Roosevelt, could have
used his awareness of this success to claim that Roosevelt should have anticipated the Pearl Harbor
attack; however, General Marshall was able to convince Dewey to sacrifice his most potent political
weapon to the greater good—keeping a secret that would continue to save lives and shorten the war.
Currier, Prescott, “My “Purple” Trip to England in 1941,” Cryptologia, Vol. 20, No. 3, July 1996, pp.
193–201.
Deavours, Cipher and Louis Kruh, Machine Cryptography and Modern Cryptanalysis, Artech House, Inc.,
Dedham, Massachusetts, 1985.
Freeman, Wes, Geoff Sullivan, and Frode Weierud, “Purple Revealed: Simulation and Computer-Aided
Cryptanalysis of Angooki Taipu B,” Cryptologia, Vol. 27, No. 1, January 2003, pp. 1–43. In this paper,
the authors provide a level of detail sufficient for implementing Purple, as well as a modern attack.
Jacobsen, Philip H., “Radio Silence of the Pearl Harbor Strike Force Confirmed Again: The Saga of Secret
Message Serial (SMS) Numbers,” Cryptologia, Vol. 31, No. 3, July 2007, pp. 223–232.
Kahn, David, The Codebreakers, second edition, Scribner, 1996. The first chapter concerns Pearl Harbor,
and more information on the various World War II-era codes and ciphers used by the Japanese can be
found elsewhere in the book.
Kahn, David, “Pearl Harbor and the Inadequacy of Cryptanalysis,” Cryptologia, Vol. 15, No. 4, October
1991, pp. 273–294. Pages 293–294 are devoted to Genevieve Grotjan, whom Kahn had interviewed.
Kelley, Stephen J., Big Machines, Aegean Park Press, Laguna Hills, California, 2001. This book focuses on
Enigma, Purple (and its predecessors), and SIGABA, the top American machine of World War II, one
that was never broken (See Chapter 9 of the present book).
64 Kruh, Louis, “Reminiscences of a Master Cryptologist,” Cryptologia, Vol, 4, No. 1, January 1980, pp. 45–50.
65 Kruh, Louis, “Reminiscences of a Master Cryptologist,” Cryptologia, Vol, 4, No. 1, January 1980, pp. 45–50.
Cryptologic War against Japan ◾ 287
Kruh, Louis, “The Deadly Double Advertisements - Pearl Harbor Warning or Coincidence?” Cryptologia,
Vol. 3, No. 3, July 1979, pp.166–171.
Kruh, Louis, “Reminiscences of a Master Cryptologist,” Cryptologia, Vol. 4, No. 1, January 1980, pp.
45–50. The following are some quotes from Frank Rowlett, from this article:
The successful cryptanalysis of the Japanese Purple system was accomplished by a team of
Army cryptanalysts. At first, it was a joint Army-Navy project, but after a few months the Navy
withdrew its cryptanalytic resources to apply them to the Japanese naval systems. The Navy
did however, continue to provide some intercept coverage of diplomatic traffic.
The Chief Signal Officer, General Joseph O. Mauborgne, was personally interested in the
Purple effort and supported our work to the fullest degree possible. He liked to refer to us as
his magicians and called the translations of the messages we produced by the name “magic”.
Friedman played a signal role in the selection and assignment of personnel and participated in
the analytical work on a part-time basis.
Lewin, Ronald, The American Magic: Codes, Ciphers and the Defeat of Japan, Farrar Straus Giroux, New
York, 1982.
Parker, Frederick D., “The Unsolved Messages of Pearl Harbor,” Cryptologia, Vol. 15, No. 4, October 1991,
pp. 295–313.
Rowlett, Frank B., The Story of Magic: Memoirs of an American Cryptologic Pioneer, Aegean Park Press,
Laguna Hills, California, 1989. Not only was Rowlett there, but he also writes well! This book does
the best job of capturing the atmosphere of World War II-era codebreaking in America.
Smith, Michael, The Emperor’s Codes: The Breaking of Japan’s Secret Ciphers, Penguin Books, New York,
2002. Smith describes the British successes against Japanese codes and ciphers, pointing out that they
were the first to crack a Japanese diplomatic cipher machine (Orange, see pp. 34–35) and JN-25 (see
pp. 5, 59–60).
Stamp, Mark and Richard M. Low, Applied Cryptanalysis: Breaking Ciphers in the Real World, John Wiley
& Sons, Hoboken, New Jersey, 2007.
Tucker, Dundas P., edited and annotated by Greg Mellen, “Rhapsody in Purple: A New History of Pearl
Harbor – Part I,” Cryptologia, Vol. 6, No. 3, July 1982, pp. 193–228.
Weierud, Frode, The PURPLE Machine 97-shiki-obun In-ji-ki Angooki Taipu B, http://cryptocellar.org/
simula/purple/index.html. An online Purple simulator can be found at this website.
On Code Talkers
Aaseng, Nathan, Navajo Code Talkers, Thomas Allen & Son, Markham, Ontario, Canada, 1992. This is a
young adult book.
Anon., “Comanches Again Called for Army Code Service,” New York Times, December 13, 1940, p. 16.
Anon., “DOD Hails Indian Code Talkers,” Sea Services Weekly, November 27, 1992, pp. 9–10.
Anon., “Pentagon Honors Navajos, Code Nobody Could Break,” Arizona Republic, September 18, 1992, p. A9.
Anon., “Played Joke on the Huns,” The American Indian Magazine, Vol. 7, No. 2, 1919, p. 101. This article revealed
the role the Sioux played in World War I with their native language. It is quoted in Meadows, William C.,
The Comanche Code Talkers of World War II, University of Texas Press, Austin, Texas, 2002, p. 30.
Bianchi, Chuck, The Code Talkers, Pinnacle Books, New York, 1990. This is a novel.
Bixler, Margaret, Winds of Freedom: The Story of the Navajo Code Talkers of World War II, Two Bytes
Publishing Company, Darien, Connecticut, June 1992.
Bruchac, Joseph, Codetalker: A Novel About the Navajo Marines of World War Two, Dial Books, New York,
2005.
Davis, Jr., Goode, “Proud Tradition of the Marines’ Navajo Code Talkers: They Fought With Words–Words
No Japanese Could Fathom,” Marine Corps League, Vol. 46, No. 1, Spring 1990, pp. 16–26.
Donovan, Bill, “Navajo Code Talkers Made History Without Knowing It,” Arizona Republic, August 14,
1992, p. B6.
288 ◾ Secret History
Durrett, Deanne, Unsung Heroes of World War II: The Story of the Navajo Code Talkers, Facts On File, New
York, 1998
Gyi, Maung, “The Unbreakable Language Code in the Pacific Theatre of World War II,” ETC: A Review of
General Semantics, Vol. 39, No. 1, Spring 1982, pp. 8–15.
Hafford, William E., The Navajo Code Talkers, Arizona Highways Vol. 65, No. 2, February 1989, pp.
36–45.
Huffman, Stephen, “The Navajo Code Talkers: A Cryptologic and Linguistic Perspective,” Cryptologia, Vol.
24 No. 4, October 2000, pp. 289–320.
Johnston, Philip, “Indian Jargon Won Our Battles,” The Masterkey for Indian Lore and History, Vol. 38, No.
4, October–December 1964, pp.130–137.
Kahn, David, “From the Archives: Codetalkers Not Wanted,” Cryptologia, Vol. 29, No. 1, January 2005,
pp. 76–87.
Kawano, Kenji. Warriors: Navajo Code Talkers, Northland Pub. Co., Flagstaff, Arizona, 1990.
King, Jodi A., “DOD Dedicates Code Talkers Display,” Pentagram, September 24, 1992, p. A3.
Langille, Vernon, “Indian War Call,” Leatherneck, Vol. 31, No. 3, March 1948, pp. 37–40.
Levine, Captain Lincoln A., “Amazing Code Machine That Sent Messages Safely to U.S. Army in War
Baffles Experts: War Tricks That Puzzled Germans,” New York American, November 13, 1921.
America’s use of Choctaw code talkers in World War I was described in this article, which is quoted
in Meadows, William C., The Comanche Code Talkers of World War II, University of Texas Press,
Austin, Texas, 2002, p. 23–24.
Marder, Murrey, “Navajo Code Talkers,” Marine Corps Gazette, September 1945, pp. 10–11.
McClain, Sally, Navajo Weapon, Books Beyond Borders, Inc., Boulder, Colorado, 1994.
McCoy, Ron, “Navajo Code Talkers of World War II: Indian Marines Befuddled the Enemy,” American
West, Vol. 18, No. 6, November/December 1981, pp. 67–73, 75.
Meadows, William C., The Comanche Code Talkers of World War II, University of Texas Press, Austin,
Texas, 2002. This thorough and scholarly account also contains very useful appendices with data
concerning all of the Native American tribes, identified thus far, that served as code talkers in World
War I or World War II.
Paul, Doris A., The Navajo Code Talkers, Dorrance Publishing Co., Inc., Pittsburgh, Pennsylvania, 1973.
Price, Willson H., “I Was a Top-Secret Human Being During World War 2,” National Enquirer, February
4, 1973. In general, National Enquirer is not a reliable source!
Shepherdson, Nancy, “America’s Secret Weapon,” Boy’s Life, November 1997, p. 45.
Stewart, James, “The Navajo at War,” Arizona Highways, June 1943, pp. 22–23. This article (published
while the war was still on!) included the following passage:
[T]he U.S. Marine Corps has organized a special Navajo signal unit for combat communica-
tions service… Its members were trained in signal work using the Navajo language as a code,
adapting a scheme tried with considerable success during World War I.
Whether or not the Japanese saw this before the war ended is unknown. In any case, they did catch on to
the fact that some of the conversations they couldn’t understand were being carried out in Navajo.
Thomas, Jr., Robert McG, “Carl Gorman, Code Talker in World War II, Dies at 90,” The New York Times,
February 1, 1998, p. 27.
United States Congress, “Codetalkers Recognition: Not Just the Navajos,” Cryptologia, Vol. 26, No. 4,
October 2002, pp. 241–256. This article provides the text of the Code Talkers Recognition Act.
U.S. Marine Corps, Navajo Dictionary, June 15, 1945.
Watson, Bruce, “Navajo Code Talkers: A Few Good Men,” Smithsonian, Vol. 24, No. 5, August 1993,
pp. 34–40, 42–43.
Wilson, William, “Code Talkers,” American History, February 1997, pp.16–20, 66–67.
Cryptologic War against Japan ◾ 289
Bibliography
A bibliography that includes unpublished (archival) sources can be found at https://web.archive.org/
web/20130306115918/http://www.history.navy.mil/faqs/faq12-1.htm.
Videography
Chibitty, Charles, Bob Craig, and Brad Agnew, American Indian Code Talkers [VHS], Center for Tribal
Studies, College of Social and Behavioral Sciences, Northeastern Oklahoma State University,
Tahlequah, Oklahoma, 1998.
Chibitty, Charles, Dwayne Noble, Eric Noble, and Jeff Eskew, Recollections of Charles Chibitty—The Last
Comanche Code Talker [VHS], Hidden Path Productions, Mannford, Oklahoma, 42 minutes, 2000.
Hayer, Brandi, Hinz Cory, and Matt Wandzel, Dine College, and Winona State University, Samuel Tso:
Code Talker, 5th Marine Division [DVD], Dine College, Tsaile, Arizona, and Winona State University,
Winona, Minnesota, 2009.
Meadows, William C., Comanche Code Talkers of World War II [VHS], August 4, 1995. This is a videotaped
interview with Comanche code talkers Roderick Red Elk and Charles Chibitty. Meadows was both
the host and producer. A copy is available in the Western History Collections of the University of
Oklahoma.
NAPBC, In Search of History: The Navajo Code Talkers [VHS], History Channel and Native American
Public Broadcasting Consortium, Lincoln, Nebraska, 50 minutes, 2006 (originally broadcast in
1998).
Red-Horse, Valerie, Director, True Whispers, The Story of the Navajo Code Talkers [DVD], PBS Home Video,
∼60 minutes, 2007 (originally broadcast 2002).
Sam, David, Patty Talahongva, and Craig Baumann, The Power of Words: Native Languages as Weapons of
War [DVD], National Museum of the American Indian, Smithsonian Institution, Washington, DC,
2006.
Tully, Brendan W., Director, Navajo Code Talkers: The Epic Story [VHS], Tully Entertainment, 55 minutes,
1994.
Wright, Mike, Code Talkers Decoration Ceremony, Oklahoma State Capitol, November 3, 1989 [VHS], Oral
History Collections, Oklahoma Historical Society, Oklahoma City, Oklahoma, 1989.
Chapter 9
SIGABA:
World War II Defense
It seems that most writers are concerned with the breaking of other nation’s ciphers.
Isn’t it more important and even more of a feat to make your own systems secure
against foreign cryptanalysts?
—Frank Rowlett1
Figure 9.1 Frank Rowlett. (Courtesy of the National Cryptologic Museum, Fort Meade,
Maryland.)
1 Quoted in Kruh, Louis, “Reminiscences of a Master Cryptologist,” Cryptologia, Vol. 4, No. 1, January 1980,
pp. 45–50, p. 49 cited here.
291
292 ◾ Secret History
creating the paper tape keys (Figure 9.2) that needed to be fed through Friedman’s device.2 The
key on the tape would control which rotor(s) turned at each step, thus avoiding the regularity of
the turning in other machines, such as Enigma. Fortunately, creating the key tape was a horrible
job, made even worse for Rowlett by the fact that Friedman told him to spend half of his time at
it, while the other half was to be devoted to his continued training, which he much preferred.3
Figure 9.2 M-134 paper key tape. (Courtesy of the National Cryptologic Museum, Fort Meade,
Maryland.)
Friedman showed Rowlett how to operate the equipment to make the tape and observed him
making a test run. He then suggested Rowlett make several more test runs and left the room
Rowlett related what came next:
After he departed, I continued as he had proposed. I decided that I would dupli-
cate the test run I had made under his supervision to see if the keys prepared on two
separate runs were identical as they should be. When I finished the second run and
compared the two keys, I found several points of discrepancy. I decided that I would
make another attempt to duplicate the first run. When it was finished and I compared
it with the two previous runs, I found all three to be different. And when I tried
two more duplicate runs, I found that I got different results for each. At this point I
decided that I had better consult with Friedman.
When I showed Friedman the results I had obtained, he came with me to the
equipment room. When he tried to produce a duplicate of the test run I had made,
he also obtained different results. We spent until lunchtime trying to get satisfactory
results, but with only moderate success.
My first day’s experience with the equipment was only a preview of the succeeding
days. The equipment operated erratically, and frequently I had to dismantle a piece in
order to locate the trouble. After pursuing this course for some time, I was finally able
to make several runs with identical results. By the end of the first month I had com-
pleted only a small portion of the compilation task that I had been assigned.
2 Mucklow, Timothy J., SIGABA/ECM II: A Beautiful Idea, Center for Cryptologic History, National Security
Agency, Fort George G. Meade, Maryland, 2015, p. 7.
3 Rowlett, Frank B., The Story of Magic: Memoirs of an American Cryptologic Pioneer, Aegean Park Press, Laguna
Hills, California, 1999, p. 92.
SIGABA: World War II Defense ◾ 293
1. What if Friedman had tasked someone with a less powerful intellect than Rowlett to cre-
ate the paper tape key? Would the new and improved tapeless cipher machine have been
invented in time for World War II? In this instance at least, it seems that assigning someone
a tedious task for which he was overqualified paid off!
2. What if Rowlett had not persisted with his idea? If Friedman’s machine went into service
using the paper tape keys, how many hours of labor would have to be devoted to tape cre-
ation? Who would carry out this work? What work would he or she be diverted from to do
so? What impact would this have on World War II? Also, would the paper tape be distrib-
uted successfully and function properly in all cases? If not, which messages would fail to go
through and what would the effect of this be?
4 Rowlett, Frank B., The Story of Magic: Memoirs of an American Cryptologic Pioneer, Aegean Park Press, Laguna
Hills, California, 1999, pp. 92–93.
5 Rowlett, Frank B., The Story of Magic: Memoirs of an American Cryptologic Pioneer, Aegean Park Press, Laguna
Hills, California, 1999, p. 94.
6 Mucklow, Timothy J., SIGABA/ECM II: A Beautiful Idea, Center for Cryptologic History, National Security
Agency, Fort George G. Meade, Maryland, 2015, p. 9, which cites Frank B. Rowlett, Oral History Interview
1974, OH-1974-01, Part B, 45c, Center for Cryptologic History, National Security Agency, Fort George G.
Meade, Maryland. Also see Rowlett, Frank B., The Story of Magic: Memoirs of an American Cryptologic Pioneer,
Aegean Park Press, Laguna Hills, California, 1999, p. 96.
7 Mucklow, Timothy J., SIGABA/ECM II: A Beautiful Idea, Center for Cryptologic History, National Security
Agency, Fort George G. Meade, Maryland, 2015, p. 15.
294 ◾ Secret History
The new and improved cipher machine was dubbed SIGABA by the Army, while the Navy called
it ECM (Electric Cipher Machine) II or CSP-888/889. A modified Navy version was known as the
CSP-2900. The machines were first sent into the field in June 1941 and, before being replaced with
another device, years after World War II, a closely accounted for 10,060 machines saw use.8 These
SIGABAs enciphered the most important messages, while lesser secrets were run through weaker
machines such as the M-209. The Germans were often able to take advantage of operational mis-
takes by M-209 operators, such as sending messages in depth (encrypted with the same key), and
recover the messages. However, this process typically took seven to ten days, by which time the
information might well be worthless. Between such cryptanalysis and captured keys, about 10%
of the M-209 traffic was compromised.9 By contrast, SIGABA was never cracked.
m209/index.htm.
10 Mucklow, Timothy J., SIGABA/ECM II: A Beautiful Idea, Center for Cryptologic History, National Security
Agency, Fort George G. Meade, Maryland, 2015, p. 20, which cites Ratcliff, Rebecca Ann, Delusions of
Intelligence: Enigma, Ultra, and the End of Secure Ciphers, Cambridge University Press, New York, 2006, p. 81
and Safford, Captain Laurance, History of Invention and Development of the Mark II ECM, SRH-360, United
States Navy OP-20-S-5, Office of the Chief of Naval Operations, Washington, DC, October 30, 1943, p. 52.
This history is available at NARA (National Archives and Records Administration) RG 457, Box 1124, College
Park, Maryland. Note: SRH stands for Special Research Histories.
SIGABA: World War II Defense ◾ 295
Figure 9.3 A SIGABA rotor prior to being wired. (Courtesy of the National Cryptologic
Museum, Fort Meade, Maryland.)
Figure 9.4 A pair of women at work on SIGABA rotors. (Courtesy of the National Cryptologic
Museum, Fort Meade, Maryland.)
296 ◾ Secret History
Figure 9.5 A completely wired SIGABA rotor. (Courtesy of the National Cryptologic Museum,
Fort Meade, Maryland.)
A completely wired SIGABA rotor is shown in Figure 9.5, along with a nickel to give a sense
of scale. Looking at the wires makes me think of knitting. It has been speculated that one of the
reasons the women out-performed the men at the task of rotor wiring is that they tended to have
greater prior experience with activities such as knitting, crocheting, sewing, embroidery, cross-
stitching, etc., and that these skills transferred over. The women’s typically smaller hands may also
have been advantageous to carrying out such precise small-scale work. It has also been suggested
that the women exhibited greater patience, becoming frustrated less quickly than males.
Prior to America entering World War II and the women’s involvement, Rear Admiral Leigh
Noyes, Director of Naval Communications, foresaw the contributions they could make. He sent
a letter to Ada Comstock, the President of Radcliffe, Harvard University’s women’s college, ask-
ing that she raise the possibility of extra-curricular training of some seniors in naval cryptanalytic
work. He wrote, “In the event of total war, … women will be needed for this work, and they can
do it probably better than men.”15 In the instance of constructing rotors, at least, he was right! And
the women contributed in many other ways. Their work is detailed in Liza Mundy’s excellent book
Code Girls.16 Following the appearance of this book, Mundy accepted the opportunity to serve as
the 11th Scholar-in-Residence in the National Security Agency’s Center for Cryptologic History.
This guarantees that there will be sequel to Code Girls, for that is part of Mundy’s contractual
obligation in the SiR role. I am looking forward to it!
14 Mucklow, Timothy J., SIGABA/ECM II: A Beautiful Idea, Center for Cryptologic History, National Security
Agency, Fort George G. Meade, Maryland, 2015, p. 20.
15 Bauer, Craig, “The Cryptologic Contributions of Dr. Donald Menzel,” Cryptologia, Vol. 30, No. 4, 2006,
pp. 306–339, p. 306 cited here. The original document is: Leigh Noyes to Ada Comstock, letter, September
25, 1941, Papers of President Comstock, Radcliffe Archives, Radcliffe Institute for Advanced Study, Harvard
University.
16 Mundy, Liza, Code Girls: the untold story of the American women code breakers of World War II, Hachette Books,
Figure 9.6 The 15-rotor heart of SIGABA. (Courtesy of the National Cryptologic Museum, Fort
Meade, Maryland.)
While the Enigma machines used by the Nazis had 3, and later 4, rotors in use at a time,
SIGABA employed 15, placed in three banks of 5 rotors each. They can be seen in Figure 9.6. The
five smaller rotors on the left-hand side are called index rotors. The five rotors in the middle are
called stepping control rotors, or simply control rotors. The right-most rotors are called the alphabet
rotors or cipher rotors. It is with this last batch of rotors that the explanation of SIGABAs function-
ing begins in the next section.
rotor, and the enciphered letter comes out on the right-hand side. In contrast to Enigma, there is
no plugboard and no reflector. The letter to be enciphered makes only one trip through the cipher
rotors. When the enciphered message is received by the intended recipient, he sets the rotors in the
same position, but runs the ciphertext letter through them in the opposite direction, starting at
the right-hand side and receiving the plaintext letter out on the left-hand side. The lack of a reflec-
tor requires the user to carefully select the correct direction, depending on whether the message is
being enciphered or deciphered. This can be done with the turn of a switch on the machine.
The cipher rotors are where the action happens, but the other two banks of rotors are what
make SIGABA secure. These are the rotors responsible for making the cipher rotors turn in a very
irregular manner. Remember, the predictable way in which the rotors of Enigma turned was one
of its major weaknesses. To see how the irregularity is introduced, we examine the last two banks
of rotors one at a time, starting with the control rotors (Figure 9.8).
While the cipher rotors only have current passing through one wire at a time, determined by
which letter is being enciphered, there are four live wires for the control rotors at each step. These are
indicated with arrows at the top left of Figure 9.8. Note that it is always these specific wires that are
live, regardless of what letter is being enciphered. After passing through all five control rotors, some
of the live wires may meet up on the right-hand side. As Figure 9.8 shows, many of the output wires,
following the fifth control rotor, are bundled. In the specific instance shown, the four live wires
entering this rotor bank exited in just three live wires, indicated by arrows on the right. Depending
on the positions of the five control rotors, their wirings can take the four live wires to anywhere from
1 to four wires at the end. The positions of three of these rotors can change, as the user enters the
plaintext message on the keyboard, to achieve these varied results. This is indicated in Figure 9.9.
In Figure 9.9, the rotor labeled “Fast” advances one position with every letter of the message.
The Medium rotor advances one position for every full rotation of the Fast rotor and the Slow rotor
advances one position for every full rotation of the Medium rotor. The total period for these rotors is
thus 263 = 17,576. SIGABA does not have the double stepping phenomena seen in Enigma (see the end
of section 7.2). The two rotors on the extreme ends do not turn. These control rotors do not do any
enciphering. Their only purpose is to generate some (pseudo)randomness to determine which cipher
rotors will turn, but before this decision is made, some help is provided by the index rotors shown in
Figure 9.10.
The live wires from the output of the control rotors snake around and enter the index rotors.
These rotors are smaller (ten contacts each) and stationary. That is, they never turn. Depending
on how many wires coming out of the control rotors are live, there could be input to anywhere
SIGABA: World War II Defense ◾ 299
from 1 to 4 of the contacts on the left-hand side of the first index rotor. If more than one wire is
live, there is a possibility for a smaller number of live wires coming out of the bundling that occurs
after all five index rotors have been traversed. In general, the result will be somewhere between
one and four live wires at the very end, although it will never be more than the number of live
wires entering the index rotor bank. That is, the number of live wires may decrease, but can never
increase. Also, there will always be at least one live wire at the end. Figure 9.11 pieces all of the
above together and shows how the cipher rotors are made to turn.
Figure 9.11 shows four live wires entering the bank of control rotors (in the middle of the
diagram). Three live wires exit this rotor bank and snake down to enter the index rotors. Two live
wires exit the index rotor bank and pass their current on to a pair of cipher rotors, C1 and C3, mak-
ing them turn one position each. Every time a letter is typed on the SIGABA keyboard, a control
rotor turns and the current starting at the control rotors follows a different path through those
rotors and the index rotors, leading to a selection of anywhere from one to four cipher rotors to be
turned. Thus, the cipher rotors turn in a manner that is very difficult to predict!
The paper tape that so frustrated Rowlett is not needed. However, after each letter’s cipher
equivalent is determined, it is printed on a narrow paper tape. This is yet another difference between
SIGABA and Enigma. For Enigma, a bulb would be illuminated to indicate the enciphered letter.
By automatically printing the letter, SIGABA allowed encryption to be carried out more rapidly.
SIGABA could actually encipher at a rate of 60 words per minute, if the operator could type that
fast!17 However, the SIGABA user’s manual, Crypto-Operating Instructions for Converter M-134-C,
17 Mucklow, Timothy J., SIGABA/ECM II: A Beautiful Idea, Center for Cryptologic History, National Security
Agency, Fort George G. Meade, Maryland, 2015, p. 16.
300 ◾ Secret History
warned that the device “should be operated at a maximum speed of 45 to 50 words per minute…
if this speed is exceeded, characters may fail to print.”18
For many cipher machines of this era, the intended recipient would have to figure out where
to insert word breaks to make the recovered message readable. In contrast to such machines,
SIGABA was implemented in a way that conveyed the spacing along with the message. Because
there are only 26 wires in each Cipher Rotor it might seem that this is impossible, but a 27th sym-
bol really isn’t needed. The space-preserving method worked like so:
• Prior to enciphering, SIGABA converts every Z to an X. Real Xs are left unchanged. Basically,
Zs and Xs are combined in a single character denoted by X.
• Spaces are converted to (the now available!) Zs.
18 Mucklow, Timothy J., SIGABA/ECM II: A Beautiful Idea, Center for Cryptologic History, National Security
Agency, Fort George G. Meade, Maryland, 2015, p. 33.
SIGABA: World War II Defense ◾ 301
Example19
ZERO ONE TWO THREE FOUR FIVE SIX
is converted by SIGABA to
XEROZONEZTWOZTHREEZFOURZFIVEZSIX
IEQDEMOKGJEYGOKWBXAIPKRHWARZODWG
It is left to the decipherer to recognize that the first X should be taken as a Z, while the last X is an
X. Because Z is the rarest letter in the English alphabet, there won’t be many Xs that need to be
changed to Zs. In any case, context should allow them to be recognized easily.
Figure 9.12 Part of a SIGABA factory. (Courtesy of the National Cryptologic Museum, Fort
Meade, Maryland.)
19 Taken from Stamp, Mark and Wing On Chan, “SIGABA: Cryptanalysis of the Full Keyspace,” Cryptologia,
Vol. 31, No. 3, July 2007, pp. 201–222.
302 ◾ Secret History
20 See Mucklow, Timothy J., SIGABA/ECM II: A Beautiful Idea, Center for Cryptologic History, National
Security Agency, Fort George G. Meade, Maryland, 2015, p. 39, which cites Safford, Captain Laurance,
History of Invention and Development of the Mark II ECM, SRH-360, United States Navy OP-20-S-5, Office of
the Chief of Naval Operations, Washington, DC, October 30, 1943, p. 61. This history is available at NARA
(National Archives and Records Administration) RG 457, Box 1124, College Park, Maryland. Note: SRH
stands for Special Research Histories.
21 Mucklow, Timothy J., SIGABA/ECM II: A Beautiful Idea, Center for Cryptologic History, National Security
If the enemy has no idea how any of the SIGABA rotors are wired, there are 26! choices for
each of the 10 large rotors and 10! Choices for each of the smaller index rotors. Thus, there would
seem to be about (26!)10(10!)5 ≈ 7.2 × 10298 total possibilities. For modern encryption algorithms,
the keysize is usually stated in bits. Because 7.2 × 10298 ≈ 2992.8, we see that the keyspace is about
993 bits. I wrote “about” because it is easy to nitpick this number by pointing out that it is unlikely
for a system to use two rotors that have the exact same wiring or a rotor whose wiring is the iden-
tity (although the internal wiring from the military Enigma’s plugboard to its rotor assembly was
the identity!). A more serious nitpick arises from the fact that the index rotors are stationary. Thus,
a collection of 5 of them is equivalent to some differently wired single rotor. Hence, once should
replace the factor (10!)5 in the calculation above with 10!.22 This reduces the keyspace to ≈ 4.13 ×
10272 ≈ 2905.6
If an enemy was able to learn the wiring of all of SIGABA’s rotors through a spy, a double
agent, blackmail, surreptitious entry, etc., then there are far fewer possible keys to consider, but
the number is still immense. To calculate it, we first note that the 10 large rotors can be placed in
the machine in 10! different orders. However, each could be placed in a given position right-side-
up or upside-down! These orientations were referred to as “forward” or “reverse.” Thus, we have
another factor of 210. Once each rotor is placed in the machine, in whatever orientation, there are
26 choices as to how far along in its rotation it is started. This gives another factor of 2610 when all
10 large rotors are considered. Altogether then, the large rotors may be set in (10!)(210)(2610) ways.
Similarly, the 5 small index rotors may be ordered in 5! ways, inserted in normal or reverse posi-
tion23 (a factor of 25) and set to any of 10 initial positions each (a factor of 105). The grand total is
(10!)(210)(2610) (5!)(25)(105) ≈ 2.0 × 1032. This is approximately 2107.3, so SIGABA could be said to
have about a 107 bit key, if the wirings of all of the rotors are known.
The above calculation shows the number of potential keys, given only the limitations imposed
by the wirings of the available rotors. However, there were other limitations imposed by the pro-
cedures that dictated how the machine was actually used. For example, the SIGABA manual
instructs for the settings of the control rotors to be sent in plaintext (!) as a message indicator,
along with the enciphered message. This obviously reduces the keyspace for an enemy who knows
what a message indicator means. On the other hand, communications between Roosevelt and
Churchill were not carried out in this manner. For all users, the bundling of outputs from the
index rotors has the effect that different orderings of the index rotors can produce identical results,
reducing the effective keyspace.24
Mark Stamp and Wing On Chan calculated the keyspace available for Roosevelt and Churchill
to be about 295.6, and that achieved by following the manual as to message indicators as about 248.4.
They pointed out that while this smaller keyspace could be brute-forced at the time of their writing
(2007), it “would have been unassailable using 1940s technology, provided no shortcut attack was
available.” They attacked the more impressive keyspace of 295.6, assuming 100 characters of known
plaintext, and found that they could achieve success 82% of the time with a total workload of only
284.5. While they conceded that this was “far from practical,” anything better than brute-force is
22 This was pointed out in Stamp, Mark and Wing On Chan, “SIGABA: Cryptanalysis of the Full Keyspace,”
Cryptologia, Vol. 31, No. 3, July 2007, pp. 201–222.
23 Although the reverse position could be utilized for the index rotors, it never actually was during World War
II (see Mucklow, Timothy J., SIGABA/ECM II: A Beautiful Idea, Center for Cryptologic History, National
Security Agency, Fort George G. Meade, Maryland, 2015, p. 29), so it would be reasonable to eliminate the
factor of 25 in calculating the keyspace.
24 Stamp, Mark and Wing On Chan, “SIGABA: Cryptanalysis of the Full Keyspace,” Cryptologia, Vol. 31, No.
considered an attack, as it shows the cipher to have less than its apparent strength. They also com-
mented that “it is certainly possible to improve on the attack presented here.”25
The next attack to be published came from George Lasry in 2019. It also required some known
plaintext, but only needed 260.2 steps.26
There was another SIGABA scare late in the war. On February 3, 1945, two U.S. Army sergeants
in Colmar, France left their truck, which contained a SIGABA, unguarded as they entered a
brothel. When they returned, the truck was gone. A frantic search was begun by counterintelli-
gence, but only the trailer that had been attached to the truck was located. The SIGABA was still
missing. General Eisenhower made locating the machine an extremely high priority, but weeks
went by with no leads. U.S. and French counterintelligence agents formed a joint squad to try to
find the SIGABA, General Fay B. Prickett became involved, inquiries were made with Swiss spies,
and General Charles de Gaulle was even consulted to see if the French might have taken the device
to learn how to strengthen their own cryptographic efforts!28
Eventually, a tip from some French source led to a pair of safes lying in the mud after appar-
ently having been dumped into the Giessen river from a bridge upstream. This was fantastic
progress, but there was a third safe. Where was it? Men searched the banks and divers checked
under the water in vain. In desperation, the river was dammed and a bulldozer dredged the bot-
tom. Thus, days went by with no certainty that the efforts would be rewarded. Finally, on March
20, a reflection from the sun revealed the last safe to be mired in the mud at a spot previously
underwater.29
After the recovery, the French explained that one of their military chauffeurs had lost his truck
and simply “borrowed” the one with the SIGABA as a replacement. He ditched the safes (pushing
them off a bridge into the Giessen) because he didn’t want to be accused of stealing them!30 Prior
to the recovery and this explanation coming forth, Eisenhower had no way of knowing whether
SIGABA had been in the hands of the enemy or not. In the meanwhile, top level communications
25 Stamp, Mark and Wing On Chan, “SIGABA: Cryptanalysis of the Full Keyspace,” Cryptologia, Vol. 31, No.
3, July 2007, pp. 201–222.
26 Lasry, George, “A Practical Meet-in-the-Middle Attack on SIGABA,” in Schmeh, Klaus and Eugen Antal, edi-
tors, Proceedings of the 2nd International Conference on Historical Cryptology, HistoCrypt 2019, Mons, Belgium,
June 23-26, 2019, Linköping University Electronic Press, Linköping, Sweden, pp. 41–49, available online at
http://www.ep.liu.se/ecp/158/005/ecp19158005.pdf.
27 ULTRA and the Army Air Forces in World War II: An Interview with Associate Justice of the U.S. Supreme Court
Lewis F. Powell, Jr., edited with an introduction and essay by Diane T. Putney, Office of Air Force History,
United States Air Force, Washington DC, 1987, p. 96, available online at https://tinyurl.com/ydcdw78b.
28 Kahn, David, The Codebreakers, second edition Scribner, New York, 1996, pp. 510–512.
29 Kahn, David, The Codebreakers, second edition Scribner, New York, 1996, pp. 510–512.
30 Kahn, David, The Codebreakers, second edition Scribner, New York, 1996, pp. 510–512.
SIGABA: World War II Defense ◾ 305
had to continue. To be on the safe side, Eisenhower ordered the production of a new, differently
wired, set of 15 rotors for all 10,060 SIGABAs.31
SIGABA and its temporary successor SIGROD were slowly replaced in the 1950s by
the TSEC/KL-7 (ADONIS/POLLUX). The new cipher machine was an electronic-
mechanical hybrid that employed a programmable cipher rotors/bezel assembly (eight
rotors/thirty-six pins), cams, and vacuum tube technology along with a novel re-
flexing principle. It was phased out of the U.S. military inventory in the early 1980s.33
Figure 9.14 A SIGABA, no longer needed, rests in a locked case. (Courtesy of the National
Cryptologic Museum, Fort Meade, Maryland.)
Historians of cryptology long thought that the reason SIGABA went out of use was that it
was no longer fast enough. However, this wasn’t exactly true. The real reason is that the machine
was unbreakable for the time period and it was feared that, if use continued, the Soviets might
31 Mucklow, Timothy J., SIGABA/ECM II: A Beautiful Idea, Center for Cryptologic History, National Security
Agency, Fort George G. Meade, Maryland, 2015, p, 30.
32 Mucklow, Timothy J., SIGABA/ECM II: A Beautiful Idea, Center for Cryptologic History, National Security
Figure 9.15 Frank Rowlett shows off a SIGABA to Admiral Bobby Ray Inman (Director
NSA) and A nn Caracristi (NSA’s first female Deputy Director). (Courtesy of the National
Cryptologic Museum, F ort Meade, Maryland.)36
34 Mucklow, Timothy J., SIGABA/ECM II: A Beautiful Idea, Center for Cryptologic History, National Security
Agency, Fort George G. Meade, Maryland, 2015, pp. 26 and 41.
35 Thanks to Robert Simpson, National Cryptologic Museum librarian, for providing this list.
36 A similar image at https://www.nsa.gov/Resources/Everyone/Digital-Media-Center/Image-Galleries/Historical/
B1, United States Patent and Trademark Office, January 16, 2001, available online at https://tinyurl.com/
ybmehc3h.
SIGABA: World War II Defense ◾ 307
Enciphering Speech1
Mathematical ideas seem to inevitably find applications that were undreamed of when they were
originally discovered. This chapter details how modular arithmetic and logarithms helped the
Allies win World War II.
1 This chapter originally appeared in a slightly different form as Bauer, Craig, “How Modular Arithmetic Helped
Win World War II,” Cryptologic Quarterly (CQ), 2015-01, Vol. 34, No. 1, pp. 43-57, Center for Cryptologic
History, National Security Agency, Fort George G. Meade, Maryland.
309
310 ◾ Secret History
Actually, nobody could speak securely using an inverter. This system protected only against
casual eavesdropping and could be easily inverted back by determined amateurs. There was no
key as such, and inverters are not hard to build. In some cases, the devices were not even needed.
With practice it is possible to understand much inverted speech, even if it isn’t that old professor
of yours speaking.
AT&T and RCA offered a slightly more sophisticated scheme in 1937. Known as the A-3
Scrambler, this system split the speech into five channels (aka subbands), each of which could be
inverted, and shuffled them before transmitting. However, this was still weak, and it was imple-
mented in an especially weak manner. Because there are only 5! = 120 ways to reorder the 5 sub-
bands and 25 = 32 ways to decide which (if any) of the subbands will be inverted, we have a total of
(120)(32) = 3,840 ways to scramble the speech. Thus, the key space is way too small. If the attacker
knows how the system works, he or she could simply try all of the possibilities. Even worse, many
of these keys failed to garble the speech sufficiently to prevent portions of it from remaining
understandable. Worst of all, of the 11 keys deemed suitable for use, only 6 were actually used!
They were applied in a cycle of 36 steps, each lasting 20 seconds, for a full period of 12 minutes.2
Hence, like the inverters of the 1920s, the A-3 Scrambler was understood to offer “privacy, not
security.” A good analogy is the privacy locks on interior doors of homes. If someone walks up to
a home bathroom that is in use, and the lock prevents the doorknob from turning, he’ll think,
“Oh, someone’s in there,” and walk away. Privacy is protected. However, there’s no real security.
Someone intent on entering that bathroom will not be stopped by the lock. In the same manner,
a scrambler would protect someone on a party line,3 but could not be expected to protect national
secrets against foreign adversaries.
When President Franklin D. Roosevelt and Prime Minister Winston Churchill spoke on the
phone, they needed real security, not just privacy, yet they initially used the A-3 Scrambler! It
was solved by the Germans by September 1941, after only a few months’ work.4 As the following
quotes show, allies on both sides of the Atlantic were aware of the problem.
The security device has not yet been invented which is of any protection whatever
against the skilled engineers who are employed by the enemy to record every word of
every conversation made.—British Foreign Office Memorandum, June 19425
In addition, this equipment furnishes a very low degree of security, and we know
definitely that the enemy can break the system with almost no effort.—Colonel Frank
McCarthy, Secretary to the Army General Staff, October 19436
2 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 554.
3 Younger readers will likely require an explanation of the term “party line.” As a first step, imagine a house with
phones that actually connect to jacks in the walls (i.e., landlines). A boy upstairs might pick up the phone in
his room and hear his dad talking to someone. He’d realize his dad was using the downstairs phone and hang
up. All of the phones in the house were wired via a common line. This would be convenient for conference calls,
but inconvenient the rest of the time. A family member would sometimes have to wait his turn, when wanting
to make a call. “Party lines” worked on the same principle, but the phones were in different homes. That is, in
the old days, you might be on a party line with one or more neighbors. You could listen in on their calls, if you
desired, but would hopefully respect their privacy and hang up when you discovered the line was in use.
4 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, pp. 555-556.
5 British Foreign Office memorandum FO/371/32346. Taken here from Hodges, Andrew, Alan Turing: The
Enigma Simon & Schuster, New York, 1983, p. 236.
6 From a letter to Harry Hopkins, assistant to President Roosevelt. Taken here from Mehl, Donald E., The Green
Hornet, self-published, 1997, p. 5.
Enciphering Speech ◾ 311
Given that the Americans and the British knew that the system they were using for voice encryp-
tion offered no security, it’s natural to ask why they didn’t use something better. The answer is that
securing speech with encryption is much more difficult than encrypting text. There are several
reasons why this is so, but one of the most important is redundancy. Redundancy in speech allows
us to comprehend it through music, background noise, bad connections, mumbling, other people
speaking, etc. Text is at least 50% redundant (in other words, removing half of the letters from a
given paragraph does not typically prevent it from being reconstructed—see Section 11.3 for more
details), but speech is much more redundant and it is hard to disguise because of this.
Speech that is scrambled in the manner of the A-3 Scrambler can be reconstructed using a
sound spectrograph, which simply involves plotting the tones and reassembling them like a jigsaw
puzzle. So, although splitting the voice into more channels would increase the number of possible
keys, the attacker could simply reassemble what amounts to a jigsaw puzzle with more pieces. A
successful voice encryption system would have to operate in a fundamentally different manner
than inverting and shuffling.
7 For a reasoned argument that it would have made little difference if the warning had arrived in time, see
Christensen, Chris, “Review of two collections of essays about Alan Turing,” Cryptologia, Vol. 44, No. 1, 2020,
pp. 82-86.
312 ◾ Secret History
As indicated above, SIGSALY, the ciphony system that would replace the A-3 Scrambler for
Roosevelt and Churchill (and others), had many different names. This is an indication of its impor-
tance. The sixth name may be seen on the cover of a formerly classified directory for the system
(Figure 10.2). The cover is certainly attention grabbing, but the contents are quite dry by comparison.
Before getting into the details of how SIGSALY worked, a picture is presented (Figure 10.3).
Upon first seeing this image, I asked, “So where in the room is SIGSALY?” I wasn’t sure which
item I should be looking at. The answer was, “It is the room!” The result of the quest for secure
voice communication led to a 55-ton system that took up 2,500 square feet. In fact, the image only
shows part of SIGSALY. It literally filled a house. Some reflection makes sense of why the project
didn’t turn out a more compact device.
Necessity is the mother of invention, so it’s not surprising that the need to keep voice commu-
nications secure from Nazi cryptanalysts is what finally motivated the design of a secure system.
But this impetus also meant that no time could be wasted. The designers didn’t have the luxury
of taking a decade to make a system of utmost elegance. Instead, they based it on earlier tech-
nology that could be readily obtained, saving much time. The heart of the system was a vocoder,
which is a portmanteau of voice coder. The original intent of such devices was to digitize speech so
that it might be sent on undersea phone cables using less bandwidth, thus reducing costs. Due to
the aforementioned high redundancy of human speech, compression down to 10 percent of the
original was found to be possible, while still allowing the original meaning to be recovered.8 For
SIGSALY, the compression was a bonus. The important thing was to digitize the voice, so that a
8 Tompkins, Dave, How to Wreck a Nice Beach, Stopsmiling Books, Chicago, Illinois, 2010, p. 23.
Enciphering Speech ◾ 313
random digital key could be added to it in the manner of the one-time pad. Off-the-shelf vocoder
technology took up much space!
For those interested in hearing how early vocoders transformed speech, a recording of
a Bell Labs vocoder from 1936 may be heard at http://www.complex.com/music/2010/08/
the-50-greatest-vocoder-songs/bell-telephone-laboratory.
Middle-aged readers might find the sound reminds them of the Cylons in the original (1970s)
Battlestar Galactica TV series. Indeed, this sound effect was produced using a vocoder.9 Decades
earlier, Secretary of War Henry Stimson had remarked of a vocoder, “It made a curious kind of
robot voice.”10
This brings us to an interesting point. Vocoders sound cool. For this reason, many musicians
have used them. Dave Tompkins, a hip-hop journalist, aware of the use of vocoders in voice
encryption and music, wrote a very entertaining book that examines both applications. The front
cover of this book appears in Figure 10.4. The title of Tompkins’s book arose from the manner in
which vocoders were tested. Various phrases would be passed through the vocoders, and listen-
ers, ignorant of what they were supposed to hear, would try to determine the messages. In one
instance, the phrase “How to recognize speech” was misheard as “How to wreck a nice beach.”
Clearly that vocoder was not suitable to military applications in which a slight misunderstanding
could have a calamitous effect.
Figure 10.4 For a book with cryptologic content, Tompkins’s work contains a record-shattering
amount of profanity. (Courtesy of Dave Tompkins).
The diverse applications of the vocoder, detailed in Tompkins’s book, are represented by
Figures 10.5 and 10.6.
The vocoder used by SIGSALY broke the speech into ten channels (from 150 Hz to 2950 Hz),
and another channel represented pitch. Some sources describe the pitch as being represented by
a pair of channels. Both points of view can be considered accurate, as will be made clear shortly.
Each channel was 25 Hz, so the total bandwidth (with two pitch channels) was (12)(25) = 300 Hz.
Ultimately, the communications were sent at VHF. The digitization of each channel was done on a
senary scale; that is, the amplitude of each signal was represented on a scale from 0 to 5, inclusive.
A binary scale was tried initially, but such rough approximation of amplitudes didn’t allow for an
understandable reconstruction of the voice on the receiving end.11 For some reason the pitch had
to be measured even more precisely, on a scale from 0 to 35. Because such a scale can be repre-
sented by a pair of numbers between 0 and 5, pitch may be regarded as consisting of two channels.
11 Hodges, Andrew, Alan Turing: The Enigma, Simon & Schuster, New York, 1983, p. 246.
Enciphering Speech ◾ 315
Figure 10.5 These men knew nothing about the future use of vocoders by musicians. (Courtesy
of the National Cryptologic Museum, Fort Meade, Maryland).
Figure 10.6 Musicians, represented here by Michael Jonzun (and a Roland SVC vocoder), knew
nothing of the use of vocoders by the military. (Courtesy of Dave Tompkins and Michael Jonzun.)
316 ◾ Secret History
Before we get to modular arithmetic, the mathematical star of this tale, we examine how loga-
rithms contributed to winning the war. When discretizing sound, it seems reasonable to represent
the amplitude using a linear scale, but the human ear doesn’t work in this fashion. Instead, the
ear distinguishes amplitudes at lower amplitudes more finely. Thus, if we wish to ease the abil-
ity of the ear to reconstruct the sound from a compressed form, measuring the amplitude on a
logarithmic scale is a wiser choice. This allows for greater discernment at lower amplitudes. Thus,
the difference in amplitude between signals represented by 0 and 1 (in our senary scale) is much
smaller than the difference in amplitude between signals represented by 4 and 5. This technique
goes by the technical name logarithmic companding, where companding is itself a compression of
compressing and expanding.12 The concept described above will already have been familiar to all
readers. Who hasn’t used heard of the (logarithmic) decibel scale for measuring sound intensity?
Having discretized the speech, we’re ready to add the random key. With both the speech and
the key taking values between 0 and 5, the sum will always fall between 0 and 10. SIGSALY,
however, performed the addition modulo 6, so that the final result remained between 0 and 5, as
represented in Figure 10.7.
Figure 10.7 The mod 6 addition of the key was referred to as “reentry” by the creators of
SIGSALY. (From Boone, James V. and Peterson, R. R., The Start of the Digital Revolution: SIGSALY
Secure Digital Voice Communications in World War II, Center for Cryptologic History, National
Security Agency, Fort Meade, Maryland, July 2000, p. 19.)
Why was the addition of the key done in this complicated manner? Why not just add without
the mod 6 step? Three reasons are given below.
1. The mod 6 step was Harry Nyquist’s idea.13 Students of information theory will recognize
this name and, for them, it certainly lends a stamp of authority to support the inclusion of this
step. But an argument from authority is not a proof! Fortunately, we have two more reasons.
2. If we don’t perform the mod 6 step, then a cipher level of 0 can arise only from both message
and key being 0. So, whenever a 0 is the output, an interceptor will know a portion of the
signal. Similarly, a great cipher level of 10 can only arise from both message and key being
5. Hence, without the mod 6 step, an interceptor would be able to immediately identify
2/36 ≈ 5.5% of the signal from the simple analysis above.
3. Simply adding the key without the mod step would result in random increases in ampli-
tude, which may be described as hearing the message over the background noise of the key.
Are you able to understand a friend talking despite the white noise produced by an air-
conditioner or chainsaw in the background?
SIGSALY enciphered every channel in this manner using a separate random key for each. A sim-
plified schematic for the overall encryption process is provided in Figure 10.8.
Figure 10.8 shows the speech entering the system on the left-hand side and getting broken
down into a pitch channel (pitch detector) and ten voice channels (spectrum 1 through spectrum
10). There are steps, not discussed here, both before and after the mod 6 (reentry) takes place.
The “missing steps” are of greater interest to engineers than mathematicians, and can be found in
Donald E. Mehl’s book The Green Hornet.14
At this point I’d like to draw your attention to the lower left-hand corner of Figure 10.8. The
“key phonograph” is exactly what it sounds and looks like. The source of the key that needed to
be combined with each channel was simply a record (see Figure 10.9). The one-time key for voice
encryption was codenamed SIGGRUV. As with text, the key was added to encipher and subtracted
to decipher. Taking the form of a record, a built-in safety mechanism caused communication to
cease if the key stopped. Otherwise, the speaker would suddenly be broadcasting in the clear.
Figure 10.9 A SIGSALY turntable and record, with a modern CD for scale. (Courtesy of the
National Cryptologic Museum, Fort Meade, Maryland.)
The digitized speech was sampled 50 times per second, so to separately encipher all of the
channels, the record had to be simultaneously playing twelve tones at different frequencies, and
these tones had to change every fiftieth of a second. It’s natural to ask why the sampling rate was
50 times per second and not higher or lower. The fundamental unit of speech, known as a pho-
neme, has a duration of about a fiftieth of a second, so the sampling rate is just high enough to
allow it to be captured. A higher sampling rate is not needed to make the digitized voice compre-
hensible and would worsen the synchronization problem—the record at the receiving terminal,
used to subtract the key, must be synchronized with the incoming message, if there is to be any
hope of recovering it! While we’re on the topic of synchronization, it should be mentioned that the
records contained tones for purposes other than encryption. For example, a tone at one particular
frequency was used for fine-tuning the synchronization.
Ideally the keys would be random, a condition simulated for SIGGRUV by recording thermal
noise backward. None of these records would become classic tunes, but the military was content
with one-hit wonders. Indeed, the system would become vulnerable if the same record were ever
replayed. Although not labeled as such, the implicit warning was “Don’t Play it Again, Uncle
Sam!,” and the records were destroyed after use.
Vinyl aficionados may have noticed that the record in Figure 10.9 is unexpectedly large in
comparison to the CD. SIGSALY’s records measured sixteen inches and could be played from start
to finish in twelve minutes. Over 1,500 of these key sets were made.15
10.4 Plan B
Once the SIGSALY installations were in place, all that was necessary for communication was that
each location have the same record. Initially spares were made, but as confidence was gained, only
two copies of each record were made. Still, there was a Plan B.
Figure 10.10 looks like a locker room, but it is simply SIGSALY’s back-up key, codenamed
SIGBUSE. If for some reason the records couldn’t be used for keying purposes, SIGBUSE could
generate a pseudorandom key mechanically.
Figure 10.10 SIGSALY’s back-up key SIGBUSE. (Courtesy of the National Cryptologic Museum,
Fort Meade, Maryland.)
Because SIGSALY would link Roosevelt and Churchill, the Americans and the British needed
to be satisfied that it was secure. The British had the added concern that the operating teams,
which would consist of Americans, even in London, would hear everything. Thus, in January
1943, the British sent their top cryptanalyst, Alan Turing, to America to evaluate the system. After
much debate, probably reaching President Roosevelt,16 Turing was allowed access to details of the
closely guarded secret project.
Turing helped by suggesting improvements to the SIGBUSE key, and he reported to the
British, “If the equipment is to be operated solely by U.S. personnel it will be impossible to prevent
them listening in if they so desire.” In reality, the Americans were often so focused on their jobs
they had no idea what was actually said.
16 We have no proof, but Mehl, Donald E., The Green Hornet, self-published, 1997, p. 69; Hodges, Andrew. Alan
Turing: The Enigma. Simon & Schuster, New York, 1983, p. 245; and Tompkins, Dave, How to Wreck a Nice
Beach, Stopsmiling Books, Chicago, Illinois, 2010, p. 59, all believe the matter reached Roosevelt. In any case,
Secretary of War Stimson resolved it.
320 ◾ Secret History
Turing’s examination of SIGSALY inspired him to create his own (completely different) sys-
tem, Delilah. Turing’s report on Delilah appeared publicly for the first time in the October 2012
issue of Cryptologia.17
Ultimately, SIGBUSE turned out to be wasted space. The records never failed, so the alternate
key was never used. A more critical part of SIGSALY was the air-conditioning system. It is shown in
Figure 10.11. A voice encryption system that fills a house requires a cooling system on the same scale!
Figure 10.11 SIGSALY’s air conditioning system. Donald Mehl appears on the right in this
photo. (Courtesy of the National Cryptologic Museum, Fort Meade, Maryland.)
17 Turing, Alan M. and Donald Bayley, “Report on speech secrecy system DELILAH, a Technical Description
Compiled by A. M. Turing and Lieutenant D. Bayley REME, 1945–1946,” Cryptologia, Vol. 36, No. 4,
October 2012, pp. 295–340.
18 This refers to the digitization process.
19 Mehl, Donald E., The Green Hornet, self-published, 1997, p. 86.
Enciphering Speech ◾ 321
Figure 10.12 Another view of SIGSALY. (Courtesy of the David Kahn Collection, National
Cryptologic Museum, Fort Meade, Maryland.)
Figure 10.12 provides another view of a SIGSALY installation. In this one, a phone is clearly
visible, but this is not what the caller would be using. The phone you see was used by a member of
the operating team to make sure synchronization was being maintained.
A separate room existed to allow the user(s) to converse in a more comfortable condition
(Figure 10.13).
Figure 10.13 SIGSALY users—fighting the Germans and Japanese… and loving it! (Courtesy of
the National Cryptologic Museum, Fort Meade, Maryland.)20
20 An alternate caption for this image is “SIGSALY: Your digital pal who’s fun to be with!”
322 ◾ Secret History
Technological developments, over the decades that followed, rapidly diminished the space needed
for secure voice encryption. Still, JFK’s system (Figure 10.14) looked decidedly less cool.
Figure 10.14 President Kennedy’s voice encryption system (Courtesy of the David Kahn
Collection, National Cryptologic Museum, Fort Meade, Maryland.)
The system in Figure 10.14 looks like something Maxwell Smart of the TV series Get Smart
might have used. What would the next step be, three phones?
21 Hodges, Andrew, Alan Turing: The Enigma, Simon & Schuster, New York, 1983, p. 247.
22 General Eisenhower complained that it made his wife sound like an old woman. The system was optimized for
male voices, and as a result, deciphered female voices sounded worse.
Enciphering Speech ◾ 323
Long since retired, SIGSALY was finally declassified in 1976. This allowed patents, applied for
decades earlier, to finally be granted. Three recipients, also long since retired (from Bell Telephone
Laboratories Inc.), were the engineers Robert C. Nathes, Ralph K. Potter, and P. W. Blye.23
A mock-up of a portion of SIGSALY (Figure 10.15) may be seen today at the National
Cryptologic Museum adjacent to Ft. Meade, Maryland. This museum also has an excellent library
that includes the David Kahn Collection.24 Kahn is widely regarded as cryptology’s greatest histo-
rian and, prior to his donation, his collection was the largest in private hands.
Figure 10.15 The National Cryptologic Museum’s SIGSALY mock-up, which has been scaled
down since this photo was taken.
In Section 10.2, we saw the consequences that may be faced when a nation is without a secure
voice encryption system. We close with a reminder of the advantage gained when a nation does
possess such a system.
23 Jones, Stacy V., “From WWII era ‘Green Hornet’ Patent Awarded,” The New York Times, July 3, 1976, p. 27,
found at the National Cryptologic Museum, David Kahn Collection, Folder 12-7.
24 Hamer, David, “The David Kahn Collection at NSA’s National Cryptologic Museum,” Cryptologia, Vol. 35,
There are two kinds of cryptography in this world: cryptography that will stop your
kid sister from reading your files, and cryptography that will stop major governments
from reading your files. This book is about the latter.
– Bruce Schneier1
1 Schneier, Bruce, Applied Cryptography, second edition, John Wiley & Sons, New York, 1996, p. xix.
Chapter 11
Claude Shannon
In Section 7.7, we took a brief look at the life of Alan Turing, who is considered by many to be
the father of computer science. If anyone could be considered the American version of Turing, it
would be Claude Shannon, who is known as “the father of information theory.”
1 Golomb, Solomon W., Elwyn Berlekamp, Thomas M. Cover, Robert G. Gallager, James L. Massey, and
Andrew J. Viterbi, “Claude Elwood Shannon (1916-2001),” Notices of the American Mathematical Society, Vol.
49, No. 1, January 2002, pp. 8–16, p. 10 cited here.
327
328 ◾ Secret History
Figure 11.1 Claude Shannon (1916–2001). (Attribution 2.0 Generic (CC BY 2.0) by Tekniska
museet, https://www.flickr.com/photos/tekniskamuseet/6832884236/sizes/o/.)
For example, suppose the message is a weather report and the various possibilities, along with their
probabilities are as follows:
M1 = Sunny 0.05
M2 = Cloudy 0.15
M3 = Partly Cloudy 0.70
M4 = Rain 0.10
A report of “Sunny” is more surprising than a report of “Partly Cloudy” and can therefore be said
to convey more information.
Let the function that measures the amount of information conveyed by a message be denoted
by I(M), where M is the message. In general, if the probability of Mi is greater than the probability
of Mj, we should have I(Mi) < I(Mj). Also, if we receive two weather reports (for different days), the
total amount of information received, however we measure it, should be the sum of the informa-
tion provided by each report; that is, I(MiMj) = I(Mi) + I(Mj).
The input of the function I is shown to be the message, but I should really just be a function
of the probability of the message; that is, if two messages are equally probable, they should be
evaluated as containing the same amount of information. We also want the function I to change
continuously as a function of this probability. Informally, a very small change in the probability of
a message should not cause a “jump” (discontinuity) in the graph of I.
Claude Shannon ◾ 329
Can you think of any functions that fit all of the conditions given above?
This is the manner in which Claude Shannon approached the problem. Rather than guess at a
formula, he formulated rules, like those above, and then sought functions that fit the rules.2 As an
extra clue as to what the function could be, consider the fact that if a message M has probability
1, no real information is conveyed; that is, I(M) = 0 in this case.
Shannon found that there was essentially just one function that satisfied his conditions,
namely the negation of the logarithm.3 The amount of information contained in a message M
is thus given by
−K ∑log ( M )
2 i
where the sum is taken over the individual components of the message, using the probability of each
component as the value for Mi. If the message consists of seven weather reports, we would sum seven
terms to get the total amount of information. There is some flexibility in that K may be any positive con-
stant, but since this only amounts to choosing units, Shannon simplified matters by taking it to be one.
Shannon presented his result as a weighted average.4 Using his formula, given below, one can
calculate the average amount of information conveyed by a message selected from a set of possibili-
ties with probabilities given by pi.
− K ∑ pi log2 ( pi )
i
For example, a single weather report, selected from the four possibilities given above, in accor-
dance with its probability, will have an average information content of
2 Shannon, Claude E., “A Mathematical Theory of Communication,” reprinted with corrections from The Bell
System Technical Journal, Vol. 27, pp. 379–423, 623–656, July, October, 1948, p. 10.
3 Nyquist and Hartley both made use of logarithms in their work on information theory, prior to Shannon. It
seems that it was an idea whose time had arrived.
4 Shannon, Claude, “A Mathematical Theory of Communication,” reprinted with corrections from The Bell
System Technical Journal: Vol. 27, pp. 379–423, 623–656, July, October, 1948. The result appears on p. 11 of
the revised paper, but the proof is given in Appendix II, which is in the second part of the split paper.
5 Tribus, Myron and Edward C. McIrvine, ‘‘Energy and Information,’’ Scientific American, Vol. 225, No.
3, September 1971, pp. 179–184, 186, 188, p. 180 cited here. Thanks to Harvey S. Leff for providing this
reference!
330 ◾ Secret History
In physics, this mysterious quantity is denoted by S. Although the formula is the same, Shannon
used H for his entropy. The name entropy stuck, but it is sometimes referred to as “information
entropy” or “Shannon entropy” to distinguish it from the concept in physics. Another reason that
Shannon’s original names were problematic is that the results we get from his formula don’t always
correspond to how we’re used to thinking about information or uncertainty.
If you are trying to evaluate Shannon entropy on a calculator or with a computer program,
you’ll quickly bump into the problem of evaluating a base 2 logarithm. Let’s take a quick look at
how to get around this difficulty.
y = log 2 x ⇔ 2 y = x (by definition)
Now take log b of both sides. You may use b = 10 (common logarithm) or b = e (natural logarithm)
or any other base b > 0. Bases 10 and e are the ones that commonly have keys devoted to them (log
and ln, respectively) on calculators. I’ll use base b = e, with the notation ln.
( )
ln 2 y = ln ( x )
Making use of one of the properties of logarithms, we may bring the exponent of the argument
down in front.
yln ( 2 ) = ln ( x )
ln ( x )
y=
ln ( 2 )
Thus, we have rewritten our base 2 logarithm in terms of logarithms with base e. We have, by the
way, just derived the change of base formula for logarithms.
Also, because we can rewrite this last equation to express
1
y = log 2 x as y = ln ( x )
ln ( 2 )
we see logarithmic functions with different bases only differ by a constant multiplier. Recalling
that Shannon’s formulation for entropy was only unique up to a constant multiple, we see that
this flexibility is equivalent to being able to use any base for the logarithm. Base 2 is especially
convenient when considering digital data; thus, 2 was the base chosen. Regardless of what form
the data takes, using the base 2 logarithm results in the units of entropy being “bits per message”
or “bits per character,” if we want an average rate of information transmission.
The average entropy, H, of English may be calculated by using the probability of each of the
26 letters in the following formula:
H =− ∑ p log ( p )
i 2 i
Claude Shannon ◾ 331
but this is really only an estimate, as the effects over groups of letters have not been accounted for
yet. Rules such as Q being (almost always) followed by U and “I before E, except after C” show
there is order in English on the scale of two- and three-character groupings. Entropy approxima-
tions based on single letter frequencies are often denoted as H1. Better estimates are given by H2
and H3 where the probabilities used in these are for digraphs and trigraphs. As N grows, HN/N
converges monotonically to a limit (see Table 11.1).
Table 11.1 First-, Second-, and Third-Order Entropy for Various Languages
Language H1 H2 H3
English
As von Neumann indicated, the idea of entropy existed in physics before Claude Shannon
applied it to text. To make sense of entropy in physics, one must first understand that imbalances
in systems offer usable energy. As an example, consider a hot cup of coffee in a room. It has energy
that can be used—one could warm one’s hands over it or use it to melt an ice cube. If left alone, its
energy will slowly dissipate with no effect other than slightly warming the room. When the coffee
and the room settle to an equilibrium temperature, the room will contain the same amount of
total energy, but there will no longer be any useable energy. The amount of entropy or “unusable
energy” can be said to have increased.
One of the few theories that scientists have enough confidence in to label a “law” is the second
law of thermodynamics. This states that entropy must increase in a closed system; that is, if no
energy is being added to a system (i.e., the system is closed), then the amount of energy in the
system that cannot be used must increase over time. In other words, the amount of usable energy
332 ◾ Secret History
in the system must decrease. To simplify this further, everything winds down. If the cup of cof-
fee described in the previous paragraph had a heating coil in it, which was plugged into an outlet
that connected it to an outside power source, then the coffee would not cool and the entropy of
the room would not increase. This is not a violation of the second law, however, because the room
could no longer be considered a closed system. It receives energy from outside.
Shannon’s “text entropy” follows the second law of thermodynamics, as Table 11.1 shows. Over
the centuries, the entropy of a particular language increases, as it does when one language springs
from another. This is an empirical result; in other words, experiment (measuring the entropy of
various texts) indicates it is true, but we do not have a proof. It makes sense though, because, as
a language evolves, more exceptions to its rules appear, and more words come in from foreign
languages. Hence, the frequency distribution of character groups tends to grow more uniform,
increasing the entropy. This phenomenon can be used to roughly date writings; however, not all
authors in a particular generation, or even century, will exhibit the same entropy in their texts.
Edgar Allan Poe’s writings, for example, exhibit higher entropy than those of his peers.6 This is
due to the unusually large vocabulary he commanded.
The maximum possible value for H1 occurs when all probabilities are equal (1/26). This gives
us
≈ 4.7
The idea of entropy also reveals approximately how many meaningful strings of characters we can
expect of length N. The answer is given by 2HN. This can be used, for example, to estimate the
keyspace for a running key cipher of a given length in any particular language.
The idea of entropy has also been influential in the arts (literature, in particular) to various
degrees over the decades. The best example I’ve come upon is Isaac Asimov’s short story “The Last
Question.”7 A more recent example of entropy in pop culture is provided by nerdcore hip hop art-
ist MC Hawking’s song “Entropy.”8 This song educates while it entertains!
6 Bennett, Jr., William Ralph, Scientific and Engineering Problem-solving with the Computer, Prentice Hall,
Englewood Cliffs, New Jersey, 1976, p. 140.
7 The story first appeared in Science Fiction Quarterly, November 1956. See the References and Further Reading
section at the end of this chapter for more on this tale.
8 http://www.mchawking.com/ is the main page for the musician. The lyrics to Entropy can be found at http://
www.mchawking.com/includes/lyrics/entropy_lyrics.php.
Claude Shannon ◾ 333
(using a 26-letter alphabet).9 Just as HN/N converges down to a limiting value as N increases,
DN/N increases to a limiting value with N.
Shannon found the redundancy of English to be D ≈ 0.7 decimal (base 10) digits per letter.
Dividing this value by log(26), we get the relative redundancy of English, which is about 50%. The
value 26 was used in the log, as Shannon chose to omit word spacing and simply use a 26-letter
alphabet for his calculation.10 He explained what his value for the redundancy of English means
and how it may be obtained.
The redundancy of ordinary English, not considering statistical structure over greater
distances than about eight letters, is roughly 50%. This means that when we write
English half of what we write is determined by the structure of the language and half
is chosen freely. The figure 50% was found by several independent methods which all
gave results in this neighborhood. One is by calculation of the entropy of the approxi-
mations to English. A second method is to delete a certain fraction of the letters from
a sample of English text and then let someone attempt to restore them. If they can be
restored when 50% are deleted the redundancy must be greater than 50%. A third
method depends on certain known results in cryptography.11
Cipher Deavours used 1.11 for his approximation of D, which converts to 78%.12 Although he
didn’t explain how he came up with this larger value, one possibility is that he included a blank
space as a character of the alphabet. As spacing rarely changes the meaning of a sentence, its pres-
ence increases the redundancy. In many early examples of writing, ranging from ancient Greek
through medieval times, word spacing isn’t present. And word spacing isn’t the only omission that
has been common historically. The original Hebrew version of the Old Testament was written
without vowels. The redundancy of this language allows it to be read anyway.
I decided to do an experiment of my own. I asked a student, Josh Gross, to send me random
messages with various percentages of letters removed and word spacing preserved. The hardest of
his challenges (the ones with the most letters removed) were the following:
65% Removed
_n _h_s _i_ _ _e _i_e _t _s _ _ _a_ _ _t _ _ _t _h_ i_ _ _e _f _i_t_
_ _s _e_ _ _ _ _n_ _ _r_ _ _ f_ _ _ t_e _o_ _ _r _f _h_ _ _d_ _ _d_ _ _ t_
t_e _k_
_h_ i_ _ _e _a_ h_ _ _ b_e_ d_ _ _ _ _d _r_ _ _h_ g_e_t _ _ _d_t_o_ _ _f
_h_ s_ _ _h. _t _s _p_ _ _e_ h_ _ _ _ _r _o _ d_ _ _i_ _ _l_ _ _a_a_i_t_ _
_ _ _t_m _f _x_ _ _i_ _ _e
9 Shannon, Claude E., “Communication Theory of Secrecy Systems,” The Bell System Technical Journal, Vol. 28,
No. 4, October 1949, pp. 656–715, p. 689 cited here.
10 Shannon, Claude E., “Communication Theory of Secrecy Systems,” The Bell System Technical Journal, Vol. 28,
System Technical Journal, Vol. 27, pp. 379–423, 623–656, July, October, 1948, pp. 14–15.
12 Deavours, Cipher, “Unicity Points in Cryptanalysis, Cryptologia,” Vol. 1, No. 1, January 1977, pp. 46–68,
p.46 cited here. Also see p. 660 of Shannon, Claude E., “Communication Theory of Secrecy Systems,” The Bell
System Technical Journal, Vol. 28, No. 4, October 1949, pp. 656–715.
334 ◾ Secret History
I sent him my solutions and then asked for messages with more letters removed. Josh sent these:
70% Removed
The answers are given at the end of this chapter, after the References and Further Reading
list, but I encourage you to try to solve them yourself first. You may use pattern word programs to
make your work easier. It doesn’t matter how long it takes you to determine the missing letters or
what resources you make use of. If the letters are recoverable, they’re redundant!
Not solely focused on solving “important problems,” Shannon applied the idea of redundancy
to a popular recreation for the cryptologically inclined:13
The ratio of the entropy of a source to the maximum value it could have while still
restricted to the same symbols will be called its relative entropy. This is the maximum
compression possible when we encode into the same alphabet.
Speaking of compression, it should be pointed out that Shannon’s work has been greatly com-
pressed in this chapter. The reader is encouraged to pursue the references for a fuller treatment.
13 Shannon, Claude, “A Mathematical Theory of Communication,” reprinted with corrections from The Bell
System Technical Journal, Vol. 27, pp. 379–423, 623–656, July, October, 1948. The result appears on p. 15 of
the revised paper.
14 Shannon, Claude, “A Mathematical Theory of Communication,” reprinted with corrections from The Bell
System Technical Journal, Vol. 27, pp. 379–423, 623–656, July, October, 1948. The result appears on p. 14 of
the revised paper.
Claude Shannon ◾ 335
Shannon also provided the theoretical background for error-correcting codes.15 These codes are
the opposite of the sort we are interested in. They seek to make the message easier to read by introduc-
ing extra redundancy. In this manner, mutilated messages may still be recovered. Cryptographers aim
to minimize redundancy. Patterns are a cryptanalyst’s best friend, so a good cipher should mask their
presence!
15 We need to be careful in this area, though, not to credit Shannon with too much, as he does make use
of previous work by Richard Hamming. This is done on p. 28 (cited on p. 27) in Shannon’s revised paper
(Shannon, Claude, “A Mathematical Theory of Communication,” reprinted with corrections from The Bell
System Technical Journal, Vol. 27, pp. 379–423, 623–656, July, October, 1948). A very simple means of adding
redundancy to create an error-correcting code is to insert extra bits at regular intervals to serve as parity checks
(making certain sets of bits have an even sum).
336 ◾ Secret History
ADFGX and ADFGVX, employed both substitution and transposition, this did not become a
standard approach to encryption until the computer era.
Shannon noted a disadvantage in ciphers with high confusion and diffusion.16
Although systems constructed on this principle would be extremely safe they possess
one grave disadvantage. If the mix is good then the propagation of errors is bad. A
transmission error of one letter will affect several letters on deciphering.
We’ll see examples of modern systems that combine substitution and transposition to satisfy
Shannon’s conditions in Sections 13.1 and 20.3.
In addition to the important work described above, Shannon found time to pursue other proj-
ects of a more recreational nature. Examples include rocket-powered Frisbees, a motorized pogo
stick, machines to play chess and solve Rubik’s cube, a flame-throwing trumpet, and a mysterious
box with a switch on it. When someone saw this box sitting on his desk and flipped the switch,
the box would open and a mechanical hand would reach out and flip the switch back again. After
this, the hand would pull back into the box and the lid would close, returning the system to its
original state.17
The National Security Agency, not bothered by mild eccentricities, invited Shannon to join
their Scientific Advisory Board.
Perhaps, because of that information theory, it was suggested that he [Shannon] should
be on our Advisory Board [NSASAB] – and he was appointed to it. He came down
there and was tremendously interested in what he found there. He sort of repudiated
his book on secret communications after that. He said that he would never have writ-
ten it if he knew then what he learned later.18
So, what did Shannon see when he joined NSASAB? While this question will remain unanswered,
NSA is the focus of the next chapter. The present chapter closes with brief sections on entropy in
religion and literature.
The first and second laws of thermodynamics have been used, affirmed, rejected,
manipulated, exploited, and criticized in order both to further and to censure religion.
– Erwin N. Hiebert19
Sir Arthur Eddington remarked, “The law that entropy always increases—the second law of ther-
modynamics—holds, I think, the supreme position among the laws of Nature.”20 As such, it is not
16 Shannon, Claude E., “Communication Theory of Secrecy Systems,” The Bell System Technical Journal, Vol. 28,
No. 4, October 1949, pp. 656–715, p. 713 cited here.
17 http://en.wikipedia.org/wiki/Claude_Shannon.
18 Campaigne, Howard H, AFIPS Oral History Interview, 1974, p. 10.
19 Hiebert, Erwin N., “The Uses and Abuses of Thermodynamics in Religion,” Daedalus, Vol. 95, No. 4, Fall
p. 74.
Claude Shannon ◾ 337
surprising that it has been used to rationalize previously held beliefs of various individuals, despite
these beliefs sometimes being mutually exclusive!
One example is its application in 1951 by Pope Pius XII, who claimed that the second law
of thermodynamics confirmed traditional proofs of the existence of God.21 By his reasoning, if
everything must wind down, the Universe cannot be infinitely old or it would have already wound
down completely. Therefore, the Universe must have had a beginning, a creation, and therefore
a creator. The big bang theory, which many scientists back, also gives a date for the beginning of
the Universe, but these scientists tend to see it as a process of creation that does not require a God.
In another direction, some scientists have argued that the idea of an eternal afterlife is a viola-
tion of the second law. That is, if there is a heaven or hell, it cannot last forever. The punk rock
band Bad Religion recorded a song titled “Cease” that makes this point in a somewhat subtle way.
The second law is not explicitly mentioned in the song, but it can be inferred.
Creationists sometimes argue that life on Earth has become more and more complex over time
and that the only way this apparent violation of the second law could be possible is if God caused
it. The scientific rebuttal to this is that the second law only applies to closed systems. The Earth is
not a closed system, because a massive amount of energy is constantly transferred to it by the sun.
It is this energy, ultimately, that makes evolution possible.
There’s more that could be said on this topic, but we now turn to a less controversial use of
entropy.
Entropy in The Crying of Lot 49 can thus be seen both as its main organizing principle
and as Pynchon’s basic philosophical assumption… Even if the world described there
is entropic, it would seem that by the very act of writing and reading about it, a certain
21 Freese, Peter, From Apocalypse to Entropy and Beyond: The Second Law of Thermodynamics in Post-War American
Fiction, Die Blaue Eule, Essen, Germany, 1997, p. 126. William Ralph Inge made the same argument in the
1930s. See p. 1069 of Hiebert, Erwin N., “The Uses and Abuses of Thermodynamics in Religion,” Daedalus,
Vol. 95, No. 4, Fall 1966, pp. 1046–1080.
22 Lewicki, Zbigniew, The Bang and the Whimper: Apocalypse and Entropy in American Literature, Greenwood
Library and Information Science, No. 9.2, 1983, pp. 135–148, p. 137 cited here.
338 ◾ Secret History
amount of information is passed on, which would cause a decrease of entropy, at least
locally. But – and this seems to be Pynchon’s ultimate coup - The Crying of Lot 49 con-
veys practically no such information. We do not learn anything about the characters
that is not ambiguous.24
Many novels make use of mysteries to intrigue readers and keep them turning the pages, but by
the end a nice resolution has been presented and everything makes sense. By contrast, at the end
of a novel in the postmodern genre, things make less sense than at the start. The Crying of Lot
49 accomplishes this, providing another style of entropic literature. There is less order at the end.
Postmodernism isn’t limited to text, however. An example made for television is the series Lost
(2004–2010). Mad magazine suggested that the title referred to the script.
Some authors have attempted to reverse entropy. There was even a journal devoted to this pur-
pose, namely Extropy: Transhuman Technology, Ideas, and Culture, which first appeared in 1989. It
featured articles such as “The Heat Death of Timothy Leary,” in which the deceased was criticized
for allowing himself to be cremated instead of fighting entropy by being cryogenically preserved
(1996).25 This was a nonfiction piece. Can fiction fight entropy? Lewicki related the manner in
which some authors attempted this:
It would follow from what has been said that the most improbable messages, namely
those composed of words haphazardly put together, could most effectively counter
entropy and provide the greatest amount of information. Such works of literature have
in fact been created, but common sense tells us that they have neither decreased the
level of entropy nor offered much information.26
Recall that Shannon considered messages of low probability to contain the most information.
Anthony Purdy, a Professor of Romance Languages at the University of Alberta, also addressed
such attempts.
Hence the simplistically reductive (and self-contradictory) belief that, since entropy
is a measure of probability, the less predictable and the more ‘experimental’ a literary
work, the more effective it will be in the ‘struggle against entropy’.27
He noted, however, that “there is no necessary correlation between information and meaning.”28
While the information content may have been high, according to Shannon’s calculation, it didn’t
24 Lewicki, Zbigniew, The Bang and the Whimper: Apocalypse and Entropy in American Literature, Greenwood
Press, Westport, Connecticut, 1984, pp. 92–93.
25 More, Max, “The Heat Death of Timothy Leary,” Extropy: Transhuman Technology, Ideas, and Culture,
#17, August 1996. The full table of contents for this issue can be seen at http://extropians.weidai.com/
extropians.96/0133.html. I first saw this particular article referenced in Freese, Peter, From Apocalypse to
Entropy and Beyond: The Second Law of Thermodynamics in Post-War American Fiction, Die Blaue Eule, Essen,
Germany, 1997, pp. 98–99.
26 Lewicki, Zbigniew, The Bang and the Whimper: Apocalypse and Entropy in American Literature, Greenwood
p. 9.
28 Bruce, Donald and Anthony Purdy, editors, Literature and Science, Rodopi, Amsterdam, Netherlands, 1994,
p. 11.
Claude Shannon ◾ 339
correspond to what we are used to thinking of as information. Purdy gave examples of how some
attempts failed:
…the high information content of such literary works as Marc Saporta’s Composition
#1 or Raymond Queneau’s Cent mille milliards de poèmes, which depend on a random-
izing principle akin to the shuffling of a deck of cards, does not generate a correspond-
ing increase in meaning. In fact, the high entropy of the source tends, if anything, to
reduce the amount of information transmitted in any single reading…29
The journal devoted to the cause of extropy gave in to entropy in 1996 and folded.
As was explained in Section 11.2, languages follow the second law of thermodynamics. That
is, the (information) entropy of a language increases over time. Some novels set in the future do
not reflect this and the characters speak in the same manner as the author’s contemporaries. One
notable exception is Anthony Burgess’s A Clockwork Orange. This novel begins with
Our pockets were full of deng, so there was no real need from the point of crasting
any more pretty polly to tolchock some old veck in an alley and viddy him swim in his
blood while we counted the takings and divided by four, nor to do the ultraviolent on
some shivering starry grey haired ptitsa in a shop and go off with the till’s guts. But, as
they say, money isn’t everything.
At least one edition has a small dictionary in the back to help readers comprehend the future
slang Burgess introduced. Much of it was Russian in origin. In America, many Spanish words are
part of everyday speech, whether it is the speaker’s native language or not. We understand what is
meant when someone is described as “macho” and when a hate-monger refers to a group of people
as “bad hombres.”
While there’s a large quantity of fiction in which entropy plays an important role, my favorite
is a short story by Isaac Asimov, a man who straddled the worlds of literature and science (he had
a PhD in chemistry). The tale is titled “The Last Question” and it connects entropy and religion
in a very entertaining way. Instead of summarizing it here, I encourage you to seek out and enjoy
the original. It has been reprinted often and details are provided in the On Entropy in Literature
section of the References and Further Reading list below.
29 Bruce, Donald and Anthony Purdy, editors, Literature and Science, Rodopi, Amsterdam, Netherlands, 1994,
p. 11.
340 ◾ Secret History
Elias, Peter, “Two Famous Papers,” IRE Transactions on Information Theory, Vol. 4, No. 3, September 1958,
p. 99. In this humorous piece Elias, the editor of the journal it appeared in, complains about two
types of papers that he wishes people would stop writing. One type details premature attempts at
revolutionizing various fields using the ideas of information theory. Elias used “Information Theory,
Photosynthesis and Religion” (title courtesy of D. A. Huffman) to represent this class of papers.
Gleick, James, The Information: A History, A Theory, A Flood, Vintage Books, New York, 2011.
Golomb, Solomon W., Elwyn Berlekamp, Thomas M. Cover, Robert G. Gallager, James L. Massey, and
Andrew J. Viterbi, “Claude Elwood Shannon (1916-2001),” Notices of the American Mathematical
Society, Vol. 49, No. 1, January 2002, pp. 8–16.
Hellman, Martin, “An Extension of the Shannon Theory Approach to Cryptography,” IEEE Transactions on
Information Theory, Vol. 23, No. 3, May 1977, pp. 289–294.
Leff, Harvey S., Maxwell’s Demon, Entropy, Information, Computing, Princeton University Press, Princeton,
New Jersey, 2014.
Pierce, John Robinson, Symbols, Signals, and Noise: The Nature and Process of Communication, Harper &
Row Publishers, New York, 1961. This is a longer, less mathematical, presentation of Shannon’s ideas,
intended for a broader audience.
Reeds, James, “Entropy Calculations and Particular Methods of Cryptanalysis,” Cryptologia, Vol. 1, No. 3, July
1977, pp. 235–254. The author notes the difficulty encountered in solving ciphers that are just slightly
over the unicity point in length. He shows how to approximate the length L that will allow solution in
practice, as opposed to merely in theory. His calculations rely on using a value for D that represents the
amount of redundancy in the language that is actually exploited by the cryptanalytic methods applied.
Roch, Axel, “Biopolitics and Intuitive Algebra in the Mathematization of Cryptology? A Review of
Shannon’s “A Mathematical Theory of Communication” from 1945,” Cryptologia, Vol. 23, No. 3, July
1999, pp. 261–266.
Shannon, Claude E., “A Mathematical Theory of Communication,” reprinted with corrections from The
Bell System Technical Journal: Vol. 27, pp. 379–423, 623–656, July, October, 1948.
Shannon, Claude E., “Communication Theory of Secrecy Systems,” The Bell System Technical Journal, Vol.
28, No. 4, October, 1949, pp. 656–715. Shannon noted, “The material in this paper appeared in a
confidential report ‘A Mathematical Theory of Cryptography’ dated Sept. 1, 1946, which has now
been declassified.” Following Shannon’s death, Jiejun Kong wrote, “Recently I am shocked to find
that this paper does not have a typesetted version on the colossal Internet, the only thing people can
get is a set of barely-legible scanned JPEG images from photocopies (see http://www3.edgenet.net/
dcowley/docs.html). So here is my memorial service to the great man. I spent a lot of time to input
and inspect the entire contents of this 60-page paper. During my typesetting I am convinced that his
genius is worth the time and effort I spent!” His work may be found at http://netlab.cs.ucla.edu/wiki/
files/shannon1949.pdf. Thank you, Jiejun!
Shannon, Claude E., “Prediction and Entropy of Printed English,” The Bell System Technical Journal, Vol.
30, No. 1, January 1951, pp. 50–64.
Shannon, Claude E., “The Bandwagon,” IRE Transactions on Information Theory, Vol. 2, No. 1, March 1956,
p. 3. In this piece, Shannon wrote,
I personally believe that many of the concepts of information theory will prove useful in
other fields… but the establishing of such applications is not a trivial matter of translating
words into a new domain, but rather the slow tedious process of hypothesis and experimental
verification.
Shannon, Claude E. and Warren Weaver, The Mathematical Theory of Communication, University of Illinois,
Urbana, 1949. This book reprints Shannon’s 1948 paper and Weaver’s popularization of it.
Sloane, Neil J. A. and Aaron D. Wyner, editors, Claude Elwood Shannon: Collected Papers, IEEE Press, New
York, 1993.
Soni, Jimmy and Rob Goodman, A Mind at Play: How Claude Shannon Invented the Information Age, Simon
& Schuster, New York, 2017. This is a biography of Claude Shannon.
Claude Shannon ◾ 341
Tribus, Myron and Edward C. McIrvine, ‘‘Energy and Information,’’ Scientific American, Vol. 225, No. 3,
September 1971, pp. 179–184, 186, 188.
“Variations of the 2nd Law of Thermodynamics,” Institute of Human Thermodynamics, http://www.
humanthermodynamics.com/2nd-Law-Variations.html. This page, which is part of a much larger
website, gives 118 variations of the famous law.
Weaver, Warren, “The Mathematics of Communication,” Scientific American, Vol. 181, No. 1, January
1949, pp. 11–15. This is an early popularization of Shannon’s work in information theory.
On Entropy in Literature
Asimov, Isaac, “The Last Question,” Science Fiction Quarterly, November 1956. This short story has
been reprinted numerous times. Collections of Asimov stories that include it are Nine Tomorrows
(1959), Opus 100 (1969), The Best of Isaac Asimov (1973), Robot Dreams (1986), Isaac Asimov: The
Complete Stories, Vol. 1 (1990). There’s even an audio version narrated by Leonard Nimoy. See http://
bestsciencefictionbooks.com/forums/threads/the-last-question-by-asimov.398/.
Bruce, Donald and Anthony Purdy, editors, Literature and Science, Rodopi, Amsterdam, Netherlands, 1994.
Burgess, Anthony, A Clockwork Orange, Ballantine Books, New York, 1965. This later paperback edition
includes a seven-page glossary, not present in the first edition, to help readers translate the slang used
in the novel.
Di Filippo, Paul, Ciphers: A Post-Shannon Rock-n-Roll Mystery, Cambrian Publications, Campbell,
California, 1997. The first sentence of Chapter 00000001 of this novel gives the reader an idea of
what he or she is getting into:
That this sophic, stochastic, Shannonesque era (which, like most historically identifiable
periods, resembled a nervous tyro actor insofar as it had definitely Missed Its Cue, arriving
when it did precisely in July 1948, ignoring conventional calendars and expectations, which
of course dictated that the Zeitgeist should change only concurrently with the decade)—that
this era should today boast as one of its most salient visual images the widely propagated photo
of a barely post-pubescent actress dry-humping a ten-foot long, steel-grey and olive-mottled
python thick as a wrestler’s biceps (and what a cruel study for any wrestler, whether to fuck or
pinion this opulent opponent)—this fact did not bother Cyril Prothero (who was, after all, a
product of this selfsame era) half so much as that it (the era) seemed—the more he learned, the
more wickedly perverse information that came flooding into his possession—to be exquisitely
poised, trembling, just awaiting A Little Push, on the verge of ending.
I expect that readers will either love or hate this novel, and that all those who fall into the first (smaller)
group will be aware of the significance of July 1948.
Freese, Peter, From Apocalypse to Entropy and Beyond: The Second Law of Thermodynamics in Post-War
American Fiction, Die Blaue Eule, Essen, Germany, 1997.
Lewicki, Zbigniew, The Bang and the Whimper: Apocalypse and Entropy in American Literature, Greenwood
Press, Westport, Connecticut, 1984.
Poe, Edgar Allan, “The Fall of the House of Usher,” Burton’s Gentleman’s Magazine, September 1839. You
won’t have trouble finding a reprint of this tale.
Pynchon, Thomas, “Entropy,” Kenyon Review, Vol. 22, No. 2, Spring 1960, pp. 27–92. Despite his studies,
Pynchon admitted:
Since I wrote this story I have kept trying to understand entropy, but my grasp becomes less
sure the more I read. I’ve been able to follow the OED definitions, and the way Isaac Asimov
342 ◾ Secret History
explains it, and even some of the math. But the qualities and quantities will not come together
to form a united notion in my head.30
Pynchon, The Crying of Lot 49, J.B. Lippincott, Philadelphia, Pennsylvania, 1966.
Shaw, Deborah and Charles H. Davis, “The Concept of Entropy in the Arts and Humanities,” Journal of
Library and Information Science, Vol. 9, No. 2, 1983, pp. 135–148, available online at https://jlis.glis.
ntnu.edu.tw/ojs/index.php/jlis/article/viewFile/141/141.
Wells, H. G., The Time Machine, Henry Holt and Company, New York, 1895. Prior to appearing in this
form, it was serialized in the January through May issues of The New Review. Near the end of the
novel, the protagonist travels farther into the future and witnesses the consequences of entropy.
Discography
A pair of songs dealing with entropy are:
“Cease,” from Bad Religion’s 1996 album The Gray Race.
“Entropy,” from MC Hawking’s 2004 album A Brief History of Rhyme: MC Hawking’s Greatest Hits.
Videography
Claude Shannon: Father of the Information Age, http://www.ucsd.tv/search-details.aspx?showID=6090. This
first aired on January 30, 2002, on UCTV, San Diego.
In this simple life it is apparent that the image of birth has been transferred from the mother
of the individual to the sky.
The issue may have been debated from the great traditions of the surah. It is opposed however
to a distinctly shamanistic system of experience.
In this simple rite it is apparent that the image of birth has been transferred from the mother
of the individual to the sky.
The image can have been derived from the great traditions of the south. It is applied however
to a distinctly shamanistic system of experience.
For the challenges with 70% of the letters missing, I was only able to come up with a solution that made
sense for the first one:
As you walk through the industrial towns you lose yourself in labyrinths of little tract houses
blackened by smoke.
30 Pynchon, Thomas, Introduction to “Entropy,” Slow Learner: Early Stories, Little, Brown, Boston, Massachusetts,
1984, p. 14.
Claude Shannon ◾ 343
I noted that “little” could also be “simple” and “tract” could be “ranch” or “shack.” I considered other alter-
natives, but these were my best guesses. Josh had actually started with:
As you walk through the industrial towns you lose yourself in labyrinths of little brick houses
blackened by smoke.
So, I was only off by one word. I had considered “brick” as a possibility, but I’m used to thinking of brick
homes as being more expensive and that didn’t seem to fit the context.
In each example, I got at least one word wrong. How did you do?
As for the other two challenges with 70% of the letters missing, I managed to string some words together
that made sense, but I couldn’t keep the whole sentence on a single topic! They ended up looking like diverse
sentences spliced together. My nonsensical “solutions” were:
The Texan made me bury Timothy the newsgroup sargent of blog metal cutaways filed metal
into flat deadly havoc of cavalry gun.
Italy harbor are giant ships and the cooperating children exactly fish where raw open bed of
leveled mud exists, and such lakes join two seas.
The train bore me away, through the monstrous scenery of slag heaps, chimneys, piled scrap
iron, foul canals, paths of cindery mud…
Still houses are being built, and the Corporation building estates, with their row upon row of
little red houses, all much liker than two peas.
Probably, you did better than I did! I should note that I resisted the temptation to Google short strings of
words I suspected were present in the hope of finding the quotes. To make it a fair test, just work with pro-
grams accessing lists of words, not texts.
Chapter 12
There is a large gap in the literature on cryptology. Following Claude Shannon’s papers “A
Mathematical Theory of Communication” (1948) and “Communication Theory of Secrecy Systems”
(1949), there was virtually nothing new until 1967, when David Kahn’s The Codebreakers, a historical
work, was published. There are some exceptions, such as the papers by Jack Levine, which although
original, were not in the direction that Shannon’s work pointed. A great deal of new research was
being done, but the public was unaware of it because of the National Security Agency (NSA).
Figure 12.1 Providing and protecting vital information through security. (http://www.nsa.gov.)
As Figure 12.1 indicates, NSA does not stand for No Such Agency. Yet, for much of the
agency’s history this name was appropriate, as the organization was deeply shrouded in secrecy.
What few histories exist are so highly classified with multiple codewords that almost
no one has access to them.
—James Bamford1
1 Bamford, James, Body of Secrets: Anatomy of the Ultra-Secret National Security Agency from the Cold War through
the Dawn of a New Century, Doubleday, New York, 2001.
345
346 ◾ Secret History
The situation has changed somewhat, since Bamford made this comment. NSA has released a
four-volume history of the agency, but a great deal of material has been redacted from it. Still, this
represents a tremendous break from tradition; the agency was literally born in secrecy.
Figure 12.2 The National Security Agency’s headquarters at Fort George G. Meade. (Courtesy
of the National Security Agency, https://web.archive.org/web/20160325220758/http://www.
nsa.gov/about/_images/pg_hi_res/nsa_aerial.jpg.)
2 Bamford, James, The Puzzle Palace, Houghton, Mifflin, and Company, New York, 1982, p. 16.
3 Both fell under the Signal Corps, and many other names were used for the group over the years. Because
Friedman obtained Yardley’s materials, we may start with the Cipher Bureau, and continue the chain with
Signal Intelligence Service, Signal Security Division, Signal Security Branch, Signal Security Service, Signal
Security Agency, and finally Army Security Agency. And this was all in a 30 year period (1917–1947)!
National Security Agency ◾ 347
Unlike the much smaller Central Intelligence Agency (CIA), NSA is under the Secretary of
Defense and is, per Truman’s original order, tasked with serving the entire government. Today,
NSA’s main directorates are the Information Assurance Directorate (IAD) and the Signals
Intelligence Directorate (SID). While the SID side (cryptanalysis) is sexier, IAD could be more
important. If you could only spend money on one side, which would it be, offense or defense? You
can ponder this while reading the history that follows.
12.2 TEMPEST
NSA’s precursor, AFSA, failed to predict the outbreak of war in Korea, but America’s cryptologists did
go on to have some cryptanalytic success later in that conflict.4 There were also early successes against
the Soviets in the Cold War.5 On the other side of the code battle, there were serious problems, right
from the start, in protecting America’s own communications, as a once classified history relates.
At this point, the newly established NSA decided to test all of its equipment. The
result—everything radiated. Whether it was mixers, keying devices, crypto equip-
ment, EAM machinery, or typewriters, it sent out a signal…[half a line of text
redacted]… Plain text was being broadcast through…[half a line of text redacted]…
the electromagnetic environment was full of it.6
Various countermeasures were taken to minimize the distance at which emanations could be
measured to reveal information. These countermeasures were dubbed TEMPEST (Transient
Electromagnetic Pulse Emanation Standard).7 The term is used for both an equipment specifica-
tion and the process of preventing usable emanations. If you pursue this topic, you’ll see references
to TEMPEST attacks, but this is not technically correct. Although it is clear what the writers
mean, TEMPEST technology is purely defensive.
Van Eck phreaking is a term that may properly be used for attacking a system by measuring
electromagnetic radiation, but only in the special case of when the intent is to reproduce the
monitor. This can be done at impressive distances, or from as nearby as the hotel room next door.
This type of attack is named after the Dutchman Wim van Eck, who authored the 1985 paper
“Electromagnetic Radiation from Video Display Units: An Eavesdropping Risk?” in which he
demonstrated the attack for CRTs.8 In 2004, another researcher revealed that LCD systems are
4 Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book I: The Struggle for
Centralization, 1945–1960, Center for Cryptologic History, National Security Agency, Fort George G. Meade,
Maryland, 1995, p. 33.
5 See Section 2.8 for information on Venona.
6 Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book I: The Struggle for
Centralization, 1945–1960, Center for Cryptologic History, National Security Agency, Fort George G. Meade,
Maryland, 1995, p. 221.
7 Some sources say that TEMPEST was simply a codeword and not an acronym. If true, what I provide here must
have been made up later. A list of variants, including Tiny Electromagnetic Particles Emitting Secret Things,
is given at https://acronyms.thefreedictionary.com/tempest.
8 Van Eck, Wim, “Electromagnetic Radiation from Video Display Units: An Eavesdropping Risk?,” Computers
& Security, Vol. 4, No. 4, December 1985, pp. 269–286. John Young wrote, “Wim van Eck’s article is actually
the source of most of the incorrect TEMPEST information out there.” See http://cryptome.org/tempest-time.
htm, for a TEMPEST timeline with Young’s corrections.
348 ◾ Secret History
also vulnerable to this sort of attack and constructed the necessary equipment to carry out the
attack for less than $2,000.9
Attacks made possible by electromagnetic emissions aren’t limited to monitors. In 1956, a
bugged telephone allowed the British to hear a Hagelin machine used by Egyptians in London.
The mere sound of the machine allowed the British cryptanalysts to determine its settings and
recover messages.10 In general, electromechanical cipher machines are vulnerable to such acousti-
cal attacks. This one example is used here to represent many incidents.
In October 1960, following a briefing by NSA, the United States Communications Security
Board (USCSB) established the Subcommittee on Compromising Emanations (SCOCE) to study
the problem. The committee learned that the Flexowriter was the worst emanator, allowing a
properly equipped observer to read plaintext from a distance of 3,200 feet.11
Non-electronic/electric encryption doesn’t offer a safe alternative. Microphones have been
hidden in typewriters to allow recordings to be made of sensitive information being typed. The
sounds of the keys hitting the paper may be distinguished to reveal the individual letters. This
can also be done by placing a tiny microphone between keys on a computer keyboard.12 It doesn’t
matter how good a nation’s ciphers are if the enemy can get the messages by other means! A great
many (possibly) secure methods of encryption are implemented in insecure ways.
1960–1972, Center for Cryptologic History, National Security Agency, Fort George G. Meade, Maryland,
1995, p. 381.
12 Keefe, Patrick Radden, CHATTER: Dispatches from the secret world of Global Eavesdropping, Random House,
14 Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book II: Centralization Wins, 1960–
1972, Center for Cryptologic History, National Security Agency, Fort George G. Meade, Maryland, 1995, p. 293.
15 Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book II: Centralization Wins, 1960–
1972, Center for Cryptologic History, National Security Agency, Fort George G. Meade, Maryland, 1995, p. 479.
16 Bamford, James, Body of Secrets: Anatomy of the Ultra-Secret National Security Agency from the Cold War through
the Dawn of a New Century, Doubleday, New York, 2001, p. 578. Johnson, Thomas, R., American Cryptology
During the Cold War, 1945–1989, Book II: Centralization Wins, 1960–1972, Center for Cryptologic History,
National Security Agency, Fort George G. Meade, Maryland, 1995, p. 368 reveals that the 5-acre mark had
almost been hit by 1968.
17 Odom, William, Fixing Intelligence for a More Secure America, second edition, Yale University Press, New
to capture signals, yet avoided danger by remaining in international waters. The expectation was
that the ships would be ignored by those being spied upon, just as the United States ignored Soviet
SIGINT ships.18 That expectation was not met.
During the Six-Day War in June 1967, Israel attacked the spy ship USS Liberty. Thirty-four
men died as a result and many more were injured. Although the Liberty was flying an American
flag, the Israeli government claims to this day that they weren’t aware that it belonged to the
United States until after the attack. There is intense disagreement among researchers as to whether
this is true or not. The section on the Liberty in the reference list at the end of this chapter offers
volumes representing both perspectives. Like the question of whether or not Yardley sold secrets to
the Japanese, this debate is likely to continue for many years.
In January 1968, close on the heels of the attack on the Liberty, another SIGINT ship, the
USS Pueblo was captured, along with her crew, by North Korea. This happened so quickly that
only a small fraction of the classified material on board could be destroyed. A declassified history
described the event: “It was everyone’s worst nightmare, surpassing in damage anything that had
ever happened to the cryptologic community.”19
By following Kerckhoffs’s rules, however, NSA averted an even greater disaster. All of the NSA’s
cipher equipment was designed to remain secure, even if the enemy somehow gained the full details
of the designs, as happened here. Security truly resided in the keys, and those that were captured
would not be used again; hence, only messages from late 1967 and early 1968 were compromised.20
Although there was less loss of life with the capture of the Pueblo (one death) than in the attack
on the Liberty, the Koreans held the surviving 82 crewmembers as prisoners, and they proved to be
brutal captors. Beatings were carried out in private while the North Koreans provided pictures for
public consumption intended to show how well the Americans were being treated. The crew of the
Pueblo minimized the propaganda value of these pictures by positioning their hands to convey a
coded signal the Koreans didn’t recognize. Upon being asked, the Americas explained that it was
the “Hawaiian good luck sign.” See Figure 12.3.
How the North Koreans learned that the true meaning of this gesture was not “good luck”
will be related shortly. But understand now that they only learned this after the pictures had been
distributed throughout the world! Upon getting the message, the North Koreans retaliated with
greater physical abuse. In all, the crew spent 11 months in captivity, before America signed an
insincere apology that gained their release.
A reunion picture similar to that in Figure 12.4 served as the cover of the Fall 2008
CRYPTOLOG (Vol. 29, No. 4), the journal of the U.S. Naval Cryptologic Veterans Association
(NCVA). After seeing this cover, I emailed Jay Browne to tell him that it gave me a chuckle. He
responded, in part, with the following:
The cover photo was somewhat “controversial.” I told Bob Payne (our Editor) to
standby for heavy seas. In fact, of the 4000 or so copies we print, we received a grand
total of 2 negative comments! After we received the first one I drafted an editorial for
18 Spy planes were another matter. The Soviets tried, and too often succeeded, in shooting these down, the most
famous incident being the U-2 piloted by Francis Gary Powers.
19 Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book II: Centralization Wins,
1960–1972, Center for Cryptologic History, National Security Agency, Fort George G. Meade, Maryland,
1995, p. 439.
20 Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book II: Centralization Wins,
1960–1972, Center for Cryptologic History, National Security Agency, Fort George G. Meade, Maryland,
1995, p. 452.
National Security Agency ◾ 351
Figure 12.4 The 40th reunion of Pueblo crewmembers. (Courtesy of Karen Pike Photography.)
the winter issue - attached. Bob chose to ignore the issue all together, and in hindsight
he was probably right.
The unpublished editorial saw print for the first time in the first edition of this book and is
included again in this new edition.21
EDITORIAL
CRYPTOLOG has received several “comments” regarding the cover photograph
of the Fall issue. Some readers may have been offended by the display of the assem-
bled PUEBLO crew members and the so-called “Hawaiian Good Luck” sign, but
CRYPTOLOG believes there is a larger story involved.
To appreciate the historical context, the reader must go back to the events surround-
ing the capture of the United States Ship PUEBLO—the first such capture since the
1880s—and the treatment of her surviving crew. The late Commanding Officer of the
ship, Commander Lloyd M. Bucher, wrote in his book, “My officers and I were hauled
before [The] Glorious General who suddenly reappeared on the scene and delivered one
of his towering histrionic rages, which were both comic and frightening to behold. He
confronted us with and let us examine a copy of a page from Time magazine [18 October
1968 issue] containing a group picture of our infamous Room 13 gang. The caption
fully explained the meaning of the “Hawaiian Good Luck Sign.” … I also knew we were
about to pay for their severe loss of face. I had not been beaten yet, but Glorious General
kept me after he had dismissed the other officers and during a denunciation lasting sev-
eral hours, threatened me with speedy execution after a trial which he indicated was now
absolutely inevitable. He was pretty convincing about it and I was returned to my cell
feeling that my chances for survival had sunk to zero.”
“On the following day, men continued to be selected for beatings. Shingleton and
Scarborough received brutal ones. Radioman Hayes had his jaw broken. The officers
began catching it as well.”22
So the cost of an expression, a gesture, the men—our men—suffered mightily at
the hands of their captors. Other newspapers and magazines printed the “offending”
photo but it was Time that explained the meaning.
While the cover photo may have offended some readers, CRYPTOLOG is offended,
even today some 40 years later, by the treatment of the North Koreans and the callous
disregard of our people by Time magazine.
The cover photo speaks volumes to both—here’s to you North Korea and to Time!
The losses described above brought an end to the use of slow moving ships for gathering intelligence.
22 Bucher, Lloyd M., with Mark Rascovich, Bucher: My Story, Doubleday, Garden City, New York, 1970.
23 Szulc, Tad, “The NSA - America’s $10 Billion Frankenstein,” Penthouse, November 1975, pp. 54–56, 70, 72,
184, 186, 188, 191–192, 194–195, p. 194 cited here.
24 The National Reconnaissance Office (NRO) once lost track of over $2 billion! See Weiner, Tim, “A Secret
Agency’s Secret Budgets Yield Lost Billions, Officials Say,” The New York Times, January 30, 1996, p. 5A.
25 Some budgets were, in fact, lumped together, but not as Szulc figured it. Beginning with fiscal year 1959, a
Consolidated Cryptologic Program (CCP) centralized all cryptologic budgeting (including the three services,
NSA, and, to a lesser extent CIA) under the Director of NSA (DIRNSA). See Johnson, Thomas, R., American
Cryptology During the Cold War, 1945–1989, Book I: The Struggle for Centralization, 1945–1960, Center for
Cryptologic History, National Security Agency, Fort George G. Meade, Maryland, 1995, p. 260.
National Security Agency ◾ 353
be working at the Fort Meade headquarters with a budget of around one billion dollars per year.
Not to be outdone, Playboy published an article by David Kahn the following month in which he
estimated that NSA employed 100,000 people with a budget of several billion dollars per year.26
A graph from a declassified history of the agency reveals how many employees were actually under
NSA’s control (Figure 12.5).
30
Civilian
25
Military
20
Thousands
15
10
0
73 75 77 79 81 83 85 87 89 91 93
Figure 12.5 Employment figures for NSA from 1973 to 1993. (From Johnson, Thomas R.,
American Cryptology During the Cold War, 1945–1989. Book III. Retrenchment and Reform,
1972–1980, Center for Cryptologic History, National Security Agency, Fort Meade, Maryland,
1995, p. 23.)
But why would Playboy and Penthouse both attempt to expose NSA’s size and budget within a
month of each other? In addition to their normal distaste for cover-ups, there was a bandwagon to
jump on. The U.S. intelligence agencies were being investigated for alleged crimes by a congressional
committee and almost every magazine and newspaper had something to say about it. The congres-
sional committee was led by Senator Frank Church, and is therefore often referred to as the Church
Committee. It examined ways in which the agencies illegally spied on and disrupted the activities of
American citizens. It seems that any group that did not fall in line with the status quo was targeted.
War protestors, civil rights activists, feminists, and Native American activists were all harassed
under the government sponsored COINTELPRO (counterintelligence program). Many of the vic-
tims, such as Martin Luther King, are now generally considered to have helped move the country in
the right direction.27 Other well-known individuals who had their privacy violated included actress
Jane Fonda, pediatrician and best-selling author Dr. Benjamin Spock, and folk singer Joan Baez.
The programs uncovered by the Church Committee investigation aroused much public indig-
nation. The CIA and FBI received the greatest scrutiny. NSA might have had a tougher time, but it
appears that the committee didn’t even want to investigate that particular agency! The paragraph
from Johnson’s history of NSA that indicates this follows (without any redactions this time).
To begin with NSA wasn’t even on the target list. But in the course of preliminary
investigation, two Senate staffers discovered in the National Archives files some
Defense paperwork relating to domestic wiretaps which referred to NSA as the source
of the request. The committee was not inclined to make use of this material, but the
two staffers leaked the documents to Representative Bella Abzug of New York, who
26 Kahn, David, “The Code Battle,” Playboy, December 1975, pp. 132–136, 224–228.
27 See http://www.icdc.com/~paulwolf/cointelpro/churchfinalreportIIIb.htm for details on the harassment of
King and http://www.icdc.com/~paulwolf/cointelpro/cointel.htm for more general information.
354 ◾ Secret History
was starting her own investigation. Church terminated the two staffers, but the dam-
age had been done, and the committee somewhat reluctantly broadened its investiga-
tion to include the National Security Agency.28
NSA programs included SHAMROCK, which involved the interception of private cables from
the United States to certain foreign countries,29 and MINARET, which involved checking all elec-
tronic messages that had at least one terminal outside the United States for names on watch lists
provided by other agencies.30 According to NSA Director Lew Allen, Jr., between 1967 and 1973,
the Agency gave about 3,900 reports on about 1,680 Americans who were on the watch list.31
Another estimate for the larger time period from 1962 to 1973 includes about 75,000 Americans
and organizations in the group that was spied upon.32
Notice that in both cases at least one side of the communication link was outside the United
States, even though, in many cases, both individuals were Americans. There is no evidence of the
NSA spying on pairs of Americans within the United States. It’s been pointed out that Canada
can legally spy on Americans and that the NSA has a cozy sort of reciprocal intelligence agreement
with Canada and other countries,33 but NSA maintains that it doesn’t ask its allies to do anything
that it’s prohibited from doing itself.
Some web pages change frequently. At one time, the following was part of NSA’s official web
presence.
The agency, the investigations showed, had monitored the domestic conversations of
Americans without the proper court warrants. It was chastised and forbidden to over-
hear such communications, and Congress established a special court to grant national-
security wiretaps.
This is typically what happens when some part of the Federal Government is caught breaking the
law. An investigation is held, nobody is punished, and legislation is passed to re-outlaw the crime.
In another example of crime without punishment, former CIA director Richard Helms commit-
ted perjury and got off with just a $2,000 fine.34
The special court that was established to grant national-security wiretaps came into being with
the Foreign Intelligence Surveillance Act of 1978. Permission slips for eavesdropping thus became
known as FISA warrants and were granted by the Foreign Intelligence Surveillance Court. This
court approved so many requests that one critic refused to characterize it as a rubber stamp, point-
ing out that even a rubber stamp runs out of ink sometimes! On the other hand, supporters of the
program argue that warrants were almost never refused because the applications were well justified
28 Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book III: Retrenchment and
Reform, 1972–1980, Center for Cryptologic History, National Security Agency, Fort George G. Meade,
Maryland, 1995, pp. 92–93.
29 This program predated NSA, having begun with ASA, following World War II.
30 Halperin, Morton H., Jerry J. Berman, Robert L. Borosage, and Christine M. Marwick, The Lawless State, The
Crimes of the U.S. Intelligence Agencies, Penguin Books, New York, 1976, p. 173.
31 Foerstal, Herbert N., Secret Science: Federal Control of American Science and Technology, Prager, Westport,
Jan Bergstra, editors, The History of Information Security, A Comprehensive Handbook, Elsevier, Amsterdam,
Netherlands, 2007, pp. 523–563, pp. 545–546 cited here.
33 Constance, Paul, “How Jim Bamford Probed the NSA,” Cryptologia, Vol. 21, No. 1, January 1997, pp. 71–74.
34 James Bamford, Body of Secrets: Anatomy of the Ultra-Secret National Security Agency from the Cold War through
in nearly every case. When FISA was first imposed, NSA decided that there would be no close
calls. Applications were only to be made on solid evidence. Also, the statistics are skewed by the
fact that in the early days weaker applications were sometimes returned without being officially
denied. They could then be strengthened with more information and submitted again or simply
forgotten. In this way, some possible rejections never became part of the statistics we see today.
In his four-volume history of NSA, Thomas R. Johnson maintains that the Agency did not act
improperly. For example, he states repeatedly that the 1968 “Omnibus Crime Control and Safe
Streets Act” overruled Section 605 of the Federal Communications Act of 1934, which forbid
eavesdropping.35 Even if this is true under the letter of the law, it clearly violates the spirit of the
constitution. Certainly the founding fathers would not have been amused by this justification.
The second half of the 1970s also marked the beginning of a debate between NSA and
American professors, mainly mathematicians, computer scientists, and engineers, who had begun
to make important cryptographic discoveries. Previously the NSA had a monopoly on research in
these areas, and they did not want to see it end. They feared the loss of control that public pursuits
in this field would entail. The various attempts of NSA to stop the academics are discussed in the
following chapters of this book, along with the relevant mathematics. Although much time was
spent battling congressional committees and academics in the 1970s, the NSA did manage to sign
a treaty in 1977 with a long-time enemy, the CIA.36
Details are lacking, but a breakthrough was reportedly made in 1979 in deciphering Russia’s
encrypted voice transmissions.37 With Venona successes having been disclosed in 1996, we may be
able to look forward to revelations on how voice decrypts affected Cold War politics in the 1980s
in the not-too-distant future.
35 Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book I: The Struggle for
Centralization, 1945–1960, Center for Cryptologic History, National Security Agency, Fort George G. Meade,
Maryland, 1995, p. 274 and Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book
II: Centralization Wins, 1960–1972, Center for Cryptologic History, National Security Agency, Fort George
G. Meade, Maryland, 1995, p. 474.
36 Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book III: Retrenchment and
Reform, 1972–1980, Center for Cryptologic History, National Security Agency, Fort George G. Meade,
Maryland, 1995, p. 197.
37 Bamford, James, Body of Secrets: Anatomy of the Ultra-Secret National Security Agency from the Cold War through
the Dawn of a New Century, Doubleday, New York, 2001, pp. 481–482.
40 Kahn, David, “The Code Battle,” Playboy, December 1975, pp. 132–136, 224–228.
356 ◾ Secret History
spend it on some more jet fighters or ICBMs, probably the NSA investment is bet-
ter. Intelligence is cheap and cost-effective. It can often save more than it costs. But
if the Government were actually to spend the money on schools and hospitals and
transportation, that investment is probably better. For a nation’s strength depends far
less upon its secret intelligence than upon its human and material resources. No doubt
a balance is best. The problem is to strike that balance, and this depends largely on the
wisdom and determination of a country’s leaders, and of its people.
This did not represent a brand new perspective. Decades earlier, an American President, and
World War II general, commented on the opportunity cost of military spending:
Every gun that is made, every warship launched, every rocket fired, signifies, in the
final sense, a theft from those who hunger and are not fed, those who are cold and are
not clothed. The world in arms is not spending money alone. It is spending the sweat
of its laborers, the genius of its scientists, the hopes of its children.
—Dwight D. Eisenhower41
Unfortunately, it is very difficult for anyone on the outside to determine whether the people’s
money is best spent on NSA or elsewhere. Although the agency’s successes are typically kept
secret (have you heard, for example, that NSA helped prevent a nuclear war between India and
Pakistan?), its failures usually receive a great deal of publicity. This leads to a warped perspective,
making it difficult to write a balanced account.
NSA’s gigantic parking lot, which fills completely early in the morning every weekday, seems
to indicate a healthy budget, which, in turn indicates that Congress must be convinced it’s getting
its money’s worth from the Agency. Whatever the budget is, tough choices still need to be made
to stay within it. Although more parking spaces are badly needed, huge parking decks are not
being constructed. There are other projects that have a stronger need for the funds the decks would
require. So, the budget is not unlimited! Nor is it squandered on ridiculous salaries. Many agency
employees, with technical skills in high demand, could make much more money on the outside.
One employee that I met gave up such a high-paying job to go to work for NSA following 9/11.
The rewards of his new career are of a different nature.
41 Eisenhower, Dwight D., “The Chance for Peace,” speech to the American Society of Newspaper Editors,
Washington, DC, April 16, 1953. Quoted here from Zinn, Howard, Terrorism and War, Seven Stories Press,
New York, 2002, p. 96.
42 Bamford, James, The Puzzle Palace, Houghton, Mifflin, and Company, New York, 1982, pp. 321–324.
National Security Agency ◾ 357
Ronald W. Clark.43 NSA got wind of this and exhibited great interest in the manuscript, which
seemed to confirm that there was something to the story. Clark was not intimidated by the Agency,
but he didn’t really know much about the trip. His biography merely opened the door on this
topic.
As the story went, the deal wasn’t made in a single trip. Friedman had to return, and in 1958,
Hagelin agreed. Crypto AG machines were eventually adopted for use by 120 nations, but it
seemed unlikely that they were all rigged. According to some accounts the security levels provided
depended on the country in which the machine was to be used. In any case, there doesn’t appear
to have been any suspicion of rigging until 1983.44 Twenty-five years is a very long time to keep
such a large-scale project secret. This particular quarter-century includes the switch from electro-
mechanical machines to computerized digital encryption, and the belief was that the backdoors
remained in place through this transition.
There are various accounts of how knowledge of the allegedly rigged machines leaked out. One ver-
sion is as follows. The spy Jonathan Pollard betrayed a tremendous amount of material to Israel, includ-
ing, apparently, details of the Crypto AG machines’ backdoors. This information was then given to the
Soviets, in 1983, in exchange for allowing a larger number of Jews to leave the Soviet Union for Israel.45
However the news got out, it is claimed to have later spread to Iran. The next episode in the
story is very well-documented. It is the arrest of the Crypto AG salesman Hans Buehler in March
1992. The Iranians charged him with spying for the Federal Republic of Germany and the United
States and imprisoned him for nine months. He was only released when Crypto AG paid a $1
million bail. A few weeks later, the company dismissed Buehler and insisted that he reimburse
them for their expense! Buehler initially had no idea why he was arrested, as he relates in his book
Verschlüsselt.46 He tells his story in the same style as Franz Kafka’s Der Prozess (The Trial), in which
the protagonist, who is arrested early in the story, never learns what the charges against him are.
Prior to his arrest, Buehler was not aware of any rigging of the machines he was selling, but he
was to eventually conclude, after speaking with several former employees of Crypto AG, that the
machines were rigged and that he was paying the price for the duplicity of others.
As you have surely noticed, I am not conveying any details of how exactly the backdoors
worked. Full details are not publicly available. Statements such as “The KGB and GRU found out
about the ‘seed key’ used by NSA as a ‘master key’ to unlock encoded communications transmit-
ted by Crypto AG machines,” made by Wayne Madsen, only hint at how it might have worked.47
43 Clark, Ronald, The Man Who Broke Purple, Little, Brown and Company, Boston, Massachusetts, 1977. As we
saw in Section 8.5, Clark’s title isn’t very accurate. Previous biographies by this author include the very popular
Einstein: the Life and Times (1971) and another on Bertrand Russell.
44 Madsen, Wayne, “Crypto AG: The NSA’s Trojan Whore?” Covert Action Quarterly, Issue 63, Winter 1998,
Unfettered Access to America’s Secrets,” Online Journal ™, September 21, 2005, http://67.225.133.110/~gbppr
org/obama/nytimes_ww2/09-21-05_Madsen.pdf. In this piece, Madsen also provides an alternate explanation,
that doesn’t involve Pollard, for how the Soviets learned the secret: “Ex-CIA agents report that the Russian
intelligence successors to the former KGB were actually tipped off about the Crypto AG project by CIA spy
Aldrich Ames,” Having two different accounts shows how speculative this whole story really is.
46 Buehler, Hans, Verschlüsselt, Werd, Zürich, Switzerland, 1994. This book is in German and no translation is
currently available.
47 Madsen, Wayne, “The Demise of Global Communications Security, The Neocons’ Unfettered Access to
Other sources describe the keys as somehow being sent with the messages by the rigged machines.
Or, in the early days of mechanical machines, it could have been a simple matter of omitting
certain levers in the devices sold to particular customers. Could a mathematical analysis of the
machines reveal the secret? This would make an excellent research project, but a reluctance to
approach it is understandable. It may merely result in months of wasted effort with no new theo-
rems or results to show for the work.
In 1995, a Swiss engineer spoke to Scott Shane, then of The Baltimore Sun,48 under the condi-
tion that his anonymity be maintained. Shane revealed what he learned:
Sometimes the mathematical formulas that determined the strength of the encryption
contained certain flaws making the codes rapidly breakable by a cryptanalyst who
knew the technical details.
Again, this is intriguing, but not nearly as detailed as we would desire! Shane provided another
piece of evidence in support of the conspiracy, for which Crypto AG failed to provide any alterna-
tive explanation. It’s a 1975 document that shows NSA cryptographer Nora L. Mackebee attended
a meeting with Crypto AG employees to discuss the design of new cipher machines.50 Motorola
engineer Bob Newman recalls Mackebee at several meetings, as she was one of several consultants
who was present when Motorola was helping Crypto AG with designs as the company made
the switch from mechanical machines to electronic.51 Shane contacted Mackabee, who had since
retired, but she said she couldn’t talk about Crypto AG.52 Crypto AG executives consistently
denied the existence of backdoors in any of their machines, as one would expect whether they were
present or not.
In 2014 and 2015, NSA released over 52,000 pages of material connected with William F.
Friedman.53 Among these were papers indicating that the NSA-Crypto AG connection was real.
They showed that negotiations between Friedman and Hagelin dated back to 1951 and included
$700,000 in compensation for Hagelin. This was well before Friedman’s 1957 trip, uncovered by
his biographer Ronald W. Clark. It took a great deal of time to work out all of the details of the
agreement between Friedman and Hagelin, although Hagelin was cooperating from the start.54
Following the release of these documents, the Swiss company’s response to the old allegations
News-Features/Declassified-Documents/Friedman-Documents/.
54 What can be determined from NSA’s Friedman release is thoroughly detailed at Simons, Marc and Paul Reuvers,
“The gentleman’s agreement, Secret deal between the NSA and Hagelin, 1939–1969,” Crypto Museum, https://
www.cryptomuseum.com/manuf/crypto/friedman.htm, created: July 30, 2015, last changed: May 10, 2020.
National Security Agency ◾ 359
shifted from strong denial to “whatever happened in the past, this is certainly not happening
today” and “mechanisms have been put in place, to prevent this from happening in the future.”55
On February 11, 2020, a much more complete story was revealed, to the dismay of the US
intelligence community. This time it wasn’t a planned release. The investigative team consisted of
Greg Miller of The Washington Post, and men and women from German and Swiss television.56
The compromise of Crypto AG was more complete than had been suspected. While NSA was
deeply involved, it was the CIA that turned out to be the Victor Kiam of the crypto equipment
market. Older readers will remember Kiam from commercials he did for Remington in which he
enthusiastically said, “I liked the shaver so much, I bought the company!” This is exactly what the
CIA did. On June 12, 1970, they secretly bought Crypto AG, in a joint purchase with the West
German Federal Intelligence Service (Bundesnachrichtendienst, or BND for short). In addition to
the United States and West Germany, four other countries, Israel, Sweden, Switzerland, and the
U.K., knew of the operation, or were given intelligence gathered from it.57 This group managed to
keep the identities of the new owners of Crypto AG secret from the public for 50 years. It makes
one wonder what other long-term secrets have been kept.58
The researchers based much of their reporting on histories prepared by the CIA and BND,
although they did not indicate how they obtained these histories.59 They also conducted interviews
with current and former members of the intelligence community and Crypto AG employees.
The CIA history noted that the Crypto AG material “represented over 40 percent of NSA’s total
machine decryptions, and was regarded as an irreplaceable resource.”60
Prior to becoming a secret owner, the CIA had made payments to Hagelin. There was one in
1960 for $855,000 to renew the “licensing agreement” that he had made with Friedman. There
were also annual payments of $70,000 and cash infusions of $10,000 for marketing Crypto AG
products. The latter helped to insure that the company would continue to dominate the world
market.61 Prior to their ownership, the CIA needed the company to stay successful! When CIA
and BND became co-owners, the profits were a nice bonus that were then poured into other
operations. And what a way to make money! The CIA history noted:
Foreign governments were paying good money to the U.S. and West Germany for the
privilege of having their most secret communications read by at least two (and possibly
as many as five or six) foreign countries.62
55 “The Crypto Agreement,” BBC Radio 4, July 28, 2015, available online at https://www.bbc.co.uk/programmes/
b0639w3v.
56 Simons, Marc and Paul Reuvers, “Operation RUBICON/THESAURUS, The secret purchase of Crypto AG by
at https://tinyurl.com/yck5xur2.
58 Hint: https://tinyurl.com/y5oy4v78.
59 Miller noted, “The first [history] is a 96-page account of the operation completed in 2004 by the CIA’s Center
for the Study of Intelligence, an internal historical branch. The second is an oral history compiled by German
intelligence officials in 2008.”
60 Miller, Greg, “The intelligence Coup of the Century,” The Washington Post, February 11, 2020, available online
at https://tinyurl.com/yck5xur2.
61 Miller, Greg, “The intelligence Coup of the Century,” The Washington Post, February 11, 2020, available online
at https://tinyurl.com/yck5xur2.
62 Miller, Greg, “The intelligence Coup of the Century,” The Washington Post, February 11, 2020, available online
at https://tinyurl.com/yck5xur2.
360 ◾ Secret History
This operation went under the code name “Thesaurus,” later changed to “Rubicon.” The technical
details will be uncovered in the years to come, but it appears that the idea of backdoors, wasn’t
quite right. Instead, two versions would be made of a machine: a good one for friendly nations
and one that appeared to be good, but was really much less secure, for other nations. Creating the
illusion of a secure system was tricky and, sometimes, foreign crypto experts would get suspicious.
Still, it worked in the electromechanical machine era, as well as with more advanced devices in
the decades that followed. Peter Jenks, of NSA, recognized that a circuit-based system could be
designed so that it appeared to generate random streams of characters, while it really had a short
enough period to be broken by NSA cryptanalysts with powerful computers at their disposal.63
The list of countries that would receive weaker systems kept growing and the West Germans
became nervous about how broadly the American’s were spying. Nations receiving weaker
machines even included members of NATO. Fearing the fall-out, if this were to be exposed, the
Germans allowed the CIA to buy them out in 1994 and become the sole owner. But, by 2018,
what had been the intelligence coup of the (20th) century, may have been surpassed, for in that
year the CIA sold off Crypto AG’s assets.64
The Crypto-AG connection is tremendously important, but it wasn’t the only news-worthy
item connected with NSA. The next section looks at some other developments at the agency over
the last 20 years.
63 Miller, Greg, “The intelligence Coup of the Century,” The Washington Post, February 11, 2020, available online
at https://tinyurl.com/yck5xur2. This will make more sense after you read Chapter 19. Alternatively, you can
read Simons, Marc and Paul Reuvers, “Operation RUBICON/THESAURUS, The secret purchase of Crypto
AG by BND and CIA,” Crypto Museum, https://www.cryptomuseum.com/intel/cia/rubicon.htm, created:
December 12, 2019, last changed: May 10, 2020.
64 Miller, Greg, “The intelligence Coup of the Century,” The Washington Post, February 11, 2020, available online
at https://tinyurl.com/yck5xur2.
65 Bamford, James, A Pretext for War: 9/11, Iraq, and the Abuse of America’s Intelligence Agencies, Doubleday, New
There were massive increases in both budget and the number of employees after 9/11, but this
was not immediate. First came some reorganizing cuts, which were made by encouraging retire-
ments. In particular, NSA no longer needed so many Soviet linguists or high-frequency special-
ists.69 Bamford provides a physical description of NSA (as of 2004):
Nicknamed Crypto City, it consists of more than fifty buildings containing more than
seven million square feet of space. The parking lot alone covers more than 325 acres
and have [sic] room for 17,000 cars.70
While these stats are impressive, there’s a lot more to the agency, former NSA Deputy Director
Chris Inglis put matters in perspective near the end of his service in 2014:
But if you want to really know what the core of NSA is, it’s its brain trust. It’s its
people. All right? We employ some, you know, number of people which includes 1,000
Ph.D.s, which includes a diverse array of disciplines that we bring to bear.71
It’s likely that much of the information gathered (by all modern nations) is collected by means
other than direct mathematical attacks on the mature cipher systems of the 21st century.
Backdoors, exploitation of electromagnetic emissions, and hacking attacks are probably the
source of much intelligence. A new Cyber Command Center (Cybercom) that carries out such
work, in addition to safeguarding American systems from such attacks, has been established
and is located at NSA. The director of NSA is now dual-hatted and also directs Cybercom. It
is simply too inconvenient to not have important systems online, and too dangerous to do so
without making intense efforts to protect these systems.
In recent years there’s been a massive amount of media attention on alleged domestic spy-
ing by NSA. The majority of the journalists making such claims are likely well-intentioned and
simply trying to report accurately; however, it seems that they are (in some cases) confusing
mere interception with actually reading messages or listening to conversations. NSA is allowed
to accidentally intercept domestic conversations, and one must understand that the technologi-
cal environment is such that it is, in many cases, impossible to intercept the desired targets, in
isolation, without also gathering untargeted items. Email messages don’t travel like postcards.
Instead they’re broken into packets, which may follow various paths before being reassembled
at the intended destination. Phone conversations are combined in large groups, to which data
compression algorithms are applied. Thus, to gather the intercepts NSA legitimately needs to
do its job, it must also unintentionally acquire other data. The unintended intercepts are then
filtered out.
Of course, the potential for abuse is present. Americans tend to fear big government and
secrecy. And NSA is a very big government agency that must also be very secretive! NSA
employees swear an oath to protect the constitution and I believe they take this oath much
more seriously than recent Presidents have. Certainly, the Agency takes the oath seriously. An
NSA employee told me that the people who work there can make mistakes and keep their jobs,
69 Bamford, James, A Pretext for War: 9/11, Iraq, and the Abuse of America’s Intelligence Agencies, Doubleday, New
York, 2004, p. 356.
70 Bamford, James, A Pretext for War: 9/11, Iraq, and the Abuse of America’s Intelligence Agencies, Doubleday, New
nsa-deputy-director-john-inglis, January 10, 2014. Note: Inglis goes by his middle name, Chris, despite how he
is named in this piece.
362 ◾ Secret History
in many cases, but that if they spy on Americans they can be fired the same day. He said that
he’d seen it happen.
Everything I’ve seen and heard at NSA has convinced me that respect for the constitution is
a key component of the culture there. A casual conversation I had with an NSA employee helps
to illustrate this. I complained about Mitt Romney having said to a protestor, “Corporations
are people, my friend” 72 and she responded with something like, “I know and that’s a huge
pain for us, because if they’re American corporations, we can’t spy on them, even if they are
45% owned by a foreign country that’s really controlling them and they’re up to no good.”
So, American corporations have the same constitutional rights as American citizens and those
rights are respected by NSA.
Another NSA employee described how an American President wanted NSA to do things that
the director thought were prohibited. The director stood his ground and refused to cooperate.
Much of the legislation that presidents and congress have pushed for has not been asked for, or
even desired, by the Agency.
1. The applicant does not have anything in his or her background that might make him or her
subject to blackmail.
2. The applicant is not a spy.
Item number 1 justifies the asking of many potentially embarrassing questions. An acquaintance
of mine was asked if he had ever cheated on his wife. If he had, someone who knew about it
could blackmail him into revealing classified information in return for his silence on the matter.
I assume he was, indeed, faithful, as he was hired; however, the polygraph test was still rough on
him. Initially he was so nervous that everything showed up as a lie. Finally, they asked him if he
had ever killed anyone. He said, “No,” which also showed up as a lie! After a break, during which
he managed to calm down, things went more smoothly.
Oddly enough, there are applicants who confessed to crimes such as murder, rape, and whole-
sale selling of illegal drugs during the polygraph test.75 In fact, of the 20,511 applicants between
1974 and 1979, 695 (3.4%) admitted to the commission of a felony, nearly all of which had
72 I know that he is legally correct. Corporate personhood is law. We live in a strange world. The United States
once had people as property (slaves) and now has property as people (corporations).
73 See Interviewing with an Intelligence Agency (or, A Funny Thing Happened on the Way to Fort Meade), by Ralph J.
previously gone undetected.76 Bamford described how these tests received a black eye during the
1950s and early 1960s because of the heavy use of EPQs (embarrassing personal questions).
These questions are almost inevitably directed toward intimate aspects of a person’s
sex life and bear little relationship to the person’s honesty or patriotism. Following a
congressional investigation and an internal crackdown, the personal questions are now
somewhat tamer. “Have you ever had an adult homosexual experience?” for example,
is one of the standard questions today.77
This quote is from 1982. The NSA is now more tolerant. Although EPQs are still used, homosexu-
ality is not necessarily considered a problem. There is even a social club, GLOBE, for Gay, Lesbian,
or Bisexual Employees.78 Of course, the applicant need not worry in any case. The answers to these
questions are kept confidential.
It wouldn’t be revealed, for example, that Bernon F. Mitchell told his interrogator about certain
“sexual experimentations” with dogs and chickens he had carried out when he was between the
ages of 13 and 19.79 Okay, maybe this wasn’t the best example. Despite Mitchell’s strange history,
he was hired along with William H. Martin. Johnson summarized the interview process for these
two men with “Certain questions about their psychological health came up on the polygraph and
background investigation but were not regarded as serious impediments to employment.”80
In 1960, these men betrayed the agency to the Russians, sparking a purge of homosexuals. In
all, 26 NSA employees were fired because of their sexual conduct.81 The discrimination was even
extended to other government positions. President Eisenhower ordered a secret blacklisting of gays
from employment within the federal government. He was aided in this by J. Edgar Hoover, who
maintained a list of homosexuals.82
76 Bamford, James, Body of Secrets: Anatomy of the Ultra-Secret National Security Agency from the Cold War through
the Dawn of a New Century, Doubleday, New York, 2001, p. 540. A pair of NSA historians expressed skepticism
that such statistics were ever compiled.
77 Bamford, James, The Puzzle Palace, Houghton, Mifflin, and Company, New York, 1982, p. 162.
78 Bamford, James, Body of Secrets: Anatomy of the Ultra-Secret National Security Agency from the Cold War through
Centralization, 1945–1960, Center for Cryptologic History, National Security Agency, Fort George G. Meade,
Maryland, 1995, p. 182. Part of my purpose in relating this story is to ease concerns of potential NSA appli-
cants. You don’t have to be 100% squeaky clean to get hired. You’ll probably do better on the interview than
Mitchell and they hired him!
81 Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book I: The Struggle for
Centralization, 1945–1960, Center for Cryptologic History, National Security Agency, Fort George G. Meade,
Maryland, 1995, p. 284.
82 Bamford, James, Body of Secrets: Anatomy of the Ultra-Secret National Security Agency from the Cold War through
the Dawn of a New Century, Doubleday, New York, 2001, pp. 543–544.
364 ◾ Secret History
1. William Weisband—His betrayal took place before NSA was created and he is thought by
some to be responsible for “Black Friday,” the day on which all Warsaw Pact encryption
systems were changed, shutting out the cryptanalysts, October 29, 1948.
2. Joseph Sydney Petersen, Jr.—Caught in 1953, his indictment for betraying secrets to the
Dutch got the NSA some unwanted publicity.
3. Roy A. Rhodes—He provided the Soviets with cryptographic information and was caught
based on information from a NKVD Lieutenant Colonel, who defected in 1957.
4. Robert Lee Johnson—Active in the 1950s and early 1960s, he provided the Soviets with key
lists for cipher machine and other valuable material.
5. Jack Dunlap—As a spy for the Soviets from 1959 to 1963, he stole documents by tucking
them under his shirt. He committed suicide while being investigated.
6. Victor Norris Hamilton—This former NSA cryptanalyst defected to the Soviet Union in July
1963.
7. Robert S. Lipka—He got away with betraying NSA to the Soviets, while he worked there
from 1964 to 1967, but in 1993 his ex-wife turned him in.
8. Christopher Boyce and Daulton Lee—The story of these traitors was told in Robert Lindsey’s
The Falcon and the Snowman (Simon and Schuster, New York, 1979). In 1984, the book was
made into an excellent movie of the same title that starred Timothy Hutton and Sean Penn.
9. William Kampiles—This CIA employee sold the Soviets the Big Bird Satellite manual in
1978. The system delivered both signals intelligence and photo surveillance.
10. John Anthony Walker, Jr.—He began spying for the Soviets in 1967, and before his arrest in
1984, had managed to recruit three other Navy men into what became known as the Walker
Spy Ring.
11. Jonathan Jay Pollard—Speculation concerning his revelations to Israel was provided earlier
in this chapter.
12. Ronald William Pelton—He was convicted of espionage in 1986 for providing the Soviets
with detailed information on the United States’ electronic espionage abilities.
13. David Sheldon Boone—This NSA cryptanalyst sold secrets to the Russians and in 1998
received a sentence of 24 years.
83 Examples taken from Fitsanakis, Joseph, “National Security Agency: The Historiography of Concealment,”
in de Leeuw, Karl and Jan Bergstra, editors, The History of Information Security, A Comprehensive Handbook,
Elsevier, Amsterdam, 2007, pp. 523–563, pp. 535, 538, 543–544 cited here; Johnson, Thomas, R., American
Cryptology During the Cold War, 1945–1989, Book I: The Struggle for Centralization, 1945–1960, Center
for Cryptologic History, National Security Agency, Fort George G. Meade, Maryland, 1995, pp. 277–279;
Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book II: Centralization Wins,
1960–1972, Center for Cryptologic History, National Security Agency, Fort George G. Meade, Maryland,
1995, pp. 470–471; Polmar, Norman and Thomas B. Allen, Spy Book, Random House, New York, 1997.
National Security Agency ◾ 365
this writing. Snowden is considered by some to be a patriotic whistle-blower. Chris Inglis, former
Deputy Director, NSA, explained why this appellation is inappropriate.
I do find it curious that Snowden, who is now kind of in the protective embrace of
Russia and who once enjoyed the protective embrace of China, has said nothing about
those legal regimes which most independent observers would say runs roughshod over
civil liberties, human rights. I find it curious he would not say a word about that. But
it’s consistent actually with what he said while he was in the United States. Nothing.
He made no complaint to anyone about what he now observes, what’s, in his view, a
violation of US person privacy, said nary a word the whole time he was at the National
Security Agency, nary a word the whole time he was with the CIA. When asked about
that by Jim Bamford, I believe two years ago, the spring of 2014, he said he had at
one time raised a question to an NSA lawyer. When we went back and took a look at
that it turns out the question that he asked was “Are the priorities that were kind of
articulated to me in a class that I took on the protection of US person privacies, the
US constitution, an equal priority between law and executive order, and then policies,
regulations, and the like?” Lawyer came back the same day answering that question
for Mr. Snowden, saying “No, that’s not exactly right. Turns out that a statute, a law,
trumps an executive order. They’re only on the same line in the priorities table because,
in the absence of a law, an executive order stands in.” If that’s a complaint about the
protection of US person privacy in the United States of America. I’m hard pressed
to see it. I’m hard pressed to understand it. Having raised not one question about
that issue while he was here in the United States, my assumption is that Snowden
doesn’t have the courage of his convictions when he thinks he might be held person-
ally accountable for standing up and defending those convictions. It’s not an official
position, but that’s how I feel.84
He [Snowden] said he was worried about the violation of US person privacy. Most of
the information he released has nothing to do with that. He said that he could prove
that the United States violated US person privacy beyond the reach of law, beyond
the constitutional norms that are established. There’s been no proof of that. Now we
as a matter of policy might decide that we’re uncomfortable with collecting telephone
metadata. That doesn’t make it illegal. Bad policy or a different choice about policy
doesn’t make it illegal or unconstitutional.85
Inglis also explained that much of what Snowden claimed is inaccurate. An example will help
illustrate this.
We have to distinguish between what Snowden said and what was true. His allega-
tions are not one in the same as revelations. Much of what he extrapolated from his
information was frankly untrue. He said, early on, that any NSA analyst could, sitting
at his or her desk, target the communications, the content of the communications, of
the President of the United States of America. Quote unquote. Patently untrue. It’s not
84 Irari Report, “Edward Snowden: NSA Perspective from former Deputy Director,” https://www.youtube.com/
watch?v=G5evenZOFU0, March 30, 2016.
85 Irari Report, “Edward Snowden: NSA Perspective from former Deputy Director,” https://www.youtube.com/
only illegal to do such a thing, but there are procedural controls and technical controls in
place that make that impossible. He said that the National Security Agency targeted the
content of the communications of US persons. That’s not true. It is absolutely true that in
targeting the content of legitimate foreign intelligence targets that sometimes the other
end of that conversation is a US person and it’s almost impossible to determine that with
great precision upfront, but because of that there are procedures in place of what exactly
do you do when you encounter that situation. I would describe that as a feature, not as a
burden, not as a sin. There had been no no evidence, since Snowden has come out, that
what he alleged is in fact true, that there have been any violations of law.86
The House Permanent Select Committee on Intelligence produced a study on the Snowden
betrayal. While it remains classified, a three page executive summary was released. It is repro-
duced below.87 You will see that it mirrors Inglis’s views and contains the lines “Snowden was not
a whistleblower” and “Snowden was, and remains, a serial exaggerator and fabricator.”
86 Irari Report, “Edward Snowden: NSA Perspective from former Deputy Director,” https://www.youtube.com/
watch?v=G5evenZOFU0, March 30, 2016.
87 U.S. House of Representatives, Executive Summary of Review of the Unauthorized Disclosures of Former National
Security Agency Contractor Edward Snowden, September 15, 2016, available online at https://fas.org/irp/
congress/2016_rpt/hpsci-snowden-summ.pdf.
National Security Agency ◾ 367
since he fled there on June 23, 2013. Accordingly, the Committee did not
interview individuals whom the Department of Justice identified as pos-
sible witnesses at Snowden’s trial, including Snowden himself, nor did the
Committee request any matters that may have occurred before a grand
jury. Instead, the IC provided the Committee with access to other indi-
viduals who possessed substantively similar knowledge as the possible
witnesses. Similarly, rather than interview Snowden’s NSA co-workers and
supervisors directly, Committee staff interviewed IC personnel who had
reviewed reports of interviews with Snowden’s co-workers and supervi-
sors. The Committee remains hopeful that Snowden will return to the
United States to face justice.
The bulk of the Committee’s 36-page review, which includes 230 foot-
notes, must remain classified to avoid causing further harm to national secu-
rity; however, the Committee has made a number of unclassified findings.
These findings demonstrate that the public narrative popularized by Snowden
and his allies is rife with falsehoods, exaggerations, and crucial omissions, a
pattern that began before he stole 1.5 million sensitive documents.
First, Snowden caused tremendous damage to national security,
and the vast majority of the documents he stole have nothing to do
with programs impacting individual privacy interests–they instead per-
tain to military, defense, and intelligence programs of great interest to
America’s adversaries. A review of the materials Snowden compromised
makes clear that he handed over secrets that protect American troops
overseas and secrets that provide vital defenses against terrorists and
nation-states. Some of Snowden’s disclosures exacerbated and acceler-
ated existing trends that diminished the IC’s capabilities to collect against
legitimate foreign intelligence targets, while others resulted in the loss of
intelligence streams that had saved American lives. Snowden insists he has
not shared the full cache of 1.5 million classified documents with anyone;
however, in June 2016, the deputy chairman of the Russian parliament’s
defense and security committee publicly conceded that “Snowden did
share intelligence” with his government. Additionally, although Snowden’s
professed objective may have been to inform the general public, the
information he released is also available to Russian, Chinese, Iranian, and
North Korean government intelligence services; any terrorist with Internet
access; and many others who wish to do harm to the United States.
The full scope of the damage inflicted by Snowden remains unknown.
Over the past three years, the IC and the Department of Defense (DOD)
have carried out separate reviews–with differing methodologies–of
the damage Snowden caused. Out of an abundance of caution, DOD
reviewed all 1.5 million documents Snowden removed. The IC, by con-
trast, has carried out a damage assessment for only a small subset of the
documents. The Committee is concerned that the IC does not plan to
assess the damage of the vast majority of documents Snowden removed.
Nevertheless, even by a conservative estimate, the U.S. Government has
spent hundreds of millions of dollars, and will eventually spend billions,
368 ◾ Secret History
The House Committee that produced this report consisted of 22 members, a mix of Republicans and
Democratics, who were unanimous in signing their names. This should help convince you that, once
people are made privy to the classified details, the Snowden betrayal is not a partisan issue.
General Michael Hayden, who served as Director of the National Security Agency from 1999
to 2005 and as Director of the Central Intelligence Agency from 2006 to 2009, offered a list of
questions he would have liked to have asked Snowden:88
88 Hayden, Michael V., Playing to the Edge: American Intelligence in the Age of Terror, Penguin Press, New York,
2016, pp. 419–420.
370 ◾ Secret History
You’ve cited Jim Clapper’s response to Ron Wyden on NSA surveillance as motivating
your actions. That was March 2013 but you began offering documents to Greenwald in
December 2012 and to Laura Poitras in January 2013. Weren’t you already committed?
While you were in Hong Kong fighting extradition, you told the press that NSA
was hacking into Chinese computers. On the surface that looks like you were trying
to buy safe passage. Were you?
The week before you fled Hong Kong, the London Guardian (based on your docu-
ments) claimed that the United States had intercepted Russian president Medvedev’s
satellite phone while he was at a G20 summit in England. What’s the civil liberties
issue there or is this just trading secrets for passage again?
You said that you raised your concerns within the system and that you were told
not to rock the boat. NSA can’t find any evidence. You took hundreds of thousands
of documents. Do any of them show your raising concerns? A single e-mail, perhaps?
You sound pretty authoritative, but the first PRISM stories were wrong, claiming
NSA had free access to the server farms of Google, Hotmail, Yahoo!, and the like. The
Washington Post later walked that back. Did you misread the slides too?
Le Monde and El País, based on your documents, claimed that NSA was collect-
ing tens of millions of metadata events on French and Spanish citizens each month.
It turns out those events were collected by the French and Spanish in war zones and
provided to NSA to help military force protection. Did you get that wrong too?
Hayden described Snowden’s betrayal as “the greatest hemorrhaging of legitimate American secrets
in the history of the republic.” He also noted that “the Snowden revelations kept on coming, often
timed for maximum embarrassment and crafted for maximum impact.”89 As for the man himself,
Hayden remarked, “I think Snowden is an incredibly naive, hopelessly narcissistic, and insuffer-
ably self-important defector.”90
INSKEEP: I want to ask about mistakes, errors, violations of privacy. You gave a fas-
cinating talk late last year at the University of Pennsylvania in which you referred to
89 Hayden, Michael V., Playing to the Edge: American Intelligence in the Age of Terror, Penguin Press, New York,
2016, p. 411.
90 Hayden, Michael V., Playing to the Edge: American Intelligence in the Age of Terror, Penguin Press, New York,
a document that had been disclosed that referred to something like 2,700 errors by the
NSA. You argued that about 2,000 of those were not really relevant, set them aside. And
then acknowledged there were 711 actual errors where you violated someone’s privacy in
a way that was not authorized. What happened on those 711 times in one year?
INGLIS: Yeah, so if I could clarify that. The report, first and foremost, was written in
the early part of 2012. We wrote it ourselves. And we generate these reports essentially to
take a hard look at how all the various things that we do to collect a communication of
interest, store the communication of interest, query the communication of interest, we
want to make sure we do that exactly right. And we determined in that report that on an
annualized basis, we extrapolated the numbers that we had essentially had about 2,776
situations that didn’t go exactly according to plan. That was immediately interpreted by
some press outlets when that was released - again, it was another unauthorized release
- but when it was released, some number of press outlets immediately equated that to
2,776 privacy violations and went so far as to say that they were either willful or kind of
attributable to the gross lack of conscientious actions on the side of NSA.
Which is why I went then to some pains to explain what that really was. It turned out
in 2,065 of those cases, so about 75 percent of those cases, the situation was that the indi-
vidual, the organization that we were authorized to understand something about, whose
communications we were trying to collect, had moved, right. Either they had physically
moved or their services had moved and they were in a different location. Our authorities
essentially asked the question up front of where is the party of interest? You know, where
is the communication of interest? And where is the collection taking place? And if any of
those change, we’re probably using the wrong authority. And so, 2,065 we notified our-
selves that that had changed. They don’t consult with us before they change their location.
And so the system actually worked exactly as it should, which is that it figured that
out, stopped the collection, purged back to the point where we last knew with precision
where they were and then went after the right authority to essentially begin that again.
In my view, that would be a feature, right, a positive feature. That leaves then 711. They
weren’t privacy violations, per se. What they were was that an analyst somewhere across
NSA entered the wrong telephone number, the wrong email address when they were
attempting to target A, but instead they could have potentially targeted A-prime. In
most of those cases the number that they entered because they fingered it, they got a 2 in
there instead of a 3, or something of that sort. The number didn’t exist and so it returned.
But in all those cases it was caught because we essentially had checks inside the
system, almost always a second check to make sure that what we have done is exactly
what we intended to do. And we caught all of those things. And essentially took the
right action. Whether it was how we formed the selector or whether it was how we que-
ried a database, whether it was how we disseminated a piece of information. And those
711 occurrences have to be considered against all the activities we took that year. And
it turns out that the average analyst, if you attributed those errors to an analyst, none
of which were willful, all of which were simply accidents, the average analyst at NSA
would make a mistake about every 10 years. The accuracy rate at NSA is 99.99984
percent, which is a pretty good record. But that said, we worry enough about making
any mistakes that the 711 are a peculiar interest to us.
We’re going to fix those. And so we have driven those down quarter by quarter,
year by year.
372 ◾ Secret History
Later in the interview, Inglis discussed some incidents, from other years, where NSA employees
broke the rules:93
Of note—you didn’t ask me but I’ll bring this up. You know, there is – a discussion
has taken place where there have, in fact, been some willful abuses of the signet capa-
bilities that NSA brings to bear. There have been 12 cases over the last 10 or so years
where individuals made misuse of the signet system. They essentially tried to collect a
communication that they were not authorized to collect 12 times.
The vast majority of those were, in fact, overseas. Right? They were NSAers oper-
ating in foreign locations trying to collect the communication of an acquaintance
so that they could better understand what that acquaintance was doing, but those
acquaintances were foreigners. And our capabilities must be applied in a way that
essentially meets the requirements imposed on me such that we would protect the
privacy of foreign persons as much as we would protect the privacy of U.S. persons.
It was not stated in the interview, but all of those people were fired.
The terser General Michael Hayden summed up the matter as follows:94
all the incidents were inadvertent; no one claimed that any rules were intentionally
violated. All of the incidents were discovered, reported and corrected by NSA itself.
Fully two-thirds of the incidents were composed of “roamers”–legitimately tar-
geted foreigners who were temporarily in the United States (and thus temporarily
protected by the Fourth Amendment).
He also pointed out that the 115 incidents of queries being incorrectly entered (typos or too-broad
search criteria) were out of 61 million inquiries. Hayden then went on to suggest that the headline
for The Washington Post article “NSA broke privacy rules thousands of times per year, audit finds”
should instead have been “NSA Damn Near Perfect.”95
2016, p. 411. Hayden quoted these lines from a piece he wrote for USA Today.
95 Hayden, Michael V., Playing to the Edge: American Intelligence in the Age of Terror, Penguin Press, New York,
2016, p. 412.
National Security Agency ◾ 373
London to present the British with a Purple analog. This balance of Army and Navy was impor-
tant, as the distrust between these service branches was an even greater barrier to the budding
intelligence sharing relationship than the distrust between nations!
After World War II, the intelligence sharing continued and in March 1946, the BRUSA
Agreement made it formal. This agreement was renamed UKUSA in 1948. Project BOURBON
was the codename for the work of America and England against the new common enemy, the
Soviet Union.96 The United States was involved with other nations to varying degrees. American
cooperation with Canada had begun back in 1940.97 Australia and New Zealand were to become
partners, as well. Although the agreements between NSA and GCHQ became public knowledge
decades earlier, the declassified documents only became available in June 2010.98
Europeans have expressed concern about how far the “Five Eyes” partners (United Kingdom,
United States, Canada, Australia, and New Zealand)99 have gone in regard to violating the privacy
of individuals and businesses. Of particular concern is a program codenamed ECHELON that
allows the agencies to search the worldwide surveillance network for desired information by using
keywords. Two examples of spying on nonmilitary targets are provided below:
In 1990, the German magazine Der Spiegel revealed that the NSA had intercepted mes-
sages about an impending $200 million deal between Indonesia and the Japanese satel-
lite manufacturer NEC Corp. After President Bush intervened in the negotiations on
behalf of American manufacturers, the contract was split between NEC and AT&T.100
In September 1993, President Clinton asked the CIA to spy on Japanese auto manu-
facturers that were designing zero-emission cars and to forward that information to
the Big Three U.S. car manufacturers: Ford, General Motors and Chrysler.101
Yet it would be naïve to believe the rest of the world was “playing fair.” President Barack Obama
said, “some of the folks who have been most greatly offended publicly, we know privately engage
in the same activities directed at us.”102 Another example shows how NSA is sometimes able to
cancel out the treachery of others.
From a commercial communications satellite, NSA lifted all the faxes and phone calls
between the European consortium Airbus, the Saudi national airline and the Saudi
government. The agency found that Airbus agents were offering bribes to a Saudi
96 Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book I: The Struggle for
Centralization, 1945–1960, Center for Cryptologic History, National Security Agency, Fort George G.
Meade, Maryland, 1995, p. 159.
97 Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book I: The Struggle for
Centralization, 1945–1960, Center for Cryptologic History, National Security Agency, Fort George G.
Meade, Maryland, 1995, p. 17.
98 NSA, Declassified UKUSA Signals Intelligence Agreement Documents [Press Release], National Security Agency,
Fort George G. Meade, Maryland, June 24, 2010, available online at http://www.nsa.gov/public_info/press_
room/2010/ukusa.shtml.
99 There are also secondary or junior partners, with whom some information is shared. Israel is not among them,
although in other ways the relationship between Israel and the United States is very close.
100 Poole, Patrick S., ECHELON, Part Two: The NSA’s Global Spying Network, http://www.bibliotecapleyades.
net/ciencia/echelon/echelon_2.htm.
101 Poole, Patrick S., ECHELON, Part Two: The NSA’s Global Spying Network, http://www.bibliotecapleyades.
net/ciencia/echelon/echelon_2.htm.
102 Hayden, Michael V., Playing to the Edge: American Intelligence in the Age of Terror, Penguin Press, New York,
2016, p. 413.
374 ◾ Secret History
official. It passed the information to U.S. officials pressing the bid of Boeing Co.
and McDonnell Douglas Corp., which triumphed last year [1994] in the $6 billion
competition.103
In any case, private intelligence agencies are on the rise. If a large corporation cannot get the gov-
ernment’s help, it can turn to one of these.
Figure 12.6 A wall listing the names of those who died serving NSA. (Courtesy of National
Security Agency, https://web.archive.org/web/20160325230227/http://www.nsa.gov/about/_
images/pg_hi_res/memorial_wall.jpg)
We’ve taken a look at some people who betrayed NSA, but they are, of course, the rare excep-
tions. There are likely more than are publicly known, but I doubt that they outnumber those at the
other extreme, who gave their lives in service to America through the agency. A wall inside NSA
commemorates these men and women (Figure 12.6). Sadly, during my time with NSA’s Center
for Cryptologic History, I saw this list grow. The rightmost column now extends to the bottom
and new names have been added in the triangular space above. While NSA employees are often
accused of violating the privacy of Americans, the numbers show that they are much more likely
to die in the line of duty than to intentionally break privacy laws these days.
103 Shane, Scott and Tom Bowman, No Such Agency, America’s Fortress of Spies, Reprint of a six-part series that
appeared in The Baltimore Sun, December 3–15, 1995, p. 2.
National Security Agency ◾ 375
Bamford, James, Body of Secrets: Anatomy of the Ultra-Secret National Security Agency from the Cold War
through the Dawn of a New Century, Doubleday, New York, 2001.
Bamford, James, The Shadow Factory: The Ultra-secret NSA from 9/11 to the Eavesdropping on America,
Anchor Books, New York, 2008.
Barker, Wayne G. and Coffman, Rodney E., The Anatomy of Two Traitors, The Defection of Bernon F.
Mitchell and William H. Martin, Aegean Park Press, Laguna Hills, California, 1981.
Boak, David G, A History of U.S. Communications Security, The David G. Boak Lectures, National Security
Agency, Fort George G. Meade, Maryland, Revised July 1973, Declassified December 2008, avail-
able online at https://www.nsa.gov/Portals/70/documents/news-features/declassified-documents/
cryptologic-histories/history_comsec.pdf.
Boak, David G, A History of U.S. Communications Security, The David G. Boak Lectures, Vol. II, National
Security Agency, Fort George G. Meade, Maryland, July 1981, Declassified December 2008, available
online at https://www.archives.gov/files/declassification/iscap/pdf/2009-049-doc2.pdf.
Breedan II, John, “What a Former NSA Deputy Director Thinks of the Snowden Movie,” Nextgov,
https://www.nextgov.com/ideas/2016/09/former-nsa-deputy-director-calls-out-snowden-movie-
grossly-inaccurate/131911/, September 28, 2016.
Briscoe, Sage and Aaron Magid, “The NSA Director’s Summer Program,” Math Horizons, Vol. 13, No. 4,
April 2006, p. 24.
Brownell, George A., The Origin and Development of the National Security Agency, Aegean Park Press,
Laguna Hills, California, 1981. This is a 98-page book.
Buehler, Hans, Verschlüsselt, Werd, Zürich, 1994. This book is in German and no translation is currently
available.
Churchill, Ward, and Jim Vander Wall, The COINTELPRO Papers, South End Press, Boston, Massachusetts,
1990. NSA is barely mentioned in this book, which is referenced here solely for the information it
contains on COINTELPRO. More information on NSA’s role may be found at http://www.icdc.
com/~paulwolf/cointelpro/churchfinalreportIIIj.htm.
Central Intelligence Agency, Family Jewels, 1973, available online at https://www.cia.gov/library/
readingroom/collection/family-jewels, released on June 25, 2007, during Michael V. Hayden’s term as
director of CIA. This nearly 700-page document, created by CIA employees in response to a request
from then Director of Central Intelligence James Schlesinger details illegal activities carried out by
the agency.
Constance, Paul, “How Jim Bamford Probed the NSA,” Cryptologia, Vol. 21, No. 1, January 1997, pp. 71–74.
de Leeuw, Karl, and Jan Bergstra, editors, The History of Information Security, A Comprehensive Handbook,
Elsevier, Amsterdam, 2007. Chapter 18 (pp. 523–563) of this large $265 book is titled National
Security Agency: The Historiography of concealment. It is by Joseph Fitsanakis, who complains about
the lack of study of this topic and then provides a list of 291 references.
Halperin. Morton H., Jerry J. Berman, Robert L. Borosage, and Christine M. Marwick, The Lawless State,
The Crimes of the U.S. Intelligence Agencies, Penguin Books, New York, 1976.
Hayden, Michael V., “Beyond Snowden: An NSA Reality Check,” World Affairs, Vol. 176, No. 5, January/
February 2014, pp. 13–23.
Hayden, Michael V., Playing to the Edge: American Intelligence in the Age of Terror, Penguin Press, New York,
2016. General Hayden served as Director of the National Security Agency from 1999 to 2005 and as
Director of the Central Intelligence Agency from 2006 to 2009.
Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book I: The Struggle for
Centralization, 1945–1960, Center for Cryptologic History, National Security Agency, Fort George
G. Meade, Maryland, 1995 available online at https://www.nsa.gov/Portals/70/documents/news-
features/declassified-documents/cryptologic-histories/cold_war_i.pdf. This book, and the next three
references, were declassified (with many redactions) beginning in 2008.
Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book II: Centralization Wins,
1960–1972, Center for Cryptologic History, National Security Agency, Fort George G. Meade,
Maryland, 1995, available online at https://www.nsa.gov/Portals/70/documents/news-features/
declassified-documents/cryptologic-histories/cold_war_ii.pdf.
376 ◾ Secret History
Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book III: Retrenchment
and Reform, 1972–1980, Center for Cryptologic History, National Security Agency, Fort George
G. Meade, Maryland, 1998, available online at https://www.nsa.gov/Portals/70/documents/news-
features/declassified-documents/cryptologic-histories/cold_war_iii.pdf.
Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book IV: Cryptologic Rebirth,
1981–1989, Center for Cryptologic History, National Security Agency, Fort George G. Meade,
Maryland, 1999, available online at https://www.nsa.gov/Portals/70/documents/news-features/
declassified-documents/cryptologic-histories/cold_war_iv.pdf.
Kahn, David, The Codebreakers, Macmillan, New York, 1967. (A second edition, with a few pages of updates,
appeared in 1996.) The NSA wasn’t pleased that Kahn devoted a chapter to them in his book, and they
considered various means of suppressing it. See Section 5.9.
Keefe, Patrick Radden, CHATTER: Dispatches from the Secret World of Global Eavesdropping, Random
House, 2005. On page 97, Keefe placed NSAs budget at $6 billion a year with 60,000 employees.
Langmeyer, Navah and Amy M. Grimes, “Mathematical Life at the National Security Agency,” Math
Horizons, Vol. 8, No. 3, February 2001, pp. 30–31.
Madsen, Wayne, “Crypto AG: The NSA’s Trojan Whore?” Covert Action Quarterly, Issue 63, Winter
1998, available online at http://mediafilter.org/caq/cryptogate/ and https://web.archive.org/
web/20000815214548/http://caq.com:80/CAQ/caq63/caq63madsen.html.
Miller, Greg, “The intelligence Coup of the Century,” The Washington Post, February 11, 2020, available
online at https://tinyurl.com/yck5xur2.
National Security Agency, Website, http://www.nsa.gov/.
National Security Agency, NSA Employee’s Security Manual. This manual (leaked in 1994) can be found
online at http://theory.stanford.edu/~robert/NSA.doc.html.
Odom, General William E., Fixing Intelligence for a More Secure America, second edition, Yale University
Press, New Haven, Connecticut, 2004. General Odom served as Director of the National Security
Agency from 1985 to 1988.
Ransom, Harry Howe, Central Intelligence and National Security, Harvard University Press, 1958; third
printing, 1965. Pages 116 to 118 discuss NSA.
Shane, Scott and Tom Bowman, No Such Agency, America’s Fortress of Spies, Reprint of a six-part series that
appeared in The Baltimore Sun, December 3–15, 1995.
Shane, Scott and Tom Bowman, “U.S. Secret Agency Scored World Coup: NSA Rigged Machines for
Eavesdropping,” The Baltimore Sun, January 3, 1996, p. 1A.
Sherman, David, “The National Security Agency and the William F. Friedman Collection,” Cryptologia,
Vol. 41, No. 3, May 2017, pp. 195–238.
Simons, Marc and Paul Reuvers, “Operation RUBICON/THESAURUS, The secret purchase of Crypto
AG by BND and CIA,” Crypto Museum, https://www.cryptomuseum.com/intel/cia/rubicon.htm,
created: December 12, 2019, last changed: May 10, 2020.
Simons, Marc and Paul Reuvers, “The Gentleman’s Agreement, Secret Deal between the NSA and Hagelin,
1939–1969,” Crypto Museum, https://www.cryptomuseum.com/manuf/crypto/friedman.htm, cre-
ated: July 30, 2015, last changed: May 10, 2020.
Smoot, Betsy Rohaly, “NSA Release and Transfer of Records Related to William F. Friedman,” Cryptologia,
Vol. 39, No. 1, January 2015, pp. 1–2.
Smoot, Betsy Rohaly, “National Security Agency releases Army Security Agency histories covering 1945–
1963,” Cryptologia, Vol. 41, No. 5, September 2017, pp. 476–478.
Tully, Andrew, The Super Spies, William Morrow, New York, September 1969.
Wagner, Michelle, “Organizational Profile: The Inside Scoop on Mathematics at the NSA,” Math Horizons,
Vol. 13, No. 4, April 2006, pp. 20–23.
Weiner, Tim, Blank Check: The Pentagon’s Black Budget, Warner Books, New York, 1990. This book makes
a study of undisclosed budgets.
Willemain, Thomas Reed, Working on the Dark Side of the Moon: Life Inside the National Security Agency,
Mill City Press, Maitland, Florida, 2017.
National Security Agency ◾ 377
On the Liberty
Borne John E., The USS Liberty: Dissenting History vs. Official History, Reconsideration Press, New York,
1995. This doctoral dissertation was submitted in partial fulfillment of the requirements for the degree
of Doctor of Philosophy, Department of History, New York University, September 1993.
Cristol, A. Jay, The Liberty Incident: The 1967 Israeli Attack on the U.S. Navy Spy Ship, Brassey’s, Inc.,
Washington DC, 2002. Cristol served for many years in the U.S. Navy, and argues that the Israelis
didn’t know they were attacking an American ship.
Ennes, Jr., James M., Assault on the Liberty, Random House, New York, 1979. Ennes, a lieutenant who was
on the Liberty, thinks the Israelis knew they were attacking an American ship.
Scott, James, The Attack on the Liberty, Simon & Schuster, New York, 2009. Scott, the son of a Liberty sur-
vivor, thinks the Israeli’s knew they were attacking an American ship.
Other works that include material on the Liberty were listed in the “On NSA” section of the references above;
for example, James Bamford’s Body of Secrets argues that the attack was known at the time to have been
on an American ship, whereas Book II of Thomas R. Johnson’s history presents the view that it was not.
On the Pueblo
Armbrister, Trevor, A Matter of Accountability, The True Story of the Pueblo Affair, Coward-McCann, Inc.,
New York, 1970.
Brandt, Ed, The Last Voyage of the USS Pueblo, W.W. Norton & Co., New York, 1969.
Bucher, Lloyd M. and Mark Rascovich, Bucher: My Story, Doubleday, Garden City, New York, 1970.
Crawford, Don, Pueblo Intrigue, Tyndale House Publishing, Wheaton, Illinois, 1969.
Gallery, Daniel V., The Pueblo Incident, Doubleday, Garden City, New York, 1970.
Harris, Stephen R. and James C. Hefley, My Anchor Held, Fleming H. Revell Company, Old Tappan, New
Jersey, 1970. Harris was the intelligence officer aboard the Pueblo at the time of capture.
Lerner, Mitchell B., The Pueblo Incident, University Press of Kansas, Lawrence, Kansas, 2002.
Liston, Robert A., The Pueblo Surrender, M. Evans and Company, Inc., New York, 1988. This book actually
argues that it was intended that the Pueblo be captured!
Also of Interest
Bamford, James, A Pretext for War: 9/11, Iraq, and the Abuse of America’s Intelligence Agencies, Doubleday,
New York, 2004. Bamford examines the following questions: Did Saddam Hussein have weapons
of mass destruction, as George W. Bush claimed? Was there a connection between Hussein and Al
Qaeda? The results of a poll showed that most Americans believed the answer to both question is yes.
Bamford clearly shows that the correct answer was no in both cases and details the abuse of the intel-
ligence agencies that led to the public’s misinformed beliefs concerning these issues.
Not of Interest
Brown, Dan, Digital Fortress, St. Martin’s Press, New York, 1998. Dan Brown’s breakthrough novel was The
Da Vinci Code, but his first novel, Digital Fortress, dealt with the NSA. It can be read for entertain-
ment, but doesn’t offer any insight into NSA.
Videography
America’s Most Secret Agency, The History Channel, January 8, 2001. Although supposedly on NSA, this
program features material on World War II, as well. As NSA was born in 1952, this can only be back-
ground. It turns out that there was more footage shot on NSA, but the agency got cold feet and asked
for it to be cut; hence, the filler—stock footage from World War II.
378 ◾ Secret History
Inside the NSA: America’s Cyber Secrets, National Geographic Video, 45 minutes, 2012.
Pueblo (alternate title, Pueblo Affair), ABC Theatre, 102 minutes, originally broadcast March 29, 1973.
A reviewer for The New York Times commented, “Despite network restrictions of the era, Pueblo
is refreshingly frank, right down to the first-ever TV display of a familiar obscene gesture (which
the American prisoners explain away to their captors as a ‘salute!’).” (http://movies.nytimes.com/
movie/128338/Pueblo/overview).
The Spy Factory, Nova, 53 minutes, originally broadcast February 3, 2009. This serves as a companion to
James Bamford’s book The Shadow Factory: The Ultra-secret NSA from 9/11 to the Eavesdropping on
America.
Top Secret: Inside the World’s Most Secret Agencies, Discovery Channel, 1999. This series explores the National
Security Agency, Scotland Yard, and Israel’s Mossad and is narrated by Johnny Depp.
Chapter 13
We now turn to a cipher far more advanced than anything previously discussed in this book. The
only reason it isn’t a good choice for use today is because increased computing power allows brute-
force solutions.
1 You may pronounce DES like a word (it rhymes with Pez) or pronounce each letter individually. There is no
standard for this!
2 Hellman, Martin E., “Work on Cryptography,” http://www-ee.stanford.edu/∼hellman/crypto.html.
3 Shannon, Claude E., “Communication Theory of Secrecy Systems,” The Bell System Technical Journal, Vol. 28,
No. 4, October 1949, pp. 656–715.
4 Hellman, Martin E., “Work on Cryptography,” http://www-ee.stanford.edu/∼hellman/crypto.html.
379
380 ◾ Secret History
Horst Feistel (Figure 13.1), an IBM employee born in Germany, is the man credited as the cre-
ator of DES (although others were involved—more on this soon). He wanted to call the sys-
tem Dataseal, but IBM used the term Demonstration Cipher, which was truncated to Demon.
Finally, the name was changed to Lucifer, maintaining what Feistel called “the evil atmosphere”
of Demon, as well as “cifer” (cipher).5 DSD-1 was another name used internally for this cipher.6
Lucifer was used by Lloyds Bank of London for a cash dispensing system in the early 1970s.7
The National Bureau of Standards (NBS)8 held a competition for a cipher system to meet
civilian needs. This system was to be called the Data Encryption Standard or DES. The call for
algorithms appeared in the Federal Register on May 15, 1973 (Vol. 38, No. 93, p. 12763) and again
on August 27, 1974 (Vol. 39, No. 167, p. 30961). Lucifer was the only algorithm deemed accept-
able by NBS and their NSA advisors.
The algorithm appeared in the Federal Register on March 17, 1975 and again on August 1, 1975
with a request for reader comments.9 Thus, Lucifer was adopted as the standard on July 15, 1977,
and had a final name change to DES. IBM agreed to place the relevant patents in the public domain,
so anyone who desired could freely use the algorithm; however, this didn’t prevent money being
made from the system by other companies that manufactured chips implementing the algorithm.10
DES can be intimidating when viewed all at once, but the individual pieces it is made out
of are very simple. The basic units (called blocks) on which the algorithm works are 64 bits (8
5 Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 980.
6 Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New
York, 2001.
7 Kinnucan, Paul, “Data Encryption Gurus: Tuchman and Meyer,” Cryptologia, Vol. 2, No. 4, October 1978, pp.
371–381.
8 NBS was founded in 1901, but renamed Bureau of Standards in 1903. It became NBS again in 1934 and then,
Requests for Comments,” Federal Register, Vol. 40, No. 52, March 17, 1975, pp. 12134–12139 and Hoffman,
John D., National Bureau of Standards, “Federal Information Processing Data Encryption Proposed Standard,”
Federal Register, Vol. 40, No. 149, August 1, 1975, pp. 32395–32414.
10 Morris, Robert, Neil J. A. Sloane, and Aaron D. Wyner, “Assessment of the National Bureau of Standards
Proposed Federal Data Encryption Standard,” Cryptologia, Vol. 1, No. 3, July 1977, pp. 281–291, p. 284 cited
here. Also see Winkel, Brian J., “There and there a department,” Cryptologia, Vol. 1, No. 4, 1977, pp. 396–397.
The Data Encryption Standard ◾ 381
characters) long. One operation used in DES consists of breaking the 64-bit message block in half
and switching sides, as depicted in the diagram in Figure 13.2.
LO RO
L1 = R0 R1 = L0
Figure 13.2 L1, the new left-hand side, is simply R0, the old right hand side; R1, the new right-hand
side, is L0, the old left-hand side. (http://csrc.nist.gov/publications/fips/fips46-3/fips46-3.pdf.)
LO RO
K1
+ f
L1 = R0 R1 = L0 + f (RO, K1)
This combination of two self-inverse operations is referred to as a round. DES goes through
16 such rounds. The manner in which the round keys are derived from K will be detailed, but
first we examine the function f. In general, we refer to a cipher that uses rounds of the form
depicted above (switching sides and applying a function to one half) as a Feistel system, or Feistel
cipher.
The most natural way of combining Ri and Ki would be to XOR them, but Ri is 32 bits long
and each of the round keys is 48 bits long. To even things up, R is expanded by repeating some of
the bits (their order is changed as well). This is indicated in Figure 13.4, and referred to as E (for
expansion). E is given by the following:
32 1 2 3 4 5
4 5 6 7 8 9
8 9 10 11 12 13
12 13 14 15 16 17
16 17 18 19 20 21
20 21 22 23 24 25
24 25 26 27 28 29
28 29 30 31 32 1
382 ◾ Secret History
R (32 Bits)
S1 S2 S3 S4 S5 S6 S7 S8
32 Bits
Once the expanded right hand side and the round key have been XORed, the result is broken
up into eight pieces of six bits each, each of which is fed into a substitution box (S-box) that returns
only four bits. Finally, a permutation, P, is performed on the output and the round is complete. See
Figure 13.4 for a depiction of these steps. The permutation P is given by:
16 7 20 21
29 12 28 17
1 15 23 26
5 18 31 10
2 8 24 14
32 27 3 9
19 13 30 6
22 11 4 25
The nonlinear heart of the algorithm is the S-boxes. Each box converts a 6-bit number, b1b2
b3b4b5b6, to a 4-bit number by first breaking it up into b1b6 and b2b3b4b5. That is, we now have a
2-bit and a 4-bit number. Converting each of these to base 10, the first is between 0 and 3 and the
second is between 0 and 15. Thus, a row and column of the S-box is referenced. The value at that
location is our 4-bit result. All eight S-boxes are provided in Table 13.1.
The Data Encryption Standard ◾ 383
S1 Column Number
Row
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
No.
0 14 4 13 1 2 15 11 8 3 10 6 12 5 9 0 7
1 0 15 7 4 14 2 13 1 10 6 12 11 9 5 3 8
2 4 1 14 8 13 6 2 11 15 12 9 7 3 10 5 0
3 15 12 8 2 4 9 1 7 5 11 3 14 10 0 6 13
S2 Column Number
Row
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
No.
0 15 1 8 14 6 11 3 4 9 7 2 13 12 0 5 10
1 3 13 4 7 15 2 8 14 12 0 1 10 6 9 11 5
2 0 14 7 11 10 4 13 1 5 8 12 6 9 3 2 15
3 13 8 10 1 3 15 4 2 11 6 7 12 0 5 14 9
S3 Column Number
Row
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
No.
0 10 0 9 14 6 3 15 5 1 13 12 7 11 4 2 8
1 13 7 0 9 3 4 6 10 2 8 5 14 12 11 15 1
2 13 6 4 9 8 15 3 0 11 1 2 12 5 10 14 7
3 1 10 13 0 6 9 8 7 4 15 14 3 11 5 2 12
S4 Column Number
Row
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
No.
0 7 13 14 3 0 6 9 10 1 2 8 5 11 12 4 15
1 13 8 11 5 6 15 0 3 4 7 2 12 1 10 14 9
2 10 6 9 0 12 11 7 13 15 1 3 14 5 2 8 4
3 3 15 0 6 10 1 13 8 9 4 5 11 12 7 2 14
(Continued)
384 ◾ Secret History
S5 Column Number
Row 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
No.
0 2 12 4 1 7 10 11 6 8 5 3 15 13 0 14 9
1 14 11 2 12 4 7 13 1 5 0 15 10 3 9 8 6
2 4 2 1 11 10 13 7 8 15 9 12 5 6 3 0 14
3 11 8 12 7 1 14 2 13 6 15 0 9 10 4 5 3
S6 Column Number
Row
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
No.
0 12 1 10 15 9 2 6 8 0 13 3 4 14 7 5 11
1 10 15 4 2 7 12 9 5 6 1 13 14 0 11 3 8
2 9 14 15 5 2 8 12 3 7 0 4 10 1 13 11 6
3 4 3 2 12 9 5 15 10 11 14 1 7 6 0 8 13
S7 Column Number
Row
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
No.
0 4 11 2 14 15 0 8 13 3 12 9 7 5 10 6 1
1 13 0 11 7 4 9 1 10 14 3 5 12 2 15 8 6
2 1 4 11 13 12 3 7 14 10 15 6 8 0 5 9 2
3 6 11 13 8 1 4 10 7 9 5 0 15 14 2 3 12
S8 Column Number
Row 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
No.
0 13 2 8 4 6 15 11 1 10 9 3 14 5 0 12 7
1 1 15 13 8 10 3 7 4 12 5 6 11 0 14 9 2
2 7 11 4 1 9 12 14 2 0 6 10 13 15 3 5 8
3 2 1 14 7 4 10 8 13 15 12 9 0 3 5 6 11
The Data Encryption Standard ◾ 385
As an example, suppose that just prior to heading into the S-boxes, you have the following
string of 48 bits:
010100000110101100111101000110110011000011110101.
The first six bits, b1b2b3b4b5b6 = 010100, will be substituted for using the first S-box, S1. We have
b1b6 = 00 and b2b3b4b5 = 1010. Converting these to base 10, we get 0 and 10, so we look in row 0
and column 10 of S1, where we find 6. Converting 6 to base 2 gives 0110. Thus, the first 6 bits of
the 48-bit string above are replaced by 0110.
We then move on to the next 6 bits of our original 48-bit string, 000110. If we now label these as b1b
2b3b4b5b6, we have b1b6 = 00 and b2b3b4b5 = 0011. Converting these to base 10, we get 0 and 3, so we look
in row 0 and column 3 of S2, where we find 14. Converting 14 to base 2 gives 1110. Thus, the second 6
bits of the 48-bit string above are replaced by 1110. We continue in this manner, 6 bits at a time, until
all 48 bits have been replaced by using each of the 8 S-boxes, in order. The final result is a 32-bit string.
It might seem that the particular values that fill the substitution boxes aren’t important. One
substitution is as good as another, right? Wrong! The official description for DES stated, “The
choice of the primitive functions KS, S1,…, S8 and P is critical to the strength of the encipherment
resulting from the algorithm.”11
Much more will be said about these S-boxes in this chapter, but for now, we continue the dis-
cussion of how DES works. We are now ready to look at the big picture.
Figure 13.5 illustrates the 16 rounds that were discussed above. Notice that there is no swap-
ping of right and left sides after the last round. This is so that encryption and decryption follow the
same steps with the only difference being that the round keys must be used in the opposite order for
decryption. It is important to note that the composition of rounds is not a round; that is, in general,
two rounds with different keys cannot be realized by a single round using some third key. If this were
the case, 16 rounds would be no better than one; they’d only take longer!
The only new elements here are the initial permutation and, at the end, the inverse initial
permutation.
The initial permutation is given by:
58 50 42 34 26 18 10 2
60 52 44 36 28 20 12 4
62 54 46 38 30 22 14 6
64 56 48 40 32 24 16 8
57 49 41 33 25 17 9 1
59 51 43 35 27 19 11 3
61 53 45 37 29 21 13 5
63 55 47 39 31 23 15 7
Nobody seems to know why the designers bothered to rearrange the bits of the
plaintext—it has no cryptographic effect—but that’s how DES is defined.
—Bruce Schneier12
11 National Institute of Standards and Technology, Data Encryption Standard (DES), Federal Information
Processing Standards Publication 46-3, October 25, 1999, p. 17, available online at http://csrc.nist.gov/
publications/fips/fips46-3/fips46-3.pdf. Note: KS stands for Key Schedule and refers to how the 16 round keys
are derived from the 56-bit key K. This is detailed later in this chapter.
12 Schneier, Bruce and Niels Ferguson, Practical Cryptography, Wiley, 2003, Indianapolis, Indiana, 2003, p. 52.
386 ◾ Secret History
Intput
Initial Permutation
Permuted LO RO
Input
K1
+ f
L1 = R0 R1 = L0 + f (RO, K1)
K2
+ f
L2 = R1 R2 = L1 + f (R1, K2)
Kn
+ f
K16
+ f
Output
Apparently, back in the 1970s, this permutation made it easier to load the chip, when DES encryp-
tion is carried out by hardware, rather than software. Hardware implementations of DES work
really well, but software implementations aren’t very efficient, because software doesn’t deal well
with permutations of bits. On the other hand, permuting bytes could be done more efficiently.
This approach is taken in portions of AES, an algorithm examined in Section 20.3.
The inverse of the Initial Permutation is:
48 8 48 16 56 24 64 32
39 7 47 15 55 23 63 31
38 6 46 14 54 22 62 30
37 5 45 13 53 21 61 29
36 4 44 12 52 20 60 28
35 3 43 11 51 19 59 27
34 2 42 10 50 18 58 26
33 1 41 9 49 17 57 25
These permutations are always as above. They are not part of the key and add nothing to the
security of DES. The cipher would be just as strong (or weak) without them, but it wouldn’t be DES.
Now, to complete the description of DES, we need to examine how the round keys are obtained
from K. It is accurate to refer to the key K as being 56 bits, but an extra 8 bits were used for error
detection. These check bits are inserted in positions 8, 16, 24, 32, 40, 48, 56, and 64 in order to
make the parity of each byte even. So, when selecting key bits from this 64-bit string, to use as
a round key, the positions holding the check digits should be ignored. The relevant 56 bits are
selected and permuted as follows.
57 49 41 33 25 17 9
1 58 50 42 34 26 18
10 2 59 51 43 35 27
19 11 3 60 42 44 36
63 55 47 39 31 23 15
7 62 54 46 38 30 22
14 6 61 53 45 37 29
21 13 5 28 20 12 4
There is more work to be done before we obtain a round key, though. A blank line was placed in
the middle of the 56 bits to indicate that it is split in half, just like the message blocks. To avoid
the confusion with L and R, we label these halves C and D.
388 ◾ Secret History
Key
Permuted
Choice 1
CO DO
Left Left
Shift Shift
C1 D1
Permuted
K1
Choice 2
Left Left
Shifts Shifts
Cn Dn
Permuted
Kn
Choice 2
Left Left
Shifts Shifts
C16 D16
Permuted
K16
Choice 2
Figure 13.6 Obtaining the 16 round keys from the 56-bit key. (http://csrc.nist.gov/publications/
fips/fips46-3/fips46-3.pdf.)
13 Cyclic shifts are often indicated with the notations “⋘” or “⋙”, depending on whether the shift is to the left
or to the right.
The Data Encryption Standard ◾ 389
After the number of left shifts required for a particular round has been performed, the two
halves are recombined and 48 bits are selected (and permuted) according to the following table.
14 17 11 24 1 5
3 28 15 6 21 10
23 19 12 4 26 8
16 7 27 20 13 2
41 52 31 37 47 55
30 40 51 45 33 48
44 49 39 56 34 53
46 42 50 36 29 32
The last necessary detail is the amount to left shift by in each round. It is usually (but not always!)
two units (see Table 13.2).
1 1 9 1
2 1 10 2
3 2 11 2
4 2 12 2
5 2 13 2
6 2 14 2
7 2 15 2
8 2 16 1
Notice that the shifts add up to 28, half the length of the key.
You now have enough information to implement DES in the programming language of your
choice. There were many details that required our attention, but each piece is very simple to
explain, as well as to code.14 However, why the S-boxes took the form they did was far from clear
when Lucifer officially became DES. There’ll be more on this soon.
For now, we note that Claude Shannon would have been pleased by the design of DES. The com-
bination of the transpositions and the substitutions made by the S-boxes over 16 rounds provides the
diffusion and confusion he desired. Suppose we encipher some message M with DES using key K to
14 It should be kept in mind, though, that DES was designed to be implemented on a chip, not expressed in software.
390 ◾ Secret History
get ciphertext C. Then, we change a single bit of M or K and encipher again to get a second ciphertext
C′. Comparing C and C′, we will typically find that they differ in about half of the positions.
See Figure 13.7 for another bit of cryptographic humor.
Figure 13.7 Another cartoon for the cryptologically informed (xkcd.com/153/). If you hover
over the cartoon with the mouse you get the alternate text, “If you got a big key space, let me
search it.”
Diffie, Hellman and others have objected that a 56-bit key may be inadequate to
resist a brute force attack using a special purpose computer costing about $20 million.
Others have estimated ten times that much. Whichever figure is correct, there is little
safety margin in a 56-bit key.15
15 Morris, Robert, Neil J. A. Sloane, and Aaron D. Wyner, “Assessment of the National Bureau of Standards
Proposed Federal Data Encryption Standard,” Cryptologia, Vol. 1, No. 3, July 1977, pp. 281–291, p. 281 cited
here. References for their objection are: Diffie, Whitfield, Preliminary Remarks on the National Bureau of
Standards Proposed Standard Encryption Algorithm for Computer Data Protection, unpublished report, Stanford
University, May 1975; Diffie, Whitfield and Martin E. Hellman, “A Critique of the Proposed Data Encryption
Standard,” Communications of the ACM, Vol. 19, No. 3, March 1976, pp. 164–165; Diffie, Whitfield and
Martin E. Hellman, “Exhaustive Cryptanalysis of the NBS Data Encryption Standard,” Computer, Vol. 10,
No. 6, June 1977, pp. 74–84.
The Data Encryption Standard ◾ 391
The machine hypothesized by Diffie and Hellman would have 1,000,000 chips, and could break
DES, given a single plaintext/ciphertext pair, in about 12 hours.16 It’s not hard to get a plaintext/
ciphertext pair. In fact, NBS agreed that this type of attack was reasonable.17
Walter Tuchman, part of the IBM design team, argued that the machine would cost $200 mil-
lion and went on to say that it would be cheaper and easier to get the information via bribery or
blackmail than to build the machine. However, he also said, “In our judgment, the 56-bit key length
is more than adequate for the foreseeable future, meaning 5 to 10 years.”18 Why not look ahead a bit
farther? Whether the machine would cost $20 million or $200 million is a trivial detail. Such costs
would be minor for a government with a military intelligence budget in the billions of dollars.
If 56 bits is too small, what size key should be used? Hellman pointed out that the military
routinely used key sizes nearly 20 times as large as that of DES.19 He suggested that NSA was
attempting to limit publicly available encryption keys to a size they could break. Because they
routinely allowed systems with keys shorter than 64 bits to be exported, but not systems with
longer keys, a 56-bit key must be vulnerable to their attacks.20 Others objecting to the 56-bit key
included the following:
Yasaki, E. K., “Encryption Algorithm: Key Size is the Thing,” Datamation, Vol. 22, No. 3,
March 1976, pp. 164–166.
Kahn, David, “Tapping Computers,” The New York Times, April 3, 1976, p. 27.
Guillen. M., “Automated Cryptography,” Science News, Vol. 110, No. 12, September 18, 1976,
pp. 188–190.
16 Diffie, Whitfield and Martin E. Hellman, “Exhaustive Cryptanalysis of the NBS Data Encryption Standard,”
Computer, Vol. 10, No. 6, June 1977, pp. 74–84.
17 Morris, Robert, Neil J. A. Sloane, and Aaron D. Wyner, “Assessment of the National Bureau of Standards
Proposed Federal Data Encryption Standard,” Cryptologia, Vol. 1, No. 3, July 1977, pp. 281–291, p. 286 cited
here.
18 Kinnucan, Paul, “Data Encryption Gurus: Tuchman and Meyer,” Cryptologia, Vol. 2, No. 4, pp. 371–381,
October 1978.
19 Kolata, Gina Bari, “Computer Encryption and the National Security Agency Connection,” Science, Vol. 197,
Morris, Sloane, and Wyner concluded that it would “very probably become feasible sometime in the
1980s for those with large resources (e.g., government and very large corporations) to decrypt [the
proposed standard] by exhaustive key search.”22 They did admit that they were not aware of any
cryptanalytic short-cuts to decrypting; that is, brute-force was the best attack they could find.
It was also observed that if the 56-bit key were obtained from 8 typed characters, as opposed
to 56 random bits, then “the cost of surreptitious decryption is lowered by a factor of about 200”
and suggested that users be warned of this, as well as advised that security can be increased by
enciphering twice with two different keys.23 This last statement seems premature (although it did
turn out to be correct, by a small margin). It wasn’t until 1993 that DES keys were proven not to
form a group. So, until 1993, one couldn’t be sure that double enciphering was any better.24
Morris, Sloane, and Wyner suggested the key length should be increased to 64 or even 128
bits.25 In earlier versions, the algorithm did, in fact, use a 128-bit key.26 They also wanted at least
32 iterations, instead of just 16.
Addressing concerns of the possibility of a backdoor having been built into DES by NSA col-
laborators, Tuchman said, “We developed the DES algorithm entirely within IBM using IBMers.
The NSA did not dictate a single wire.”27 It is now known that IBM did in fact receive help from
NSA. After receiving IBM’s work, NBS sent it on to NSA, where changes were made, with the
algorithm described here being the final result.28
There were even contemporary accounts admitting NSA’s involvement:
Aaron Wyner of Bell Laboratories in Murray Hill, New Jersey, says ‘IBM makes no
bones about the fact that NSA got into the act before the key size was chosen.’ [Alan]
Konheim [of IBM at Yorktown Heights, New York] admits that ‘IBM was involved
with the NSA on an ongoing basis. They [NSA employees] came up every couple of
months to find out what IBM was doing.’ But, Konheim says, ‘The 56-bit key was
chosen by IBM because it was a convenient size to implement on a chip.’29
More inconvenient would have been an inability to obtain an export license for a version with a
larger key!
22 Morris, Robert, Neil J. A. Sloane, and Aaron D. Wyner, “Assessment of the National Bureau of Standards
Proposed Federal Data Encryption Standard,” Cryptologia, Vol. 1, No. 3, July 1977, pp. 281–291, p. 281 cited
here.
23 Morris, Robert, Neil J. A. Sloane, and Aaron D. Wyner, “Assessment of the National Bureau of Standards
Proposed Federal Data Encryption Standard,” Cryptologia, Vol. 1, No. 3, July 1977, pp. 281–282, 286.
24 Although the claim should not have been stated as a fact, it wasn’t based on nothing. For the evidence, see
Grossman, Edna, Group Theoretic Remarks on Cryptographic Systems Based on Two Types of Addition, Research
Report RC-4742, Thomas J. Watson Research Center, IBM, Yorktown Heights, New York, February 26, 1974.
25 Morris, Robert, Neil J. A. Sloane, and Aaron D. Wyner, “Assessment of the National Bureau of Standards
Proposed Federal Data Encryption Standard,” Cryptologia, Vol. 1, No. 3, July 1977, pp. 281–291, p. 282 cited
here.
26 Girdansky, M. B., Data Privacy: Cryptology and the Computer at IBM Research, IBM Research Reports, Vol.
7, No. 4, 1971, 12 pages; Meyer, Carl H., “Design Considerations for Cryptography,” AFIPS ‘73 Conference
Proceedings, Vol. 42, AFIPS Press, Montvale, New Jersey, 1973, pp. 603–606 (see Figure 3 on p. 606).
27 Kinnucan, Paul, “Data Encryption Gurus: Tuchman and Meyer,” Cryptologia, Vol. 2, No. 4, October 1978,
pp. 371–381.
28 Trappe, Wade and Lawrence C. Washington, Introduction to Cryptography with Coding Theory, Prentice Hall,
In 1973 NBS solicited private industry for a data encryption standard (DES). The
first offerings were disappointing, so NSA began working on its own algorithm. Then
Howard Rosenblum, deputy director for research and engineering discovered that
Walter Tuchman of IBM was working on a modification to Lucifer for general use.
NSA gave Tuchman a clearance and brought him in to work jointly with the Agency
on Lucifer modification.31
NSA worked closely with IBM to strengthen the algorithm against all except brute
force attacks and to strengthen substitution tables, called S-boxes. Conversely, NSA
tried to convince IBM to reduce the length of the key from 64 to 48 bits. Ultimately,
they compromised on a 56-bit key.32
Though its export was restricted, it was known to be widely used outside the United
States. According to a March 1994 study, there were some 1,952 products developed
and distributed in thirty-three countries.33
30 Although not in the original declassified version, these passages were later released thanks to a FOIA request
filed by John Young.
31 Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book III: Retrenchment and
Reform, 1972–1980, Center for Cryptologic History, National Security Agency, Fort George G. Meade,
Maryland, 1995, p. 232.
32 Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book III: Retrenchment and
Reform, 1972–1980, Center for Cryptologic History, National Security Agency, Fort George G. Meade,
Maryland, 1995, p. 232. Another source tells it differently: Foerstel, Herbert N., Secret Science: Federal Control
of American Science and Technology, Praeger, Westport, Connecticut, 1993, p.129 says the key was chopped
from 128 bits (as used in Lucifer) to 56 bits and that NSA really wanted a 32 bit key!
33 Johnson, Thomas, R., American Cryptology During the Cold War, 1945–1989, Book III: Retrenchment and
Reform, 1972–1980, Center for Cryptologic History, National Security Agency, fort George G Meade,
Maryland, 1995, p. 239.
34 Morris, Robert, Neil J. A. Sloane, and Aaron D. Wyner, “Assessment of the National Bureau of Standards Proposed
Federal Data Encryption Standard,” Cryptologia, Vol. 1, No. 3, July 1977, pp. 281–291, p. 282 cited here.
35 For claim and quote about doubt (from the authors), see Morris, Robert, Neil J. A. Sloane, and Aaron D. Wyner,
“Assessment of the National Bureau of Standards Proposed Federal Data Encryption Standard,” Cryptologia,
Vol. 1, No. 3, July 1977, pp. 281–291, p. 287 cited here.
36 Lexar Corporation, An Evaluation of the NBS Data Encryption Standard, unpublished report, Lexar Corporation,
the communications, data processing, and banking industries, by the police and the medical and
legal professions, as well as being widely used by the Federal agencies for which it was officially
designed.” They were right:
DES quickly took over the encryption market, becoming the code provided by 99
percent of the companies selling equipment.37
Even the Moscow-based Askri company offers a software encryption package called
Cryptos for $100. The package is based on the Data Encryption Standard, America’s
own national standard.38
The plea for a longer key wasn’t heeded, nor were details of the S-box construction revealed. Although
adoption of DES was widespread, these problems prevented it from being universally accepted.
Robert Morris of Bell Laboratories in Murray Hill, New Jersey, says that officials of
the Bell Telephone Company have decided that the DES is too insecure to be used in
the Bell System.39
In 1985, when DES was used by bankers to encrypt electronic fund transfers, NSA Deputy
Director Walter Deeley said he “wouldn’t bet a plugged nickel on the Soviet Union not breaking
it.”40 Yet, DES remained mandatory until 2002 for all federal agencies, for protection of sensitive
unclassified information needing encryption.
A common question over the years since the arrival of DES was whether or not NSA could
break it. The quote above seems to indicate an affirmative answer by 1985, but the quote by Matt
Blaze reproduced below comes at the question from another perspective.41
An NSA-employed acquaintance, when asked whether the government can crack DES
traffic, quipped that real systems are so insecure that they never need to bother.
Blaze went on to give a list of the “Top Ten Threats to Security in Real Systems.” Anyone concerned
with actual security, and not just the mathematical aspects of the subject should study this list.42
37 Foerstel, Herbert N., Secret Science: Federal Control of American Science and Technology, Praeger, Westport,
Connecticut, 1993, p. 129.
38 Foerstel, Herbert N., Secret Science: Federal Control of American Science and Technology, Praeger, Westport,
the DES designers. They called it “T attack.” The mysterious S-boxes were generated and tested until
ones that fit certain criteria were found; they needed to be able to resist T attack and linear cryptanaly-
sis. Neither attack was publicly known at the time. The NSA knew about differential cryptanalysis,
but didn’t want the information shared; hence, the generation of the S-boxes had to remain secret.
Susan Landau, among others, believes that NSA had not anticipated linear cryptanalysis.43 In any
case, both of these attacks were rediscovered in the open community and are now available to anyone.
There are some “weak” keys for DES. This term is used to denote keys such that all of the
round keys they produce are identical. Obvious examples are a key consisting of 56 zeros and a
key consisting of 56 ones, but there are two others. There are also six pairs of semi-weak keys that
should be avoided. Rather than generating 16 distinct round keys, these only produce two, each
of which is then used for 8 rounds. The result is that a key in each semi-weak pair can decrypt
messages enciphered with the other key.44 Each extra key that can be used reduces the average
run-time of a brute-force attack by a factor of two.
Figure 13.8 The EFF DES Cracker sits behind Paul Kocher, the principal designer, who is
holding one of the 29 boards, each of which contains 64 custom microchips. (From http://
www.cryptography.com/technology/applied-research/research-efforts/des-key-search/des-key-
search-photos.html and used with the permission of Paul Kocher.)
43 In any case, it first appeared publicly in Matsui, Mitsuru, “Linear Cryptanalysis Method for DES Cipher,” in
Helleseth, Tor, editor, Advances in Cryptology – EUROCRYPT ’93 Proceedings, Lecture Notes in Computer
Science, Vol. 765, Springer, Berlin, Germany, 1994, pp. 386–397.
44 For more on this topic see Moore, Judy H. and Gustavus Simmons, “Cycle Structure of the DES with Weak
and Semiweak Keys,” in Odlyzko, Andrew M., editor, Advances in Cryptology – CRYPTO ‘86 Proceedings,
Lecture Notes in Computer Science, Vol. 263, Springer, Berlin, Germany, 1987, pp. 9–32.
396 ◾ Secret History
Figure 13.9 A close-up view of one of the DES Cracker circuit boards. (From http://www.cryp-
tography.com/technology/applied-research/research-efforts/des-key-search/des-key-search-
photos.html and used with the permission of Paul Kocher.)
Figure 13.10 A close-up view of one of the “Deep Crack” custom microchips. (From http://
www.cryptography.com/technology/applied-research/research-efforts/des-key-search/des-key-
search-photos.html and used with permission of Paul Kocher.)
The Data Encryption Standard ◾ 397
To prove the insecurity of DES, EFF built the first unclassified hardware for cracking
messages encoded with it. On Wednesday, July 17, 1998 the EFF DES Cracker, which
was built for less than $250,000, easily won RSA Laboratory’s DES Challenge II
contest and a $10,000 cash prize. It took the machine less than 3 days to complete the
challenge, shattering the previous record of 39 days set by a massive network of tens
of thousands of computers.45
Six months later, on Tuesday, January 19, 1999, Distributed.Net, a worldwide coalition
of computer enthusiasts, worked with EFF’s DES Cracker and a worldwide network of
nearly 100,000 PCs on the Internet, to win RSA Data Security’s DES Challenge III in a
record-breaking 22 hours and 15 minutes. The worldwide computing team deciphered
a secret message encrypted with the United States government’s Data Encryption
Standard (DES) algorithm using commonly available technology. From the floor of
the RSA Data Security Conference & Expo, a major data security and cryptography
conference being held in San Jose, Calif., EFF’s DES Cracker and the Distributed.Net
computers were testing 245 billion keys per second when the key was found.46
Nevertheless, the broken system was reaffirmed as the standard in 1999! This statement should be
qualified—it was reaffirmed in the Triple DES implementation, which neither the EFF machine
nor Distributed.Net could break. In 2002, a new standard was finally named: the Advanced
Encryption Standard. It is described in Section 20.3.
For now, the record for the least expensive DES cracking machine is held jointly by team
members from the German universities of Bochum and Kiel. Dubbed COPACOBANA (Cost-
Optimized Parallel Code Breaker), their $10,000 device cracked a DES message in 9 days in 2006.
Modifications made since then have improved COPACOBANA’s efficiency and it now produces
plaintext in less than a day.
C = E key 2 ( E key 1 ( M ))
we simply form two columns of partial decipherments.
45 ““EFF DES Cracker” Machine Brings Honesty to Crypto Debate, Electonic Frontier Foundation Proves that
DES is not Secure,” Electronic Frontier Foundation, https://tinyurl.com/y8fzjymd, July 17, 1998.
46 “Cracking DES,” Electronic Frontier Foundation, https://tinyurl.com/y9eqmwdc.
47 Merkle, Ralph, and Martin Hellman, “On the Security of Multiple Encryption,” Communications of the ACM,
These two columns will have an entry in common. When key 1 is used in Column 1 and key 2 is
used in Column 2, the entries will both match the result following the first encipherment of the
original message. Thus, after calculating 2 columns of 256 values each, the keys must be revealed.
This is how we get 257 as the size of the space that must be brute-forced.
When looking for a match between Columns 1 and 2, we might find several. If we have more
than a single block of text to work with, we can apply the potential keys to each of these to see
which is actually the correct key pair.
With Triple DES, we gain more of an advantage. We may carry out the triple encryption with
just two keys by applying the following steps.48
48 This procedure was suggested in Tuchman, Walter, “Hellman Presents No Shortcut Solutions to DES,” IEEE
Spectrum, Vol. 16, No. 7, July 1979, pp. 40–41.
49 van Oorschot, Paul C. and Michael J. Wiener, “A Known-plaintext Attack on Two-key Triple Encryption,” in
Damgård, Ivan B., editor, Advances in Cryptology – EUROCRYPT ’90 Proceedings, Lecture Notes in Computer
Science, Vol. 473, Springer, Berlin, Germany, 1991, pp. 318–325.
50 Kaliski, Jr., Burton S., Ronald L. Rivest, and Alan T. Sherman, “Is the Data Encryption Standard a Group?
(Results of Cycling Experiments on DES),” Journal of Cryptology, Vol. 1, No. 1, 1988, pp. 3–36.
51 Campbell, Keith W. and Michael J. Wiener, “DES is Not a Group,” in Brickell, Ernest F., Advances in
Cryptology – Crypto ’92, Springer, Berlin, Germany, 1993, pp. 512–520. This paper credits Don Coppersmith
for the proof. It’s available online at http://dsns.csie.nctu.edu.tw/research/crypto/HTML/PDF/C92/512.PDF.
The Data Encryption Standard ◾ 399
We may double encipher a given message M by using both keys: E1E0(M). This double encipher-
ment may then be applied to the ciphertext, E1E0(E1E0(M)). For convenience, we denote this as
(E1E0)2(M). What is of interest to us is that there are choices for M such that (E1E0)n(M) = M,
where n is about 232. This power is small enough that a cryptanalyst can investigate and determine
the exact value of n without having a ridiculously long wait. The lowest value of n yielding the
original message is referred to as the cycle length of M. The lowest value for n that will work for all
messages is the order of E1E0. The cycle length of any particular message M must divide the order
of E1E0, which in turn divides the order of the group formed by the DES keys (if it is indeed a
group). So, to show that DES is not a group, all that is necessary is to calculate the cycle lengths
for various choices of n and look at their least common multiple, which provides a lower bound on
the order of E1E0 (and hence, the DES keys themselves).
Don Coppersmith was the first to do this.52 The cycle lengths of 33 messages he examined
implied the order of E1E0 to be at least 10277. Keith W. Campbell and Michael J. Wiener followed
up on this with 295 more messages showing that the subgroup generated by the DES permutation
was bounded below by 1.94 × 102499. Campbell and Wiener noted that back in 1986 cycle lengths
were published by Moore and Simmons53 that, when taken with the argument above, were suf-
ficient to show DES was not a group; however, this point was missed and the question remained
open for years!
DES was not to be the last cipher that NSA had a hand in designing for outside use. A less
successful attempt was made with the Clipper chip in 1993.54 It differed from DES in that
the algorithm was classified; all the user would get would be a tamperproof chip. Even worse,
the proposal came with the idea of “key escrow,” meaning that the built-in key for each chip
would be kept on file, so that it could be accessed by law enforcement agents who had obtained
warrants. The government’s past history of warrantless eavesdropping didn’t inspire much con-
fidence in this proposal. Following some heated debate, Clipper vanished. Key escrow attempts
proved to be even less welcome in the European Union, where there are, in general, more laws
protecting privacy.
52 Don Coppersmith, In Defense of DES, personal communication to author(s) of DES is not a Group, July 1992.
The work was also described briefly in a posting to sci.crypt on Usernet News, May 18, 1992.
53 Moore, Judy H. and Gustavus Simmons, “Cycle Structure of the DES with Weak and Semiweak Keys,” in
Odlyzko, Andrew M., editor, Advances in Cryptology – CRYPTO ‘86 Proceedings, Lecture Notes in Computer
Science, Vol. 263, Springer, Berlin, Germany, 1987, pp. 9–32.
54 For a non-technical history of the Clipper chip, see pages 226–268 of Levy, Steven, Crypto: How the Code
Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New York, 2001. An analysis of the
algorithm used by Clipper (it was eventually released) is given in Kim, Jongsung, and Raphaël C.–W. Phan
“Advanced Differential-Style Cryptanalysis of the NSA’s Skipjack Block Cipher,” Cryptologia, Vol. 33, No. 3,
July 2009, pp. 246–270.
400 ◾ Secret History
As I said, we were hoping for some correlation, not necessarily a perfect correlation. We investi-
gated in a much unsophisticated way. We wrote a program that generated a random message and
enciphered it with 100 random keys and then calculated the correlation between the bitsums of
the ciphertexts and the keys. Actually, this was all done inside a large loop, so many different ran-
dom messages were tested in this manner. Every time a message yielded a higher correlation than
any previous messages (or tied the current high), it was displayed on the screen. Thus, when run,
we saw a list of ever better (but never actually good) correlation values displayed beside various
messages. Take a look at Figure 13.11, which shows some results, and see if you notice anything
unusual, before reading the explanation that follows.
We thought it very strange that the last two messages displayed were complementary! Because
the messages were randomly generated it seemed unlikely that the complement of one would
appear so quickly, and why would they have the same correlation value? I went back and read a
bit more about DES and learned that replacing every bit of a message with its complement and
doing the same for the key yields a ciphertext that is the complement of the original ciphertext.55
Algebraically,
E (K , P ) = E (K , P )
This result was known (to others) well before we began our attack, but it was new to us. Taking
the message yielding the “best” correlation, we tried it repeatedly with random sets of keys and
computed the correlation for each set. As the output in Figure 13.12 indicates, the correlation
didn’t remain at the value found above.
The situation is analogous to taking a large number of coins (as opposed to messages) and
flipping them 100 times each. Perhaps one of them will land head side up 60% of the time. The
distribution of the number of heads the coins shows should follow a bell curve, so it’s not surpris-
ing that some coins yield many more heads than others. However, if we take the coin yielding the
most heads and flip it another 100 times, repeatedly, we can expect, on average, that there will be
50 heads each time.
55 For a proof see Schneier, Bruce and Niels Ferguson, Practical Cryptography, Wiley, Indianapolis, Indiana, 2003,
p. 54.
The Data Encryption Standard ◾ 401
Figure 13.12 Result of retrying the “best” message with more random key sets.
So, a purely investigatory line of thought, not backed by any theory, pointed us to an interest-
ing feature of DES that we had not anticipated. Though the result was not new, I think it serves as
a good illustration of how research may go. Even failures may prove interesting and useful.
Several DES variants have been offered to address attacks. The most notable (and simplest) is
Triple DES, but there is also DES with Independent Subkeys, DESX, CRYPT(3), Generalized
DES (GDES), DES with Alternate S-Boxes, RDES, snDES (a number of variants, with n taking
the values 2, 3, 4, and 5), DES with Key-Dependent S-Boxes, and NEWDES.56
implementing matrix encryption.58 Before showing Levine’s approach, we set our notation. We will
represent traditional matrix encryption by Ci = APi, where A is the (never changing) enciphering
matrix and Pi and Ci are the ith groups of plaintext and ciphertext characters, respectively.
58 Levine, Jack, “Variable Matrix Substitution in Algebraic Cryptography,” American Mathematical Monthly, Vol.
65, No. 3, March 1958, pp. 170–179.
59 Feynman, Richard, “What Do You Care What Other People Think?” W. W. Norton & Company, New York,
1988, p. 61.
The Data Encryption Standard ◾ 403
So, the ciphertext begins 0 14 3 0 4 14, or AODAEO, in terms of letters. This should be
enough to make Levine’s matrix encryption mode clear.
Levine also presented a version that used previous ciphertext to vary the encryption:
C i = APi + BC i −1
Another method used by Levine to vary the substitutions will only be mentioned in passing, as it
didn’t lead in the direction of the modern modes of encryption. It took the form Ci = AiPi. That is,
the matrix A changed with every encipherment. One only need be careful that the Ai are generated
in a manner that ensures each will be invertible; otherwise, the ciphertext will not have a unique
decipherment.
damaged blocks will be of the same number and in the same positions. This mode was invented in
1976 by researchers at IBM. They were granted a patent two years later.60
Shift Register
Encrypt
Pi Ci
(8 bits) (8 bits)
The two Ci paths heading out indicate the ciphertext goes to the intended recipient, as well as
back up to the right-hand side of the shift register. When the 8 bits of ciphertext hit the shift reg-
ister, they push out the 8 bits on the left-hand side of the register; that is, all the bits in the register
shift eight positions to the left and the leftmost 8 bits fall off the edge (get discarded).
If an error is present in a plaintext block, it will change all the ciphertext blocks that follow, but
this isn’t as bad as it sounds. The error will undo itself upon deciphering, until the original flawed
plaintext block is reached. On the other hand, if an error creeps into a ciphertext block, there will
60 Ehrsam, William F., Carl H. W. Meyer, John L. Smith, and Walter L. Tuchman, Message Verification and
Transmission Error Detection by Block Chaining, U.S. Patent 4,074,066, February 14, 1978, https://patents.
google.com/patent/US4074066A/en.
The Data Encryption Standard ◾ 405
be an error in the corresponding plaintext block, and it will then creep into the shift register, where
it will cause further errors until it is shifted all the way out of the register.
Shift Register
Encrypt
Pi Ci
(8 bits) (8 bits)
The only difference between Figure 13.13 and Figure 13.14 is that in the present mode the bits
advancing the shift register arise directly from the encryption step, rather than after the XOR. In this
mode, an error that creeps into a ciphertext block will affect only the corresponding plaintext block.
At the Crypto ’82 conference held at the University of California, Santa Barbara, a pair of
talks addressed weaknesses in using OFB to generate keys in groups of less than 64 bits at a time,
as illustrated above. An abstract published in the proceedings of this conference put it bluntly.
The broad conclusion is reached that OFB with m = 64 is reasonably secure but OFB
with any other value of m is of greatly inferior security.61
61 Davies, Donald W. and Graeme I. P. Parkin, “The Average Cycle Size of the Key Stream in Output Feedback
Encipherment (Abstract),” in Chaum, David, Ronald L. Rivest, and Alan T. Sherman (editors), Advances in
Cryptology, Proceedings of Crypto 82, Plenum Press, New York, 1983, pp. 97–98.
406 ◾ Secret History
The problem was that smaller values of m, such as the 8 used above, could give rise to cycles of key
to be XORed with the text that are simply too short for heavy traffic. So, it shouldn’t be a surprise
that the update to this mode definition, as given in SP 800-38A, is to have it operate on the entire
block, not just a portion of it.62
Step 2 : R = E ( N ⊕ L )
Efficient Authenticated Encryption,” ACM Conference on Computer and Communications Security (CCS ‘01),
ACM Press, pp. 196–205, 2001, available online at http://krovetz.net/csus/papers/ocb.pdf. The journal version
is in ACM Transactions on Information and System Security (TISSEC), Vol. 6, No. 3, August 2003, pp. 365–403,
available online at https://dl.acm.org/doi/pdf/10.1145/937527.937529 and https://www.cs.ucdavis.edu/∼rogaway/
papers/ocb-full.pdf.
65 Digital Fountain was founded to commercialize encoding technology for efficient reliable asynchronous mul-
N stands for nonce, which is another name for an initialization vector. It is a random string of bits.
This step, performed repeatedly, generates the sequence L1, L2, L3, … The symbol · is not used to
represent the simple kind of multiplication that everyone is familiar with. Instead, it designates
multiplication over the finite field GF(2n) with n = 128. This requires some explanation. A much
smaller example than applies to OCB will be used to convey the idea.
Suppose we want to multiply two values over the finite field GF(2n) for the case n = 3. Then
the values being multiplied will be 3-bits each. But before the multiplication is carried out, the bits
will become the coefficients of polynomials. For example, to multiply 111 and 011, we multiply 1x 2
+ 1x + 1 and 0x 2 + 1x + 1. Of course, we don’t need to write a coefficient that is 1 (unless it’s the
constant term) and the terms with 0 as a coefficient drop out. We are really multiplying x 2 + x +
1 and x + 1. This product is x3 + 2x 2 + 2x + 1. But in our binary world we reduce the coefficients
modulo 2, so that they can only be 0 or 1. Our product becomes x3 + 1. Now there is one last
step to make. We need to divide by an irreducible polynomial (i.e. one that cannot be factored in
our system where the coefficients must be 0 or 1) and take the remainder as our answer. Because
we’re looking at the case n = 3, the irreducible polynomial must be of degree 3. There are only two
such polynomials. The one that will be used here is x3 + x + 1. When we divide our product by
this number (again adjusting coefficients modulo 2), the remainder is x. Converting this back to
a binary representation, we get 010.
For OCB, n = 128, so the bits that make up Li-1 become the coefficients of a polynomial
with as many as 128 terms (i.e. a maximum degree of 127, if the leading coefficient is 1). In Step
4, this value is multiplied by 2, which corresponds to the binary number 10 (ignoring the 126
leading 0s) and the polynomial x. The irreducible polynomial that the product is then divided
by in OCB is m(x) = x128 + x7 + x 2 + 1. The remainder (which is what we are interested in) is
converted back to bits (by taking the coefficients) and is our final answer. This weird way of
multiplying strings of bits will be seen again when we get to the Advanced Encryption Standard
in Chapter 20.
Step 4 : Z1 = L ⊕ R
This step generates the first block of key to be XORed with the first message block.
This last step introduces a new function, ntz(i), which gives the number of trailing 0s in the binary
representation of i. If i is read from right to left, this is the number of 0s that are encountered
before coming to a 1. For even values of i, we have ntz(i) = 0. The result of ntz(i) gives us the sub-
script for L. That is, it tells us which term in the L sequence is to be XORed with Zi−1 to give us
Zi. This step must be performed repeatedly, until enough key is generated to encipher the entire
message.
Notice that none of the five steps detailed above require any knowledge of the message that
is to be enciphered. These calculations can be carried out in advance. Once the sequence Zi is
known, the message blocks can be enciphered. It should also be noted that if identical messages are
enciphered with the same encryption key, but different nonces are used to generate the sequence Zi,
then the final results will be completely different. The nonce needs to be communicated to the
recipient, who is only assumed to know the key used by the encryption algorithm E.
408 ◾ Secret History
There are a few more aspects of OCB that need to be detailed. The first concerns enciphering
the last message block, which may be shorter than the rest. A new value must be calculated like so:
X = len( M m ) ⊕ ( L ⋅ 2−1 ) ⊕ Z m
Where len(Mm) is the length of the last message block expressed as an n-bit number.
The weird polynomial multiplication plays a role in this step. This time, instead of multiply-
ing by 2 = 10 = x, we are multiplying by 2−1, which is x−1, the multiplicative inverse of x modulo
m(x) = x128 + x7 + x 2 + 1. That is, we must multiply by the polynomial that, when multiplied by x
and divided by m(x) yields a remainder of 1. But how do we find this polynomial? More generally,
how can we find the multiplicative inverse of a polynomial modulo another polynomial? The steps
for the simpler case of finding the multiplicative inverse of a number mod another number (using
the extended Euclidean algorithm) are shown in Section 14.3. To solve this more intimidating
sounding problem, there is no difference other than using polynomials instead of integers. While
it might seem very tedious to carry out the multiplication by whatever x−1 turns out to be, there
are great shortcuts that can be taken and it is not hard at all to code this up on computer (or to
put it on a chip).66
After X is determined, another new value is calculated:
Y = E(X )
Y is likely longer than Mm, but the appropriate number of bits from the right side of Y are deleted so
that the result is the same length as Mm. That is, some of the least significant bits of Y are removed.
The truncated Y, trunc(Y ) is then XORed with Mm to get the last ciphertext block. That is:
C m = trunc(Y ) ⊕ M m
Thus, the final ciphertext block is the same length as the final message block.
Although the entire message has now been enciphered, a few more calculations are performed
to give this mode a special property. The first is called a checksum and is calculated as
checksum = M1 ⊕ M 2 ⊕ ⊕ M m −1 ⊕ Y ⊕ C m 0*
Cm 0* is simply Cm with 0s appended to the end (as least significant bits) to make this block the
same length as the others. The process of adding 0s is called padding. In some other systems, pad-
ded bits follow a pattern, such as alternating 1s and 0s.
Next, an authentication tag must be calculated, like so
The number, τ, of most significant bits retained for the tag varies from application to application.
The checksum is an intermediate calculation that is not included in what is sent to the intended
recipient. All that is transmitted is the ciphertext and the tag.
Now, what is the advantage of doing the extra calculations needed to determine the tag? The
tag gives the recipient a way to confirm the authenticity of the message. After the message is
completely deciphered, the recipient can calculate the checksum, and then the tag. If the tag thus
66 The shortcut is detailed in the appendix of Stallings, William, “The Offset Codebook (OCB) Block Cipher
Mode of Operation for Authenticated Encryption,” Cryptologia, Vol. 42, No. 2, March 2018, pp. 135–145.
The Data Encryption Standard ◾ 409
calculated matches the one he received, then the recipient can conclude that the message is authen-
tic. That is, it came from the person claiming to have sent it, without alteration.
Ciphers don’t typically have this property. For example, if DES is implemented in Electronic
Code Book (ECB) mode, someone who is able to control the communication channel might be
able rearrange the enciphered blocks so that the deciphered message has a different meaning.
Perhaps the message was an order for the purchase of 1,000 shares of stock X and 50,000 shares
of stock Y. The rearranged blocks might decipher to 1,000 shares of stock Y and 50,000 shares
of stock X. If the same key is used for more than one message, an attacker could insert blocks
from one message into another as additions or replacements. Both sorts of mischief are blocked
by OCB. Changes that could go undetected in ECB will be revealed in OCB, when the tag the
recipient calculates doesn’t match the tag sent with the ciphertext. If the tag does match, then the
message had to come from the person claiming to have sent it. This is what is meant by authen-
ticity. The fact the message could not have been altered without being detected is called message
integrity. Usually, integrity follows as a consequence of authenticity. Of course, what authenticity
really means in this instance is that the message came from the person who has the key. If the key
is stolen, even OCB will be unable to distinguish between authentic and unauthentic messages.
In addition to the great feature of authenticity, OCB is very fast and requires little energy to
implement. Hence, it is popular in devices for which these issues matter. To be precise, what was
detailed here is OCB1. There is also OCB3, which is slightly more efficient and allows the incorpo-
ration of data that must be authenticated, but doesn’t require encryption.67 To answer an obvious
question, there was an OCB2, but it was shown to be insecure in 2018.68
67 Stallings, William, “The Offset Codebook (OCB) Block Cipher Mode of Operation for Authenticated
Encryption,” Cryptologia, Vol. 42, No. 2, March 2018, pp. 135–145, details both OCB1 and OCB3.
68 See Inoue, Akiko and Kazuhiko Minematsu, “Cryptanalysis of OCB2,”, https://eprint.iacr.org/2018/1040;
De Meyer, Lauren and Serge Vaudenay, “DES S-box generator,” Cryptologia, Vol. 41, No. 2, March 2017,
pp. 153–171.
Diffie, Whitfield, and Martin E. Hellman, “A Critique of the Proposed data Encryption Standard,”
Communications of the ACM, Vol. 19, No. 3, March 1976, pp. 164–165.
Electronic Frontier Foundation, Cracking DES: Secrets of Encryption Research, Wiretap Politics & Chip
Design, O’Reilly, Sebastopol, California, 1998. This book is in the public domain. The bulk of it is
source code, intended to be scanned by OCR. The purpose was to get around export laws on software
that did not apply to code in print form. The Electronic Frontier Foundation’s website is http://www.
eff.org/.
Güneysu, Tim, Timo Kasper, Martin Novotný, Christof Paar, and Andy Rupp, “Cryptanalysis with
COPACOBANA,” IEEE Transactions on Computers, Vol. 57, No. 11, November 2008, pp. 1498–1513.
Hellman, Martin, Ralph Merkle, Richard Schroeppel, Lawrence Washington, Whitfield Diffie, Stephen
Pohlig, and Peter Schweitzer, Results of an Initial Attempt to Cryptanalyze the NBS Data Encryption
Standard, Information Systems Laboratory SEL 76-042, Stanford University, September 1976. This is
the paper that found a symmetry that cut the keyspace in half under a chosen plaintext attack.
Hellman, Martin, http://cryptome.org/hellman/hellman-ch1.doc. The beginning of an autobiography can
be found here.
Juels, Ari, moderator, RSA Conference 2011 Keynote – The Cryptographers’ Panel, video available online at
http://www.youtube.com/watch?v=0NlZpyk3PKI. The panel consisted of Whitfield Diffie, Martin
Hellman, Ron Rivest, Adi Shamir, and Dickie George. George was involved with DES from the NSA
side, as a technical director. The conversation is wide-ranging, but much of it concerns DES.
Kahn, David, The Codebreakers, second edition, Scribner, New York, 1996, p. 980. Do not consult the first
edition for information on DES. This cipher didn’t exist when the first edition was published.
Katzman, Jr., Harry, The Standard Data Encryption Algorithm, Petrocelli Books, New York, 1977.
Kinnucan, Paul, “Data Encryption Gurus: Tuchman and Meyer,” Cryptologia, Vol. 2, No. 4, pp. 371–
381, October 1978, reprinted from Mini-Micro Systems, Vol. 2 No. 9, October 1978, pp. 54, 56–58,
60. This paper quotes Walter Tuchman as saying, “The DES algorithm is for all practical purposes
unbreakable.” Feistel is mentioned, but this paper gives nearly all of the credit for DES to Walter
Tuchman and Carl Meyer. Tuchman was also quoted as saying, “The NSA told us we had inadver-
tently reinvented some of the deep secrets it uses to make its own algorithms.”
Landau, Susan, “Standing the Test of Time: The Data Encryption Standard,” Notices of the AMS, Vol.
47, No. 3, March 2000, pp. 341–349, available online at http://www.ams.org/notices/200003/fea-
landau.pdf.
Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking,
New York, 2001.
Matsui, Mitsuru, “Linear Cryptanalysis Method for DES Cipher,” in Helleseth, Tor, editor, Advances in
Cryptology – EUROCRYPT ‘93 Proceedings, Lecture Notes in Computer Science, Vol. 765, Springer,
Berlin, Germany, 1994, pp. 386–397.
Merkle, Ralph and Martin Hellman, “On the Security of Multiple Encryption,” Communications of the
ACM, Vol. 24, No. 7, July 1981, pp. 465–467.
Morris, Robert, Neil J. A. Sloane, and Aaron D. Wyner, “Assessment of the National Bureau of Standards
Proposed Federal Data Encryption Standard,” Cryptologia, Vol. 1, No. 3, July 1977, pp. 281–291. The
authors attended the second of two workshops held by NBS on September 21–22, 1976 to evaluate
the proposed Data Encryption Standard. This paper presents their conclusions. It also provides many
references on the reports, papers, and patents leading up to DES.
Simovits, Mikael J., The DES: an Extensive Documentation and Evaluation, Aegean Park Press, Laguna
Hills, California, 1996.
Solomon, Richard J., “The Encryption Controversy,” Mini-Micro Systems, Vol. 2, No. 2, February 1978,
pp. 22–26.
U.S. Department of Commerce, Data Encryption Standard, FIPS Pub. 46–3, National Institute of Standards
and Technology, Washington, DC, 1999, available online at http://csrc.nist.gov/publications/fips/
fips46-3/fips46-3.pdf. This is the final version of the government document detailing the Data
Encryption Standard.
The Data Encryption Standard ◾ 411
van Oorschot, Paul C. and Michael J. Wiener, “A Known-plaintext Attack on Two-key Triple Encryption,”
in Damgård, Ivan B., editor, Advances in Cryptology – EUROCRYPT ‘90 Proceedings, Lecture Notes
in Computer Science, Vol. 473, Springer, Berlin, Germany, 1991, pp. 318–325.
On Modes of Encryption
Davies, Donald W. and Graeme I. P. Parkin, “The Average Cycle Size of the Key Stream in Output
Feedback Encipherment,” [abstract] in Chaum, David, Ronald L. Rivest, and Alan T. Sherman, edi-
tors, Advances in Cryptology, Proceedings of Crypto 82, Plenum Press, New York, 1983, pp. 97–98.
de Vigenere, Blaise, Traicté des Chiffres, ou, Secretes Manieres D’escrire, Abel l’Angelier, Paris, 1586.
Although Cardano had previously hacked at the problem, this is where the first working autokeys
were presented.
Diffie, Whitfield and Martin Hellman, “Privacy and Authentication: An Introduction to Cryptography,”
Proceedings of the IEEE, Vol. 67, No. 3, March 1979, pp. 397–427.
Ehrsam, William F., Carl H. W. Meyer, John L. Smith, and Walter L. Tuchman, Message Verification and
Transmission Error Detection by Block Chaining, U.S. Patent 4,074,066, February 14, 1978, https://
patents.google.com/patent/US4074066A/en.
Hwang, Tzonelih and Prosanta Gope, “RT-OCFB: Real-Time Based Optimized Cipher Feedback Mode,”
Cryptologia, Vol. 40, No. 1, January 2016, pp. 1–14.
Hwang, Tzonelih and Prosanta Gope, “PFC-CTR, PFC-OCB: Efficient Stream Cipher Modes of
Authencryption,” Cryptologia, Vol. 40, No. 3, 2016, pp. 285–302. The authors argue that “both
of the proposed stream cipher modes of Authencryption [in the title of the paper] are quite robust
against several active attacks (e.g., message stream modification attacks, known-plain-text attacks,
and chosen-plain-text attacks) [… and] can efficiently deal with other issues like “limited error propa-
gation,” and so on, existing in several conventional stream cipher modes of operation like CFB, OFB,
and CTR.”
Levine, Jack, “Variable Matrix Substitution in Algebraic Cryptography,” American Mathematical Monthly,
Vol. 65, No. 3, March 1958, pp. 170–179. Levine, being familiar with classical cryptology, applied
autokeys to matrix encryption in this paper.
Rogaway, Phillip, OCB: Documentation, https://www.cs.ucdavis.edu/∼rogaway/ocb/ocb-doc.htm. This
website provides a list of papers by Phillip Rogaway on Offset Codebook mode.
Stallings, William, “NIST Block Cipher Modes of Operation for Confidentiality,” Cryptologia, Vol. 34, No
2, April 2010, pp. 163–175.
Stallings, William, “NIST Block Cipher Modes of Operation for Authentication and Combined
Confidentiality and Authentication,” Cryptologia, Vol. 34, No. 3, July 2010, pp. 225–235.
Stallings, William, “The offset codebook (OCB) block cipher mode of operation for authenticated encryp-
tion,” Cryptologia, Vol. 42, No. 2, March 2018, pp. 135–145.
Chapter 14
A major problem with all of the methods of encipherment examined thus far is that the sender
and receiver must agree on a key prior to the creation and delivery of the message. This is often
inconvenient or impossible. There were some failed attempts to overcome this problem before an
elegant solution was found. One is detailed below, before examining current solutions.
1 Weber, Ralph E., “James Lovell and Secret Ciphers During the American Revolution,” Cryptologia, Vol. 2, No.
1, January 1978, pp. 75–88.
413
414 ◾ Secret History
Notice that Lovell included & as one of the characters in his alphabet.
The first letter of our message is I, so we look for I in the column headed by W (our first
alphabet). It is found in position 14, so our ciphertext begins with 14. Our next plaintext letter
is H. We look for H in the column headed by I (our second alphabet) and find it in position 27.
Thus, 27 is our next ciphertext number. Then we come to plaintext letter A. Looking at our third
alphabet, we find A in position 15. So far, our ciphertext is 14 27 15. We’ve now used all three
of our alphabets, so for the fourth plaintext letter we start over with the first alphabet, in the same
manner as the alphabets repeat in the Vigenère cipher. The complete ciphertext is 14 27 15 27
24 1 20 12 12 10 12 16 10 26 8 19 12 2 11 1 21 13 12.
Lovell explained this system to John Adams and Ben Franklin and attempted to communicate
with them using it. He avoided having to agree on keys ahead of time by prefacing his ciphertexts
with clues to the key, such as “You begin your Alphabets by the first 3 letters of the name of that
family in Charleston, whose Nephew rode in Company with you from this City to Boston.”2
Thus, for every message Lovell sent using this scheme, if a key hadn’t been agreed on ahead of
time, he had to think of some bit of knowledge that he and the recipient shared that could be
hinted at without allowing an interceptor to determine the answer. This may have typically taken
longer than enciphering! Also, Lovell’s cipher seems to have been too complicated for Adams and
Franklin even without the problem of key recovery, as both failed to read messages Lovell sent.
Abigail Adams was even moved to write Lovell in 1780, “I hate a cipher of any kind.”3
Figure 14.2 Martin Hellman (1945–). (Photograph supplied by Martin Hellman, who dates it as
1976 plus or minus a couple of years.)
a quarter century by Hellman, against war and the threat of nuclear annihilation. He traced his
interest in cryptography to three main sources, one of which was the appearance of David Kahn’s
The Codebreakers.8 As this book was also an influence on Diffie (and many others!), its importance
in the field is hard to overestimate. The scheme developed by Diffie and Hellman works as follows.
Alice and Bob must first agree on a prime number (p) and a generator (g) of the multiplica-
tive group of units modulo p. If you haven’t yet had an abstract algebra course, the concept of a
generator is probably new to you. It is simply a number that, when raised to consecutive powers
(mod p), results in the entire group being generated. That’s why it’s called a generator and often
denoted by the letter g. If someone is eavesdropping on the line and obtains the values p and g,
that’s okay! After Alice and Bob agree on these numbers, each person (except the eavesdropper!)
selects another number. Let’s say Alice chooses x and Bob chooses y. They keep these values secret.
Alice calculates g x (mod p) and Bob calculates g y (mod p). They then exchange these new values.
Alice would have great difficulty calculating y from g y. Similarly, Bob is unlikely to be able to
determine Alice’s x. However, both can form gxy. Alice simply raises the number Bob sent her to
the power of x and Bob raises the number Alice sent him to the power of y. The eavesdropper can-
not do this as she knows neither x nor y. She may have intercepted g, p, g x, and g y, but this won’t
help her to find g xy. Thus, Alice and Bob now share a secret number. This will serve as their key for
future communication, using whatever system they choose.
When g, p, and g x (mod p) are known, but not x, determining x is known as the discrete log
problem. The security of Diffie–Hellman key exchange sounds like it should be equivalent to this
problem, but this has yet to be proven. To recap, the two problems are:
1. Discrete Log: Given g x, find g.
2. Diffie–Hellman: Given g x and g y, find g xy.
It is possible that someone may find a way to defeat the key exchange (problem 2) without solving
the discrete log problem (problem 1). This is a very big open problem. More will be said about the
difficulty of the discrete log problem in Section 16.8, which follows a discussion of complexity
theory. In any case, we have a lovely system built on nothing more complicated than the laws of
exponents and the idea of remainder arithmetic!
Malcolm Williamson discovered Diffie–Hellman key exchange before the individuals it is
named after, but he was employed by GCHQ at the time, and his work was classified. It was only
released in 1997.
Implemented as described above, the key exchange requires several messages be sent. If we are
willing to do this, even some of the cryptosystems discussed earlier in this text may be used to
securely send a message to someone with whom a key exchange has not taken place. An analogy
will make the process clear, after which it will be described more mathematically.
Alice can send Bob a message she places in a box and secures with a padlock. Bob doesn’t try to
open it, because he doesn’t have the key. Instead, he simply adds his own padlock and mails it back
to Alice. Anyone wishing to see the message at this stage must be able to remove both padlocks.
Alice can only remove her own, which she does. She then mails it back to Bob, who removes his
padlock and reads the message.
Now for the mathematical version of the physical process described above:
• Let E A, EB, DA, and DB represent the enciphering and deciphering algorithms (along with the
keys A and B) used by Alice and Bob.
• Alice sends Bob E A(M).
• Bob sends Alice EB (E A(M)).
• Alice sends Bob DA(EB (E A(M)))
Note: If DA and EB commute, DA and E A will then cancel out and Alice’s final message will amount
to EB (M)
• Bob then applies DB to Alice’s last message and reads the plaintext.
The Birth of Public Key Cryptography ◾ 417
Figure 14.3 Adi Shamir (1952–), Ron Rivest (1947–), and Len Adleman (1945–). Between the necks
of Rivest and Adleman, the chalkboard shows ∴P = NP. (Courtesy of Len Adleman, https://web.
archive.org/web/20160305203545/http://www.usc.edu/dept/molecular-science/RSApics.htm.)
Three MIT professors, Ron Rivest, Adi Shamir, and Len Adleman (Figure 14.3), developed the
(asymmetric) public key cryptosystem now known as RSA (after their last initials). It doesn’t
require any messages to be sent prior to the intended ciphertext. With RSA the enciphering key
9 Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New
York, 2001, p. 88. Levy points out that the title of the paper brings to mind the “mind-blowing paperbacks of
the New Directions publishing house,” which issued Waiting for Godot, Siddhartha, and In the American Grain.
10 Diffie, Whitfield and Hellman, Martin, New Directions in Cryptography, IEEE Transactions on Information
Theory, Vol. IT-22, No. 6, November 1976, pp. 644–654, p. 644 cited here.
11 Knapsack systems are discussed in Chapter 16.
418 ◾ Secret History
for any recipient can be made public and his deciphering key cannot be obtained from it without
extremely time consuming calculations.
The creators of RSA did not have immediate success in their attack on the problem. In fact,
they were able to break their first 32 attempts themselves.12 Finally, three years after beginning, a
workable scheme was found. There may have been an earlier breakthrough, as the following anec-
dote from a ski trip the researchers went on suggests.
On only the second day that the Israeli [Shamir] had ever been on skis, he felt he’d
cracked the problem. “I was going downhill and all of a sudden I had the most remark-
able new scheme,” he later recalled. “I was so excited that I left my skis behind as I
went downhill. Then I left my pole. And suddenly… I couldn’t remember what the
scheme was.” To this day he does not know if a brilliant, still-undiscovered cryptosys-
tem was abandoned at Killington.13
The details of RSA will soon be laid out, but first, we must review some (old!) results from number
theory.
Fermat stated the theorem in a different form and did not offer a proof. It is presented here in a
manner that will prove useful, but it first needs to be generalized a bit. This was done by Leonhard
Euler (with proof!) in 1760.14 Euler’s generalization may be stated tersely as
(m, n ) = 1 ⇒ mϕ ( n ) = 1 (mod n )
The notation (m, n) = 1 means that m and n are relatively prime; that is, their greatest com-
mon divisor is 1. It is sometimes written using the more descriptive notation gcd(m, n) = 1. The
exponent φ (n) is called the “Euler φ function” with φ pronounced as fee rather than figh. It’s also
sometimes referred to as Euler’s totient function.15 However you read it, φ (n) is defined to be the
number of positive integers less than n and relatively prime to n. For example, φ (8) = 4, because 1,
3, 5, and 7 are the only positive integers less than 8 that have no positive factors in common with
8 other than 1. It is easy to see that φ (p) = p − 1 if p is a prime. Hence, Fermat’s little theorem is
just a special case of Euler’s theorem.
12 Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New
York, 2001, p. 97.
13 Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New
York, 2001, p. 96. Was this what prompted Hughes-Hallett et al. to put a skier on the cover of their calculus
book? An attempt to jog Shamir’s memory? They even used the first person perspective as if attempting to
recreate Shamir’s field of vision at the moment that the great idea was passing through his mind!
14 Sandifer, C. Edward, The Early Mathematics of Leonhard Euler, Mathematical Association of America,
Washington, DC, 2007, p. 203. Euler’s phi function was used at this time, but not given a name until three
years later (see the next footnote). Carl Friedrich Gauss introduced the “mod n” notation, the congruence
symbol, and φ in 1801.
15 Totient is from Latin and means “to count.” Euler named this function in Euler, Leonhard, “Theoremata
Arithmetica Nova Methodo Demonstrata,” Novi Commentarii academiae scientiarum Petropolitanae, Vol. 8,
1763, pp. 74–104, reprinted in Opera Omnia, Series 1, Vol. 2, B. G. Teubner, Leipzig, 1915, pp. 531–555.
The Birth of Public Key Cryptography ◾ 419
Proof
Observing that the multiplicative group modulo n has order φ (n), and recalling that the order of a
group element must divide the order of the group, we see that mφ(n) = 1 (mod n). The requirement
that (m, n) = 1 is necessary to guarantee that m is invertible modulo n or, in other words, to ensure
that m is an element of the group of units modulo n.
Although Fermat and Euler had not realized it, their work would find application in cryptog-
raphy, making RSA possible. The eventual applications of once pure mathematics are typically
impossible to predict. To see how RSA works, we first multiply both sides of Euler’s equation by
m to get:
mϕ ( n )+1 = m (mod n ).
Any message can easily be converted to blocks of numbers. We could, for example, replace each
letter with its numerical value A = 01, B = 02, …, Z = 26, where the leading zeros eliminate
ambiguity when the values are run together. Or we may replace each character with its ASCII
representation in bits for a base 2 number (which could be converted to base 10). So, let m denote
a block of text in some numerical form. Choose a positive integer e such that (e, φ (n)) = 1. We can
then compute d, the multiplicative inverse of e (mod φ (n)). That is, the product ed will be one more
than a multiple of φ (n). The result is med = m (mod n).
If I want to be able to receive an enciphered message from someone, we don’t have to secretly
meet to exchange keys. I simply publicly reveal my values for e and n. Anyone who wants to send
me a message m can compute and send me (mod n). All I have to do is raise this value to the d
power and mod out again: (me)d = med = m (mod n). I get the original message back. The idea is
that I can tell everyone e and n, allowing them to send messages, but it will be very hard for them
to compute d; hence, only I can read the messages.
We now take a look at a toy example. The modulus is not large enough to offer any security,
but it does help illustrate the key steps.
The first thing Alice must do, if she is to receive RSA enciphered messages, is generate the keys.
She starts by selecting two primes, p = 937 and q = 1,069. Thus, her modulus is n = pq = 1,001,653.
Next, Alice needs to come up with an enciphering exponent e and a deciphering exponent d. Not
just any value will do for e. If e fails to have an inverse modulo φ (n), she will not be able to decipher
the messages she receives uniquely! For e to be invertible modulo φ (n), it must be relatively prime
to φ (n). Because, n = pq, where p and q are distinct primes, we have φ (n) = (p − 1)(q − 1) = (936)
(1,068) = 999,648.16 Alice tries e = 125. To be sure that 125 and 999,648 have no common divisors
greater than 1, we may apply the Euclidean algorithm. This is a simple procedure used to calculate
gcd values. It does, in fact, go all the way back to Euclid (c. 371–285 BCE).17
999,648 = 125(7,997) + 23
16 Proving φ(n) = (p − 1)(q − 1), when n is the product of two distinct primes, p and q, is left as an exercise (with
some hints). See the online exercises. All exercises are available online only.
17 Dates are according to Wikipedia; however, http://www.gap-system.org/∼history/Biographies/Euclid.html
gives (c. 325–265 BCE), but then, both are indicated to be estimates!
420 ◾ Secret History
We then repeat the procedure using 125 and 23 in place of 999,648 and 125:
125 = 23(5) + 10
Repeating the process gives:
23 = 10(2) + 3
And again:
10 = 3(3) + 1
One more iteration yields a remainder of 0:
3 = 1(3) + 0
This algorithm may require more or fewer iterations for another pair of numbers, but no matter
what values are investigated the algorithm always ends with a remainder of 0 and the last nonzero
remainder is the gcd. That we obtained 1 shows the two numbers are relatively prime, so 125 was
a good choice for the enciphering key. It is invertible modulo 999,648.
Now that Alice has her enciphering exponent (part of her public key), how can she compute her
deciphering exponent (her private key)?
Rewriting the equations we obtained from the Euclidean algorithm allows us to express the
gcd as a linear combination of the two numbers that yielded it. We rewrite the equations as:
1 = 10 − (3)(3) (14.1)
3 = 23 − (10)(2) (14.2)
1 = 125(7) − 23(38)
Finally we substitute for 23 using Equation 14.4 and collect multiples of 125 and 999,648:
1 = 125(303,893) − 999,648(38)
Nuclear war is not war. It is suicide and genocide rolled into one.
Ignoring case, he converts the characters to numbers using the following scheme.
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
01 02 03 04 05 06 07 08 09 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26
He gets
14210312050118230118091914152023011809200919192109030904050114
04070514150309040518151212050409142015151405
Because the modulus is only a little over a million, the text must be split into pieces less than a
million; otherwise the encryption could fail to be one-to-one. That is, a given ciphertext block
could have more than one possible decipherment. Groups of six numbers will do; thus, we have
The fact that the last number is only four digits does not matter, nor do leading zeros in the sev-
enth and other number groups.
To get the final ciphertext, Bob simply raises each number to the enciphering exponent, 125,
and reduces the answer modulo 1,001,653. He gets
It should be noted that it is not guaranteed that all ciphertext blocks will be nice six-digit
numbers (after adding leading zeros when necessary). Moding out by 1,001,653 could yield
a remainder larger than 999,999, which would then require 7 digits. If the ciphertext were
run together with the occasional block of length 7, decipherment would be ambiguous. Alice
wouldn’t know where to insert the breaks (i.e., when to take six digits and when to take seven).
The problem could be resolved by writing all ciphertext numbers using seven digits. For the
example above (because we never got a number in excess of 999,999), all of the ciphertext blocks
would begin with a leading zero.
18 The plaintext was taken from The Nazi Within, an essay by Martin E. Hellman, which can be found online at
http://www-ee.stanford.edu/∼hellman/opinion/bitburg.html
422 ◾ Secret History
If one can factor the modulus, RSA encryption is easy to break, but the security of RSA is
not known to be equivalent to factoring. There may be a way to break RSA by some other means
that would not reveal the factorization of n. This is an important open problem. So, as with
Diffie–Hellman key exchange, security is based on, if not equivalent to, the inability of an attacker
to solve a mathematical problem (factoring for RSA and the discrete log problem for Diffie–
Hellman). Other public key systems are built on other supposedly hard problems. There will be
more on this in later chapters. It’s possible that someone could come up with a quick solution for
one (or all) of these problems tomorrow or that someone already has and we just don’t know it!
The RSA algorithm has an important practical drawback—it is slow. We’ll take a quick look
at a good way to compute powers, but it is still tedious compared to extremely fast operations
like XORing used in other encryption algorithms. Thus, RSA is typically used only to encipher
a key for some symmetric system (like DES). The (enciphered) key is then sent along with a mes-
sage enciphered with that key in the symmetric system. This will be discussed in greater detail in
Chapter 18. Because RSA is just applied to the relatively short (compared to most messages) key,
the delay is tolerable!
Even if we are only enciphering a tiny amount of text with RSA, we’d still like to do it as effi-
ciently as possible. The technique of repeated squaring, demonstrated below, is a nice way to carry
out the exponentiation. As an example, consider 3851563 (mod 391). Obviously this is too small of
a modulus, but the same technique applies for other values and smaller numbers make for clearer
explanations. Naturally, we do not wish to repeatedly multiply by 385, moding out as we go, 1,563
times. Instead, we calculate the following squares:
We may stop here. We only need to go up to the power of 2 closest to the desired power (1,563 in
this example) without exceeding it.
Now, to compute 3851563, we observe that 1,563 = 1024 + 512 + 16 + 8 + 2 + 1, so we may write:
19 Gardner, Martin, “Mathematical Games, A New Kind of Cipher that Would Take Millions of Years to Break,”
Scientific American, Vol. 237, No. 2, August 1977, pp. 120–124.
20 Deavours, Cipher, “A Special Status Report The Ithaca Connection: Computer Cryptography in the Making,”
Cryptologia, Vol 1, No. 4, October 1977, pp. 312–317, p. 312 cited here.
21 Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New
Cryptologia, Vol. 1, No. 4, October 1977, pp. 406–414. The improvement consisted of showing how poor
choices for the key can allow message recovery without factoring. Over the years a number of special cases that
need to be avoided have been recognized. These are detailed in Section 15.1.
424 ◾ Secret History
with nuclear weapons! Indeed, this was made explicit in the letter: “[A]tomic weapons and cryptol-
ogy are also covered by special secrecy laws.”23
The author’s provocation appears to have been a symposium on cryptology that the IEEE had
slated for October 10, 1977, at a conference at Cornell University in Ithaca, New York. Hellman,
Rivest, and others were scheduled as speakers. The letter was reported on in Science, The New York
Times, and elsewhere, generating a great deal of publicity for the conference.24
Basically the claim was that such publications and lectures should first be cleared by the
National Security Agency (NSA). If this sort of voluntary censorship could be achieved, the con-
stitutionality of the matter need not arise. The full letter is provided in Figure 14.4.25
It was eventually revealed by an NSA spokesman that the letter was written (unofficially) by
Joseph A. Meyer (Figure 14.5), an NSA employee.26
This wasn’t the first time Meyer had angered supporters of civil liberties. Paul Dickson’s The
Electronic Battlefield (1976) summarizes a January 1971 article “Crime Deterrent Transponder
System” by Meyer in IEEE Transactions on Aerospace and Electronics Systems:27
The article by Joseph Meyer, an engineer in the employ of the National Security
Agency, recommends a system in which tiny electronic tracking devices (transpon-
ders) are attached to those 20 million Americans who have been in trouble with the
law—in fact, wearing one of the devices would be a condition of parole. The tran-
sponders would be linked by radio to a computer that would monitor the wearer’s
(or “subscriber’s” in the words of Meyer) location and beep out a warning when the
person in question was about to violate his territorial or curfew restrictions. In addi-
tion, these little boxes would be attached to people in such a manner that they could
not be removed without the computer taking note of the act. Taking off or tinkering
with your transponder would, in Meyer’s world, be a felony. Good engineer that he
is, Meyer has also thought out some of the other applications of these portable units,
which include monitoring aliens and political minorities. Robert Barkan, the writer
who first brought the Meyer proposal and other such ideas to a broader audience
through his articles, had this to say about the transponder system in The Guardian,
“‘1984’ is still fiction, but no longer science fiction. The technology of the police state
is ready. All that remains is for the government to implement it.”
Many of the requests generated by Martin Gardner’s column for the RSA paper came from outside
the United States. In light of Meyer’s letter, Rivest remarked, “If I were more of a skeptic, I’d think
I was being set up.”28
23 Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New
York, 2001, p. 109.
24 Shapley, Deborah and Gina Bari Kolata, “Cryptology: Scientists Puzzle Over Threat to Open Research,
Publication,” Science, Vol. 197, No. 4311, September 30, 1977, pp. 1345–1349; Browne, Malcolm W., “Harassment
Alleged over Code Research,” The New York Times, October 19, 1977, available online at https://www.nytimes.
com/1977/10/19/archives/harassment-alleged-over-code-research-computer-scientists-say-us.html.
25 My attempt to find the full letter led to John Young, who runs www.cryptome.org. He was able to obtain it
Publication,” Science, Vol. 197, No. 4311, September 30, 1977, pp. 1345–1349.
The Birth of Public Key Cryptography ◾ 425
Figure 14.5 Joseph A. Meyer. (Adapted from Meyer, Joseph A., “Crime deterrent transponder
system,” IEEE Transactions on Aerospace and Electronics Systems, Vol. AES-7, No. 1, January
1971, pp. 2-22, p. 22 cited here. With permission from IEEE.)
On the advice of Stanford’s general counsel, I even presented two papers at a 1977
symposium at Cornell University, instead of my usual practice of having the student
co-authors do the presentations. The attorney told me that if the ITAR were inter-
preted broadly enough to include our papers, he believed they were unconstitutional.
But a court case could drag on for years, severely hindering a new Ph.D.’s career (espe-
cially if the attorney’s belief was not shared by the jury), whereas I was already a ten-
ured professor.
I presented these thoughts to Ralph Merkle and Steve Pohlig, the students in question, but
left the final decision to them. Initially they wanted to take the risk and give the papers, but
eventually concern from their parents won out. Fortunately, the presentations went off with-
out incident, though it was dramatic having Ralph and Steve stand mute by the podium, so
they would get the recognition they deserved, as I gave the papers.29
Although not scheduled to present, Diffie made a point of delivering a talk at an informal session,
showing that he wasn’t easily intimidated.30
Did the reaction of the IEEE, as an organization, mirror that of the individuals discussed
above? To see the official reaction, look at Figure 14.6 on this and the next two pages.31 The first
is a reply to the Meyer letter that started it all.
The combination of the cryptographers’ brave stance and the fact that Meyer’s claims were not
actually backed up by the ITAR, which contained an exemption for published material, resulted
in no real changes in the manner in which cryptologic research was now being carried out—
publicly. Indeed, the publicity the letter generated served to promote public interest! The threats
did, however, delay the RSA team’s distribution of their paper. As mentioned earlier, Simmons
and Norris described it in the pages of Cryptologia before the technical report was mailed to the
Scientific American readers or appeared in a journal. Ironically, prior to all of this, Simmons, who
managed the Applied Mathematics Department at Sandia Laboratories, attempted to hire Rivest,
31 These letters, like the original Meyer letter, were obtained from Martin Hellman via John Young.
428 ◾ Secret History
but Rivest declined because he thought he would have more freedom at MIT! Meyer’s letter was
far from the last attempt to control public cryptologic research.32
Whitfield Diffie relates:33
A more serious attempt occurred in 1980, when the NSA funded the American
Council on Education to examine the issue with a view to persuading Congress to give
it legal control of publications in the field of cryptography. The results fell far short
32 Nor was it the first! Back in 1975, the NSA attempted to intimidate other organizations, such as the National
Science Foundation (NSF), out of providing funding for cryptologic research (for details, see Levy, Steven,
Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New York, 2001, p.
107). If grant money for important cryptographic work is only available through NSA, censorship is achieved
in a more subtle (but still apparent) manner.
33 From p. xvii of the foreword by Whitfield Diffie in Schneier, Bruce, Applied Cryptography, second edition,
34 Kolata, Gina Bari, “Cryptography: A New Clash between Freedom and National Security,” Science, Vol. 209,
No. 4460 August 29, 1980, pp. 995–996.
35 Bamford, James, The Puzzle Palace, Houghton Mifflin Company, Boston, 1982, p. 363.
430 ◾ Secret History
of these reasons, it seems to me that something more fundamental will in the end prevent any
restrictions anyway. It is called the First Amendment.”36
I thought this would be the least important paper my name would ever appear on.
—Len Adleman37
Adleman actually argued that he shouldn’t be listed as an author, claiming he had done little—
mainly breaking previous attempts. The others insisted and he agreed as long as his name would
be put last. It seems logical that the authors’ names should appear on papers in the order of their
contributions, but this is very rarely the case in mathematics. The convention is to list the authors
alphabetically. So, RSA could easily have become known as ARS instead, if Adleman hadn’t asked
to be put last.
Unlike many of the systems we’ve examined, RSA was patented38 in 1983 and was therefore
not royalty-free; however, it was placed in the public domain in 2000, before the patent was set
to expire in 2003. It’s likely that you’ve used RSA, whether you were aware of the fact or not.
Internet transactions in which you provide a credit card number are often secured in this manner.
Although it happens behind the scenes, your computer uses the vendor’s public key to secure the
message you send.
The creators of RSA all made other contributions to cryptology that will be discussed in later
chapters, as will the way in which RSA may be used to sign messages. It is worth mentioning now
that Rivest invented Alice and Bob,39 the two characters who are typically used to illustrate cryp-
tographic protocols.40 Previously, messages were sent from person A to person B. The creation of
Alice and Bob helped to humanize the protocols. Over the years, other characters have been added
to the cast. These characters are so convenient, I chose to use them in explaining earlier systems,
even though the original descriptions did not include them.
36 Quoted in Foerstel, Herbert N., Secret Science: Federal Control of American Science and Technology, Praeger,
Westport, Connecticut, 1993, p. 125, which cites The Government’s Classification of Private Ideas, hearings
before a subcommittee of the Committee on Government Operations, U.S. House of Representatives, 96th
Congress, Second Session, 28 February, 20 March, 21 August 1980, U.S. Government Printing Office,
Washington, DC, 1981, p. 410.
37 Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New
Signatures and Public-Key Cryptosystems (There was soon a title change to A Method for Obtaining Digital
Signatures and Public-key Cryptosystems. The date is the same for both.), MIT Laboratory for Computer Science
Report MIT/LCS/TM-82, April 1977, Cambridge, Massachusetts.
40 Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New
Dramatis Personae
Eve Eavesdropper
Peggy Prover
Victor Verifier
Alice and Bob are also handy characters for those working in coding theory. This does not
refer to the study of the sort of codes discussed in this text. Rather than provide a definition, I
encourage you to read the transcript (which does provide one) of The Alice and Bob After Dinner
Speech, given at the Zurich Seminar, April 1984, by John Gordon, by invitation of Professor James
Massey. To intrigue you, a few paragraphs are reproduced below.43
Coding theorists are concerned with two things. Firstly and most importantly they are
concerned with the private lives of two people called Alice and Bob. In theory papers,
whenever a coding theorist wants to describe a transaction between two parties he
doesn’t call them A and B. No. For some longstanding traditional reason he calls them
Alice and Bob.
Now there are hundreds of papers written about Alice and Bob. Over the years
Alice and Bob have tried to defraud insurance companies, they’ve played poker for
high stakes by mail, and they’ve exchanged secret messages over tapped telephones.
If we put together all the little details from here and there, snippets from lots of
papers, we get a fascinating picture of their lives. This may be the first time a definitive
biography of Alice and Bob has been given.
41 Schneier, Bruce, Applied Cryptography, second edition, John Wiley & Sons, New York, p. 23.
42 In the first edition of Schneier’s Applied Cryptography, this character was named Mallet. Although he has since
been replaced by Mallory, he served as the inspiration for the pen-name of an American Cryptogram Association
(ACA) member.
43 The transcript is available at http://downlode.org/Etext/alicebob.html.
432 ◾ Secret History
Whatever the details of their personal lives may be, Alice and Bob have certainly helped cryptolo-
gists. In Malgorzata Kupiecka’s paper, “Cryptanalysis of Caesar Cipher,”44 she formally acknowl-
edged their help by writing at the end, “I wish to thank Alice and Bob.”
However, it turns out that Alice and Bob weren’t always Alice and Bob. Recently, Leonard
Adleman shared their origin story with me. The famous pair did not appear in early draft versions
of the RSA paper. Rivest had sent one of these drafts to Richard Schroeppel, asking for comments
and information on the current state of factoring. Schroeppel’s August 1, 1977 reply included the
following:
On p. 4, I would suggest a notation change. Using p and q for primes is quite stan-
dard, but you shouldn’t continue assigning letters in the same sequence to nonprimes.
Perhaps m would be a good symbol for pq, and e for the public exponent you have
called s. Another literary suggestion: name your protagonists, perhaps Adolf and
Bertholt or somesuch. This would reserve isolated letters for mathematical quantities.
In sharing this with me, Adleman commented, “While the names Richard suggests are problem-
atic, I never liked Ron’s choice either.”45
44 Kupiecka, Malgorzata, “Cryptanalysis of Caesar Cipher,” Journal of Craptology, Vol. 3, November 2006.
Available online at http://www.anagram.com/∼jcrap/Volume_3/caesar.pdf/. You may recall from Section 13.2
that when differential cryptanalysis was discovered, it was found that DES, an older system by that time, was
optimized against it. Hence, it was concluded that NSA was aware of the attack years earlier. In a similar vein,
Malgorzata demonstrates that differential cryptanalysis is not a good attack against the Caesar shift cipher and
concludes that the Romans must have been aware of this approach and optimized their encryption against it.
Journal of Craptology specializes in humorous papers of this sort. See http://www.anagram.com/∼jcrap/ for
complete electronic contents.
45 Adleman, Leonard, email to the author, January 5, 2020.
46 The Story of Non-Secret Encryption, http://web.archive.org/web/19980415070754/http://www.cesg.gov.uk/
Williamson, Malcolm J., Thoughts on Cheaper Non-Secret Encryption, CESG Report, 10 August 1976.
Williamson had the idea put forth in this second paper long before it was published.
49 Cocks, Clifford C., Note on “Non-Secret Encryption,” CESG Report, 20 November 1973, available online at
http://fgrieu.free.fr/histnse/Cliff%20Cocks%20paper%2019731120.pdf.
The Birth of Public Key Cryptography ◾ 433
However, Bobby Ray Inman, a director of NSA, claimed in congressional testimony, that NSA
had discovered public key cryptography 10 years before the academics.50 If correct, this would
mean that NSA had it before GCHQ. There was no evidence publicly available in support of
Inman’s claim until 2019, when a Freedom of Information Act request led to the release of a 100+
page, previously top secret, history titled Fifty Years of Mathematical Cryptanalysis (1937–1987).
Although the released version was heavily redacted, the lines reproduced below survived.
Rick Proto, [188], had suggested the exponentiation scheme as an “irreversible” trans-
formation, prior to Ellis’ paper on nonsecret encryption. No public key cryptosystem
has yet been used operationally,51
Whatever followed the comma at the end of the quoted lines was redacted and remains a
mystery, as do the details of the reference [188]. In the bibliography at the end of the history, this
entire reference was redacted. Recognizing a transformation as apparently irreversible and build-
ing a public key system out of it are two different things, but if all Proto did was the former, why
would the title of his paper still be secret in 2019, when the history was released?
Back in 2009, NSA recognized Proto’s importance by naming a facility at Fort Meade the
“Richard C. Proto Symposium Center.” Prior to this, the only named facility was Friedman
Auditorium.52 A pair of references on Proto are given in the list below, but the specifics of his
contributions to NSA are sorely lacking.
50 Diffie, Whitfield and Susan Landau, Privacy on the Line: The Politics of Wiretapping and Encryption, The MIT
Press, Cambridge, Massachusetts, 1998, p. 253, note 15 for Chapter 3.
51 Stahly, Glenn F., Fifty Years of Mathematical Cryptanalysis (1937-1987), National Security Agency, August
pp. E1269–E1270, from the Congressional Record Online through the Government Publishing Office,
available online at https://www.govinfo.gov/content/pkg/CREC-2009-06-02/html/CREC-2009-06-02-pt1-
PgE1269-4.htm.
434 ◾ Secret History
Ellis, James H., The possibility of secure non-secret analogue encryption, CESG Report 3007, May 1970, avail-
able online at http://fgrieu.free.fr/histnse/CESG_Research_Report_No_3007_1.pdf.
Ellis, James H., The Story of Non-Secret Encryption, 1987, available online at http://web.archive.org/
web/20030610193721/http://jya.com/ellisdoc.htm. This paper details the discovery of non-secret
encryption at the U.K. Government Communications Headquarters (GCHQ). References are pro-
vided to the now declassified technical papers. It was reprinted as Ellis, James H., “The History of
Non-Secret Encryption,” Cryptologia, Vol. 23, No. 3, July 1999, pp. 267–273.
Foerstel, Herbert N., Secret Science: Federal Control of American Science and Technology, Praeger, Westport,
Connecticut, 1993. This book takes a broad look at the topic, but Chapter 4 focuses on cryptography.
Gardner, Martin, “Mathematical Games, A New Kind of Cipher That Would Take Millions of Years to
Break,” Scientific American, Vol. 237, No. 2, August 1977, pp. 120–124.
Hellman, Martin, Homepage, http://www-ee.stanford.edu/∼hellman/.
Kahn, David, “Cryptology Goes Public,” Foreign Affairs, Vol. 58, No. 1, Fall 1979, pp. 141–159.
Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking,
New York, 2001. Aimed at a general audience, this book is the best single source on the history of
modern cryptology up to 2001. Of course, more technical surveys exist for those wishing to see the
mathematics in greater detail.
Reilly, Larry, “Top-Secret Famous,” Fairfield Now, Fall 2009, pp. 22–26, available online at http://
saccovanzettiexperience.com/site/wp-content/uploads/2015/05/FairfieldNow_ALifeinSecrets.
pdf. This article is on Richard C. Proto.
Rivest, Ron, Homepage, http://theory.lcs.mit.edu/∼rivest/.
Rivest, Ronald L., Adi Shamir, and Leonard Adleman, On Digital Signatures and Public-Key Cryptosystems
(There was soon a title change to A Method for Obtaining Digital Signatures and Public-key Cryptosystems.
The date is the same for both.), MIT Laboratory for Computer Science Report MIT/LCS/TM-82,
April 1977, Cambridge, Massachusetts. This report later appeared as cited in the reference below.
Rivest, Ronald L., Adi Shamir, and Leonard Adleman, “A Method for Obtaining Digital Signatures and
Public-key Cryptosystems,” Communications of the ACM, Vol. 21, No. 2, February 1978, pp. 120–126.
Simmons, Gustavus J. and Michael J. Norris, “Preliminary Comments on the MIT Public-Key
Cryptosystem,” Cryptologia, Vol. 1, No. 4, October 1977, pp. 406–414.
Williamson, Malcolm J., Non-Secret Encryption Using a Finite Field, CESG Report, 21 January 1974.
Williamson, Malcolm J., Thoughts on Cheaper Non-Secret Encryption, CESG Report, 10 August 1976.
Breaking News!
On December 15, 2020, when the present book was in the proof stage, Whit Diffie was inducted
into NSA’s Hall of Honor in what he described as “the most unlikely honor of my life.” David
Kahn was also inducted into the Hall on this great day. Thus, work done in the academic com-
munity (not subject to NSA’s control) was recognized as being of great value.
Chapter 15
Attacking RSA
The most obvious, most direct, and most difficult way to attack RSA is by factoring the modulus.
The paragraph below indicates how this approach works. This chapter then details 12 non-factor-
ing attacks, before returning to examine various factoring algorithms.
One way to compute d (given e and n) is to first factor n. We will have n = pq. Because p and q are
distinct primes, they are relatively prime; hence, φ(n) = (p − 1)(q − 1). Having calculated the value of
φ(n), we may easily find the multiplicative inverse of e modulo φ(n). Because this inverse is d, the deci-
phering exponent, we have now broken the system; however, the “factor n” step is highly nontrivial!
(C )−1 − x
( ) (M )
x y
1 C 2y = C1xC 2y = M e1 e2
= M xe1 + ye2 = M 1 = M (mod n ).
Thus, Eve, who hasn’t recovered d, can obtain M.
PATCH: Don’t have the same modulus for any two users!
1 Simmons, Gustavus J., “A “Weak” Privacy Protocol Using the RSA Crypto Algorithm,” Cryptologia, Vol. 7,
No. 2, April 1983, pp. 180–182.
435
436 ◾ Secret History
Imagine the malicious hacker Mallory controls Alice and Bob’s communication channel.
When Alice requests Bob’s public key, Mallory changes the e that Bob tries to send her by a single
bit. Instead of (e, n), Alice receives (e ′, n). When Alice enciphers her message, Mallory lets it pass
unchanged to Bob, who is unable to read it. After some confusion, Bob sends his public key to
Alice again, since she clearly didn’t use the right values. Alice then sends the message again using
(e, n). Mallory may then use the attack described above to read M.2
PATCH: Never resend the same message enciphered two different ways. If you must resend,
alter the message first.
2 Joye, Marc, and Jean-Jaque Quisquater, “Faulty RSA Encryption,” UCL Crypto Group Technical Report
CG-1997/8, Université catholique de Louvain, Louvain-la-Neuve, Belgium, 1987. This attack was first pre-
sented in the paper cited here, although Mallory was not present. It was assumed that an accidental error
necessitated the resending of the message.
3 Schneier, Bruce, Applied Cryptography, second edition, John Wiley & Sons, New York, 1996.
Attacking RSA ◾ 437
n − ϕ ( n ) = n − ( n − p − q + 1) = p + q − 1
But for this attack, we required q < p < 2q. The second half of this double inequality gives us
p + q − 1 < 3q − 1
and the first half of the double inequality tells us that q is the smaller factor and therefore
q< n
Hence,
3q − 1 < 3 n −1 < 3 n
The last few steps thus establish that
n − ϕ (n ) < 3 n
4 Wiener, Michael J., “Cryptanalysis of Short RSA Secret Exponents,” IEEE Transactions on Information Theory,
Vol. 36, No. 3, 1990, pp. 553–558.
5 Adapted from p. 206 of Boneh, Dan, “Twenty Years of Attacks on the RSA Cryptosystem,” Notices of the
American Mathematical Society, Vol. 46, No. 2, February 1999, pp. 203–213.
438 ◾ Secret History
e k 1
− < 2
n d 3d
which in turn is
1
< .
2d 2
The weaker bound is indicated because it allows us to apply a theorem concerning the number of
solutions k/d satisfying the inequality. Namely,
e k 1
− <
n d 2d 2
Attacking RSA ◾ 439
implies that there are fewer than log2(n) fractions k/d that approximate e/n this closely. A tech-
nique is available to efficiently check the possibilities until the correct d is found.6
PATCH 1: Using a small value for d allows for quicker decryption, but don’t do it!
PATCH 2: Use a small value for d, but increase e to the point that the attack above won’t work.
We can increase e by multiples of φ(n), without making any difference other than increasing the
time needed for encryption. Making e > n1.5 will block the attack above.
x= ∑a N M (mod n)
i i i
i =1
where Ni = n/ni and Mi = Ni−1 (mod ni). The easiest way to see how a solution may be obtained is
through an example.
Example 1
Suppose we have
x = 5 ( mod 9 )
x = 8 ( mod 11)
x = 6 ( mod 14 ) .
For convenience, we label the moduli as n1 = 9, n2 = 11, and n3 = 14. We compute the product of
the moduli n = n1n2n3 = (9)(11)(14) = 1,386. Using Gauss’s formula, we have
k
∑
x = ai N i M i ( mod n ) = a1N 1 M1 + a2 N 2 M 2 + a3 N 3 M 3 ( mod n )
i =1
The values for the ai were given in the problem, and the Ni are very easy to calculate. To get the Mi,
we could use a technique from Section 14.3 (the Euclidean algorithm), but for such small values, it is
easier to do the calculations in one’s head. For large numbers we’d resort to the algorithm. We have
M1 = N 1−1 = (154 ) = (1) = 1 ( mod 9 )
−1 −1
= 10,436 ( mod 1,386 )
= 734 ( mod 1,386 )
Checking against our original three equations, we see that the answer, x = 734, works.
Attacking RSA ◾ 441
Now that the Chinese Remainder Theorem is clear, we’re ready to examine how it can be used
to attack RSA encryption. We already saw the danger in users destined to receive the same mes-
sage sharing a modulus. It is also a disaster if the same message goes to users who have different
values for n, but share the same e. The Chinese remainder theorem may allow the message to be
recovered in this case. It only depends on how many copies of the message are sent out relative
to the value of e. If there are m copies sent, all using the same e and distinct moduli, and m ≥ e,
then the message can be recovered. As a small example, we use e = m = 3. Our three intercepted
ciphertexts follow below:
C1 = M 3 ( mod n1 )
C 2 = M 3 ( mod n2 )
C 3 = M 3 ( mod n3 )
This looks a bit different than the example of the Chinese remainder theorem above, but we may
rewrite these equations as
M 3 = C1 ( mod n1 )
M 3 = C 2 ( mod n2 )
M 3 = C 3 ( mod n3 )
Now they match the example, with M 3 taking the place of x.
If the moduli are not all pairwise relatively prime, then we may compute the gcd of two of them
to arrive at a prime factor of one of the moduli. Because this allows us to then factor a modulus,
and go on to recover d very easily, we assume the moduli are relatively prime. This being the case,
we can apply the Chinese remainder theorem to the equations above to get an integer C ′ such that
9 Håstad, Johan, “On using RSA with Low Exponent in a Public Key Network,” in Williams, Hugh C., editor,
Advances in Cryptology – CRYPTO ’85 Proceedings, Lecture Notes in Computer Science, Vol. 218, Springer,
Berlin, Germany, 1986, pp. 403–408.
10 Coppersmith, Don, “Small Solutions to Polynomial Equations, and Low Exponent RSA Vulnerabilities,”
Journal of Cryptology, Vol. 10, No. 4, December 1997, pp. 233–260.
442 ◾ Secret History
Figure 15.2 Daniel Bleichenbacher (1964–). (Courtesy of Daniel Bleichenbacher and Kit
August.)
A chosen ciphertext attack is one in which the attacker gets to create any ciphertext he likes and
then get the corresponding plaintext. Adding the adjective “adaptive” means that he can do this
repeatedly, using knowledge he gained in previous iterations to adapt his next ciphertext in such
a way as to optimize the information it yields. This is the sort of attack Daniel Bleichenbacher
(Figure 15.2) launched against RSA padded in accordance with PKCS #1. Actually, his attack
was a bit less demanding in that he didn’t need the corresponding plaintexts, but rather just the
knowledge of whether or not these plaintexts corresponded to some block of data encrypted using
PKCS #1. Describing his attack, he wrote, “we expect that the attack needs roughly 220 chosen
ciphertexts to succeed.”11 This sounds daunting, but Bleichenbacher points out that the attack is
practical, because there are servers that will accept ciphertexts and return error messages when
11 Bleichenbacher, Daniel, “Chosen Ciphertext Attacks against Protocols Based on the RSA Encryption Standard
PKCS #1,” in Krawczyk, Hugo, editor, Advances in Cryptology – CRYPTO ’98 Proceedings, Lecture Notes in
Computer Science, Vol. 1462, Springer, Berlin, Germany, 1998, pp. 1–12.
Attacking RSA ◾ 443
they are not PKCS conforming. So, it’s not like he’s expecting Bob to respond to his million-plus
ciphertexts!
Reporting on experiments carried out for his new attack, Bleichenbacher wrote, “We tested the
algorithm with different 512-bit and 1024-bit keys. The algorithm needed between 300 thousand
and 2 million chosen ciphertexts to find the message.” But improved systems give new attacks a
tough race, and Bleichenbacher admitted that version 2 of PKCS #1, which made use of results
published in 1995, was not vulnerable to his attack.12
Why is this man smiling? Paul Kocher (Figure 15.3) made his mark on cryptology in 1995, while
still an undergraduate at Stanford, by discovering timing attacks. These are not limited to RSA
but may be applied to a variety of systems. Kocher was also responsible for leading the design of
the Electronic Frontier Foundation’s DES cracker (see Section 13.3). I look forward to seeing what
he’ll do next. The manner in which his timing attack applies to RSA follows.
For this attack to work, the attacker needs some access to the recipient’s machine. If the attacker
can obtain the ciphertexts, as well as the amount of time taken by the recipient’s machine to deci-
pher the messages, he can work backward to find what the decryption exponent must be. That is,
if decryption is done by, for instance, the repeated squaring method detailed in Section 14.3, then
12 The 1995 paper is Bellare, Mihir and Phillip Rogaway, “Optimal asymmetric encryption,” in De Santis,
Alfredo, editor, Advances in Cryptology – EUROCRYPT ‘94 Proceedings, Lecture Notes in Computer Science,
Vol. 950, Springer, Berlin, Germany, 1995, pp. 92–111.
13 Kocher, Paul, “Timing attacks on implementations of Diffie-Hellman, RSA, DSS, and other systems,” in
Koblitz, Neal, editor, Advances in Cryptology – CRYPTO ‘96 Proceedings, Lecture Notes in Computer Science,
Vol. 1109, Springer, Berlin, Germany, 1996, pp. 104–113.
444 ◾ Secret History
each multiplication will take a certain amount of time, dependent on the machine doing it. These
times then correspond to particular keys.
PATCH: In the paper that introduced his attack, Kocher also put himself in the defensive posi-
tion and suggested patches. He explained how random delays may be inserted into the decryption
algorithm to throw the timing off, but moving back to offense indicated that an attacker could still
break the system by collecting more data. Another patch he proposed is called blinding. Blinding
had previously been used in another context. Here it meant multiplying the received ciphertext by a
random, but invertible number r raised to the value of the enciphering exponent (modulo n), prior to
any deciphering. So, one is then left to decipher r eC. This is done in the normal manner, by exponen-
tiation to the power d. We get (r eC)d = r edCd = rM. The recipient must then multiply by the inverse
of r to recover the message. Using a different r each time will prevent an attacker from being able to
find a correlation between the original ciphertext block and the time needed to decipher.
C /M 2e = M1e ( mod n )
For the moment, let’s assume that M1 ≤ 2m1 and M2 ≤ 2m 2 for some positive integers m1 and m2.
Now, if all we have is the intercepted ciphertext C, we don’t know M and cannot attempt to
factor it directly, but the equality above allows us to determine the factorization. We begin by
building a table of M1e (mod n) values for all M1 = 2, 3, …, 2m1. Next, we start calculating C/M2e
(mod n) for M2 = 2, 3, …, 2m 2 . As each of these latter values is calculated, it is checked for in the
M1e (mod n) table. When a match is found, we have the values of M1 and M2 satisfying C/M2e =
M1e (mod n). Multiplying the values M1 and M2 together gives us the message M.
This attack should remind you of the attack on double DES presented in Section 13.4. Both
are meet-in-the-middle attacks.
A similar attack works against Elgamal, a system detailed in Sections 16.8 and 17.2.4.
PATCH: Randomly pad the message prior to encryption, so that M is not small compared to n.
This is also a way to patch against attacks 7 and 8. It is very important to pad! The term “textbook
RSA” basically means RSA without padding, because that is how RSA is often presented in textbooks.
Factoring one 1024-bit RSA modulus would be historic. Factoring 12720 such mod-
uli is a statistic. The former is still out of reach for the academic community (but
14 Boneh, Dan, Antoine Joux and Phong Q. Nguyen, “Why textbook ElGamal and RSA encryption are inse-
cure,” in Okamoto, Tatsuaki, editor, Advances in Cryptology – ASIACRYPT 2000 Proceedings, Lecture Notes
in Computer Science, Vol. 1976, Springer, Berlin, Germany, 2000, pp. 30–43.
15 Lenstra, Arjen K., James P. Hughes, Maxime Augier, Joppe W. Bos, Thorsten Kleinjung and Christophe Wachter,
“Ron was wrong, Whit is right,” February 2012, https://anonymous-proxy-servers.net/paper/064.pdf.
Attacking RSA ◾ 445
anticipated). The latter comes as an unwelcome warning that underscores the diffi-
culty of key generation in the real world.
The quote is a reworking of a remark commonly attributed to Joseph Stalin, “A single death is a
tragedy; a million deaths is a statistic.” Just as Stalin was responsible for millions of deaths, so the
team that reworked the quote was responsible for destroying thousands of RSA moduli, by factor-
ing them. Their attack worked on real-world public keys, yet if handed some randomly chosen
public key, they would be unlikely to be able to break it. This seeming paradox will soon become
clear. Their attack is extremely simple!
The researchers simply gathered millions of public keys, then took the moduli two at a time
and used the Euclidean algorithm to find the greatest common divisors. Nothing was achieved
when the gcd turned out to be 1, but in thousands of cases it was one of the prime factors of the
moduli. This happened because, in these cases, the two moduli were of the form n1 = pq and n2 =
pr, for some primes p, q, and r. Once the common factor p is found, both moduli could then be
very easily factored. Some of the moduli thus factored were 2048 bits long. Greater size would tend
to improve security for most attacks, but the vulnerability resulting from different users selecting
a common prime still exists.
On average, 2 out of every 1,000 moduli were factored. So, RSA, as implemented is 98.8%
secure against this attack. The authors concluded that RSA is “significantly riskier” than systems
based on Diffie-Hellman.
In a classic case of understatement, the authors of the “Ron was wrong, Whit is right” paper
pointed out that their results “may indicate that proper seeding of random number generators is
still a problematic issue.” They went on to note
The lack of sophistication of our methods and findings make it hard for us to believe
that what we have presented is new, in particular to agencies and parties that are
known for their curiosity in such matters. It may shed new light on NIST’s 1991 deci-
sion to adopt DSA as digital signature standard as opposed to RSA, back then a public
controversy.
Indeed, the potential problem was noticed earlier by Don Johnson. In a 1999 paper, he considered
the effect of a random number generator that had been intentionally “chilled” so that it generated
primes from a smaller set than if it was functioning properly.16 However, the 2012 paper does not
assign blame to a malevolent insider, but rather to unintentionally poor algorithms being used to
generate the primes.
PATCH: Somehow make sure you are choosing the primes p and q as randomly as possible.
This is much harder than it sounds.
There are many other non-factoring attacks on RSA. Some of these may be found in the ref-
erences for this chapter and others are discussed in Section 17.2. Factoring attacks are presented
below; however, none of these (other than attack 12, above) represent any real threat to RSA,
provided that various special cases are avoided when selecting primes and exponents and when
implementing the system. Of course, the minimum size of the modulus required for decent secu-
rity increases constantly with computing speed and the occasional improved factoring algorithm.
16 Johnson, Don, ECC, Future Resiliency and High Security Systems, Certicom Whitepaper, March 30, 1999,
revised July 6, 1999, available online at http://web.archive.org/web/20040215121823/www.comms.engg.susx.
ac.uk/fft/crypto/ECCFut.pdf, pp. 12–14 are relevant here.
446 ◾ Secret History
25195908475657893494027183240048398571429282126204
03202777713783604366202070759555626401852588078440
69182906412495150821892985591491761845028084891200
72844992687392807287776735971418347270261896375014
97182469116507761337985909570009733045974880842840
17974291006424586918171951187461215151726546322822
16869987549182422433637259085141865462043576798423
38718477444792073993423658482382428119816381501067
48104516603773060562016196762561338441436038339044
14952634432190114657544454178424020924616515723350
77870774981712577246796292638635637328991215483143
81678998850404453640235273819513786365643912120103
97122822120720357
This number has 617 digits in base 10, but it’s called RSA-2048, because it has that many bits in
base 2. There was once a $200,000 prize offered by RSA Security for anyone who could find the
two prime factors and explain how they did it. The RSA factoring challenge has expired, but they
did give away prizes for the factorization of other, smaller products. See Table 15.1 for sizes and
prizes.
RSA Security answers an obvious question in the FAQ section of their website:17
I think the challenge also served as a show of strength for the company. Not many businesses will
put their products to the test in such a straight-forward way. Attempts to factor the larger num-
bers continued even after the financial incentive went away. RSA-768 was factored in 2009 by an
international team.18 The smaller RSA-704 held out until 2012, when it was factored by Shi Bai,
Emmanuel Thomé, and Paul Zimmermann.19 The rest remain unsolved as of this writing.
17 http://www.rsa.com/rsalabs/node.asp?id=2094#WhyIs.
18 Kleinjung, Thorsten, Kazumaro Aoki, Jens Franke, Arjen K. Lenstra, Emmanuel Thomé, Joppe W. Bos, Pierrick
Gaudry, Alexander Kruppa, Peter L. Montgomery, Dag Arne Osvik, Herman te Riele, Andrey Timofeev, and
Paul Zimmermann, “Factorization of a 768-Bit RSA Modulus,” in Rabin, Tal, editor, Advances in Cryptology –
CRYPTO 2010 Proceedings, Lecture Notes in Computer Science, Vol. 6223, Springer, Berlin, Germany, 2010,
pp. 333–350, available online at https://link.springer.com/content/pdf/10.1007%2F978-3-642-14623-7_18.pdf.
19 Bai, Shi, Emmanuel Thomé, and Paul Zimmermann “Factorisation of RSA-704 with CADO-NFS,” 2012,
hal-00760322f, 4 pages, available online at https://hal.inria.fr/file/index/docid/760322/filename/369.pdf.
20 Gauss, Carl Friedrich, Disquisitiones Arithmeticae, Gerhard Fleischer, Leipzig, 1801, Article 329.
448 ◾ Secret History
only test the numbers 2, 3, 5, 7, 11, 13, 17, and 19. You’ll notice that we stopped far short of 391
this time. We really only need to check up to the square root of the number we’re trying to factor.
If a number, n, is the product of two smaller numbers, both numbers cannot exceed the square
root of n or the product would exceed n.
This sort of approach is the idea behind the sieve of Eratosthenes (Figure 15.4), the purpose of
which is to eliminate the composite numbers from a list of consecutive integers. To begin we write
out the first few natural numbers:
1 2 3 4 5 6 7 8 9 10
11 12 13 14 15 16 17 18 19 20
21 22 23 24 25 26 27 28 29 30
31 32 33 34 35 36 37 38 39 40
41 42 43 44 45 46 47 48 49 50
51 52 53 54 55 56 57 58 59 60
61 62 63 64 65 66 67 68 69 70
71 72 73 74 75 76 77 78 79 80
81 82 83 84 85 86 87 88 89 90
91 92 93 94 95 96 97 98 99 100
In order to be prime, a number must have exactly two distinct positive divisors. The first natural
number, 1, has only one positive divisor, so it is not prime and we cross it out:
1 2 3 4 5 6 7 8 9 10
11 12 13 14 15 16 17 18 19 20
21 22 23 24 25 26 27 28 29 30
31 32 33 34 35 36 37 38 39 40
41 42 43 44 45 46 47 48 49 50
51 52 53 54 55 56 57 58 59 60
61 62 63 64 65 66 67 68 69 70
71 72 73 74 75 76 77 78 79 80
81 82 83 84 85 86 87 88 89 90
91 92 93 94 95 96 97 98 99 100
Attacking RSA ◾ 449
Our next entry, 2, is prime. We now cross out all multiples of 2 (except 2 itself). The number 2 is
boldfaced to stress its primality:
1 2 3 4 5 6 7 8 9 10
11 12 13 14 15 16 17 18 19 20
21 22 23 24 25 26 27 28 29 30
31 32 33 34 35 36 37 38 39 40
41 42 43 44 45 46 47 48 49 50
51 52 53 54 55 56 57 58 59 60
61 62 63 64 65 66 67 68 69 70
71 72 73 74 75 76 77 78 79 80
81 82 83 84 85 86 87 88 89 90
91 92 93 94 95 96 97 98 99 100
After 2, the next remaining entry, 3, is prime. We now cross out all multiples of 3:
1 2 3 4 5 6 7 8 9 10
11 12 13 14 15 16 17 18 19 20
21 22 23 24 25 26 27 28 29 30
31 32 33 34 35 36 37 38 39 40
41 42 43 44 45 46 47 48 49 50
51 52 53 54 55 56 57 58 59 60
61 62 63 64 65 66 67 68 69 70
71 72 73 74 75 76 77 78 79 80
81 82 83 84 85 86 87 88 89 90
91 92 93 94 95 96 97 98 99 100
1 2 3 4 5 6 7 8 9 10
11 12 13 14 15 16 17 18 19 20
21 22 23 24 25 26 27 28 29 30
31 32 33 34 35 36 37 38 39 40
41 42 43 44 45 46 47 48 49 50
51 52 53 54 55 56 57 58 59 60
61 62 63 64 65 66 67 68 69 70
71 72 73 74 75 76 77 78 79 80
81 82 83 84 85 86 87 88 89 90
91 92 93 94 95 96 97 98 99 100
1 2 3 4 5 6 7 8 9 10
11 12 13 14 15 16 17 18 19 20
21 22 23 24 25 26 27 28 29 30
31 32 33 34 35 36 37 38 39 40
41 42 43 44 45 46 47 48 49 50
51 52 53 54 55 56 57 58 59 60
61 62 63 64 65 66 67 68 69 70
71 72 73 74 75 76 77 78 79 80
81 82 83 84 85 86 87 88 89 90
91 92 93 94 95 96 97 98 99 100
450 ◾ Secret History
The next prime is 11. We could now cross out all multiples of 11, but this has already been
done, as these numbers, less than 100, all have smaller prime divisors. In fact, all of the numbers
that remain are prime. If one weren’t, it would have to factor into a pair of numbers, both of which
exceed 10. This is an impossibility, as the product must be less than 100. As explained earlier, we
only need to search up to the square root of the number.
One of the students of the great poet Callimachus was Eratosthenes of Cyrene (c. 275–192
BCE), who became librarian in the Museum, the scientific institute of Alexandria. He
invented a new method to calculate prime numbers, drew a famous world map, cata-
logued several hundreds of stars, but became especially famous for his calculation of the
circumference of the earth, based on the angle of the shadow that the sun made over a
vertical pole at Alexandria at noon and the fact that at the same time, the sun light fell
straight into a well as Syene in southern Egypt. He concluded that the circumference
was 45,460 kilometers, which is pretty close to the real figure. He also wrote a treatise
on chronology and a book on musical theory, composed poems and comedies, and was
responsible for two dictionaries and a book on grammar. As an ethnologist, he sug-
gested that the common division between civilized people and barbarians was invalid.
Eratosthenes was nicknamed bêta or ‘number two’, because in no branch of science was
he ever the best, although he excelled in nearly every one of them.21
We now jump ahead to the 17th century and a factorization method due to Pierre de Fermat
(Figure 15.5). Let n be the number we are attempting to factor. If we can find a representation of n
as the difference of two squares (i.e., n = x 2 − y2), then we can easily factor n as n = (x − y) (x + y).
How practical is this? The first important question to ask is whether or not every composite
number n can be represented in this manner. Consider n = ab, where a and b are both greater than
one and neither need be prime. Letting x = (a + b)/2 and y = (a − b)/2, we see
2 2
a +b a −b a 2 + 2ab + b 2 a 2 − 2ab + b 2 4ab
x2 − y2 = −
2 2 =
4 − 4 = 4 = ab
Thus, such a representation always exists.
An example will make it clear how this observation may be applied. Consider n = 391. We
start with
x = n = 20.
The ⌈ ⌉ notation is referred to as the ceiling function. It indicates that the quantity within should be
rounded up to the closest integer. In Section 14.7, a related function called the floor function will also
be needed. It is notated with ⌊ ⌋ and means that the quantity within should be rounded down to the
closest integer. Most programming languages have commands built in to perform these operations.
Using x = 20, we continue like so:
I tell math majors that Leonhard Euler (Figure 15.6) should be mentioned at least once in every
mathematics course, and if he isn’t, they should go to the registrar’s office and ask for a tuition refund,
452 ◾ Secret History
for something important was left out! We’ve already seen how Euler’s generalization of Fermat’s little
theorem paved the way for RSA encryption. We now examine Euler’s method for factoring.
Instead of looking at n as the difference of two squares, we may often view it as the sum of two
squares (in two different ways); for example 130 = 112 + 32 = 72 + 92. Let us assume that we can
do this for a number that we are attempting to factor. That is,
n = a2 + b2 = c 2 + d 2.
⇒ a2 − c 2 = d 2 − b2
⇒ ( a − c )( a + c ) = ( d − b )( d + b )
Letting g denote the greatest common divisor of a − c and d − b, we have a − c = gx and d − b =
gy for some relatively prime integers x and y. So, our previous line now gives us (by substitution)
( gx )( a + c ) = ( gy )( d + b )
⇒ ( x )( a + c ) = ( y )( d + b ) (15.1)
The right-hand side is divisible by y, so the left-hand side must also be divisible by y. However, y
doesn’t divide x, as x and y are relatively prime. Therefore y must divide a + c. That is, there exists
m such that ym = a + c. By substituting for (a + c) in equation 15.1, we get (y)(d + b) = (x)(ym). We
can then divide through by y to get d + b = xm.
Now consider the product of
g 2 m 2
+ and (x 2
)
+ y2 .
2 2
g
2
m 2
2
g 2 m2 2
2
(
2
2
)
+ x + y = +
4
x + y2 ( )
4
1 2
= ( g + m 2 )( x 2 + y 2 )
4
1
= ( gx )2 + ( gy )2 + (mx )2 + (my )2
4
1
= (a − c )2 + (d − b )2 + (d + b )2 + (a + c )2
4
1 2
= a − 2ac + c 2 + d 2 − 2bd + b 2 + d 2 + 2bd + b 2 + a 2 + 2ac + c 2
4
1
= 2a 2 + 2b 2 + 2c 2 + 2d 2
4
1
= 2(a 2 + b 2 ) + 2(c 2 + d 2 )
4
1
= (2n + 2n)
4
1
= (4n)
4
=n
Attacking RSA ◾ 453
So, the product we were considering is a factorization of n, as desired. Thus, once the representa-
tion of n as a sum of two squares, in two ways, is obtained, one may calculate g, m, x, and y to
obtain a factorization. The problem is efficiently finding the necessary sums!
We now make another big leap in time to a living mathematician, John Pollard, who has sev-
eral factorization algorithms named after him.
15.6 Pollard’s p − 1 Algorithm
Recall Fermat’s Little Theorem:
m= ∏q
q ≤B
q prime.
Example 2
Using n = 713 as a small example to illustrate this method, we may take a = 2 and B = 5. Then
m = (2)(3)(5) = 30. It follows that gcd(am − 1, n) = gcd(230 − 1, 713) = 31. Division then reveals
that 713 = (31)(23).
This approach won’t work with n = 85, no matter how large we make B. The reason for this is
that the prime factors of 85 are p = 5 and q = 17, so when we look at p − 1 andq − 1, we get 4 and
16. No matter how many primes we multiply together, we’ll never get a multiple of either of these
numbers, unless we repeat the prime factor 2. In case p − 1 and q − 1 both have a repeated prime
factor, we must modify the way in which we defined m. Possible patches include
m = B ! and m =
q ≤B
∏q n
where the product runs over all prime factors q less than B, and n is some positive integer greater
than 1.
There are other twists we can put on this algorithm, but the above gives the general idea. The
time needed for this method is roughly proportional to the largest prime factor of p − 1. Hence, it
is efficient for p such that p − 1 is smooth. To resist this attack, choose primes of the form 2q + 1,
where q is prime. That way, p − 1 will have at least one large factor. There is also Pollard’s ρ (rho)
22 There are a few terms in mathematics that sound like the names of gangsta rappers: B-smooth, 2-pi, cube root,
etc.
454 ◾ Secret History
Algorithm (1975) for factoring, but it is strongest when the number being attacked has small fac-
tors, which is certainly not the case for RSA, so it won’t be detailed here.23
John D. Dixon’s (Figure 15.7) algorithm has its roots in Fermat’s method of factorization. Rather
than insist on finding x and y such x 2 − y2 = n, in order to factor n, we could content ourselves with
an x and y such that x 2 − y2 = kn. This may also be expressed as x 2 = y2 (mod n). So, if we can find two
squares, x 2 and y2, that are equal modulo n, we may have a factorization given by (x − y)(x + y). It’s
only “may,” because this broadening of Fermat’s method allows the possibility that (x − y) = k and
(x + y) = n. This idea goes back to Maurice Kraitchik in the 1920s.25
We can find potential x and y values quicker by not insisting that y be a perfect square.
23 See Pollard, John M., “A Monte Carlo Method for Factorization,” BIT Numerical Mathematics, Vol. 15, No. 3,
September 1975, pp. 331–334.
24 Dixon, John D., “Asymptotically Fast Factorization of Integers,” Mathematics of Computation, Vol. 36, No.
153, January 1981, pp. 255–260.
25 Pomerance, Carl, “A Tale of Two Sieves,” Notices of the American Mathematical Society, Vol. 43, No. 12,
December 1996, pp. 1473–1485, p. 1474 cited here.
Attacking RSA ◾ 455
Piecing together 282 from the first column and 262 and 312 from the second column, we get
( 28)2 ( 26)2 (31)2 = ( 2)( −1)( 2)( 53)( −1)( 2)2 ( 53) ( mod 391)
2
= ( −1) ( 2 ) ( 53) = ( 2 ) ( 53) ( mod 391)
2 4 2 2
That is,
{-1, 2, 3, 5, 7,11,13,17,19,23,29,31,37,41,43,47,51,53}
212 = (2)(5)2
[0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
212 = (−1)(11)(31) [1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0]
222 = (3)(31)
[0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0]
232 = (2)(3)(23) [0, 1, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0]
232 = (−1)(11)(23) [1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0]
If we stop our factor base at 53, we cannot include values such as our second factoriza-
tion for 222, because it contained the factor 149. If we made our factor base larger, it could be
accommodated.
Finding a potential solution is equivalent to selecting vectors whose sum modulo 2 is the zero
vector. To find such solutions efficiently we may construct a matrix M whose columns are the vec-
tors above and then look for a solution to the matrix equation
MX = 0 ( modulo 2 ) ,
where X is the column vector
x1
x 2
x
k
for a factor base with k elements.
26 For factoring a record-breakingly large prime, the factor base will now contain about a million values, accord-
ing to Pomerance, Carl, “A Tale of Two Sieves,” Notices of the American Mathematical Society, Vol. 43, No. 12,
December 1996, pp. 1473–1485, p. 1483 cited here.
Attacking RSA ◾ 457
This approach was discovered by Michael Morrison and John Brillhart and published in 1975.27
The pair used it, with some help from continued fractions, to factor the seventh Fermat number
7
F7 = 22 + 1
Having a large factor base increases the chances of finding a solution after a given number of values
has been investigated, but it slows down the linear algebra step.
We look at this refinement with a second example. Suppose we wish to factor 5,141. We select
the base {−1, 2, 3, 5, 7, 11, 13} and make a table of values. Note that one representation was chosen
for each value. We could have listed two, but going with just the smaller of the two in absolute
value gives us a factorization more likely to be useful.
1 72 5,184 = 40 = (2)3(5)
27 Morrison, Michael A. and John Brillhart, “A Method of Factoring and the Factorization of F7,” Mathematics
of Computation, Vol. 29, No. 129, January 1975, pp. 183–205.
458 ◾ Secret History
We now delete all factorizations in the right-most column that aren’t 13-smooth. We’re left
with:
1 72 5,184 = 40 = (2)3(5)
1 0 1 1 0 0 1 x1 0
0 1 0 0 1 1 0 x 2 0
0 0 0 1 1 0 0 x3 0
0 1 0 1 0 1 0 x 4 = 0
0 0 0 1 0 0 0 x5 0
0 0 0 0 0 0 0 x6 0
0 0 0 0 0 1 0 x 7 0
A little bit of linear algebra leads us to the following solutions for the column vector of xs.
We may then divide 5,141/97 = 53 to get the complete factorization: 5,141 = (97)(53). If the gcd
had been 1, we would have gone on to try the second, and possibly the third solution.
Although the method detailed here is named after Dixon, Dixon himself humbly pointed out
that28
…the method goes back much further than my paper in 1981. References to my paper
sometimes do not seem to appreciate this fact. For example, the entry in Wikipedia
seems to credit the idea to me. The fact is the idea in one form or another had been
around for a much longer time, but no-one had been able to give a rigorous analysis
of the time complexity of the versions which were used. What I did in my paper was
to show that a randomized version of the method can be analyzed and that (at least
qualitatively and asymptotically) it is faster than other known methods (in particular,
subexponential in log N). As far as I know, except for improved constants, this is still
true…
I did not suggest that the randomized version which I described would be competitive
in practice with algorithms which were currently in use (but most of which still have
no rigorous analysis). I do not think anyone has seriously tried to factor a large number
using random squares.
Dixon didn’t know the whole history when he published his 1981 paper, but he did include it in a
later paper.29 In what seems to be a theme with important work in cryptology in recent decades,
Dixons’s 1981 paper was rejected by the first journal to which he submitted it.30
base, there is a 1 in every position of our list that contained a B-smooth number. The numbers
represented by other values are discarded. This is an oversimplification; there are several shortcuts
that make the process run much quicker, but it conveys the general idea. The interested reader can
pursue the references for further details.
When Martin Gardner provided the first published description of RSA, he gave his readers a
chance to cryptanalyse the new system:
As a challenge to Scientific American readers the M.I.T. group has encoded another
message, using the same public algorithm. The ciphertext is:
9686 9613 7546 2206
1477 1409 2225 4355
8829 0575 9991 1245
7431 9874 6951 2093
0816 2982 2514 5708
3569 3147 6622 8839
8962 8013 3919 9055
1829 9451 5781 5154
Its plaintext is an English sentence. It was first changed to a number by the standard
method explained above, then the entire number was raised to the 9,007th power
(modulo r) by the shortcut method given in the memorandum. To the first person who
decodes this message the M.I.T. group will give $100.31
The modulus, which Gardner labeled as r, would now be written as n. In any case, it was the
following number, which became known as RSA-129, as it is 129 digits long.
114381625757888867669235779976146612010218296721242362562561842935
706935245733897830597123563958705058989075147599290026879543541
Gardner didn’t expect a solution in his lifetime, but a combination of increased computing power,
improved factoring algorithms, and Gardner’s own longevity resulted in his seeing a solution on
April 26, 1994. The factors were
3490529510847650949147849619903898133417764638493387843990820577
and
32769132993266709549961988190834461413177642967992942539798288533 .
They were determined using a quadratic sieve and the plaintext turned out to be32
For numbers up to about 110 digits (in base 10), the quadratic sieve is the best general method
for factoring presently available. For larger values, the number field sieve is superior.33
31 Gardner, Martin, “Mathematical Games, A New Kind of Cipher That Would Take Millions of Years to Break,”
Scientific American, Vol. 237, No. 2, August 1977, pp. 120–124, text from p. 123, ciphertext from p. 121.
32 Hayes, Brian P., “The Magic Words are Squeamish Ossifrage,” American Scientist, Vol. 82. No. 4, July–August
1994, pp. 312–316.
33 Stamp, Mark and Richard M. Low, Applied Cryptanalysis: Breaking Ciphers in the Real World, John Wiley &
Sons, Hoboken, New Jersey, 2007, p. 316.
Attacking RSA ◾ 461
This sensational achievement announced to the world that Pollard’s number field sieve
had arrived.
A description of the algorithm requires a background in modern algebra that is beyond the scope
of this text; however, there are some elements in common with simpler methods. For example,
sieving remains the most time-consuming step in this improved algorithm.
In 2003, Adi Shamir (the “S” in RSA), along with Eran Tromer, published designs for special-
ized hardware to perform factorizations based on the number field sieve.36 They named it TWIRL,
which is short for The Weizmann Institute Relation Locator, with the Weizmann Institute being
their employer. “Relation Locator” refers to finding factoring relations in a matrix, like the one
described in Section 15.7. The pair estimated that $10 million worth of hardware would be suf-
ficient for a machine of this design to complete the sieving step for a 1,024-bit RSA key in less
than a year.
RSA remains as secure against factoring attacks as it was when it was first created. Improved fac-
toring techniques and improved hardware have simply forced users to use longer keys. If TWIRL
concerns you, simply use a 2,048-bit key. Your biggest concern, as seen in Section 15.1.12, is mak-
ing sure the primes used were generated in as random a manner as possible!
34 Lenstra, Arjen K., Hendrik W. Lenstra, Jr., Mark S. Manasse, and John M. Pollard, “The Number Field Sieve,”
in Lenstra, Arjen K. and Hendrik W. Lenstra, Jr., editors, The Development of the Number Field Sieve, Lecture
Notes in Mathematics, Vol. 1554, Springer, Berlin, Germany, 1993, pp. 11–42.
35 Pomerance, Carl, “A Tale of Two Sieves,” Notices of the American Mathematical Society, Vol. 43, No. 12,
December 1996, pp. 1473–1485, p. 1480 cited here.
36 Shamir, Adi and Eran Tromer, “Factoring Large Numbers with the Twirl Device,” in Boneh, Dan, editor,
Advances in Cryptology – CRYPTO 2003 Proceedings, Lecture Notes in Computer Science, Vol. 2729, Springer,
Berlin, Germany, 2003, pp. 1–27.
37 Shor, Peter, “Algorithms for quantum computation: discrete logarithms and factoring,” in Goldwasser, Shafi,
editor, 35th Annual IEEE Symposium on Foundations of Computer Science (FOCS), IEEE Computer Society
Press, Los Alamitos, California, 1994, pp. 124–134.
462 ◾ Secret History
Lenstra, Arjen K., James P. Hughes, Maxime Augier, Joppe W. Bos, Thorsten Kleinjung, and Christophe
Wachter, “Ron was wrong, Whit is right,” February 2012, https://anonymous-proxy-servers.net/
paper/064.pdf.
Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking,
New York, 2001.
Rivest, Ronald L., Adi Shamir, and Leonard Adleman, On Digital Signatures and Public-key Cryptosystems
(There was soon a title change to A Method for Obtaining Digital Signatures and Public-key Cryptosystems.
The date is the same for both.), MIT Laboratory for Computer Science Report MIT/LCS/TM 82,
Cambridge, Massachusetts, April 1977. This report later appeared as cited in the reference below.
Rivest, Ronald L., Adi Shamir, and Leonard Adleman, “A Method for Obtaining Digital Signatures and
Public-key Cryptosystems,” Communications of the ACM, Vol. 21, No. 2, February 1978.
Robinson, Sara, “Still Guarding Secrets after Years of Attacks, RSA Earns Accolades for its Founders,”
SIAM News, Vol. 36, No. 5, June 2003, pp. 1–4.
Simmons, Gustavus J., “A “Weak” Privacy Protocol Using the RSA Crypto Algorithm,” Cryptologia, Vol. 7,
No. 2, April 1983, pp. 180–182.
Wiener, Michael J., “Cryptanalysis of Short RSA Secret Exponents,” IEEE Transactions on Information
Theory, Vol. 36, No. 3, May 1990, pp. 553–558.
On Factoring
Bach, E. and J. Shallit, “Factoring with Cyclotomic Polynomials,” Mathematics of Computation, Vol. 52, No.
185, January 1989, pp. 201–209.
Dixon, John D., “Asymptotically Fast Factorization of Integers,” Mathematics of Computation, Vol. 36, No.
153, January 1981, pp. 255–260.
Dixon, John D., “Factorization and Primality Tests,” American Mathematical Monthly, Vol. 91, No. 6, June–
July 1984, pp. 333–352. See Section 11 of this paper for the historical background.
Lenstra, Jr., Hendrik W., “Factoring Integers with Elliptic Curves,” Annals of Mathematics, Vol. 126, No. 3,
November 1987, pp. 649–673.
Lenstra, Arjen K., Hendrik W. Lenstra, Jr., Mark S. Manasse, and John M. Pollard, “The Number Field
Sieve,” in Lenstra, Arjen K. and Hendrik W. Lenstra, Jr., editors, The Development of the Number
Field Sieve, Lecture Notes in Mathematics, Vol. 1554, Springer, 1993, pp. 11–42.
Lenstra, Arjen K., “Factoring,” in Tel, Gerard and Paul Vitányi, editors, Distributed Algorithms, WDAG
1994, Lecture Notes in Computer Science, Vol. 857, Springer, Berlin, Germany, 1994, pp. 28–38.
Lenstra, Arjen K., Eran Tromer, Adi Shamir, Wil Kortsmit, Bruce Dodson, James Hughes, and Paul
Leyland, “Factoring Estimates for a 1024-Bit RSA Modulus,” in Laih, Chi Sung, editor, Advances in
Cryptology – ASIACRYPT 2003 Proceedings, Lecture Notes in Computer Science, Vol. 2894, Springer,
Berlin, Germany, 2003, pp. 55–74. This was a follow-up paper to the one above.
Montgomery, Peter L. and Robert D. Silverman, “An FFT Extension to the P – 1 Factoring Algorithm,”
Mathematics of Computation, Vol. 54, No. 190, April 1990, pp. 839–854.
Morrison, Michael A. and John Brillhart, “A Method of Factoring and the Factorization of F7,” Mathematics
of Computation, Vol. 29, No. 129, January 1975, pp. 183–205. This paper describes factoring with
continued fractions.
Pollard, John M., “Theorems on Factorization and Primality Testing,” Proceedings of the Cambridge
Philosophical Society, Vol. 76, No. 3, November 1974, pp. 521–528.
Pollard, John M., “A Monte-Carlo Method for Factorization,” Bit Numerical Mathematics, Vol. 15, No. 3,
1975, pp. 331–334.
Pollard, John M., Home Page, https://sites.google.com/site/jmptidcott2/.
Pomerance, Carl, and Samuel S. Wagstaff, Jr., “Implementation of the Continued Fraction Integer Factoring
Algorithm,” Congressus Numerantium, Vol. 37, 1983, pp. 99–118.
464 ◾ Secret History
Pomerance, Carl, “The Quadratic Sieve Factoring Algorithm,” in Beth, Thomas, Norbert Cot, and Ingemar
Ingemarsson, editors, Advances in Cryptology, Proceedings of EUROCRYPT 84, Lecture Notes in
Computer Science, Vol. 209, Springer, Berlin, Germany, 1985, pp. 169–182, available online at www.
math.dartmouth.edu/∼carlp/PDF/paper52.pdf.
Pomerance, Carl, “A Tale of Two Sieves,” Notices of the American Mathematical Society, Vol. 43, No. 12,
December 1996, pp. 1473–1485.
Shamir, Adi and Eran Tromer, “Factoring Large Numbers with the Twirl Device,” in Boneh, Dan, editor,
Advances in Cryptology – CRYPTO 2003 Proceedings, Lecture Notes in Computer Science, Vol. 2729,
Springer, Berlin, Germany, 2003, pp. 1–27.
Shor, Peter, “Algorithms for Quantum Computation: Discrete Logarithms and Factoring,” in Goldwasser,
Shafi, editor, 35th Annual IEEE Symposium on Foundations of Computer Science (FOCS), IEEE
Computer Society Press, Los Alamitos, California, 1994, pp. 124–134.
Williams, Hugh C. and Jeffrey O. Shallit, “Factoring Integers before Computers,” in Gautschi, Walter,
editor, Mathematics of Computation 1943–1993: A Half-Century of Computational Mathematics,
Proceedings of Symposia in Applied Mathematics, Vol. 48, American Mathematical Society, Providence,
Rhode Island, 1994, pp. 481–531.
Chapter 16
Thus, even starting with the most fundamental and ancient ideas concerning prime
numbers, one can quickly reach the fringe of modern research. Given the millennia
that people have contemplated prime numbers, our continuing ignorance concerning
the primes is stultifying.
—Richard Crandall and Carl Pomerance1
∑ p1 i
i
1 Crandall, Richard E. and Carl Pomerance, Prime Numbers: A Computational Perspective, Springer, New York,
2001, pp. 6–7.
2 See Ribenboim, Paulo, The New Book of Prime Number Records, Springer, New York, 1996, pp. 3–18.
465
466 ◾ Secret History
diverges. If this series converged, the number of primes could be infinite or finite, but divergence
leaves only one option. If there were finitely many primes, the series would have to converge. So,
there cannot be a largest prime. There is, however, a largest known prime. Table 16.1 lists the top
10 largest known primes. The meaning of GIMPS, Seventeen or Bust, and Mersenne in this table
are explained Section 16.4.2.
Table 16.1 Top 10 Largest Known Primes (as of October 12, 2020)
Rank Prime Digits Discoverer Year Reference
1 282589933 − 1 24862048 GIMPS 2018 Mersenne 51?
2 277232917 − 1 23249425 GIMPS 2018 Mersenne 50?
3 274207281 − 1 22338618 GIMPS 2016 Mersenne 49?
4 2 57885161 −1 17425170 GIMPS 2013 Mersenne 48?
5 243112609 − 1 12978189 GIMPS 2008 Mersenne 47
6 242643801 − 1 12837064 GIMPS 2009 Mersenne 46
7 237156667 − 1 11185272 GIMPS 2008 Mersenne 45
8 232582657 − 1 9808358 GIMPS 2006 Mersenne 44
9 10223 × 231172165 + 1 9383761 Seventeen or Bust 2016
10 230402457 − 1 9152052 GIMPS 2005 Mersenne 43
Source: http://primes.utm.edu/largest.html.
Even though there are infinitely many primes, there are still arbitrarily long sequences of inte-
gers (without skipping any) that do not contain any primes. If you want n integers in a row, all of
which are composite (nonprime), here you are!
(n + 1) ! + 2, (n + 1) ! + 3, (n + 1) ! + 4, …, ( n + 1) ! + n, (n + 1) ! + n + 1
The first is divisible by 2, the second by 3, and so on.
The first gap between primes that contains 1,000 composites occurs right after the prime
1,693,182,318,746,371. This prime is followed by 1,131 composites. This fact was discovered by
Bertil Nyman, a Swedish nuclear physicist.3 Even though, gaps aside, we have plenty of primes to
choose from (recall there are infinitely many), they become increasingly rare as a percent of the
total, as the length of the number we desire grows.
The function π(n) is defined to be the number of primes less than or equal to n. Some sample
values are provided in the table below:
n π(n)
10 4
100 25
1,000 168
3 Caldwell, Chris K. and G. L. Honaker, Jr., Prime Curios! The Dictionary of Prime Number Trivia, CreateSpace,
Seattle, Washington, 2009, p. 218.
Primality Testing and Complexity Theory ◾ 467
So, 40% of the first 10 integers are prime, but only 16.8% of the first 1,000 are prime. π(n) may
be calculated for any value of n by simply testing the primality of each positive integer less than
or equal to n. However, if n is large, this is a very time-consuming method. It would be nice if
there were some expression of π(n) that is easier to evaluate such as p(n) = ⌊1.591 + 0.242n −
0.0000752n2⌋. Recall that ⌊n⌋ denotes the greatest integer less than or equal to n. By plugging in
values, we see that p(n) works great for 10, 100, and 1,000, but it becomes inaccurate for higher
values of n, as can be seen by expanding the table above. Also, it doesn’t do well for most values
under 1,000, other than the ones in the table!4
n π(n)
10 4
100 25
1,000 168
10,000 1,229
100,000 9,592
1,000000 78,498
10,000,000 664,579
100,000,000 5,761,455
1,000,000,000 50,847,534
10,000,000,000 455,052,511
German mathematician Carl Friedrich Gauss (1777–1855) came up with the prime number
theorem, but was unable to prove it. It says:
n
π(n ) ∼
ln(n )
We saw the ∼ notation previously in Section 1.6, where Stirling’s formula was presented. Recall
that it is pronounced “asymptotically approaches” and means that the ratio of the two quantities,
n
π(n ) and ln(n ) , in this case, approaches 1 as n approaches infinity.
Gauss came up with another estimate, the logarithmic integral of n, which is written li(n). It
also asymptotically approaches π(n) as n → ∞, and it gives more accurate estimates when n is small.
n
1
li(n ) =
∫ ln(x ) dx
2
That these functions converge to π(n) was proven in 1896, independently, by Jacques Hadamard
and C. J. de la Vallée-Poussin. Their proofs used the Riemann Zeta function.
4 The function p(n) was obtained by using Lagrangian interpolation on the values for which it was seen to work
perfectly.
468 ◾ Secret History
Testing values, it appears that li(n) > π(n) for all n; however, this is not true. Although it is
ridiculously large, there is a point at which li(n) switches from being an overestimate to being an
underestimate. Stanley Skewes, a South African mathematician, investigated where this change-
over occurs. He could not establish an exact value, but he did find a bound. The switch takes place
e 79
somewhere before e e . This bound assumes that the Riemann hypothesis holds.5 Without this
10963
assumption, Skewes found the larger bound 1010 .6 It was learned, before Skewes made his cal-
culations, that li(n) eventually switches back to being an underestimate again. In fact, it switches
between an overestimate and an underestimate infinitely many times!7 For many years, Skewes’s
numbers held the record for being the largest numbers that ever served a useful purpose (i.e., used
in a proof), but they have since been dwarfed by other values. They’ve also been diminished in
another way—much smaller bounds have been found for where li(n) first transitions to an
underestimate.
Still the question remains, how can we find or generate large prime numbers? Primality testing
is concerned with deciding whether or not a given number is prime. For the quickest tests, the
revelation that a number is not prime is made without revealing any of the factors. They’re like
existence theorems for nontrivial, proper factors.
We first look at some probabilistic tests. These tests can sometimes prove a number is com-
posite, but they can never quite prove primality. The best they can do is suggest primality, with
arbitrarily high probabilities.
5 Skewes, Stanley, “On the Difference π(x) − Li(x),” Journal of the London Mathematical Society, Vol. 8 (Series 1),
No. 4, 1933, pp. 277–283.
6 Skewes, Stanley, “On the Difference π(x) − Li(x) (II),” Proceedings of the London Mathematical Society, Vol. 5
(Series 3), No. 1, 1955, pp. 48–70.
7 Littlewood, John Edensor, “Sur la Distribution des Nombres Premiers,” Comptes Rendus, Vol. 158, 1914,
pp. 1869–1872.
8 McGregor-Dorsey, Zachary Strider, “Methods of primality testing,” MIT Undergraduate Journal of Mathematics,
Vol. 1, 1999, pp. 133–141.
Primality Testing and Complexity Theory ◾ 469
Example 1 (n = 391)
To calculate 2390 modulo 391, we use the repeated squaring technique introduced in Section 14.3.
We first calculate 2 to various powers of 2 modulo 391:
22 = 4
24 = 16
28 = 256
216 = 65,536 = 239 ( mod 391)
232 = 57,121 = 35 ( mod 391)
264 = 1,225 = 52 ( mod 391)
2128 = 2,704 = 358 ( mod 391)
2256 = 128,164 = 307 ( mod 391)
and then multiply appropriate values to get the desired power:
( )( )( )( )
2390 = 22 24 2128 2256 = ( 4 )(16 )( 358)( 307 ) = 7,033,984 = 285 ( mod 391).
Because 2390 did not simplify to 1, we can conclude that 391 is not prime.
This test gives us no indication what the factors of 391 are. Also, this test doesn’t always work so
nicely! The base 2 is nice to use for testing purposes, but it won’t unmask all composites.
Example 2 (n = 341)
2340 = 1 (mod 341), so we cannot draw an immediate conclusion. We then check 3340 (mod 341)
and get 56. We may now conclude that 341 is composite. Because 341 was able to sneak by the
base 2 test, we call 341 a base 2 pseudoprime. Base 3 revealed the composite nature of 341, but
sometimes neither 2 nor 3 will reveal composites.
Example 3 (n = 1,729)
For this example, 21,728 = 1 (mod 1,729), so we cannot draw an immediate conclusion. We then check
31728 (mod 1,729) and get 1 again. We still cannot draw a conclusion. Continuing on with other
bases, we always get 1, if the base is relatively prime to 1,729. It’s tempting to conclude that 1,729 is
prime. But, because Fermat’s Little Theorem isn’t “if and only if,” we haven’t proven anything. In fact
there is strong evidence that 1,729 is not prime, such as the fact that 1,729 = (7)(13)(19)!
Using a base that’s less than 1,729, but not relatively prime to it, will reveal 1,729 to be com-
posite, but such a number would be a factor of 1,729. It would be quicker to use trial division, if
we need to find a base that is a factor of a number to prove it is composite.
A composite number n for which every base relatively prime to n yields 1 is called a Carmichael
number after Robert Carmichael (Figure 16.1).
For years it was an open problem to determine the number of Carmichael numbers, but in
1994 it was shown that there are infinitely many.9 Carmichael found the first of the numbers that
9 Alford, W. R., A. Granville and Carl Pomerance, “There are Infinitely Many Carmichael Numbers,” Annals of
Mathematics, Vol. 139, No. 3, May 1994, pp. 703–722.
470 ◾ Secret History
would be named after him in 1910. The first few Carmichael numbers10 are 561, 1105, 1729, 2465,
2821, 6601, 8911, 10585,…
There are only 20,138,200 Carmichael numbers less than 1021, which is about
0.000000000002% of the total.12 Thus, the odds that a randomly chosen number will be a
Carmichael number are very small. Nevertheless, we’d like to have a primality test that isn’t
fooled by these numbers.
10 “A002997, Carmichael numbers: composite numbers n such that a^(n − 1) == 1 (mod n) for every a coprime to
n.” The On-Line Encyclopedia of Integer Sequences •, http://oeis.org/A002997.
11 This website is the result of following a link from https://web.archive.org/web/20120214210612/http://www.
websitehome.co.uk/rgep/p82.pdf.
13 Weisstein, Eric W., “Primality Test,” MathWorld, A Wolfram Web Resource, https://mathworld.wolfram.com/
PrimalityTest.html refers to this as the Rabin-Miller test, as does Bruce Schneier in the second edition of
Applied Cryptography, p. 259. Richard A. Mollin adds a name to get Miller–Rabin-Selfridge. He explains that
John Selfridge deserves this recognition, as he was using the test in 1974, before it was published by Miller.
See Mollin, Richard A., An Introduction to Cryptography, Chapman & Hall / CRC, Boca Raton, Florida,
2001, p. 191. If you want to avoid names altogether, it is also referred to as the strong pseudoprimality test.
Whatever you choose to call it, the primary reference is Rabin, Michael O., “Probabilistic Algorithm for
Testing Primality,” Journal of Number Theory, Vol. 12, No. 1, February 1980, pp. 128–138.
Primality Testing and Complexity Theory ◾ 471
must be odd.) We then pick an integer a < n. If n is prime, and a is relatively prime to n, then one
of the following must hold:
1. ad = 1 (mod n)
s
2. a 2 d = −1 (mod n) for some 0 ≤ s < t – 1
Thus, if neither holds true, we know n cannot be prime.
It is known that for every n that is composite, at least 75% of the choices for a will reveal that
fact via the test above. Passing the test for a particular base doesn’t prove the number is prime, but
the test can be repeated with different values for a. Passing the test for a different value represents
an independent event, so the probability of passing after m bases have been investigated is less than
(1/4)m, if n is composite. Thus, we can test until the probability is vanishingly small.
Example 4
Because n = 1729 caused trouble earlier, we’ll investigate this value with our new test. We have
n − 1 = 1728, which is divisible by 26, but not by 27, so t = 6. 1728/(26) = 27, so we have d = 27 and
(26)(27) = n − 1. We’re now ready to investigate condition 1, above. Picking a = 2 and calculating
ad (mod n), we get
sd
s a2 ( mod n )
0 645 (this was already calculated in the step above)
2 1,065
3 1
4 1
5 1
s
None, of the values for a 2 d (mod n) yield −1, so condition 2 is not satisfied. Because neither condi-
tion 1 nor condition 2 holds, 1,729 fails the Miller–Rabin Test and cannot be prime.
Because there are guaranteed to be bases that reveal n to be composite, if it is composite, we
could make this a deterministic test by testing every base less than the number. However, that
would be silly, as the time required to do so would well exceed the time needed to test for primality
by trial division; hence, we refer to the Miller–Rabin test as a probabilistic test. Rabin and Miller
are pictured in Figures 16.2 and 16.3. Why is Rabin smiling? Perhaps he could see the future, as
revealed in the caption to Figure 16.2
Naturally, some enjoy the challenge of finding numbers that fool primality tests. François
Arnault did so for the Miller–Rabin test, as implemented by the computer algebra system
472 ◾ Secret History
Figure 16.2 Michael O. Rabin (1931–). Rabin split the $1 million Dan David Prize with two
others. (From SEAS, Michael O. Rabin Wins Dan David Prize: Computer Science Pioneer Shares
$1 Million Prize for Outstanding Achievements in the Field [press release], Harvard School of
Engineering and Applied Science, February 16, 2010.)
ScratchPad.14 Implementations such as this apply the test to a small set of bases. The composite
number that squeaked by all of these was 11950687687952657925183613157251163518982455
81. However, Arnault noted that after he found this exception ScratchPad was improved, and
renamed Axiom, with a new primality test that isn’t just Miller–Rabin, and which recognizes the
number above as composite.
14 Arnault, François, “Rabin-Miller Primality Test: Composite Numbers Which Pass It,” Mathematics of
Computation, Vol. 64, No. 209, January 1995, pp. 355–361.
Primality Testing and Complexity Theory ◾ 473
It is an intriguing possibility that there may be a hybrid system that catches all composites.
In other words, we may have two tests, each of which misses certain numbers, but if there is no
overlap between these sets of missed numbers then passing both tests guarantees primality.
15 Waring, Edward, Meditationes Algebraicae, Cambridge University Press, Cambridge, UK, 1770.
16 In 1773, according to Weisstein, Eric W, “Wilson’s Theorem.” MathWorld, A Wolfram Web Resource, http://
mathworld.wolfram.com/WilsonsTheorem.html.
17 Adleman, Leonard, Carl Pomerance and, Robert S. Rumely, “On Distinguishing Prime Numbers From
Composite Numbers,” Annals of Mathematics, Vol. 117, No. 1, January 1983, pp. 173–206.
474 ◾ Secret History
Institute of Technology in Kanpur.18 The students were in graduate school when the proof was
completed, but were undergraduates when most of the work was done.19
Just as the encryption scheme developed by Rivest, Shamir, and Adleman became known as
RSA, so the primality test of Agrawal, Kayal, and Saxena became known as AKS (Figure 16.5).
Figure 16.5 The AKS Algorithm (From Agrawal, Manindra, Neeraj Kayal, and Nitin Saxena,
“PRIMES Is in P,” Annals of Mathematics, Second Series, Vol. 160, No. 2, September 2004, pp.
781–793, p. 784 cited here. With permission.)
• Or(n) is the order of n modulo r; that is, the smallest value k such that nk = 1 (mod r).
• ϕ is the Euler phi function, represented here as ϕ. See Section 14.3 of this book.
• log denotes a base 2 logarithm.
18 It was posted online in August 2002, and appeared in print over two years later, in September 2004. See
Agrawal, Manindra, Neeraj Kayal, and Nitin Saxena, “PRIMES is in P,” Annals of Mathematics, Second Series,
Vol. 160, No. 2, September 2004, pp. 781–793, available online at http://www.math.princeton.edu/∼annals/
issues/2004/Sept2004/Agrawal.pdf.
19 Aaronson, Scott, The Prime Facts: From Euclid to AKS, http://www.scottaaronson.com/writings/prime.pdf,
2003, p. 10.
Primality Testing and Complexity Theory ◾ 475
• (mod X r − 1, n) means divide by the polynomial X r − 1 and take the remainder, and also
reduce all coefficients modulo n.
Step 5 in Figure 16.5 makes use of what is sometimes called “freshman exponentiation.” Usually a
student who expands out (X + a)n as Xn + an, will lose points, but the work is correct, if the expan-
sion is carried out modulo n, where n is a prime. In fact, we can simplify further, if a is relatively
prime to the modulus n. In that case, Fermat’s little theorem tells us an = a (mod n). Thus, when n
is prime, we have (X + a)n = Xn + a. If equality doesn’t hold, n must be composite. A close look at
Step 5 reveals that we are not just moding out by n, but also by X r − 1. This helps to speed up what
would otherwise be a time-consuming calculation, as we cannot apply the shortcuts described
above without knowing n is prime!
Example 5 (n = 77)
1. We cannot express n in the form ab, where a is a natural number and b is an integer, so we
cannot draw a conclusion at this point.
2. Find the smallest r such that Or(n) > log2n = (log 77)2 ≈ (6.266787)2 ≈ 39.27.
We need the order of 77 modulo r to be 40 or higher. The order of an element divides the
order of the group, so we need a group with 40 or more elements. We start with r = 41,
because the (multiplicative) group of integers modulo 41 has 40 elements, and find that it
works. The order of 77 (mod 41) is 41.
3. Now check if 1 < (a, 77) < 77 for some a ≤ r = 41.
This happens for a = 7 (and other values), so we stop and declare 77 to be composite. It may
seem like a silly way to test, because step 3, by itself, takes longer than trial division would,
but remember that this is just a small example to illustrate how the test works. For numbers
of the size we’re interested in, this test is quicker than trial division.
Example 6 (n = 29)
1. We cannot express n in the form ab, where a is a natural number and b is an integer, so we
cannot draw a conclusion at this point.
2. Find the smallest r such that Or(n) > log2n = (log 29)2 ≈ (4.85798)2 ≈ 23.6.
We need the order of 29 modulo r to be 24 or higher. The order of an element divides the
order of the group, so we need a group with 24 or more elements. We find that r = 31 is the
smallest possibility. The (multiplicative) group of integers modulo 31 has 30 elements, and
the order of 29 (mod 31) is 31.
3. Now check if 1 < (a, 29) < 29 for some a ≤ r = 31. There is no such a.
4. Because n < r, we conclude n is prime.
Rather than roll out a third example to show how step 5 may come to be used, we continue
with Example 6, pretending that r didn’t exceed n. To do so, we’ll pretend r was 24.20
20 We couldn’t use r = 24 in Example 6, because the (multiplicative) group of integers modulo 24 (after discarding
the values that don’t have inverses) consists of only eight elements.
476 ◾ Secret History
16.4.2 GIMPS
Some numbers are easier to test for primality than others. Mersenne numbers, which have the
form 2n − 1, are currently the easiest. For values of n, such as n = 2 or 3, that yield primes, we call
the numbers Mersenne primes, after the French mathematician Marin Mersenne (1588–1648).
The ease of testing such numbers is why the top 8 largest known primes are all Mersenne primes.
Another advantage numbers of this form have, when it comes to a chance of making it on the top
10 list, is that anyone with a computer may download a program that allows him or her to join
in the testing. The program, Great Internet Mersenne Prime Search (GIMPS, for short), allows
people to donate otherwise idle time on their personal computers to testing numbers of the form
2n − 1 for primality. In the top 10 list, the number referenced as Mersenne 47 is known to be
the 47th Mersenne prime. By contrast, the number referenced as Mersenne 48? is known to be a
Mersenne prime, but it might not be the 48th such number. That is, mathematicians have yet to
prove that there is no Mersenne prime between Mersenne 47 and this number.
There is one prime on the current top 10 list that is not a Mersenne prime and wasn’t dis-
covered through GIMPS. It was credited to “Seventeen or Bust,” which is another program that
anyone may download to help search for primes taking a special form, in this case the form
k · 2n + 1 for certain values of k.22
Primes having special forms should not be considered for cryptographic purposes. Also, bigger
isn’t always better. If your RSA modulus has 578,028,320,322,400 digits, how long do you think
it will take an attacker to figure out what two primes you multiplied together, when there are so
few primes known with over 20 million digits? Proving extremely large numbers to be prime is
not done for the sake of applications, but rather for love of the game. If you need to get something
useful out of the search for record-breakingly large primes, how about $100,000?
21 http://www.cse.iitk.ac.in/users/manindra/.
22 For details of Seventeen or Bust and why some mathematicians care, see https://en.wikipedia.org/wiki/
Seventeen_or_Bust, https://primes.utm.edu/bios/page.php?id=429, and http://www.prothsearch.com/sierp.
html.
Primality Testing and Complexity Theory ◾ 477
Perhaps the most valuable prime ever found was 243112609 – 1. When Edson Smith
found this mathematical gem in 2008, it was the first one ever found with more
than ten million digits, so he won $100,000. He used software provided by the Great
Internet Mersenne Prime Search (GIMPS), so he will share the prize with them. And
what of his part? He will give that money to University of California at Los Angeles’
(UCLA’s) mathematics department. It was their computers he used to find the prime.23
23 Caldwell, Chris and G. L. Honaker, Jr., Prime Curios! The Dictionary of Prime Number Trivia, CreateSpace,
Seattle, Washington, 2009, p. 243.
24 Hartmanis, Juris and Richard E. Stearns. “On the Computational Complexity of Algorithms,” Transactions of
the American Mathematical Society, Vol. 117, No. 5, May 1965, pp. 285–306.
25 Karp, Richard M., “Reducibility Among Combinatorial Problems,” in Miller, Raymond E., James W. Thatcher,
and Jean D. Bohlinger, editors, Complexity of Computer Computations, Plenum, New York, 1972, pp. 85–103.
478 ◾ Secret History
Another complexity class is labeled NP-hard. These problems are NP-complete, but there
doesn’t need to be a way to verify solutions in polynomial time. Thus, NP-hard properly contains
NP-complete.
Thousands of problems are now known to be NP-complete. A few examples follow.
1. Traveling salesman problem — Suppose a traveling salesman wants to visit every state capital
in the United States by car. What path would be the shortest? We could solve this problem
by considering all of the possibilities. If we allow the salesman to start anywhere, there are
50! Possible routes, so although the solution is among them, we’re not likely to find it.
2. Knapsack problem — The knapsack problem (aka the subset sum problem) consists of finding
a selection of numbers from a given set such that the sum matches some desired value. For
example, if our set is S = {4, 8, 13, 17, 21, 33, 95, 104, 243, 311, 400, 620, 698, 805, 818,
912} and we wish to find values that sum to 666, we may take 620 + 21 + 17 + 8. There may
be no solution. In our example, we cannot obtain a sum of 20. However, a solution may be
found, or its nonexistence demonstrated, simply by forming all possible subsets of S and
checking their sums. This is clearly not practical for large S, because the number of subsets
grows exponentially with the size of S (a set of n elements has 2n subsets). The name of this
problem comes from imagining the desired value as the size of a knapsack that we wish to
completely fill, with the numbers representing the sizes of objects.
3. Hamiltonian graph problem — This is similar to the first example. Imagine the salesman is
restricted from traveling directly from some cities to other cities. Pretend, for example, the
highway from Harrisburg, PA to Annapolis, MD is one-way! If the salesman wants to go
from Annapolis to Harrisburg, he must first head to some other capital. We’ll investigate
this problem in greater detail in Section 21.3.
4. Decoding linear codes — Let M be a m × n matrix of 0s and 1s. Let y be a vector with n com-
ponents (i.e., an n-tuple), each of which is either 0 or 1. Finally, let k be a positive integer.
The time-consuming question is this: Is there a vector x with m components, each of which
is 0 or 1, but with no more than k 1s such that xM = y (mod 2)?26 Robert McEliece turned
this into a public key cryptosystem in 1978.27
5. Tetris — Yes, the addictive game is NP-complete. A team of three computer scientists proved
this in 2002.28
Some of the NP-complete problems have special cases that are easy to solve. This doesn’t matter.
Even if almost all instances of a problem can be solved rapidly, a problem could be classified as
NP-complete or NP-hard or EXP. Complexity theory considers worst cases and is not concerned
with the time required on average or in the best case.
It is possible that polynomial time solutions exist for all NP problems, and that mathemati-
cians have just not been clever enough to find them. The fact that deterministic primality testing
26 Talbot, John and Dominic Welsh, Complexity and Cryptography An Introduction, Cambridge University Press,
Cambridge, UK, 2006, pp. 162–163.
27 McEliece, Robert J., A Public-Key Cryptosystem Based on Algebraic Coding Theory, Deep Space Network Progress
Report 42-44, Jet Propulsion Laboratory, California Institute of Technology, January and February 1978, pp.
114–116.
28 Demaine, Erik D., Susan Hohenberger, and David Liben-Nowell, “Tetris is Hard, Even to Approximate,” in
Warnow, Tandy and Binhai Zhu, editors, Computing and Combinatorics, 9th Annual International Conference
(COCOON 2003), Lecture Notes in Computer Science Vol. 2697, 2003, Springer, Berlin, Germany, pp.
351–363.
Primality Testing and Complexity Theory ◾ 479
resisted being place in P until the 21st century can be seen as evidence of this; however, it is widely
believed that P ≠ NP.
A proof showing either P = NP or P ≠ NP is considered the Holy Grail of computer science.
This problem was one of the seven Millennium Prize Problems, so a proof that withstands peer
review will net the author a $1,000,000 prize from the Clay Mathematics Institute.29
Conferences are a great place to learn a bit of mathematics outside one’s own specialty. At
the 2008 Joint Mathematics Meetings in San Diego, I greatly enjoyed the AMS Josiah Willard
Gibbs lecture delivered by Avi Wigderson of the Institute for Advanced Study. It was titled
“Randomness—A Computational Complexity View.” The main result was likely well-known to
experts in complexity theory, but was new to me. It follows the conjectures below.
Conjecture 1: P ≠ NP. That is, some NP problems require exponential time/size. This seems
likely. As there are now thousands of NP-complete problems that have been heavily studied and
thus far resisted polynomial time solutions, it would be surprising if such solutions exist for them,
as well as all other NP problems.
Conjecture 2: There are problems that can be solved in polynomial time with probabilistic algo-
rithms, but not with deterministic algorithms. Again, this seems very reasonable. At first, the only poly-
nomial time algorithms for primality testing were probabilistic. Eventually a deterministic algorithm
was found, but it would be surprising if this could always be done. The weaker probabilistic tests
should be quicker in some cases. After all, they don’t give as firm an answer, so they should be faster.
And now for the shocker — there’s a theorem (it’s been proven!) that one of these conjectures
must be wrong.30 Unfortunately, the theorem doesn’t tell us which one. Because we can only have
one of the above conjectures, I’d bet on P ≠ NP. If this is true, the conclusion we can draw from
the negation of conjecture 2 is that probabilistic algorithms aren’t as powerful as they seem.
There are many other complexity classes. The above is not intended to be a survey of the field,
but rather an introduction to some concepts that are especially relevant to cryptology.
XOR Lemma,” in Proceedings of the 29th annual ACM Symposium on Theory of Computing (STOC 1997), ACM
Press, New York, 1997, pp. 220–229, available online at https://dl.acm.org/doi/pdf/10.1145/258533.258590.
31 Gordon John, “Terms Used in the disciplines of Cryptography, IT Security and Risk Analysis,” Journal of
Craptology, Vol. 0, No. 0, December 1998. See http://www.anagram.com/jcrap/ for more information on this
humorous journal, including the complete contents.
480 ◾ Secret History
Figure 16.6 Ralph Merkle, Martin Hellman, and Whitfield Diffie. (http:engineering.stanford.
edu/about/images/memories/pop_timemag.jpg Copyright Chuck Painter/Stanford News Service).
Merkle was enrolled in CS 244 Computer Security at the University of California at Berkeley
in the fall of 1974, his last semester as an undergraduate. This course required a project. Each
student had to submit two proposals and the professor would then use his broader experience to
help steer the students in the right direction. For Project 1, Merkle proposed a scheme for public
key cryptography, something that had not yet been done. The work of Diffie and Hellman was
discussed earlier in this text, but it came later, historically. Merkle was the first. The first page of
his proposal is reproduced in Figure 16.7. Be sure to read the professor’s comment at the top.32
This proposal continued almost to the end of a sixth page. By contrast, Merkle’s second project
proposal followed, but weighed in at only 22 words, not counting the final sentence, “At this point,
I must confess, that I am not entirely thrilled by the prospect of engaging in this project, and will
expand upon it only if prodded.”
Following his professor’s negative reaction to his proposal, Merkle rewrote it, making it shorter
and simpler. He showed the rewrite to the professor, but still failed to convince him of its value.
Merkle then dropped the class, but didn’t give up on his idea. He showed it to another faculty
member who said, “Publish it, win fame and fortune!”33
So, in August of 1975, Merkle submitted a paper to Communications of the ACM, but a reviewer
wrote the following to the editor:34
I am sorry to have to inform you that the paper is not in the main stream of pres-
ent cryptography thinking and I would not recommend that it be published in the
Communications of the ACM.
32 The full proposal may be found at Merkle, Ralph C., Publishing a New Idea, http://www.merkle.com/1974/.
This is the source for the page reproduced here.
33 Merkle, Ralph C., Publishing a New Idea, http://www.merkle.com/1974/.
34 Merkle, Ralph C., Publishing a New Idea, http://www.merkle.com/1974/.
Primality Testing and Complexity Theory ◾ 481
The editor, in her rejection letter to Merkle, added that she “was particularly bothered by the fact
that there are no references to the literature.”35 There couldn’t be any references to the literature,
because it was a brand-new idea! On a personal note, knowing that reviewers will sometimes look
at a paper’s reference list before reading it, to see if the author has “done his homework,” and then
factor that into their decision, I always make sure my papers have many references.
Merkle didn’t give up. He revised the paper and eventually (almost three years later!) it was
published in Communications of the ACM.36 By this time, Merkle was not the first to publish on
the topic of public key cryptography, although he was the first to conceive it, write it up, and sub-
mit it. What lessons can we learn from this? I think there are three.
1. Undergraduates can make important contributions. We’ve seen that repeatedly in the his-
tory of cryptology.
2. Be persistent. If Merkle had let the negative reactions of his professor, the reviewer, and the
editor discourage him, his discovery would never have been recognized.
3. Communication skills are important, even for math and computer science students. It
doesn’t do you any good to be the deepest thinker on a topic, if you cannot eventually get
your ideas across to duller minds. I think writing skills are important, even if the under-
graduates I know don’t all agree. Merkle’s professor eventually became aware that he had
rejected a good idea. He blamed his error on a combination of “Merkle’s abstruse writing
style and his own failings as a mathematician.”37
In Merkle’s system, two people wishing to communicate over an insecure channel who have not
agreed on a key ahead of time, may come to an agreement over the insecure channel. Merkle
referred to these people as X and Y (this was before Alice and Bob came on the scene). Person
X sends person Y “puzzles.” These puzzles can take the form of enciphered messages, using any
traditional symmetric system. Merkle’s example was Lucifer, a precursor to DES. The key can
be weakened by only using 30 bits of it, for example, and fixing the rest. It should be weakened
enough that the recipient can break it with some effort. It’s assumed that no method quicker than
brute-force is available for solving these puzzles.
Upon receiving N of these puzzles, person Y tries to break one. Whichever puzzle Y selects,
the plaintext will be a unique puzzle number and a unique key intended for use with a traditional
system. Y then sends the puzzle number back to X. Having kept a list of the puzzle numbers and
corresponding keys that he sent to Y, X now knows what key Y has revealed by solving that par-
ticular puzzle. Now that both X and Y have agreed on a key, they may use it to converse securely
over the insecure channel. The only information an eavesdropper will have picked up is what
puzzle number led to the key, not the key itself. The eavesdropper could solve all of the puzzles and
thereby uncover the key, but if N is large, this may not be practical. On average, an eavesdropper
would find the key after cracking half of the messages.
One of the seven references in the published version of Merkle’s paper was “Hiding Information
and Receipts in Trap Door Knapsacks” by Merkle and Hellman, which had been accepted to
appear in IEEE Transactions on Information Theory. We’ll now examine the idea presented in that
paper.
S = {4, 8, 13, 17, 21, 33, 95, 104, 243, 311, 400, 620, 698, 805, 818, 912}
Given a selection of elements of S, their sum is easily computed, but the inverse problem
appears intractable (not in P), so we’ll use it for a one-way function.
Our toy example for S has 16 elements. We can represent any combination of them with a
16-bit string. For example 0000000000010100 represents 620 + 805 = 1425. Defining a function
f that takes such strings to the sums they represent, we can write f(0000000000010100) = 1425.
Now, if we wish to send the message HELP IS ON THE WAY, we may convert each of the let-
ters to their ASCII bit representations and encipher in 16-bit blocks.38 Of course, this is just a fancy
way of making a digraphic substitution (as each character is 8 bits) and it is not a public key system.
Also, we need to be careful! Is it possible that 1425 has more than one solution? If so, decipherment
will not be unique and the recipient will have to make choices. If S is chosen carefully, this problem
can be avoided. The trick is to make the elements of S such that, when ordered from least to greatest,
the value of every element exceeds the sum of all those that came before.39 However, this ruins our
one-way function. Although each number that can be obtained from summing elements of S now
has only one such solution, it is easy to find—we simply apply the greedy algorithm of taking the
largest element of S that doesn’t exceed our given number and, after subtracting it form our desired
total, repeat the process as many times as necessary until we get to zero.
Fortunately, there is a way to disguise our knapsack so that an attacker will not be able to take
advantage of this simple approach. We multiply every element in our knapsack by some number m
and then reduce the result modulo n. The modulus needs to exceed the sum of the elements of S,
and m should be relatively prime to n, to guarantee m−1 exists modulo n. The disguised knapsack
serves as the public key and is used as described above. The private portion of the key is m −1 and
n. It is also a good idea to keep m secret, but it isn’t needed for decryption. The recipient simply
multiplies each block by m −1 modulo n and uses this value to solve the knapsack problem using the
fast algorithm that goes with the original superincreasing knapsack. The answer he gets will also
work for the value sent using the disguised knapsack, and thus yield the desired plaintext.
Example 7
We start with the superincreasing knapsack
S = {5, 7, 15, 31, 72, 139, 274, 560, 1659, 3301, 6614, 13248, 26488, 53024, 106152, 225872}
The total of our knapsack elements is 437461. Because n must exceed this sum, we take n =
462193. Now we pick an m relatively prime to n, such as m = 316929. Multiplying every element
in our knapsack by m (mod n) gives the disguised knapsack
38 All you need to know to understand what follows is that ASCII assigns values to each character and those for
the capital letters are A = 65, B = 66, C = 67,…, Z = 90. These values may then be converted to binary (base 2).
39 The technical term for such a set is superincreasing.
484 ◾ Secret History
mS = {198066, 369731, 132005, 118746, 171431, 144796, 408455, 460321,
271770, 239870, 123151, 114180, 3893, 430202, 80931, 10862}
We then randomly scramble the order of our knapsack to further disguise it, using the key
10, 7, 14, 2, 8, 5, 16, 11, 9, 6, 15, 12, 1, 3, 4, 13
to get:
mS = {239870, 408455, 430202, 369731, 460321, 171431, 10862, 123151,
271770, 144796, 80931, 114180, 198066, 132005, 118746, 3893}
We can now reveal the elements of mS and the value of the modulus n to the world and anyone
wanting to send a message can, but to decipher, we need m −1 (mod n). We may find this by using
the Euclidean algorithm, as demonstrated in Section 14.3. We get m −1 = 304178.
If someone wants to send the message WELL DONE, he or she would have to encipher the
letters in pairs. The first pair WE is represented in ASCII by 87 and 69. In binary this is 01010111
and 01000101. Running all 16 bits together we have 0101011101000101. The positions of the 1s
indicate that we should take the reordered mS knapsack values in positions 2, 4, 6, 7, 8, 10, 14,
and 16 and add them together. We get
Reducing this modulo n = 462193, we get 439938 as the final ciphertext, C. The rest of the
pairs of message letters may be enciphered in the same way.
Once the enciphered message is received, the recipient begins the decipherment process by
multiplying the first portion of the ciphertext by m −1 (mod n), where n = 462193. This is m −1C =
(304178)(439938) = 133819460964 = 259481 (mod n).
The recipient then turns to the original knapsack
S = {5, 7, 15, 31, 72, 139, 274, 560, 1659, 3301, 6614, 13248, 26488, 53024, 106152, 225872}
and finds values that sum to 259481 using the greedy algorithm of looking for the largest number
in the knapsack that doesn’t exceed 259481. Once it is found, that number is subtracted from
259481 and the search is repeated using the reduced number to get
S = {3301, 274, 53024, 7, 560, 72, 225872, 6614, 1659, 139, 106152, 13248, 5, 15, 31, 26488}
The knapsack values used to get the sum 259481 are now in positions 7, 16, 8, 2, 10, 6, 14, and
4. Ordering these, the recipient has 2, 4, 6, 7, 8, 10, 14, and 16 and constructs a 16-bit number
with 1s in those positions. This is 0101011101000101. Splitting into bytes, converting to base 10,
and finally to the letters those numbers represent in ASCII, the recipient gets 01010111 01000101
= 87 69 = WE.
So, we now have a public key cryptosystem based on an NP-complete problem, but you should
not be overconfident! As was mentioned before, a clever cryptanalyst might find a method of attack
Primality Testing and Complexity Theory ◾ 485
that avoids the intractable problem. Indeed, such was the case here. Obviously, if the attacker mul-
tiplies the disguised knapsack by m −1 and mods out by n, he’ll be able to read messages as easily as
the intended recipient. That’s why m −1 and n are kept secret. However, m −1 and n aren’t the only
numbers that will work. It turns out that any multiplier and modulus that yield a superincreasing
set will serve to break the system. The attacker doesn’t even need any ciphertext to begin his work.
He may start the attack as soon as the public key is made public! Once he recovers a superincreas-
ing knapsack (not necessarily the one the user is keeping secret), he’s ready to read any message
using the original public key.
Even if the attack described above wasn’t possible, there would still be a serious flaw. An
attacker could simply encipher every 16-bit combination, recording the results, and then crack any
ciphertext by doing a reverse look-up in the table thus created. This could be averted by using a
larger knapsack, and thus enciphering large blocks that would block the brute-force attack on the
message space. However, the attack above still stands.
The moral of this story is that a system built on a secure foundation isn’t necessarily secure!
Over the years various revisions of the knapsack idea have been put forth and broken. This has
happened to other systems, such as matrix encryption. After several rounds of cracks and patches,
most cryptographers consider a system too flawed to become workable and lose interest in further
iterations.
There’s another story connected with knapsack encryption that has a more important moral.
At the Crypto ’82 conference held at University of California, Santa Barbara, Leonard Adleman
gave a dramatic talk. While he was describing an approach that could be used to break a vari-
ant of the Merkle-Hellman knapsack cipher, he was running a program on an Apple II personal
computer to actually demonstrate the break! I’ll let Hellman offer his perspective on this talk:40
He started his lecture by saying, “First the talk, then the public humiliation.” I was
livid! Cryptography is more an art than a science, and all of the top researchers had
come up with at least one system that had been broken. Why did he have to say he was
going to humiliate me?
Later, I realized he was talking about himself, not me. He was afraid the computer
would crash or some other problem would prevent him from proving that his approach
worked. This experience of mine is a great example of why Dorothie [his wife] is right
about giving everyone the benefit of the doubt, at least initially.
As it turned out, the computer did not crash. The reactions of some of the participants at the pro-
gram’s success can be seen in Figure 16.8.
Hellman’s moral to this story is “Get curious, not furious.” When you are upset by someone’s
words or actions, instead of becoming angry, become curious as to the person’s motivations. Perhaps
your first assumptions are wrong and the person didn’t mean any harm. The book by Hellman from
which I excerpted the paragraphs above explains how simple approaches like this can improve your
personal relationships, as well as international relationships. If such ideas would only catch on, they’d
have a much bigger impact on the world than Hellman’s work in cryptology. Spread the word!
40 Hellman, Dorothie and Martin Hellman, A New Map for Relationships: Creating True Love at Home & Peace
on the Planet, New Map Publishing, 2016, p. 138. The website https://anewmap.com/ includes a link for a free
download of the eBook version.
486 ◾ Secret History
Figure 16.8 Leonard Adleman (left), Adi Shamir (right), and Martin Hellman (by the overhead
projector) watch as knapsack encryption is broken. (Courtesy of Len Adleman.)
Figure 16.9 Taher Elgamal (1955–) (Creative Commons Attribution 3.0 Unported license, by
Wikipedia user Alexander Klink, https://commons.wikimedia.org/wiki/File:Taher_Elgamal_
it-sa_2010.jpg.).
The difficulty of solving the discrete log problem (see Section 14.2) was used by Taher Elgamal41
(Figure 16.9), an Egyptian-born American cryptographer, to create the Elgamal public key cryp-
tosystem in 1985.42 Alice and Bob will illustrate how Elgamal works.
41 You will see Elgamal spelled El Gamal and ElGamal in the literature. In this text, the name is spelled as
Elgamal spells it.
42 Elgamal, Taher, “A Public Key Cryptosystem and a Signature Scheme Based on Discrete Logarithms,” IEEE
Transactions on Information Theory, Vol. 31, No. 4, July 1985, pp. 469–472.
Primality Testing and Complexity Theory ◾ 487
Alice begins by picking a large prime p and a generator g of the multiplicative group of
integers modulo p. Recall from Section 14.2 that taking consecutive powers of a generator
(mod p), until we reach 1, will result in the entire group being generated. Alice then selects
a private key a and computes A = ga mod p. She publishes g, p, and A, only keeping the value
a secret.
Bob wishes to send a message M, which he must put in the form of a number between 2 and
p. If the message is too long for this, it can be broken into pieces. Bob then selects a key k (mod p)
and computes C1 = gk (mod p) and C2 = MAk (mod p). He then sends Alice both C1 and C2. Thus,
this system has the disadvantage that the ciphertext is twice as long as the message.
To recover the message M, Alice computes x = C1a (mod p). From this, she is able to then cal-
culate x−1 by using the Euclidean algorithm, as detailed in Section 14.3.
Finally, Alice computes x−1C2 (mod p), which reveals the message, because
( ) ( ) ( ) ( ) ( )
−1 −1 −1 k −1
x −1C 2 = C1a C 2 = g ak MA k = g ak M ga = M g ak g ak = M .
The security of this system relies on the inability of an attacker to find the value of a when he
is provided with g, p, and A = ga (mod p). This is the discrete log problem.
Example 8
We may use the prime p = 2687 and the value g = 22 for illustrative purposes. If Alice’s private
key is a = 17, she must calculate A = ga (mod p). This is 2217 = 824 (mod 2687). She reveals to
the world p, g, and A. If Bob wants to send the brief message HI, he must convert it to numbers
first. We’ll simply replace the letter using the scheme A = 0, B = 1, …, Z = 25, but ASCII or
other methods may be used. We get 0708, or 708. For his key Bob randomly chooses k = 28.
He computes
( )
C 2 = MA k ( mod p ) = ( 708) 82428 ( mod 2687 ) = 1601
Bob sends Alice 55 and 1601. To read the message, Alice first computes x = C1a (mod p) = 5517
(mod 2687) = 841. Using the Euclidean algorithm, she finds the multiplicative inverse of 841
(mod 2687) is 2048. She then computes x−1C2 (mod p) = (2048)(1601) (mod 2687) = 708, which
under our encoding scheme, translate to HI.
Basing a cryptosystem on a “hard” problem is not an idea new to this chapter. We’ve already
examined a cipher based on the hardness of factoring, namely RSA. Recall that RSA is based on
the factoring problem, but it is not known to be equivalent to that problem. Nor do we know that
factoring is NP-complete. It might fall in P, like primality testing. The same is true for the discrete
log problem. We don’t know if it is in P or NP-complete. The Merkle-Hellman system and the
McEliece system (alluded to briefly) are the only systems in this chapter based on problems known
to be NP-complete.
See Figure 16.10 for a photograph of the key players in the world of public key cryptography.
488 ◾ Secret History
Figure 16.10 The gangs all here! From left to right: Adi Shamir, Ron Rivest, Len Adleman,
Ralph Merkle, Martin Hellman, and Whit Diffie. Notice that none of these cryptologic all-stars
is wearing a tie. It’s what’s in your head that matters! (Picture courtesy of Eli Biham, taken at the
presentation on August 21 at Crypto 2000, an IACR conference. The 21 at the bottom right is
part of a date stamp that was mostly cropped out for reproduction here.)
McGregor-Dorsey, Zachary S., “Methods of Primality Testing”, MIT Undergraduate Journal of Mathematics,
Vol. 1, 1999, pp. 133–141. This very nice history is available online at http://www-math.mit.edu/
phase2/UJM/vol1/DORSEY-F.PDF.
Miller, Gary L., “Riemann’s Hypothesis and Tests for Primality,” Journal of Computer and System Sciences,
Vol. 13, No. 3, December 1976, pp. 300–317.
Rabin, Michael O., “Probabilistic Algorithm for Testing Primality,” Journal of Number Theory, Vol. 12,
No. 1, February 1980, pp. 128–138.
Ramachandran, R., “A Prime Solution,” Frontline, Vol. 19, No. 17, August 17–30, 2002, available online at
http://www.flonnet.com/fl1917/19171290.htm. This is a popular piece on AKS and the men behind it.
Ribenboim, Paulo, The Book of Prime Number Records, second edition, Springer, New York, 1989.
Ribenboim, Paulo, The Little Book of Big Primes, Springer, New York, 1991. This is a condensed version of
The Book of Prime Number Records.
Ribenboim, Paulo, The New Book of Prime Number Records, Springer, New York, 1996. This is an update of
The Book of Prime Number Records.
Ribenboim, Paulo, The Little Book of Bigger Primes, second edition, Springer, New York, 2004. This is a
condensed version of The New Book of Prime Number Records.
Robinson, Sara, “Researchers Develop Fast Deterministic Algorithm for Primality Testing,” SIAM News,
Vol. 35, No. 7, September 2002, pp. 1–2.
The following books each list the bracketed number as the first prime. Can anyone continue the sequence?
[1] The 1986 Information Please Almanac, 39th edition, Houghton Mifflin Company, Boston,
Massachusetts, p. 430.
[2] Ribenboim, Paulo, The New Book of Prime Number Records, Springer, New York, 1996, p. 513.
[3] Garrett, Paul, Making, Breaking Codes: An Introduction to Cryptology, Prentice Hall, Upper Saddle
River, New Jersey, 2001, p. 509.
More Primes? The following books indicate the bracketed number is prime.
[4] Posamentier, Alfred S., and Ingmar Lehmann, The (Fabulous) Fibonacci Numbers, Prometheus Books,
Amherst, New York, 2007, p. 333. Typos aside, this is a very good book.
[27] King, Stephen, Dreamcatcher, Scribner, New York, 2001, p. 211.
On Complexity Theory
Fortnow, Lance and Steve Homer, A Short History of Computational Complexity, November 14, 2002, http://
people.cs.uchicago.edu/∼fortnow/papers/history.pdf.
Garey, Michael R. and David S. Johnson, Computers and Intractability: A Guide to the Theory of
NP-Completeness, W. H. Freeman and Co., New York, 1979. This book lists over 300 NP-complete
problems.
Hartmanis, Juris and Richard E. Stearns. “On the Computational Complexity of Algorithms,” Transactions
of the American Mathematical Society, Vol. 117, No. 5, May 1965, pp. 285–306. Complexity Theory
begins here!
Talbot, John and Dominic Welsh, Complexity and Cryptography an Introduction, Cambridge University
Press, Cambridge, UK, 2006.
On Tetris
Breukelaar, Ron, Erik D. Demaine, Susan Hohenberger, Hendrik Jan Hoogeboom, Walter A. Kosters, and
David Liben-Nowell, “Tetris is Hard, Even to Approximate,” International Journal of Computational
Geometry and Applications, Vol. 14, No. 1, April 2004, pp. 41–68. This paper is the merged ver-
sion of two previous papers: “Tetris is Hard, Even to Approximate” by Erik D. Demaine, Susan
Hohenberger, and David Liben-Nowell, and “Tetris is Hard, Made Easy” by Ron Breukelaar, Hendrik
Jan Hoogeboom, and Walter A. Kosters.
490 ◾ Secret History
Breukelaar, Ron, Hendrik Jan Hoogeboom, and Walter A. Kosters, “Tetris is Hard, Made Easy, Technical
Report 2003-9, Leiden Institute of Advanced Computer Science, Universiteit Leiden, 2003.
Demaine, Erik D., Susan Hohenberger, and David Liben-Nowell, “Tetris is Hard, Even to Approximate,”
in Warnow, Tandy and Binhai Zhu, editors, Computing and Combinatorics, 9th Annual International
Conference (COCOON 2003), Lecture Notes in Computer Science Vol. 2697, 2003, Springer, Berlin,
Germany, 2003, pp. 351–363.
Peterson, Ivars, “Tetris Is Hard,” Ivars Peterson’s MathTrek, Mathematical Association of America, October
28, 2002, http://web.archive.org/web/20120120070205/http://www.maa.org/mathland/mathtrek_
10_28_02.html.
On McEliece’s System
Chabaud, Florent, “On the security of Some Cryptosystems Based on Error-Correcting Codes,” in De
Santis, Alfredo, editor, Advances in Cryptology – EUROCRYPT ’94 Proceedings, Lecture Notes in
Computer Science, Vol. 950, Springer, Berlin, Germany, 1995, pp. 131–139.
McEliece, Robert J., A Public-Key Cryptosystem Based on Algebraic Coding Theory, Deep Space Network
Progress Report 42-44, Jet Propulsion Laboratory, California Institute of Technology, January and
February 1978, pp. 114–116.
On Elgamal
Elgamal, Taher, “A Public Key Cryptosystem and a Signature Scheme Based on Discrete Logarithms,” in
Blakley, G. Robert and David Chaum, editors, Advances in Cryptology: Proceedings of CRYPTO 84,
Lecture Notes in Computer Science, Vol. 196, Springer, Berlin, Germany, 1985, pp. 10–18.
Elgamal, Taher, “A Public Key Cryptosystem and a Signature Scheme Based on Discrete Logarithms,” IEEE
Transactions on Information Theory, Vol. 31, No. 4, July 1985, pp. 469–472.
Niehues, Lucas Boppré, Joachim von zur Gathen, Lucas Pandolfo Perin, and Ana Zumalacárregui, “Sidon
Sets and Statistics of the ElGamal Function,” Cryptologia, Vol. 44, No. 5, September 2020, pp.
438–450.
On Improving Relationships
Hellman, Dorothie and Martin Hellman, A New Map for Relationships: Creating True Love at Home & Peace
on the Planet, New Map Publishing, 2016. The website https://anewmap.com/ includes a link for a
free download of the eBook version.
Chapter 17
Authenticity
There are many situations in which we would like to be sure that a message was actually composed
and sent by the person it appears to be from. This is referred to as the authenticity of the message
and it is not a new problem.
We thank you for the large deliveries of arms and ammunitions which you have been
kind enough to send us. We also appreciate the many tips you have given us regarding
your plans and intentions which we have carefully noted. In case you are concerned
about the health of some of the visitors you have sent us you may rest assured they will
be treated with the consideration they deserve.
In addition to the security checks mentioned above, there are other ways to verify a user’s identity.
Just as you can recognize the voice of a friend on the telephone, a telegraph operator has a style
that can be recognized by other operators. This is referred to as the fist of the sender and can be
depicted graphically (Figures 17.1 and 17.2).
1 Marks, Leo, Between Silk and Cyanide: a Codemaker’s War, 1941-1945. The Free Press, New York, 1998, p. 522.
493
494 ◾ Secret History
Figure 17.1 Each line represents a distinct “fist” sending the same alphabet and numbers.
(From the David Kahn Collection, National Cryptologic Museum, Fort Meade, Maryland.)
Figure 17.2 The change in this agent’s fist was the result of his enlisting another operator’s
help. (From the David Kahn Collection, National Cryptologic Museum, Fort Meade, Maryland.)
This was known to the Nazis as well as the British. Just as a skilled impersonator might be able
to use his voice to fool you into thinking he’s someone else, one telegrapher can imitate the fist
of another. The Nazis had success in this arena, but fists were used against them to track U-boats
by tracking the recognizable style of each of their radio operators. In today’s world, the need for
Authenticity ◾ 495
certainty of a sender’s identity continues to be of great importance, even in times of peace. We’ll
now look at some modern attempts to address this problem.
(M )
eb
da
( mod na ) ( mod nb )
Now Bob has twice as much work to do to recover the message, but he is the only one who can
do so, and Alice is the only one who could have sent it. Bob first applies his private key, then he
applies Alice’s public key, and finally he reads the message.
Atari® used RSA “as the basis of its protection scheme on video game cartridges. Only car-
tridges that have been signed by the company’s public key work in the company’s video game
machines.”2 The signing capability of RSA makes it even more useful, but it does allow attacks not
discussed in Chapter 15.
(C ′ )d = (Cr e )d = ( M e r e )d = (( Mr )e )d = ( Mr )ed = ( Mr )1 = Mr
Seeing that it is a multiple of r, the value he randomly selected, he may multiply by the inverse
of r to get M.
But how could the attacker get the corresponding plaintext for the ciphertext C ′? Simple — he
sends it to the person who created C. It will look like any other legitimate enciphered message, so
the recipient will raise it to the power d to find the plaintext. It will be gibberish, but the recipi-
ent may simply assume it was garbled in transmission. The attacker then need only obtain this
“deciphered” message.
The first RSA attack examined in Chapter 15 was the common modulus attack. This can be
taken further, now that the signature aspect of RSA had been covered. John M. DeLaurentis did
2 Garfinkel, Simson, PGP: Pretty Good Privacy, O’Reilly & Associates, Sebastopol, California, 1995, p. 95.
496 ◾ Secret History
so in 1984, in a paper in which he showed that an insider posed a much greater threat than Eve.
He could, like Eve, read M, but he could also break the system completely, and be able to view all
messages and sign with anybody’s key.3
(e M )(d M ) − 1 = (2k )( s )
where s is an integer that must be odd.
Mallet may then choose a random integer a, such that 1 < a < n − 1. If a and n have a common
divisor greater than 1, then the Euclidean algorithm can be used to find it, and it will be one of
the primes, p or q, used to generate n. Knowing the factorization of n would then allow him to
determine the private key of anyone in the system. Thus, we assume a and n are relatively prime.
If Mallet can find a value x ≠ ±1 such that x 2 = 1 (mod n), he can use x to factor n by writing
x − 1 = 0 (mod n) and (x − 1)(x + 1) = 0 (mod n). A prime factor of n will then be provided by
2
either gcd(x − 1, n) or gcd(x + 1, n). We’ll return to this idea, but first, we make a few observations.
Because (eM )(dM ) = 1 (mod φ(n)), by definition, we have (eM )(dM ) − 1 = 0 (mod φ(n)). Mallet
can substitute into this last equality using the identity (eM )(dM ) − 1 = (2k)(s) to get (2k)(s) = 0 (mod
φ(n)). In other words (2k)(s) is a multiple of φ(n). Thus,
ks
a2 = 1 ( mod n ).
It might be possible to replace k with a smaller nonnegative integer such that the equality still
holds. Let k′ > 0 denote the smallest such number. Then,
k′ s
a2 = 1 ( mod n ).
So,
(a ) = 1 (mod n)
2
2k ′−1 s
Thus, Mallet has the desired value x in the form
k ′−1
s
a2
He could be unlucky and have
k ′−1 s
a2 = −1 ( mod n )
in which case it is a square root of 1 (mod n), but not one that is useful for factoring n. DeLaurentis
showed that this happens at most half of the time. So, if Mallet is unlucky, he can simply randomly
3 DeLaurentis, John M., “A Further Weakness in the Common Modulus Protocol for the RSA Cryptosystem,”
Cryptologia, Vol. 8, No. 3, July 1984, pp. 253–259.
Authenticity ◾ 497
choose another base a and try again. Once n is factored, Mallet can easily generate all of the pri-
vate keys.
Like the Miller–Rabin primality test, this is a probabilistic attack, but DeLaurentis went on to
show that assuming the Riemann hypothesis makes his attack deterministic. He also presented a
second attack that allows Mallet to read and sign messages without factoring n.
e M d M − 1 = kϕ (n ) for some positive integer k
1. f is a factor of kφ(n).
2. f is relatively prime to φ(n).
From this we may conclude that f is a factor of k. In other words, r is a multiple of φ(n). That
is, r = (k/f )(φ(n)).
Because r is relatively prime to φ(n), we may use the extended Euclidean algorithm to find
integers x and y such that xr + yeA = 1. In particular, we can find a pair x and y such that y is posi-
tive. Because r is a multiple of φ(n), when we reduce this last equality modulo φ(n), the xr term
drops out to leave yeA = 1 (mod φ(n)).
Thus, y fits the definition of a multiplicative inverse of eA (mod φ(n)). It might not be the
inverse that Alice actually uses, but it will work just as well. Mallet’s use of this value will result in
encipherments and signatures that are identical to those produced by Alice.
Thus, like Bob’s Elgamal-enciphered message to Alice (Section 16.8), Alice’s signed message
consists of two values, S1 and S2. These would be sent along with the message.
498 ◾ Secret History
Alice makes v, g, and p public. From these, her signature may be verified by computing
v S1 S1S2 (mod p ) and g M (mod p). If the signature is valid, both values will be the same. A few
simple steps show why these two values must match:
−1
1. The same message must always yield the same hash. That is, we have a hash function. This
will be important when it comes to verifying signatures based on the hash. We denote the
hash function by h(x).
2. Hashing should be fast. After all, speed is the drawback of using the methods previously
described.
3. It should be difficult (computationally infeasible) to find two messages that hash to the same
value. That is, we should not be able to find distinct x and x ′ such that h(x) = h(x ′ ). If this
happens, we say that a collision has been found. It would be very bad if a legitimately signed
document could have a portion replaced (with the new text having some other meaning)
and hash to the same value as the original. This would make the altered document appear to
be signed as well. However, any collision is seen as a serious threat, even if it is found using
two messages that look more like random text than meaningful messages. This is a very
interesting condition, as no hash function can be one to one—the whole idea is to create a
shorter message! Thus, collisions must exist, even for the best hash functions. Yet finding
them should be very difficult!
4. Computing any preimage should be difficult. That is, given the hash of a message, h(x), we
should not be able to recover the message or any other text y such that h(y) = h(x).
Hash functions came about in the 1950s, but they were not then intended to serve crypto-
graphic purposes.4 The idea of these functions was simply to map values in a dataset to smaller
values. In this manner, comparisons could be made more easily and searches could be accelerated.
In 1974, researchers who may not have been aware of hash functions, recognized that “hard-
to-invert” functions offered a way to check if a message had been changed.5 They credited the
idea to Gus Simmons, who had presented the idea to them “in connection with monitoring the
4 Knuth, Donald, The Art of Computer Programming – Sorting and Searching, Vol. 3, Addison-Wesley, Reading,
Massachusetts, 1973. See pp. 506–549.
5 Gilbert, Edgar N., Florence J. MacWilliams and Neil J. A. Sloane, “Codes Which Detect Deception,” The Bell
System Technical Journal, Vol. 53, No. 3, March 1974, pp. 405–424.
Authenticity ◾ 499
production of certain materials in the interest of arms limitation.”6 The cryptographic connection
was described formally in 1981 by Mark N. Wegman and Larry Carter, a pair of researchers who
did non-cryptographic work with hash functions in the late 1970s.7
SHA-1 forms part of several widely used security applications and protocols, includ-
ing TLS and SSL, PGP, SSH, S/MIME, and IPsec. Those applications can also use
6 Gilbert, Edgar N., Florence J. MacWilliams, and Neil J. A. Sloane, “Codes Which Detect Deception,” The Bell
System Technical Journal, Vol. 53, No. 3, March 1974, pp. 405–424, p. 406 quoted from here.
7 Wegman, Mark N., and J. Lawrence Carter, “New Hash Functions and Their Use in Authentication and Set
Equality,” Journal of Computer and System Sciences, Vol. 22, No. 3, June 1981, pp. 265–279.
8 http://en.wikipedia.org/wiki/MD5 has a clear (but brief) explanation of the algorithm. A more detailed expla-
http://eprint.iacr.org/2009/223.pdf.
11 Stamp, Mark and Richard M. Low, Applied Cryptanalysis: Breaking Ciphers in the Real World, Wiley-Interscience,
of SHA-0 and Reduced SHA-1,” in Cramer, Ronald, editor, Advances in Cryptology – EUROCRYPT 2005
Proceedings, Lecture Notes in Computer science, Vol. 3494, Springer, Berlin, Germany, 2005, pp. 36–57.
13 https://web.archive.org/web/20090117004931/http://www.nsa.gov/ia/programs/suiteb_cryptography/index.
shtml.
500 ◾ Secret History
MD5 … SHA-1 hashing is also used in distributed revision control systems such as
Git, Mercurial, and Monotone to identify revisions, and to detect data corruption or
tampering. The algorithm has also been used on Nintendo console Wii for signature
verification during boot, but a significant implementation flaw allowed an attacker to
bypass the security scheme.14
Its structure and components are quite different from its predecessors, and at first
sight it seems like a complete break with the past. In this article, researchers show that
KECCAK is the endpoint of a long learning process involving many intermediate
designs, mostly gradual changes, but also some drastic changes of direction.16
Daemen remarked, “Personally I consider my greatest result Keccak and not Rijndael. In fact
its design contains the best elements of all cryptographic research I’ve done since I’ve started in
1989.”17
While mathematicians are working out security issues, world governments are racing ahead.
Bill Clinton was the first U.S. President to use digital signatures. These were applied, most suit-
ably, to an e-commerce treaty with Ireland in September 1998 and, in 2000, a bill that gave digital
signatures the same status, legally, as traditional signatures.
You may have noticed that I didn’t provide details of any of the algorithms mentioned above. It’s
not because these details are unimportant. It’s simply because this is one of the few areas in cryptol-
ogy that I could never get too excited about. I think part of the problem for me is that, unlike cipher
systems, there’s no message to uncover. One usually just looks for collisions. The References and
Further Reading list at the end of this chapter provides several books that devote serious space to the
topic of hash functions, as well as some important papers, if you’re interested. We now move on to
how hash functions can be used to protect passwords, a topic I find much more interesting.
the result with a value that is stored. Because computing any preimage should be difficult for a
good hash function, someone gaining access to the hashed values shouldn’t be able to determine
the passwords.
In the first edition of this book, I wrote, “The important idea of storing the password in a
disguised form doesn’t seem to be strongly associated with anyone. Somebody had to be first to
hit on this idea, but I haven’t been able to learn who.” After quoting what was clearly an incorrect
account of this innovation from Wikipedia, I wrote, “Anyone who can provide a definitive first is
encouraged to contact me!” I soon heard from Steven Bellovin (see Section 2.12). He wrote:
While visiting Cambridge (my 1994 visit, I think), I was told by several people that
Roger Needham had invented the idea. I asked him why he had never claimed it
publicly; he said that it was invented at the Eagle Pub – then the after-work gather-
ing spot for the Computer Laboratory – and both he and the others present had had
sufficiently much India Pale Ale that he wasn’t sure how much was his and how much
was anyone else’s…18
Ross Anderson noted, “For some time Roger Needham and Mike Guy each credited the other
with the invention.”19 In any case, Maurice Wilkes was the first to publish the solution (in 1968),
giving Needham full credit.20 Needham’s good night in the pub was back in 1967.21 A decade
later, he continued to pursue his research in the same manner. Sape Mullender, a PhD student at
the time, recalled:
The first time I met Roger was in the late seventies and, at the end of the afternoon, a
chunk of the department, including Roger, went to the Eagle, the pub in Bene’t Street.
Until the Computer Laboratory moved to the outskirts of the city, that was the place
where the day’s events were discussed almost daily. It’s also the pub where Crick and
Watson contemplated DNA and came up with the double helix structure.22
I spent a sabbatical in Cambridge in 1987/1988 and daily trips to the Eagle were still
very much part of the culture. Roger would definitely join a few times a week. During
that time, the Eagle was renovated and lost some of its dingy charm, so excursions
further afield were also undertaken. The Bath and the Mill became frequent haunts.
I think the reason those places became so inspirational was that (1) students were
relaxed when there and that’s a condition for having brainwaves and (2) mingling took
place between different groups and this gave an opportunity for fresh ideas to come to
some of the problems.23
ian.com/news/2003/mar/10/guardianobituaries.microsoft.
22 Email from Sape Mullender to the author, May 20, 2020.
23 Email from Sape Mullender to the author, May 20, 2020.
502 ◾ Secret History
Cipher Deavours, perhaps unaware of Needham’s role in this story, wrote, “Non-classified work
on such functions dates back to the early 1970s.”24 One would assume that the government had
studied the problem before then. Deavours went on to reference a 1977 paper by Peter J. Downey
that describes a system in use in 1972.25 This system’s initial step was vaguely described as “some
preliminary reductions in [the password’s] length.” But after that, the next step is crystal clear. The
password, p, was enciphered as
Downey broke this system, recovering passwords from their enciphered values. Doing so only
required computing the multiplicative inverse of 216 (mod 1019 − 1). It’s a wonder anyone thought
this could be secure!
Hashes used in this manner needn’t decrease the size of the data being “hashed,” but they must
have the property that preimages are difficult to compute. Sometimes such hashes are referred to
as one-way functions; however, there is no mathematical proof that one-way functions even exist!
Some other references to password protection from the early 1970s are listed below:
Bartek, Douglas J., “Encryption for Data Security,” Honeywell Computer Journal,
Vol. 8, No. 2, September 1974, pp. 86–89.
Evans, Jr., Arthur, William Kantrowitz and Edwin Weiss, “A User Authentication
Scheme not Requiring Secrecy in the Computer,” Communications of the ACM,
Vol. 17, No. 8, August 1974, pp. 437–442.
Purdy, George B., “A High Security Log-in Procedure,” Communications of the ACM,
Vol. 17, No. 8, August 1974, pp. 442–445.
Wilkes, Maurice V., Time-Sharing Computer Systems, American Elsevier, New York,
1972. This is a second edition of the work that first presented Needham’s idea.
A 1979 paper by Robert Morris and Ken Thompson of Bell Labs is worth looking at in greater
detail.26 It gave the history of password security on the UNIX time-sharing system. At first, the pass-
words were enciphered by software simulating a cipher machine used by the United States during
WW II, namely the M-209 (see the last paragraph of Section 9.1). However, it turned out that the
machine wasn’t any better in this capacity than it was for keeping secrets long-term during the war.
It turned out that the M-209 program was usable, but with a given key, the ciphers
produced by this program are trivial to invert. It is a much more difficult matter to
find out the key given the cleartext input and the enciphered output of the program.
Therefore, the password was used not as the text to be encrypted but as the key, and
a constant was encrypted using this key. The encrypted result was entered into the
password file.27
24 Deavours, Cipher, “The Ithica Connection: Computer Cryptography in the Making, A Special Status Report,”
Cryptologia, Vol. 1, No. 4, October, 1977, pp. 312–316, p. 313 cited here.
25 Downey, Peter J., Multics Security Evaluation: Password and File Encryption Techniques, ESD-TR-74-193,
Vol. III, Electronic Systems Division, Hanscom Air Force Base, Massachusetts, June 1977.
26 Morris, Robert and Ken Thompson, “Password Security: A Case History,” Communications of the ACM,
Vol. 22, No. 11, November 1979, pp. 594–597, quotation taken from online version at https://citeseerx.ist.psu.
edu/viewdoc/summary?doi=10.1.1.128.1635.
Authenticity ◾ 503
Morris and Thompson found that the M-209 simulation software had a fatal flaw — it was too
fast! This is not a normal complaint to have, but in this case it meant that a brute-force attack
could be run rapidly, testing a subset of possible keys that were more likely to be chosen by users.
These included short keys and words found in English dictionaries. To patch against this sort of
attack, a more up-to-date algorithm was modified (to become even slower) and applied.
The brute-force attack was now less of a threat, but an additional precaution was still taken —
users were urged to select “more obscure” passwords.29 Another improvement consisted of making
the password a “salted password.” This is an important feature that, like carefully choosing the
password, is still relevant today. Morris and Thompson explained:
The key search technique is still likely to turn up a few passwords when it is used on a
large collection of passwords, and it seemed wise to make this task as difficult as pos-
sible. To this end, when a password is first entered, the password program obtains a
12-bit random number (by reading the real-time clock) and appends this to the pass-
word typed in by the user. The concatenated string is encrypted and both the 12-bit
random quantity (called the salt) and the 64-bit result of the encryption are entered
into the password file.30
When the user later logs in to the system, the 12-bit quantity is extracted from the
password file and appended to the typed password. The encrypted result is required,
as before, to be the same as the remaining 64 bits in the password file. This modifi-
cation does not increase the task of finding any individual password, starting from
scratch, but now the work of testing a given character string against a large collection
of encrypted passwords has been multiplied by 4096 (212). The reason for this is that
there are 4096 encrypted versions of each password and one of them has been picked
more or less at random by the system.31
28 Morris, Robert and Ken Thompson, “Password Security: A Case History,” Communications of the ACM, Vol.
22, No. 11, November 1979, pp. 594–597, quotation taken from online version at https://citeseerx.ist.psu.edu/
viewdoc/summary?doi=10.1.1.128.1635.
29 Morris, Robert and Ken Thompson, “Password Security: A Case History,” Communications of the ACM, Vol.
22, No. 11, November 1979, pp. 594–597, quotation taken from online version at https://citeseerx.ist.psu.edu/
viewdoc/summary?doi=10.1.1.128.1635.
31 Morris, Robert and Ken Thompson, “Password Security: A Case History,” Communications of the ACM, Vol.
22, No. 11, November 1979, pp. 594–597, quotation taken from online version at https://citeseerx.ist.psu.edu/
viewdoc/summary?doi=10.1.1.128.1635.
504 ◾ Secret History
While DES was slow in software, it was fast on a chip. To prevent someone from running the
attack through a commercial DES chip, the software used didn’t perfectly match the DES algo-
rithm. The expansion table, E, instead of being fixed, was made to depend on the 12-bit random
number.32
Robert Morris joined the National Security Agency in 1986.33
We now turn to a signature scheme that explicitly includes a hash function.
The inverse of k, needed for S2, is calculated (mod q). The two values S1 and S2 constitute the
signature for message M, and are sent with it.
To verify that a signature is genuine, we compute the following
U 1 = S2 −1hash( M ) (mod q )
U 2 = S1S2 −1 (mod q )
V = ( g U1 vU 2 (mod p )) (mod q )
32 Morris, Robert and Ken Thompson, “Password Security: A Case History,” Communications of the ACM, Vol.
22, No. 11, November 1979, pp. 594–597.
33 Markoff, John, “Robert Morris, Pioneer in Computer Security, Dies at 78,” The New York Times, June 29, 2011,
U1 requires calculating the inverse of S2. This is done modulo q. The signature, S1 S2, is deemed
genuine if V = S1. DSA creates signatures at the same speed as RSA, but requires 10 to 40 times as
long to verify signatures.36 It is quicker than Elgamal, though.
Example 1
As a small example to illustrate DSA, we choose the prime q = 53. We then need a much larger
prime p, such that p − 1 is divisible by q. We try 10q + 1 = 531, but it is divisible by 3. We are
also able to factor 12q + 1 = 637. Finally, 14q + 1 gives us the prime 743. So we now have q = 53
and p = 743 as our two primes. We also need an element g of order q in the multiplicative group
modulo p. So, we compute g = h(p − 1)/q (mod p), where h is an element of maximal order modulo p.
We quickly find that h = 5 is suitable. We then have g = 5(p − 1)/q (mod p) = 514 (mod p) = 212. We
randomly set s = 31. Then v = g s (mod p), so we get v = 21231 = 128. p, q, g, and v are made public,
but s is kept secret. We’re now ready to sign a message.
We’ll let the message be the single letter D, perhaps a grade someone will be receiving. We
represent it numerically as 3 (using our old scheme A = 0, B = 1,… Z = 25). Our signature requires
two numbers be calculated. These numbers require us to pick a random value k between 1 and
q − 1. We let k = 7. Then
becomes (ignoring the hash step and simply using the full message instead)
36 Schneier, Bruce, Applied Cryptography, second edition, John Wiley & Sons, New York, 1996, p. 485.
37 Schneier, Bruce, Applied Cryptography, second edition, John Wiley & Sons, New York, 1996, p. 486.
506 ◾ Secret History
government agencies, if they didn’t lie to us. The patent for DSA lists David W. Kravitz as the inven-
tor.38 He holds an undergraduate degree in mathematics and a Ph.D. in Electrical Engineering. He
created DSA during an 11-year stint at the NSA.39
Like another system NSA had a hand in, experts in the open community felt DSA’s key was
too small. Originally, the modulus was set at 512 bits, but because of these complaints, it was
adjusted so that it could range from 512 to 1024 bits, in 64-bit increments.40
Whoever thinks his problem can be solved using cryptography, doesn’t understand his problem
and doesn’t understand cryptography. — Attributed by Roger Needham and Butler Lampson
to each other
Biham, Eli, Rafi Chen, Antoine Joux, Patrick Carribault, William Jalby, and Christophe Lemuet. “Collisions
of SHA-0 and Reduced SHA-1,” in Cramer, Ronald, editor, Advances in Cryptology – EUROCRYPT
2005 Proceedings, Lecture Notes in Computer science, Vol. 3494, Springer, Berlin, Germany, 2005,
pp. 36–57.
DeLaurentis, John M., “A Further Weakness in the Common Modulus Protocol for the RSA Cryptosystem,”
Cryptologia, Vol. 8, No. 3, July 1984, pp. 253–259.
Dobbertin, Hans, “Cryptanalysis of MD4,” Journal of Cryptology, Vol. 11, No. 4, Fall 1998, pp. 253–271.
Elgamal, Taher, “A Public Key Cryptosystem and a Signature Scheme Based on Discrete Logarithms,” in
Blakley, G. Robert and David Chaum, editors, Advances in Cryptology: Proceedings of CRYPTO 84,
Lecture Notes in Computer Science, Vol. 196, Springer, Berlin, Germany, 1985, pp. 10–18.
Elgamal, Taher, “A Public Key Cryptosystem and a Signature Scheme Based on Discrete Logarithms,” IEEE
Transactions on Information Theory, Vol. 31, No. 4, July 1985, pp. 469–472.
Gilbert, Edgar N., Florence J. MacWilliams, and Neil J. A. Sloane, “Codes Which Detect Deception,” The
Bell System Technical Journal, Vol. 53, No. 3, March 1974, pp. 405–424.
38 Kravitz, David W., Digital signature algorithm, United States Patent 5,231,668, July 27, 1993, available online
at https://patents.google.com/patent/US5231668.
39 “About Dr. David W. Kravitz,” TrustCentral, https://trustcentral.com/about/about-dr-david-w-kravitz/.
40 Schneier, Bruce, Applied Cryptography, second edition, John Wiley & Sons, New York, 1996, p. 486.
41 Anderson, Ross, Security Engineering—Third Edition, https://www.cl.cam.ac.uk/~rja14/book.html.
Authenticity ◾ 507
Holden, Joshua, “A Good Hash Function is Hard to Find, and Vice Versa,” Cryptologia, Vol. 37, No. 2,
April 2013, pp. 107–119.
Horng, Gwoboa, “Accelerating DSA Signature Generation,” Cryptologia, Vol. 39, No. 2, April 2015,
pp. 121–125.
Kishore, Neha and Priya Raina, “Parallel Cryptographic Hashing: Developments in the Last 25 Years,”
Cryptologia, Vol. 43, No. 6, November 2019, pp. 504–535.
Menezes, Alfred J., Paul C. van Oorschot, Scott A. Vanstone, Handbook of Applied Cryptography, CRC
Press, Boca Raton, Florida, 1997. Chapter 9 (pp. 321–383) is focused on hash functions. This book
is freely available online in its entirety (780 pages) at http://labit501.upct.es/∼fburrull/docencia/
SeguridadEnRedes/teoria/bibliography/HandbookOfAppliedCryptography_AMenezes.pdf.
Morris, Robert and Ken Thompson, “Password Security: A Case History,” Communications of the ACM,
Vol. 22, No. 11, November 1979, pp. 594–597, available online at https://citeseerx.ist.psu.edu/view-
doc/summary?doi=10.1.1.128.1635 and https://dl.acm.org/doi/pdf/10.1145/359168.359172.
National Institute of Standards and Technology (NIST), Announcing the Standard for Secure Hash
Standard, Federal Information Processing Standards Publication 180-1, April 17, 1995, available online
at http://web.archive.org/web/20120320233841/http://www.itl.nist.gov/fipspubs/fip180-1.htm. This
describes SHA-1.
National Institute of Standards and Technology (NIST), Announcing the Secure Hah Standard, Federal
Information Processing Standards Publication 180-2 (+ Change Notice to include SHA-224), August
1, 2002, available online at http://csrc.nist.gov/publications/fips/fips180-2/fips180-2withchangeno
tice.pdf. This describes SHA-2.
National Institute of Standards and Technology (NIST), Digital Signature Standard (DSS), FIPS Publication
186, May 19, 1994, available online at https://web.archive.org/web/20131213131144/http://www.itl.
nist.gov/fipspubs/fip186.htm. Some revisions have been made over the years. As of July 2013, we have
186-4, available online at https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.186-4.pdf.
National Institute of Standards and Technology (NIST), Hash Functions, SHA-3 Project, https://csrc.
nist.gov/projects/hash-functions/sha-3-project. This website has many useful links related to SHA-3,
including the FIPS documents.
Pfitzmann, Birgit, Digital Signature Schemes: General Framework and Fail-Stop Signatures, Lecture Notes in
Computer Science, Vol. 1100, Springer, New York, 1991.
Preneel, Bart, “Cryptographic Hash Functions,” European Transactions on Telecommunications, Vol. 5,
No. 4, 1994, pp. 431–448.
Preneel, Bart, Analysis and Design of Cryptographic Hash Functions, doctoral dissertation, Katholieke
Universiteit Leuven, Belgium, February 2003, available online at http://homes.esat.kuleuven.
be/∼preneel/phd_preneel_feb1993.pdf.
Preneel, Bart, René Govaerts, and Joos Vandewalle, “Information Authentication: Hash Functions and
Digital Signatures,” in Preneel, Bart, René Govaerts, and Joos Vandewalle, editors, Computer Security
and Industrial Cryptography: State of the Art and Evolution, Lecture Notes in Computer Science,
Vol. 741, Springer, 1993, pp. 87–131.
Schofield, Jack, “Roger Needham,” The Guardian, March 10, 2003, available online at https://www.
theguardian.com/news/2003/mar/10/guardianobituaries.microsoft.
Stallings, William, “Digital Signature Algorithms,” Cryptologia, Vol. 37, No. 4, October 2013, pp. 311–327.
Stallings, William, Cryptography and Network Security: Principles and Practice, 8th edition, Pearson,
Edinburgh Gate, Harlow, Essex, UK, 2020. This is a comprehensive look at cryptography. No crypt-
analysis is present, but there is much material on hash functions.
Stamp, Mark, and Richard M. Low, Applied Cryptanalysis: Breaking Ciphers in the Real World, Wiley-
Interscience, Hoboken, New Jersey, 2007. Chapter 5 of this book (pp. 193–264) discusses cryptanaly-
sis of hash functions. The authors state in the conclusion for this chapter (p. 256), “For many years, it
seems that hash functions had been largely ignored by cryptographers. But with the successful attack
on MD5, and similar results for SHA-1 pending, hash functions have moved from a sleepy crypto-
graphic backwater to the forefront of research.”
Stevens, Marc, Elie Bursztein, Pierre Karpman, Ange Albertini, and Yarik Markov, Shattered,
https://s hattered.io/. This website is devoted to the breaking of SHA-1. It includes a link to
508 ◾ Secret History
the technical paper whose authors are listed here. Others contributed to the work detailed on
this page, as well.
Stevens, Marc, Pierre Karpman, and Thomas Peyrin, The SHAppening: freestart collisions for SHA-1, https://
sites.google.com/site/itstheshappening/.
Wegman, Mark N. and J. Lawrence Carter, “New Hash Functions and Their Use in Authentication and Set
Equality,” Journal of Computer and System Sciences, Vol. 22, No. 3, June 1981, pp. 265–279.
Wikipedia contributors, “Password,” Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/wiki/
Password.
Wilkes, M. V., Time-Sharing Computer Systems, American Elsevier, New York, 1968.
Winternitz, Robert S., “Producing a One-way Hash Function from DES,” in Chaum, David, editor,
Advances in Cryptology, Proceedings of Crypto ‘83, Plenum Press, New York, 1984, pp. 203–207.
Xie, Tao and Dengguo Feng, How To Find Weak Input Differences For MD5 Collision Attacks, May 30,
2009, http://eprint.iacr.org/2009/223.pdf.
Chapter 18
Entire messages can be enciphered with RSA, but it’s a slow algorithm. The competition, in the
late 1970s, was the Data Encryption Standard (DES), which was a thousand times faster. Yet for
DES, a key had to be agreed on ahead of time. So, what’s the solution? Which of these two system
should be used? The answer is both! Loren M. Kohnfelder suggested a hybrid system in his 1978
undergraduate thesis, written while he was studying electrical engineering at MIT.1 Whitfield
Diffie recalled ten years later how this was “hailed as a discovery in its own right.”2
1 Kohnfelder, Loren M., Toward a Practical Public Key Encryption Algorithm, undergraduate thesis, Department
of Electrical Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts, May 1978, avail-
able online at https://dspace.mit.edu/bitstream/handle/1721.1/15993/07113748.pdf?sequence=1. Kohnfelder’s
thesis advisor was Len Adleman.
2 Diffie, Whitfield, “The First Ten Years of Public-Key Cryptography,” Proceedings of the IEEE, Vol. 76, No. 5,
May 1988, pp. 560–577, p. 566 cited here.
509
510 ◾ Secret History
It should be noted that in having the best of both worlds by combining RSA and DES,
we run into the problem of a chain being only as strong as its weakest link. The RSA portion
may be ignored if DES can be broken and DES needn’t be attacked if the primes used for the
RSA portion are too small, or if another weakness is present in the RSA implementation. For
example, when using RSA on a short message, like a DES key, some padding (aka salt) needs to
be added to the message prior to encryption (see Section 15.1.11). Secure implementation of an
otherwise sound system is a highly non-trivial step. In addition to the normal security problems,
programmers in the 1980s also had to contend with machines much slower than what we take
for granted today.
3 It took 10 minutes to create a 256-bit key and 20-30 seconds to encipher a small file. See Garfinkel, Simson,
PGP: Pretty Good Privacy, O’Reilly & Associates, Sebastopol, California, 1995, p. 88.
4 Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New
York, 2001, p. 190
5 Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New
York, 2001, p. 195
Pretty Good Privacy and Bad Politics ◾ 511
shall ensure that communications systems permit the government to obtain the plaintext contents
of voice, data, and other communications when appropriately authorized by law.”6
There is no agreement on who initiated the particular provision of the bill quoted above. Some
say the Federal Bureau of Investigation was responsible, but according to the Electronic Frontier
Foundation (EFF), it was Joe Biden himself.7
Fearing that strong crypto was about to be outlawed, Zimmermann quickly finished his pro-
gram, relabeled it as freeware, and with the help of friends, began to distribute it on American
Internet sites in June 1991. He needn’t have rushed, because Biden reacted to anger over his pro-
posed anti-privacy legislation, by withdrawing that section from the Bill.8 Zimmermann still faced
legal troubles, though. Although he wasn’t directly responsible, PGP left the country within a day
of being posted online, in violation of encryption export laws.
Zimmermann’s cipher for his first hybrid system was based on work Merritt had done for the
Navy.9 After making his changes, he renamed it Bass-O-Matic, after a blender used in a Saturday
Night Live skit, in which Dan Aykroyd portrayed a salesman who used the machine to chop up a
fish.10 Bass-O-Matic proved to be the weak link in PGP, but Zimmermann had other problems.
RSA was patented and he did not have permission to make use of it.
Back in November 1986, Zimmermann had met with James Bidzos, the president of RSA
Data Security. It was mainly a meeting between Bidzos and Merritt, who did contract work for
RSA Data Security, but Zimmermann was there. Bidzos and Zimmerman didn’t get along at
all — they were complete political opposites. Zimmermann refused contract work from Bidzos
because his company had a military connection. As for Bidzos’s view of the military, he had joined
the U.S Marines, even though he was a citizen of Greece.11
Despite these differences, Bidzos gave Zimmermann a copy of Mailsafe. What he didn’t give
him was permission to use RSA in his own encryption program. Bidzos remembered this when
PGP appeared. Simson Garfinkel observed that, “What followed could only be described as a low-
intensity war by Bidzos against PGP and Zimmermann.”12
Zimmermann wasn’t the only one to infringe upon the RSA patent. The following excerpt
from a question-and-answer session at a Computers, Freedom, & Privacy conference in 1992,
6 Quote reproduced here from Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in
the Digital Age, Viking, New York, 2001, p. 195. Also, Garfinkel, Simson, PGP: Pretty Good Privacy, O’Reilly
& Associates, Sebastopol, California, 1995, p. 97.
7 Garfinkel, Simson, PGP: Pretty Good Privacy, O’Reilly & Associates, Sebastopol, California, 1995, p. 97.
businesses depend on strong encryption to protect themselves from spying competitors, so they found them-
selves on the same side as the privacy advocates.
9 Merritt had protested the war in Vietnam, but he didn’t see any conflict between that viewpoint and his
doing work for the Navy. See Garfinkel, Simson, PGP: Pretty Good Privacy, O’Reilly & Associates, Sebastopol,
California, 1995, p. 91.
10 Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New
shows that Bidzos could have a sense of humor about it.13 The question, left out here for brevity,
was initially answered by John P. Barlow (cofounder of the Electronic Frontier Foundation):
Barlow: The problem with trying to regulate the flow of information, which is a little
like trying to regulate the flow of wind, is that it’s quite possible to keep it out of
the hands of individuals and small institutions. It’s very difficult to keep it out of
the hands of large institutions, so you have, in effect, the situation where the Soviets
are using RSA in their launch codes and have for a long time, and yet we can’t use it as
individuals in the United States, you know, and that’s just dumb.
Bidzos: My revenue forecasts are being revised downward.
Barlow: You weren’t getting any royalties on that anyway, were you Jim?
Bidzos: Maybe.
Apparently, PGP didn’t initially bother NSA. Zimmermann soon learned (from Eli Biham)
that his system was vulnerable to differential cryptanlaysis.14 Other flaws were present, including
one that prevented the last bit of each byte from being properly encrypted.15 After the flaws were
pointed out, Zimmermann began working on version 2.0, but this time he got help from much
stronger cryptographers from around the world,16 who appreciated what he was trying to do. Bass-
O-Matic was replaced by a Swiss cipher, the International Data Encryption Algorithm (IDEA),
which offered a 128-bit key. Many other improvements were made and new features introduced.
PGP 2.0 was released from Amsterdam and Auckland (the hometowns of two of Zimmermann’s
new collaborators) in September 1992.17
In November 1993, following a deal Zimmermann made in August of that year, ViaCrypt
PGP Version 2.4 came out. ViaCrypt had a license for RSA, so their product was legal, but it was
indeed a product. Users had to pay $100 for it.18
In the meanwhile RSA Data Security had released a free version, RSAREF, for noncommercial
use. The first version had restrictions that prevented Zimmermann from making use of it within
PGP,19 but RSAREF 2.0 didn’t, and Zimmermann included it in PGP Version 2.5. Thus, this
version, while free, was only legal for noncommercial use. Soon thereafter, Version 2.6 appeared, an
update made to appease Bidzos who was furious. Version 2.6 was intentionally made incompatible
13 Cryptography and Control: National Security vs. Personal Privacy [VHS], CFP Video Library #210, Topanga,
California, March 1992, This 77 minute long tape shows a panel session, with questions and answers, from The
Second Conference on Computers, Freedom, & Privacy (CFP). See http://www.cfp.org/ for what this group
has done over the years and http://www.forests.com/cfpvideo/ for their full video catalog.
14 Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New
(Spain), Peter Simons (Germany). See Garfinkel, Simson, PGP: Pretty Good Privacy, O’Reilly & Associates,
Sebastopol, California, 1995, p. 103; Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving
Privacy in the Digital Age, Viking, New York, 2001, p. 200.
17 Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New
upon the RSA patent. See Garfinkel, Simson, PGP: Pretty Good Privacy, O’Reilly & Associates, Sebastopol,
California, 1995, p. 105.
Pretty Good Privacy and Bad Politics ◾ 513
with previous versions of PGP, beginning on September 1, 1994, so as not to allow the patent
infringers to continue without the upgrade.20 The program quickly appeared in Europe, in viola-
tion of the export laws of the time. This was followed by the appearance of PGP 2.6ui, which was
an “unofficial international” version that updated an older version in a way that made it compat-
ible with 2.6, for private or commercial use.
So in 1993 and 1994, PGP was legitimized in both commercial and freeware versions, but in
the meantime, Zimmermann began to have legal troubles with the U.S. Government. Beginning
in February 1993, Zimmermann faced investigators from the U.S. Customs Department. They
were concerned about how PGP was exported. It seems that they were looking for someone small
to make an example of. It would be easier to gain a prosecution of Zimmermann based on the
export laws, than to take on RSA Data Security (whose RSAREF had been exported) or Netcom
(whose FTP servers could be used to download PGP from abroad).21
This wasn’t Zimmermann’s first brush with the law. He had been arrested twice before at
nuclear freeze rallies in the 1980s. No charges were ever filed, though.22 Ultimately, the export
violation investigation ended the same way. In 1996, the Government closed the case without
charges being pressed. Still, it must have been a nerve-wracking experience, as he could have faced
serious jail time if convicted.
PGP 3 was not a minor upgrade. It used a new algorithm for encryption CAST-128 and
replaced the RSA component with a choice of DSA or Elgamal. Also, for the first time, the program
had a nice interface. All previous versions were run from the command line. ViaCrypt was still
producing commercial versions. They used even numbers for their versions, while Zimmermann
used odd numbers for the free versions, but the commercial version 4 was ready before the free
version 3, so Zimmermann renamed the free version PGP 5, for its May 1997 release.23
The commercial production of PGP software has changed hands several times since 1997.
Most recently, in 2010, Symantec Corp. bought PGP for $300 million.24 The company would cer-
tainly be worth far less if the export laws hadn’t been changed in 2000. Thus, all versions of PGP
became legal for export. Was this a victory for privacy advocates? Zimmermann wrote:25
The law changed because the entire U.S. computer industry (which is the largest,
most powerful industry in the U.S.) was united in favor of lifting the export controls.
Money means political influence. After years of fighting it, the government finally had
to surrender. If the White House had not lifted the export controls, the Congress and
the Judicial system were preparing to intervene to do it for them.
It was described by some as a victory for civil libertarians, but they only won because of their
powerful allies in industry.
There is much more that can be said about PGP. Back in Section 1.17 we briefly digressed into
data compression. This is relevant to PGP, which compresses files prior to encryption. Because
20 Garfinkel, Simson, PGP: Pretty Good Privacy, O’Reilly & Associates, Sebastopol, California, 1995, p. 108.
21 Garfinkel, Simson, PGP: Pretty Good Privacy, O’Reilly & Associates, Sebastopol, California, 1995, p. 112. The
opinion concerning why Zmmermann was targeted is my own.
22 Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New
at http://www.computerworld.com/s/article/9176121/Symantec_buys_encryption_specialist_PGP_for_300M.
25 http://www.philzimmermann.com/EN/faq/index.html.
514 ◾ Secret History
compression reduces redundancy, and redundancy is of great value to the cryptanalyst, this step
is well worth the extra time it takes. It should also be pointed out that PGP isn’t just for email. It
includes a feature allowing the user to apply conventional cryptography (no RSA involved here)
to compress and encipher files for storage.26 The user only needs to come up with a random pass-
phrase to use as a key.
It’s personal. It’s private. And it’s no one’s business but yours. You may be planning
a political campaign, discussing your taxes, or having a secret romance. Or you may
be communicating with a political dissident in a repressive country. Whatever it is,
you don’t want your private electronic mail (email) or confidential documents read by
anyone else. There’s nothing wrong with asserting your privacy. Privacy is as apple-pie
as the Constitution.
The right to privacy is spread implicitly throughout the Bill of Rights. But when
the United States Constitution was framed, the Founding Fathers saw no need to
explicitly spell out the right to a private conversation. That would have been silly.
Two hundred years ago, all conversations were private. If someone else was within
earshot, you could just go out behind the barn and have your conversation there. No
one could listen in without your knowledge. The right to a private conversation was a
natural right, not just in a philosophical sense, but in a law-of-physics sense, given the
technology of the time.
26 Zimmermann, Philip R., The Official PGP User’s Guide, The MIT Press, Cambridge, Massachusetts, 1995, p. 18
27 Reproduced from http://web.mit.edu/prz/, Zimmermann’s homepage.
Pretty Good Privacy and Bad Politics ◾ 515
But with the coming of the information age, starting with the invention of the tele-
phone, all that has changed. Now most of our conversations are conducted electroni-
cally. This allows our most intimate conversations to be exposed without our knowledge.
Cellular phone calls may be monitored by anyone with a radio. Electronic mail, sent
across the Internet, is no more secure than cellular phone calls. Email is rapidly replacing
postal mail, becoming the norm for everyone, not the novelty it was in the past.
Until recently, if the government wanted to violate the privacy of ordinary citizens,
they had to expend a certain amount of expense and labor to intercept and steam open
and read paper mail. Or they had to listen to and possibly transcribe spoken telephone
conversation, at least before automatic voice recognition technology became available.
This kind of labor-intensive monitoring was not practical on a large scale. It was only
done in important cases when it seemed worthwhile. This is like catching one fish at a
time, with a hook and line. Today, email can be routinely and automatically scanned
for interesting keywords, on a vast scale, without detection. This is like driftnet fish-
ing. And exponential growth in computer power is making the same thing possible
with voice traffic.
Perhaps you think your email is legitimate enough that encryption is unwarranted.
If you really are a law-abiding citizen with nothing to hide, then why don’t you always
send your paper mail on postcards? Why not submit to drug testing on demand? Why
require a warrant for police searches of your house? Are you trying to hide something?
If you hide your mail inside envelopes, does that mean you must be a subversive or
a drug dealer, or maybe a paranoid nut? Do law-abiding citizens have any need to
encrypt their email?
What if everyone believed that law-abiding citizens should use postcards for their
mail? If a nonconformist tried to assert his privacy by using an envelope for his mail, it
would draw suspicion. Perhaps the authorities would open his mail to see what he’s hid-
ing. Fortunately, we don’t live in that kind of world, because everyone protects most of
their mail with envelopes. So no one draws suspicion by asserting their privacy with an
envelope. There’s safety in numbers. Analogously, it would be nice if everyone routinely
used encryption for all their email, innocent or not, so that no one drew suspicion by
asserting their email privacy with encryption. Think of it as a form of solidarity.
Senate Bill 266, a 1991 omnibus anticrime bill, had an unsettling measure bur-
ied in it. If this non-binding resolution had become real law, it would have forced
manufacturers of secure communications equipment to insert special “trap doors” in
their products, so that the government could read anyone’s encrypted messages. It
reads, “It is the sense of Congress that providers of electronic communications services
and manufacturers of electronic communications service equipment shall ensure that
communications systems permit the government to obtain the plain text contents of
voice, data, and other communications when appropriately authorized by law.” It was
this bill that led me to publish PGP electronically for free that year, shortly before the
measure was defeated after vigorous protest by civil libertarians and industry groups.
The 1994 Communications Assistance for Law Enforcement Act (CALEA) man-
dated that phone companies install remote wiretapping ports into their central office
digital switches, creating a new technology infrastructure for “point-and-click” wire-
tapping, so that federal agents no longer have to go out and attach alligator clips to
phone lines. Now they will be able to sit in their headquarters in Washington and lis-
ten in on your phone calls. Of course, the law still requires a court order for a wiretap.
516 ◾ Secret History
But while technology infrastructures can persist for generations, laws and policies can
change overnight. Once a communications infrastructure optimized for surveillance
becomes entrenched, a shift in political conditions may lead to abuse of this new-
found power. Political conditions may shift with the election of a new government, or
perhaps more abruptly from the bombing of a federal building.
A year after the CALEA passed, the FBI disclosed plans to require the phone
companies to build into their infrastructure the capacity to simultaneously wiretap
1 percent of all phone calls in all major U.S. cities. This would represent more than
a thousandfold increase over previous levels in the number of phones that could be
wiretapped. In previous years, there were only about a thousand court-ordered wire-
taps in the United States per year, at the federal, state, and local levels combined. It’s
hard to see how the government could even employ enough judges to sign enough
wiretap orders to wiretap 1 percent of all our phone calls, much less hire enough
federal agents to sit and listen to all that traffic in real time. The only plausible way
of processing that amount of traffic is a massive Orwellian application of automated
voice recognition technology to sift through it all, searching for interesting keywords
or searching for a particular speaker’s voice. If the government doesn’t find the target
in the first 1 percent sample, the wiretaps can be shifted over to a different 1 per-
cent until the target is found, or until everyone’s phone line has been checked for
subversive traffic. The FBI said they need this capacity to plan for the future. This
plan sparked such outrage that it was defeated in Congress. But the mere fact that
the FBI even asked for these broad powers is revealing of their agenda. Advances in
technology will not permit the maintenance of the status quo, as far as privacy is
concerned. The status quo is unstable. If we do nothing, new technologies will give
the government new automatic surveillance capabilities that Stalin could never have
dreamed of. The only way to hold the line on privacy in the information age is strong
cryptography.
You don’t have to distrust the government to want to use cryptography. Your busi-
ness can be wiretapped by business rivals, organized crime, or foreign governments.
Several foreign governments, for example, admit to using their signals intelligence
against companies from other countries to give their own corporations a competitive
edge. Ironically, the United States government’s restrictions on cryptography in the
1990s have weakened U.S. corporate defenses against foreign intelligence and orga-
nized crime.
The government knows what a pivotal role cryptography is destined to play in
the power relationship with its people. In April 1993, the Clinton administration
unveiled a bold new encryption policy initiative, which had been under develop-
ment at the National Security Agency (NSA) since the start of the Bush administra-
tion. The centerpiece of this initiative was a government-built encryption device,
called the Clipper chip, containing a new classified NSA encryption algorithm. The
government tried to encourage private industry to design it into all their secure
communication products, such as secure phones, secure faxes, and so on. AT&T
put Clipper into its secure voice products. The catch: At the time of manufacture,
each Clipper chip is loaded with its own unique key, and the government gets to
keep a copy, placed in escrow. Not to worry, though–the government promises that
they will use these keys to read your traffic only “when duly authorized by law.” Of
Pretty Good Privacy and Bad Politics ◾ 517
course, to make Clipper completely effective, the next logical step would be to out-
law other forms of cryptography.
The government initially claimed that using Clipper would be voluntary, that no
one would be forced to use it instead of other types of cryptography. But the public
reaction against the Clipper chip was strong, stronger than the government antici-
pated. The computer industry monolithically proclaimed its opposition to using
Clipper. FBI director Louis Freeh responded to a question in a press conference in
1994 by saying that if Clipper failed to gain public support, and FBI wiretaps were
shut out by non-government-controlled cryptography, his office would have no choice
but to seek legislative relief. Later, in the aftermath of the Oklahoma City tragedy, Mr.
Freeh testified before the Senate Judiciary Committee that public availability of strong
cryptography must be curtailed by the government (although no one had suggested
that cryptography was used by the bombers).
The government has a track record that does not inspire confidence that they will
never abuse our civil liberties. The FBI’s COINTELPRO program targeted groups
that opposed government policies. They spied on the antiwar movement and the civil
rights movement. They wiretapped the phone of Martin Luther King Jr. Nixon had
his enemies list. Then there was the Watergate mess. More recently, Congress has
either attempted to or succeeded in passing laws curtailing our civil liberties on the
Internet. Some elements of the Clinton White House collected confidential FBI files
on Republican civil servants, conceivably for political exploitation. And some over-
zealous prosecutors have shown a willingness to go to the ends of the Earth in pursuit
of exposing sexual indiscretions of political enemies. At no time in the past century
has public distrust of the government been so broadly distributed across the political
spectrum, as it is today.
Throughout the 1990s, I figured that if we want to resist this unsettling trend in the
government to outlaw cryptography, one measure we can apply is to use cryptography
as much as we can now while it’s still legal. When use of strong cryptography becomes
popular, it’s harder for the government to criminalize it. Therefore, using PGP is good
for preserving democracy. If privacy is outlawed, only outlaws will have privacy.
It appears that the deployment of PGP must have worked, along with years of
steady public outcry and industry pressure to relax the export controls. In the clos-
ing months of 1999, the Clinton administration announced a radical shift in export
policy for crypto technology. They essentially threw out the whole export control
regime. Now, we are finally able to export strong cryptography, with no upper lim-
its on strength. It has been a long struggle, but we have finally won, at least on the
export control front in the US. Now we must continue our efforts to deploy strong
crypto, to blunt the effects increasing surveillance efforts on the Internet by various
governments. And we still need to entrench our right to use it domestically over the
objections of the FBI.
PGP empowers people to take their privacy into their own hands. There has been
a growing social need for it. That’s why I wrote it.
Philip R. Zimmermann
Boulder, Colorado
June 1991 (updated 1999)
518 ◾ Secret History
28 http://www.philzimmermann.com/EN/letters/index.html.
29 Levy, Steven, Crypto: How the Code Rebels Beat the Government, Saving Privacy in the Digital Age, Viking, New
York, 2001, p. 289.
30 Wikipedia contributors, “Pretty Good Privacy,” Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/
wiki/Pretty_Good_Privacy#Security_quality.
31 Chidi, George A., “Federal judge allows keyboard-stroke capture,” cnn.com, January 7, 2002, http://www.cnn.
com/2002/TECH/internet/01/07/fbi.surveillance.idg/index.html.
32 McCullagh, Declan, “Feds use keylogger to thwart PGP, Hushmail,” c|net, July 20, 2007, https://www.cnet.
com/news/feds-use-keylogger-to-thwart-pgp-hushmail/.
33 Poddebniak, Damian, Christian Dresen, Jens Müller, Fabian Ising, Sebastian Schinzel, Simon Friedberger,
Juraj Somorovsky, and Jörg Schwenk, “Efail: Breaking S/MIME and OpenPGP Email Encryption using
Exfiltration Channels,” 27th USENIX Security Symposium, Baltimore, Maryland, August 2018. See https://
efail.de/ for a link to this and much more.
Pretty Good Privacy and Bad Politics ◾ 519
Before I could answer, a coworker responded, “Why? Did you break up with him?” She immedi-
ately blushed. Her password was not well chosen.
The psychological approach was also applied by Richard Feynman at Los Alamos during the
Manhattan Project, when the secrets of the atomic bomb warranted the highest level of protection.
He turned to one of the filing cabinets that stored those secrets and guessed that the combination
lock that secured it might be set to an important mathematical constant. He tried π first, forwards,
backward, every way he could imagine. It didn’t work, but e did. The combination was 27-18-28
and Feynman soon found he could open all five filing cabinets with that combination.34
A study of 43,713 hacked MySpace account passwords revealed the following top 10 list, with
their frequencies in parentheses.35
1. password1 (99)
2. iloveyou1 (52)
3. abc123 (47)
4. myspace1 (39)
5. fuckyou1 (31)
6. summer07 (29)
7. iloveyou2 (28)
8. qwerty1 (26)
9. football1 (25)
10. 123abc (22)
MySpace forced users to include at least one non-alphabetic character in their passwords. Many
users obviously just tack a 1 onto their original choice. A study of 32 million passwords hacked
from RockYou.com, gives a different top 10 list.36
1. 123456
2. 12345
3. 123456789
4. Password
5. iloveyou
6. princess
7. rockyou
8. 1234567
9. 12345678
10. abc123
Passwords ought not be vulnerable to dictionary attacks, ought not consist solely of letters, and
ought to be long. Ideally, they look random. We remember many random looking numbers (phone
numbers, Social Security numbers, etc.), so it ought to be easy enough to remember one more with
34 Feynman, Richard, “Surely You’re Joking Mr. Feynman!” W. W. Norton & Company, New York, 1985, pp.
147–151.
35 “A brief analysis of 40,000 leaked MySpace passwords,” November 1, 2007, http://www.the-interweb.com/
serendipity/index.php?/archives/94-A-brief-analysis-of-40,000-leaked-MySpace-passwords.html.
36 Coursey, David, “Study: Hacking Passwords Easy As 123456,” PCWorld. January 21, 2010, available online at
http://www.pcworld.com/businesscenter/article/187354/study_hacking_passwords_easy_as_123456.html.
520 ◾ Secret History
a mix of numbers and letters. But it really isn’t just one more! We need passwords for bank cards,
and every website we wish to use for online purchases, and they should all be different! Often this
is not the case. Recall, the atomic secrets alluded to earlier were distributed over five filing cabi-
nets, all of which had e as the combination. Inevitably, in many cases, the passwords are written
down and kept near the computer.
Another potential place for implementation flaws is in generating the primes to be used for the
RSA portion, as we saw in Section 15.1.12. Back in the 1990s, PGP users were able to select the size
of these primes from the low-commercial, high-commercial or “military” grade — up to over 1,000
bits.37 Increased speed is the only reason to select the smaller sizes. Once a size is selected, the pro-
gram prompts the user to type some arbitrary text. The text itself is ignored, but the time interval
between keystrokes is used to generate random numbers that are then used to generate the primes.38
There are many other technical details that need to be addressed if one wants to implement a
hybrid system securely. Bearing this in mind, we can better understand why it took Zimmermann
so long to code up the first version of PGP.
Essentially, officials want Congress to require all services that enable communica-
tions—including encrypted e-mail transmitters like BlackBerry, social networking
Web sites like Facebook and software that allows direct “peer to peer” messaging like
Skype—to be technically capable of complying if served with a wiretap order. The
mandate would include being able to intercept and unscramble encrypted messages.39
They can promise strong encryption. They just need to figure out how they can pro-
vide us plain text.
—FBI General Counsel Valerie Caproni, September 27, 2010.40
According to the New York Times article, the administration expected that the bill would be
considered in 2011. However, their plan fizzled out. At a congressional hearing on February 17,
37 Zimmermann, Philip R., The Official PGP User’s Guide, The MIT Press, Cambridge, Massachusetts, 1995, p. 21.
38 Zimmermann, Philip R., The Official PGP User’s Guide, The MIT Press, Cambridge, Massachusetts, 1995,
p. 22.
39 Savage, Charlie, “U.S. Tries to Make It Easier to Wiretap the Internet,” New York Times, September 27, 2010,
2011, the following exchange took place between Henry C. “Hank” Johnson, Jr., a Republican
Congressman from Georgia, and Caproni:
Mr. JOHNSON. What is it exactly that you would want Congress to do, or are you
asking Congress for anything?
Ms. CAPRONI. Not yet.
Mr. JOHNSON. Or did we just simply invite you here to tell us about this?
Ms. CAPRONI. You invited me, and we came. But we don’t have a specific request yet.
We are still—the Administration is considering—I am really here today to talk about
the problem. And I think if everyone understands that we have a problem, that is the
first step, and then figuring out how we fix it is the second step. The Administration
does not yet have a proposal. It is something that is actively being discussed within the
Administration, and I am optimistic that we will have a proposal in the near future.41
The FBI did not propose a bill in the near future. Apparently, they couldn’t figure out exactly
what it would be.
Earlier in the hearing, Susan Landau, of the Radcliffe Institute for Advanced Study, Harvard
University, gave testimony explaining NSA’s point of view.
I want to step back for a moment and talk about cryptography, a fight we had in
the 1990’s in which the NSA and the FBI opposed the deployment of cryptography
through the communications infrastructure. In 1999, the U.S. Government changed
its policy.
The NSA has been firmly behind the change of policy, and endorsed a full set of
unclassified algorithms to be used for securing the communications network. The
NSA obviously believes that in the conflict between communications surveillance and
communications security, we need to have communications security.42
41 Going Dark: Lawful Electronic Surveillance in the Face of New Technologies, Hearing before the Subcommittee
on Crime, Terrorism, and Homeland Security of the Committee on the Judiciary House of Representatives,
One Hundred Twelfth Congress, First Session, February 17, 2011, pp. 49–50, available online at https://www.
govinfo.gov/content/pkg/CHRG-112hhrg64581/pdf/CHRG-112hhrg64581.pdf.
42 Going Dark: Lawful Electronic Surveillance in the Face of New Technologies, Hearing before the Subcommittee
on Crime, Terrorism, and Homeland Security of the Committee on the Judiciary House of Representatives,
One Hundred Twelfth Congress, First Session, February 17, 2011, p. 24, available online at https://www.
govinfo.gov/content/pkg/CHRG-112hhrg64581/pdf/CHRG-112hhrg64581.pdf.
522 ◾ Secret History
brute-force, but the phone was set to make its contents permanently inaccessible (by erasing the
stored form of the AES encryption key), if the correct code wasn’t entered by the tenth attempt.43
On February 9, 2016, FBI Director James Comey claimed that the Bureau couldn’t unlock the
iPhone.44 The FBI appealed to Apple to create software that would allow the iPhone’s contents to
be accessed, but Apple refused and the legal battle began. FBI leadership apparently thought that
the emotionally charged issue of terrorism was one that the Bureau could use to rally the American
public around their old (lost) cause, to force telecommunications providers to develop techniques
to provide the government with access to encrypted data on demand.
Apple refused to comply with the FBI’s request. On February 16, in “A Message to our custom-
ers,” the tech giant’s CEO, Tim Cook, explained,45
The United States government has demanded that Apple take an unprecedented step
which threatens the security of our customers. We oppose this order, which has impli-
cations far beyond the legal case at hand.
Cook also explained why encryption is necessary:46
Smartphones, led by iPhone, have become an essential part of our lives. People use
them to store an incredible amount of personal information, from our private conver-
sations to our photos, our music, our notes, our calendars and contacts, our financial
information and health data, even where we have been and where we are going.
All that information needs to be protected from hackers and criminals who want
to access it, steal it, and use it without our knowledge or permission. Customers expect
Apple and other technology companies to do everything in our power to protect their
personal information, and at Apple we are deeply committed to safeguarding their
data.
Compromising the security of our personal information can ultimately put our
personal safety at risk. That is why encryption has become so important to all of us.
For many years, we have used encryption to protect our customers’ personal data
because we believe it’s the only way to keep their information safe. We have even put
that data out of our own reach, because we believe the contents of your iPhone are
none of our business.
Cook then noted that Apple provided the FBI with all of the information that they could actu-
ally access and legally supply to the FBI in connection with the San Bernadino case, and explained
why creating the backdoor the FBI requested would be a dangerous move.47
But now the U.S. government has asked us for something we simply do not have, and
something we consider too dangerous to create. They have asked us to build a back-
door to the iPhone.
The government suggests this tool could only be used once, on one phone. But
that’s simply not true. Once created, the technique could be used over and over again,
on any number of devices. In the physical world, it would be the equivalent of a master
key, capable of opening hundreds of millions of locks — from restaurants and banks
to stores and homes. No reasonable person would find that acceptable.
The government is asking Apple to hack our own users and undermine decades
of security advancements that protect our customers — including tens of millions of
American citizens — from sophisticated hackers and cybercriminals.
Near the end of the letter, Cook wrote, “We are challenging the FBI’s demands with the deep-
est respect for American democracy and a love of our country.” As I see it, opposing the govern-
ment, when it is wrong, is a patriotic act. Cook apparently felt the same way, for he closed with
“And ultimately, we fear that this demand [from the FBI] would undermine the very freedoms and
liberty our government is meant to protect.”
The (online) letter included a link to “Answers to your questions about Apple and security.”
This Q and A offered a bit more detail to help people understand the implications of a backdoor,
in terms of legal precedent, and the possibility of abuse, noting, “The only way to guarantee that
such a powerful tool isn’t abused and doesn’t fall into the wrong hands is to never create it.”48
On February 18, John McAfee, who has been imprisoned in 11 countries and described as a
“cybersecurity legend and psychedelic drug enthusiast,”49 weighed in on the matter in an op-ed piece.
The fundamental question is this: Why can’t the FBI crack the encryption on its own?
It has the full resources of the best the US government can provide.
With all due respect to Tim Cook and Apple, I work with a team of the best hack-
ers on the planet. These hackers attend Defcon in Las Vegas, and they are legends in
their local hacking groups, such as HackMiami. They are all prodigies, with talents
that defy normal human comprehension. About 75% are social engineers. The remain-
der are hardcore coders. I would eat my shoe on the Neil Cavuto show if we could not
break the encryption on the San Bernardino phone. This is a pure and simple fact.
And why do the best hackers on the planet not work for the FBI? Because the FBI
will not hire anyone with a 24-inch purple mohawk, 10-gauge ear piercings, and a tat-
tooed face who demands to smoke weed while working and won’t work for less than
a half-million dollars a year. But you bet your ass that the Chinese and Russians are
hiring similar people with similar demands and have been for many years. It’s why we
are decades behind in the cyber race.50
So here is my offer to the FBI. I will, free of charge, decrypt the information on the
San Bernardino phone, with my team. We will primarily use social engineering, and
it will take us three weeks. If you accept my offer, then you will not need to ask Apple
to place a backdoor in its product, which will be the beginning of the end of America.
48 “Answers to your questions about Apple and security,” Apple, https://www.apple.com/customer-letter/answers/.
49 Hathaway, Jay, “Antivirus Wild Man John McAfee Offers to Solve FBI’s iPhone Problem So Apple Doesn’t
Have To,” February 19, 2016, Intelligencer, available online at https://nymag.com/intelligencer/2016/02/john-
mcafee-says-he-can-crack-that-iphone.html.
50 McAfee, John, “JOHN MCAFEE: I’ll decrypt the San Bernardino phone free of charge so Apple doesn’t
need to place a back door on its product,” Business Insider, February 18, 2016, available online at https://www.
businessinsider.com/john-mcafee-ill-decrypt-san-bernardino-phone-for-free-2016-2.
524 ◾ Secret History
If you doubt my credentials, Google “cybersecurity legend” and see whose name
is the only name that appears in the first 10 results out of more than a quarter of a
million.51
While a social engineering attack (manipulating people to get them to provide access or infor-
mation) is often the easiest approach, it wouldn’t be useful in the case of the iPhone. Indeed,
McAfee later admitted what he wrote here (and said elsewhere) wasn’t exactly right.
I speak through the press, to the press, and to the general public. For example, last
night I was on RT, and I gave a vastly oversimplified explanation of how you would
hack into the iPhone.52 I can’t possibly go in and talk about the secure spaces on the
A7 chip. I mean, who’s going to understand that crap? Nobody. But you gotta believe
me: I understand it. And I do know what I’m doing, else I would not be where I am.
This is a fact. Someone who does not understand software cannot start a multibillion
dollar company. This is just a fact of life. So, if I look like an idiot, it is because I am
speaking to idiots.53
By doing so, I knew that I would get a shitload of public attention, which I did. That video,
on my YouTube account, it has 700,000 views. My point is to bring to the American pub-
lic the problem that the FBI is trying to [fool] the American public. How am I going to
do that, by just going off and saying it? No one is going to listen to that crap. So I come
up with something sensational. Now, what I did not lie about was my ability to crack the
iPhone. I can do it. It’s a piece of friggin’ cake. You could probably do it.54
Many others in the tech community found some of the FBI’s statements disingenuous. To
them it seemed like the FBI was more interested in setting a legal precedent than in gaining access
to the iPhone in question. There were certainly people the FBI could appeal to if all they truly
wanted was the contents of that particular phone. When the FBI revealed, on March 28, 2016,
that they found a third party who was able to access the iPhone without Apple’s help55 the reac-
tion from many was, “Of course you did.” Because the FBI backed down, the case ended without
a ruling against them. If the Bureau was confident they would win, would the mysterious third
party have even been appealed to for help?
51 McAfee, John, “JOHN MCAFEE: I’ll decrypt the San Bernardino phone free of charge so Apple doesn’t
need to place a back door on its product,” Business Insider, February 18, 2016, available online at https://www.
businessinsider.com/john-mcafee-ill-decrypt-san-bernardino-phone-for-free-2016-2.
52 McAfee, John, “John McAfee Reveals To FBI, On National TV, How To Crack The iPhone (RT Interview),”
article/12277-john-mcafee-challenges-reddit.
54 Turton, William, daily dot, “John McAfee lied about San Bernardino shooter’s iPhone hack to ‘get a s**tload of
There’s an important detail to consider when it comes to assessing how honest the FBI has
been on this issue. FBI Director James Comey indicated that the Bureau paid over $1.3 million
to the third party for the hack.56 If so, the FBI was drastically overcharged, as you will soon see.
According to another source, the cost was only $15,000.57 If the latter is true, why did Comey lie?
On September 14, 2016, Sergei Skorobogatov, a computer scientist at the University of
Cambridge, published a paper online giving full details of how to hack the iPhone 5C, Skorobogatov
noted, “The process does not require any expensive and sophisticated equipment. All needed parts
are low cost and were obtained from local electronics distributors.”58 He also posted videos dem-
onstrating the process on YouTube. The timing of this research project was not a coincidence. On
March 28, Comey had said that “NAND mirroring” wouldn’t be used to hack into the iPhone,
and that “It doesn’t work.”59 This was the approach that Skorobogatov successfully applied. John
Gruber noted, “When the FBI lies it’s a “fib”. When you lie to the FBI it’s a “felony”.”60
As of this writing (May 23, 2020), McAfee’s interview on RT61 (inaccurately explaining an
iPhone 5C hack) has 1,127,246 views, while Skorobogatov’s video62 (actually demonstrating an
iPhone5C hack) has only 241,601 views. It looks like McAfee is pretty media savvy after all. The
conclusion that the FBI was playing the media was reached by some politicians, in addition to
technical people. For example, Democratic Senator Ron Wyden said, “There are real questions
about whether [the FBI] has been straight with the public on [the Apple case].”63
The San Bernadino shooter case wasn’t the first time the FBI initiated such a legal challenge.
An earlier instance involved an iPhone 5S, but the crime involved drugs, not terrorism. The FBI
lost this case on February 29, 2016, when a federal judge rejected its request to order Apple to open
the iPhone.64 There were several other attempts involving iPhones and iPads, but the terrorist’s
iPhone got the most attention.
56 Lichtblau, Eric and Katie Benner, The New York Times, “F.B.I. Director Suggests Bill for iPhone Hacking
Topped $1.3 Million,” April 21, 2016, https://www.nytimes.com/2016/04/22/us/politics/fbi-director-suggests-
bill-for-iphone-hacking-was-1-3-million.html.
57 Fontana, John, ZDNet, “FBI’s strategy in Apple case caught in distortion field,” April 6, 2016, https://www.
zdnet.com/article/fbis-strategy-in-apple-case-caught-in-distortion-field/.
58 Skorobogatov, Sergei P., “The bumpy road towards iPhone 5c NAND mirroring,” https://arxiv.org/
2016, https://daringfireball.net/linked/2016/03/23/buzzfeed-apple-fbi.
61 McAfee, John, “John McAfee Reveals To FBI, On National TV, How To Crack The iPhone (RT Interview),”
watch?v=tM66GWrwbsY.
63 Fontana, John, ZDNet, “FBI’s strategy in Apple case caught in distortion field,” April 6, 2016, https://www.
zdnet.com/article/fbis-strategy-in-apple-case-caught-in-distortion-field/.
64 Ackerman, Spencer, Sam Thielman, and Danny Yadron, “Apple case: judge rejects FBI request for access to
drug dealer’s iPhone,” The Guardian, February 29, 2016, available online at https://www.theguardian.com/
technology/2016/feb/29/apple-fbi-case-drug-dealer-iphone-jun-feng-san-bernardino.
526 ◾ Secret History
The FBI likely pushed harder in the terrorism case, believing that they would have great
support from the public. They miscalculated badly. The FBI was opposed, not only by Apple,
but also by Access Now, Amazon.com, the American Civil Liberties Union, Box, the Center
for Democracy and Technology, Cisco Systems, Dropbox, the Electronic Frontier Foundation,
Evernote, Facebook, Google, Lavabit, LinkedIn, Microsoft, Mozilla, Nest Labs, Pinterest, Slack
Technologies, Snapchat, Twitter, WhatsApp, and Yahoo!65
And it wasn’t just industry and liberal and libertarian organizations that opposed the FBI. The
Bureau even took knocks from an intelligence community giant. General Michael Hayden, who had
served as a director of the National Security Agency, as well as the Central Intelligence Agency, said,
You can argue this on constitutional grounds. Does the government have the right to
do this? Frankly, I think the government does have a right to do it. You can do balanc-
ing privacy and security… dead men don’t have a right to privacy. I don’t use those
lenses. My lens is the security lens, and frankly, I think it’s a close but clear call that
Apple’s right on just raw security grounds.66
Jim Clapper has said, he’s the Director of National Intelligence, that the greatest
threat to the United States is the cyber threat and I think Apple is technologically cor-
rect when they say doing what the FBI wants them to do in this case will make their
technology, their encryption, overall weaker than it would otherwise be. So I get why
the FBI wants to get into the phone, but we make tradeoffs like this all the time and
this may be a case where we’ve got to give up some things in law enforcement and even
counter terrorism in order to preserve this aspect, our cybersecurity.67
Any effort to legislate or to use a court to stop this broad technological trend just
isn’t going to work. We are going to a world of very high-end encryption that will be
used routinely by people around the planet. Now, from my own line of work, signals
intelligence, intercepting communications, that represents a challenge, but there are
also tools available that you can still get meaningful intelligence out of communica-
tions even though you might never read the content.68
As for the American public, a CBS poll showed they were split between siding with Apple and
the FBI.69 This is not surprising, given that it’s an issue that takes a bit of time to understand. It’s
likely a tiny percentage that actually have a good understanding of it.
https://www.foxbusiness.com/features/fmr-nsa-cia-chief-hayden-sides-with-apple-over-feds.
67 Limitone, Julia, FOXBusiness, “Fmr. NSA, CIA Chief Hayden Sides with Apple Over Feds,” March 7, 2016,
https://www.foxbusiness.com/features/fmr-nsa-cia-chief-hayden-sides-with-apple-over-feds.
68 Limitone, Julia, FOXBusiness, “Fmr. NSA, CIA Chief Hayden Sides with Apple Over Feds,” March 7, 2016,
https://www.foxbusiness.com/features/fmr-nsa-cia-chief-hayden-sides-with-apple-over-feds.
69 Anon., “CBS News poll: Americans split on unlocking San Bernardino shooter’s iPhone,” CBS News, March 18, 2016,
https://www.cbsnews.com/news/cbs-news-poll-americans-split-on-unlocking-san-bernardino-shooters-iphone/.
Pretty Good Privacy and Bad Politics ◾ 527
While the attention didn’t go the way the FBI wanted, they did succeed in generating a lot of
it. John Oliver devoted an installment of his HBO program Last Week Tonight with John Oliver
to the topic on March 14, 2016, before the FBI got into the iPhone.70 As of this writing (May
23, 2020), the episode, which is critical of the FBI’s position, has racked up 12,127,673 views on
YouTube. It included some quotes from Republican Senator Lindsey Graham. The first, given
below in greater context than by Oliver, is from a Republican presidential primary debate held on
December 15, 2015.
The bottom line is, we’re at war. They’re trying to come here to kill us all and it’s up
to the government to protect you within constitutional means. Any system that would
allow a terrorist to communicate with somebody in our country and we can’t find out
what they’re saying is stupid. If I’m president of the United States, and you join ISIL,
you are going to get killed or captured. And the last thing you are going to hear if I’m
president is, you’ve got a right to remain silent.71
Then presidential candidate Donald Trump suggested a boycott of Apple until they comply
with the FBI.
Three months after Graham’s quote above, the Senator had the following exchange with
Attorney General Loretta E. Lynch.
Attorney General Loretta E. Lynch: I think for us the issue is about a criminal inves-
tigation into a terrorist act and the need to obtain evidence.
Graham: But it’s just not so simple and I’ll end with this. I thought it was that simple.
I was all with you until I actually started getting briefed by people in the intel commu-
nity and I will say I’m a person who’s been moved by the arguments of the precedent
we set and the damage we may be doing to our own national security.
This quote was also played by Oliver, who said it was a miracle that Graham had “met the
concept of nuance,” but did he really? Read on.
Despite the FBI having backed down, some US Senators still had their sights set on forcing
backdoors into communications devices. On April 7, 2016, a familiar bit of draft legislation was
leaked, followed by an official release on the 13th. As with the bill Biden had cosponsored in
1991, this new proposal, called the Compliance with Court Orders Act of 2016 (CCOA), would
require “any person who provides a product or method to facilitate a communication or the pro-
cessing or storage of data” to “be capable of complying” with court orders to turn over “data in an
intelligible format” even if the data was enciphered by the user. Senator Ron Wyden said, “This
flawed bill would leave Americans more vulnerable to stalkers, identity thieves, foreign hackers
70 Oliver, John, Encryption: Last Week Tonight with John Oliver (HBO), LastWeekTonight YouTube Channel,
March 14, 2016, https://www.youtube.com/watch?v=zsjZ2r9Ygzw.
71 Wofford, Taylor, Newsweek, “Full Transcript: CNN Republican Undercard Debate,” December 15, 2015,
https://www.newsweek.com/cnn-republican-undercard-debate-transcript-405767.
528 ◾ Secret History
and criminals.”72 This time, the authors of the bill were Republican Richard Burr and Democrat
Dianne Feinstein.73 The Senators were quoted as saying,
The underlying goal is simple: when there’s a court order to render technical assistance
to law enforcement or provide decrypted information, that court order is carried out.
No individual or company is above the law.74
Ultimately, this legislation went nowhere, just like the 2011 attempt.
Thanks to the great work of the FBI — and no thanks to Apple — we were able to
unlock Alshamrani’s phones.77
Apple’s decision has dangerous consequences for the public safety and the national
security and is, in my judgement, unacceptable. Apple’s desire to provide privacy
for its customers is understandable, but not at all costs. … There is no reason why
companies like Apple cannot design their consumer products and apps to allow for
72 Hosenball, Mark and Dustin Volz, Reuters, “U.S. Senate panel releases draft of controversial encryption
bill,” April 13, 2016, https://finance.yahoo.com/news/u-senate-panel-releases-draft-192224282.html and
Pfefferkorn, Riana, Just Security, “Here’s What the Burr-Feinstein Anti-Crypto Bill Gets Wrong,” April 15,
2016, https://www.justsecurity.org/30606/burr-feinstein-crypto-bill-terrible/.
73 Volz, Dustin and Mark Hosenball, Reuters, “Leak of Senate encryption bill prompts swift backlash,” April 8,
2016, https://www.reuters.com/article/us-apple-encryption-legislation-idUSKCN0X52CG.
74 Volz, Dustin and Mark Hosenball, Reuters, “Leak of Senate encryption bill prompts swift backlash,” April 8,
2016, https://www.reuters.com/article/us-apple-encryption-legislation-idUSKCN0X52CG.
75 Feiner, Lauren, “Senators threaten to regulate encryption if tech companies won’t do it themselves,” CNBC, December
FBI Director Christopher A. Wray also pushed hard against Apple, saying,
Public servants, already swamped with important things to do to protect the American
people — and toiling through a pandemic, with all the risk and hardship that
entails — had to spend all that time just to access evidence we got court-authorized
search warrants for months ago.79
Still, they got the data they wanted. And I continue to believe that they are making it sound
like a greater challenge than it actually was. One of the iPhones in this case had been shot and the
FBI still got into it!80
That same day, Apple responded:
The terrorist attack on members of the US armed services at the Naval Air Station in
Pensacola, Florida was a devastating and heinous act. Apple responded to the FBI’s
first requests for information just hours after the attack on December 6, 2019 and
continued to support law enforcement during their investigation. We provided every
piece of information available to us, including iCloud backups, account information
and transactional data for multiple accounts, and we lent continuous and ongoing
technical and investigative support to FBI offices in Jacksonville, Pensacola, and New
York over the months since.
On this and many thousands of other cases, we continue to work around-the-clock
with the FBI and other investigators who keep Americans safe and bring criminals
to justice. As a proud American company, we consider supporting law enforcement’s
important work our responsibility. The false claims made about our company are an
excuse to weaken encryption and other security measures that protect millions of users
and our national security.
It is because we take our responsibility to national security so seriously that we
do not believe in the creation of a backdoor — one which will make every device
vulnerable to bad actors who threaten our national security and the data security of
our customers. There is no such thing as a backdoor just for the good guys, and the
78 Welch, Chris, The Verge, “The FBI successfully broke into a gunman’s iPhone, but it’s still very angry at Apple,”
May 18, 2020, https://www.theverge.com/2020/5/18/21262347/attorney-general-barr-fbi-director-wray-apple-
encryption-pensacola and KTVN, “AG Barr: Apple’s decision has dangerous consequences for public safety,”
https://www.ktvn.com/clip/15067612/ag-barr-apples-decision-has-dangerous-consequences-for-public-safety
for a video clip of this quote.
79 Welch, Chris, The Verge, “The FBI successfully broke into a gunman’s iPhone, but it’s still very angry at Apple,”
American people do not have to choose between weakening encryption and effective
investigations.
Customers count on Apple to keep their information secure and one of the ways
in which we do so is by using strong encryption across our devices and servers. We sell
the same iPhone everywhere, we don’t store customers’ passcodes and we don’t have
the capacity to unlock passcode-protected devices. In data centers, we deploy strong
hardware and software security protections to keep information safe and to ensure
there are no backdoors into our systems. All of these practices apply equally to our
operations in every country in the world.81
I’m convinced that Apple will come out on top again, but there is yet another update to this
tale of history repeating itself that needs to be addressed before closing out this chapter.
81 Welch, Chris, The Verge, “The FBI successfully broke into a gunman’s iPhone, but it’s still very angry at Apple,”
May 18, 2020, https://www.theverge.com/2020/5/18/21262347/attorney-general-barr-fbi-director-wray-apple-
encryption-pensacola.
82 S.3398 — EARN IT Act of 2020, 116th Congress (2019–2020), congress.gov, https://www.congress.gov/
bill/116th-congress/senate-bill/3398/text.
83 Mullin, Joe, Electronic Frontier Foundation (EFF), “The EARN IT Bill Is the Government’s Plan to Scan Every
https://www.brookings.edu/techstream/the-earn-it-act-is-a-disaster-amid-the-covid-19-crisis/.
Pretty Good Privacy and Bad Politics ◾ 531
favor of the former, before the next edition, and that this section would truly become history, but
I’m not optimistic. It might take several more editions.
Pfefferkorn, Riana, TechCrunch “The FBI is mad because it keeps getting into locked iPhones without
Apple’s help,” https://techcrunch.com/2020/05/22/the-fbi-is-mad-because-it-keeps-getting-into-
locked-iphones-without-apples-help/, May 22, 2020.
Poddebniak, Damian, Christian Dresen, Jens Müller, Fabian Ising, Sebastian Schinzel, Simon Friedberger,
Juraj Somorovsky, and Jörg Schwenk, “Efail: Breaking S/MIME and OpenPGP Email Encryption
using Exfiltration Channels,” presentation at 27th USENIX Security Symposium, Baltimore, Maryland,
August 2018. See https://efail.de/ for a link to this presentation and much more.
Savage, Charlie, “U.S. Tries to Make It Easier to Wiretap the Internet,” New York Times, September 27,
2010, p. A1, available online at http://www.nytimes.com/2010/09/27/us/27wiretap.html and http://
archive.nytimes.com/www.nytimes.com/2010/09/27/us/27wiretap.html.
Scahill, Jeremy and Josh Begley, The Intercept, “The CIA Campaign to Steal Apple’s Secrets,” https://
theintercept.com/2015/03/10/ispy-cia-campaign-steal-apples-secrets/, March 10, 2015.
Schneier, Bruce, E-Mail Security: How to Keep Your Electronic Messages Private, John Wiley & Sons, New
York, 1995. Schneier covered both PGP and PEM (Privacy Enhanced Mail), which used DES or triple
DES, with two keys, along with RSA.
Shwayder, Maya, Digital Trends, “The FBI broke Apple’s iPhone encryption. Here’s why you shouldn’t panic,”
https://www.digitaltrends.com/news/fbi-iphone-hack-encryption-pensacola-shooter-analysis/?itm_
source=35&itm_content=1x7&itm_term=2498265, May 18, 2020.
Smith, Ms., “NAND mirroring proof-of-concept Show that FBI could Use it to Crack iPhone,” CSO
Online, http://www.networkworld.com/article/3048488/security/nandmirroring-proof-of-concept-
show-that-fbi-could-use-it-to-crackiphone.html, March 28, 2016.
Stallings, William, Protect Your Privacy: the PGP user’s guide, Prentice Hall PTR, Englewood Cliffs, New
Jersey, 1995.
U.S. Senate Committee on the Judiciary, “Encryption and Lawful Access: Evaluating Benefits and Risks to
Public Safety and Privacy,” https://www.judiciary.senate.gov/meetings/encryption-and-lawful-access-
evaluating-benefits-and-risks-to-public-safety-and-privacy, December 10, 2019. This website has a
video of the hearing and transcripts.
Wikipedia contributors, “Crypto Wars,” Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/wiki/
Crypto_Wars
Wikipedia contributors, “EFAIL,” Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/wiki/EFAIL.
Wikipedia contributors, “FBI–Apple encryption dispute,” Wikipedia, The Free Encyclopedia, https://
en.wikipedia.org/wiki/FBI–Apple_encryption_dispute.
Wikipedia contributors, “List of the most common passwords,” Wikipedia, The Free Encyclopedia, https://
en.wikipedia.org/wiki/List_of_the_most_common_passwords.
Zimmermann, Philip R., The Official PGP User’s Guide, MIT Press, Cambridge, Massachusetts, 1995. Even
this book is available for free (in an ASCII version online). Zimmermann notes that it’s “out of date
with current PGP software, but still politically interesting.”86
Zimmermann, Philip R., PGP Source Code and Internals, MIT Press, Cambridge, Massachusetts, 1995.
The export laws were strange. Although PGP couldn’t legally be exported as software, the source
code could. And, of course, source code can be scanned and converted for use by a text editor; hence,
this book. On another note, the experts know that trying to keep an algorithm secret is a bad sign!
Revealing the details, if the software is any good, will ease concerns, not increase them.
Zimmermann, Philip, R., Home Page, http://www.philzimmermann.com/EN/background/index.html.
Stream Ciphers
Anyone who considers arithmetical methods of producing random digits is, of course,
in a state of sin.
—John von Neumann (1951)1
The tape machine depicted in Section 2.9 can be seen as the beginning of a still ongoing area of
cryptographic research — stream ciphers. Such systems attempt to generate random numbers that
can be combined with the message in an approximation of the unbreakable one-time pad. The
problem is, machines cannot generate random numbers. Thus, we usually refer to such numerical
sequences as pseudorandom and the devices that create them as pseudorandom number generators
or PRNGs for short.
It should be noted that a much earlier origin for stream ciphers is offered by the autokey
ciphers of the 16th century, as discussed in Section 2.5. In any case, stream ciphers are especially
important when we want to encipher and decipher data in real time. Applications such as secure
cell phone conversations and encrypted streaming video provide examples. In these cases, the
pseudorandom sequences consist of 0s and 1s that are usually generated bit by bit or byte by byte.
We begin with an early attempt using modular arithmetic.
X n = ( aX n −1 + b) (mod m)
1 John von Neumann was on the National Security Agency Scientific Advisory Board (NSASAB). He’s quoted
here from Salomon, David, Data Privacy and Security, Springer, New York, 2003, p. 97.
2 Lehmer, Derrick Henry, “Mathematical Methods in Large-Scale Computing Units,” in Aiken, Howard H.,
Proceedings of a Second Symposium on Large-Scale Digital Calculating Machinery, Annals of the Computation
Laboratory of Harvard University, Vol. 26, Harvard University Press, Cambridge, Massachusetts, 1951 pp.
141‒146. The conference was held on September 1, 1949. This may have been the first attempt to generate
pseudorandom numbers with a linear congruential generator.
533
534 ◾ Secret History
For example, if we take a = 3, b = 5, m = 26, and seed the generator with X0 = 2, we get:
X0 = 2
X 1 = 3(2) + 5 = 11
X 2 = 3(11) + 5 = 12(mod 26)
X 3 = 3(12) + 5 = 15 (mod 26)
X 4 = 3(15) + 5 = 24 (mod 26)
X 5 = 3(24) + 5 = 25 (mod 26)
X 6 = 3(25) + 5 = 2 (mod 26)
At this point, we’re back at our starting value. The output will now continue, as before, 11, 12,
15, 24, 25, … Clearly, this is not random! We get stuck in a cycle of period 6. Still, if this could be
modified to generate a cycle with a much longer period, longer than any message we might want
to encipher, it seems like it could be a reasonable way to generate a key that could then be paired
with the message, one letter at a time, modulo 26.
Of course, the modern approach uses bits instead. This isn’t a problem, as we could convert
these values to bits and XOR them with our message (also expressed in bits). However, the effect is
the same as that of a binary Vigenère cipher, because the values to be XORed repeat.
If we select the values of a, b, and m more carefully, we can cycle through all values from 0
to m − 1, but we must then repeat the cycle, as in the example above. If m is large enough, this
might seem safe; for example, m may be larger than the length of the message. Nevertheless, this
technique is not secure. Jim Reeds was the first to publicly break such ciphers in 1977.3
An obvious next step for the cryptographer is to try higher power congruential generators, such
as quadratics. Notice that each term still only depends on the one that came before:
3 Reeds, James, “Cracking a Random Number Generator,” Cryptologia, Vol. 1. No. 1, January 1977, pp. 20‒26.
A later paper on this topic is Plumstead, Joan B., “Inferring a Sequence Generated by a Linear Congruence,” in
Proceedings of the 23rd Annual Symposium on Foundations of Computer Science, IEEE Computer Society Press,
Los Alamitos, California, 1982, pp. 153‒159.
4 Boyar, Joan, “Inferring Sequences Produced by Pseudo-random Number Generators,” Journal of the ACM
(JACM), Vol. 36, No. 1, January 1989, pp. 129-141. Joan B. Plumstead’s later papers were published under the
name Joan Boyar.
5 Lagarias, Jeffrey C. and James Reeds, “Unique Extrapolation of Polynomial Recurrences,” SIAM Journal on
Computing, Vol. 17, No. 2, April 1988, pp. 342‒362.
6 Guinier, Daniel, “A Fast Uniform “Astronomical” Random Number Generator,” SIGSAC Review (ACM
Special Interest Group on Security Audit & Control), Vol. 7, No. 1, Spring 1989, pp. 1‒13.
Stream Ciphers ◾ 535
X n = (aX n −1 + bX n − 2 + c ) (mod m )
This way each value depends on the two previous values (hence, degree 2) and we can attain
longer periods. Nothing is squared. We would, of course, need two seed values X0 and X1. The
first number we generate would be X 2. This is the basic idea behind linear feedback shift registers
(LFSRs). They are very fast (in hardware) when working with bits modulo 2. We could indicate
mod 2 by setting m = 2, but as we’ve seen before, the convention is to replace + with ⊕ to represent
XOR, which is the same as addition modulo 2. LFSRs are usually represented diagrammatically
rather than algebraically (Figure 19.1).
b3 b2 b1
The figure is best explained with an example. We may seed the register (the values of the bs) with the
bits 101; that is b3 = 1, b2 = 0, and b1 = 1. The diagonal arrows indicate that we get our new bit by taking
the XOR of b3 and b1, which is 1 ⊕ 1 = 0. Notice that b2 is not used in this calculation. The bits that
are used, b3 and b1, are referred to as the taps. The new bit that is calculated, based on the taps, follows
the longest arrow and takes the place of b3, but b3 doesn’t just vanish. Instead, it advances to the right to
take the place of b2, which in turn advances to the right to replace b1. With nowhere left to go, b1 “falls
of the edge” (indicated by the shortest arrow) and is gone. These steps are then all repeated with the new
values. Starting with the seed, our register holds the following values, as we iterate:
101
010
001
100
110
111
011
101
which brings us back to the start. Notice that this register cycles through seven different sets of
values. We say that it has period 7. The rule depicted diagrammatically may also be represented
algebraically as
bn +3 = bn +2 ⊕ bn , for n = 1, 2,…
To use a LFSR as a stream cipher, we simply take the bits that shift through the register. The
LFSR above gives the stream 1010011 (which then repeats). This is the third column in the list
536 ◾ Secret History
of register values given above. XORing our plaintext with a string of only 7 bits, used repeatedly,
is not a very secure method of encryption! We’ll improve upon this, but what is to come will be
clearer if we start out slowly.
Observe that there’s a “bad seed” that’s useless for cryptographic purposes. If we start off with
b3 = 0, b2 = 0, and b1 = 0, we’ll never get any nonzero values. XORing the stream of bits that is
generated by this seed with the plaintext message will leave it unchanged.
In general, the longest period possible for a LFSR with n elements is 2n – 1, so our original
example above was maximized in this respect. A different (nonzero) seed will simply cycle through
the same values beginning at a different point. We cannot get a longer period by XORing different
bits, although this can cause the states to be cycled through in a different order. So, if we want a
LFSR that generates a longer key, we must look at those with more elements, but more elements
don’t guarantee that longer keys will result.
We can investigate whether or not a LFSR produces a maximal period by examining a polyno-
mial associated with the register. For example, for the LFSR pictured in Figure 19.1, the polynomial
is p(x) = x3 + x1 + 1. The powers of x are taken to be the positions of the bits that are made use of in the
XOR, and + 1 is always tacked on to the end. This is called a tap polynomial or connection polynomial.
We first see whether or not the tap polynomial can be factored (mod 2). If it cannot, we say it
is irreducible. In this case, if the polynomial is of degree n, the period must divide 2n – 1. We can
check the polynomial above for reducibility modulo 2, by plugging in 0 and 1 to get
p(0) = 03 + 01 + 1 = 1 (mod 2)
p(1) = 13 + 11 + 1 = 3 = 1 (mod 2)
Because neither 0 nor 1 is a root, neither x nor x + 1 can be a factor.7 A third-degree polynomial
must have a linear factor, if it is reducible, so the polynomial above must be irreducible. Higher
degree polynomials require other methods for checking for reducibility; for example, the fourth
degree polynomial f(x) = x 4 + x 2 + 1 has no roots modulo 2, but it is not irreducible. We have x 4
+ x 2 + 1 = (x 2 + x +1)(x 2 + x +1) (mod 2).
So, our third-degree example, being irreducible, must have a period that divides 23 − 1 = 7;
hence, the period is either 1 or 7. We attain period 1 with the seed consisting of all zeros, and
period 7 with any other seed. We’ll see another important cryptographic use of irreducible poly-
nomials in Section 20.3.
But what if 2n − 1 is not prime? For example, for a LFSR with a register containing 4 bits, and
a tap polynomial that is irreducible, all we can conclude is that the period must divide 24 − 1. This
doesn’t tell us the period is 15, because 3 and 5 are also factors of 24 – 1. Thus, the result above is
less useful when 2n – 1 is composite. Fortunately, we have another test. In the definition that fol-
lows, we let l denote the length of the register.
An irreducible polynomial p(x) is said to be primitive if:
l
1. p(x) is a factor of x 2 −1 − 1 , and
2. p(x) is not a factor of xk – 1 for any positive divisor k of 2l – 1.
We then have the following result: A LFSR with a tap polynomial that is primitive will have a
maximal period. As an example of an LFSR with a long period, we have bn + 31 = bn + 3 ⊕ bn for
7 This is not a typo. Modulo 2, x – 1 and x + 1 are the same, because –1 = 1 (mod 2).
Stream Ciphers ◾ 537
n = 1, 2, … This LFSR requires a 31-bit seed, which will then generate a sequence of bits with a
period of 231 – 1 = 2,147,483,647.
where each of the ai is either 0 or 1. The string of known key bits, 10101100, labeled b1b2b3b4b5b6b7b8
for convenience, although they needn’t be from the start of the message, tells us
1 = a3 0 ⊕ a2 1 ⊕ a1 0 ⊕ a0 1
1 = a31 ⊕ a2 0 ⊕ a11 ⊕ a0 0
0 = a31 ⊕ a2 1 ⊕ a1 0 ⊕ a0 1
0 = a3 0 ⊕ a2 1 ⊕ a11 ⊕ a0 0
From this system of equations, we may solve for the ai. This may be done without the techniques of
linear algebra, but for larger examples we’d really want to use matrices, so we’ll use one here. We have
1 0 1 0 a0 1
0 1 0 1 a1 = 1
1 0 1 1 a 2 0
0 1 1 0 a
3 0
The four by four matrix has the inverse
0 1 1 1
1 1 1 0
1 1 1 1
1 0 1 0
So our solution is
Thus, the equation for the LFSR appears to be bn + 4 = bn + 3 ⊕ bn. This equation may then be
used to generate all future key bits, as well as previous key bits, if the crib occurred in a position
other than the start of the message
If the equation fails to yield meaningful text beyond the crib, we’d have to consider a five-
element LFSR, and if that doesn’t work out, then a six-element LFSR, etc. However, we’d need
more key bits to uniquely determine the coefficients for anything beyond 4 elements. In general,
we need 2n bits of key for an n-element LFSR. As n grows, 2n quickly becomes tiny, as a percent-
age, compared to the maximal period of 2n – 1 for the n-element LFSR. So, although we needed
a little over half of the repeating key to uniquely solve the 4 element LFSR, the period 231 – 1 =
2,147,483,647 LFSR defined by bn + 31 = bn ⊕ bn + 3 could be recovered from only 62 bits of key, a
tiny percentage of the whole (and less than 8 characters, as each keyboard character translates to 8
bits). This is not an unreasonable crib! As mentioned before, modern ciphers are expected to hold
strong against known-plaintext attacks, so the LFSRs described above are not useful for crypto-
graphic purposes. However, they are incorporated as components of stronger systems.
18 17 16 13 8 0
21 20 10 0
22 21 20 10 7 0
Figure 19.2 indicates that the A5/1 stream cipher consists of three linear feedback shift regis-
ters. The first XORs the bits in positions 13, 16, 17, and 18 to get a new bit, which is then placed
at the end, forcing all of the bits to shift one position to the left. The last bit, formerly in position
18, shifts off the register and is XORed with bits from the other two LFSRs to finally provide the
bit that is XORed with the message to yield a bit of ciphertext.
Stream Ciphers ◾ 539
Because all three LFSRs must be seeded, the key is 19 + 22 + 23 = 64 bits long. Notice that
we count the bit in position 0 for each LFSR, along with the rest. Each of the three LFSRs has a
length that is relatively prime to the lengths of the others. This would generate a period that’s the
product of all three. However, there’s another feature that lengthens the period. Notice that the
diagram for A5/1 has bits labeled in positions 8, 10, and 10. These are called clocking bits. In each
cycle, the bits in the clocking positions are examined. Because there is an odd number of clocking
bits, there must be either more 1s than 0s or more 0s than 1s in these positions. The registers that
have the more popular bit in their clocking positions advance. If all three bits match, all of the
registers advance.
In defiance of Kerckhoffs’s rules, the algorithm provided above was kept secret, while it was
being placed in over 100 million cell phones. In compliance with Kerckhoffs’s rules the public
learned it anyway! It was part of the Global System for Mobile Communications (GSM) cellphone
standard. Various attacks have made it clear that the system is insecure. Details may be found in
the papers in the References and Further Reading list at the end of this chapter.
A5/2 made use of four LFSRs that advance in an irregular manner, like those of A5/1. Although
this might make A5/2 sound stronger than A5/1 (4 is bigger than 3, right?), it isn’t. It was purposely
made weaker, intended for use in certain countries, while Americans and Europeans used the
stronger A5/1. A5/2 was made public in August 1999, and before the month ended, Ian Goldberg,
David A. Wagner, and Lucky Green broke it.8 For details, see the References and Further Reading
section at the end of this chapter.
19.5 RC4
RC4 (Rivest Cipher 4), designed by Ron Rivest in 1987, was a very popular stream cipher. Again,
in denial of Kerckhoff’s laws, the details of this cipher were kept secret and could only be obtained
by signing a nondisclosure agreement with RSA Data Security Inc. In September 1994, however,
the source code was anonymously posted to the Cypherpunks mailing list.9
The cipher starts off with a list of all 8-bit numbers, in order. These bytes are
S0 = 00000000
S1 = 00000001
S2 = 00000010
S3 = 00000011
S4 = 00000100
S5 = 00000101
S255 = 11111111.
8 Goldberg, Ian, David Wagner, and Lucky Green, “The (Real-Time) Cryptanalysis of A5/2,” paper presented at
the Rump Session of the CRYPTO ‘99 conference, Santa Barbara, California, August 15‒19, 1999.
9 Schneier, Bruce, Applied Cryptography, second edition, John Wiley & Sons, New York, 1996, p. 397.
540 ◾ Secret History
These bytes are then shuffled so that their new order appears random. To do this, another set of
256 bytes is initialized using the key. The key may be any length up to 256 bytes. At the low end,
there are attacks that can break RC4 if the key is just 40 bits.
Whatever length key is selected, we simply split it into bytes and label them K0, K1, K 2, K 3,…
K 255. If we reach the end of our key before we fill 256 bytes, we continue filling bytes using our
key over again, from the start. For example, if our key was only 64 bytes long, we’d have to lay it
end to end four times in order to have enough bytes to fill K0 through K 255. The shuffling of the Si
is then carried out by the following loop:
j=0
for i = 0 to 255
j = ( j + Si + K i ) (mod 256)
Swap Si and S j
next i
After resetting the index variables i and j to 0, we’re ready to generate our key stream for enci-
phering. The key, K, actually used for encryption is then generated byte by byte using the follow-
ing equations, applied repeatedly:
i = (i + 1) (mod 256)
j = ( j + Si ) (mod 256)
Swap Si and S j
t = (Si + S j ) (mod 256)
K i = St
Ci = Mi ⊕ K i
We apply the steps above to each byte, Mi, of the message, until they are all enciphered.
RC4 is a simple cipher and easy to program. It also marks a departure from the other methods
discussed in this section. The exact period of RC4 is not known, but analysis thus far indicates
that it is very likely in excess of 10100.10 This lower bound is a familiar number. Mathematicians
were referring to 10100 as a googol, long before an Internet search engine appropriated the name
in a misspelled form.
RC4 was used in the Secure Socket Layer (SSL) and Wired Equivalent Privacy (WEP), both
of which were found to be insecure. Because of its weaknesses, WEP is sometimes said to stand
for White Elephant Protection. WEP implemented RC4 in a manner similar to how Enigma was
used, as described in Chapter 7. A 24-bit initialization vector (IV) placed at the front of WEP
ciphertexts helped to generate a session key.
When it came time to fill the key bytes, the IV bits were used first. They were then followed
by a key that can be used many times, because the randomly generated IV results in the scrambled
key differing each time. However, given a sufficient depth of messages (just like the Poles needed
to recover Enigma keys), these initialization vectors allow WEP to be broken.11
Other software packages that made use of RC4 included Microsoft® Windows® and Lotus
Notes®. RC4 was actually the most popular stream cipher in software. Today, it is no longer
considered secure. Several papers describing attacks on it are listed in the References and Further
Reading section at the end of this chapter.
You may come upon references to RC5 and RC6. Although these sound like newer versions of
the system described above, they are not. Both RC5 and RC6 are block ciphers. The numbering
simply indicates the order in which Rivest developed these unrelated ciphers. It’s analogous to the
numbering of Beethoven’s symphonies.
Although RC4 is broken, no other stream cipher has yet been recognized as the new champion.
One contender that is gaining in popularity and may capture the title is ChaCha20, designed
by Daniel J. Bernstein in 2008.12 It has been adopted by Google as a replacement for RC4 in
Transport Layer Security (TLS), the successor to Secure Sockets Layer (SSL).
Caution! Even if a stream cipher is mathematically secure, it can be broken when misused.
For example, if the same initial state (seed) is used twice, the security is no better than a binary
running key cipher. Keys should never be reused in stream ciphers!
11 For more details of this attack presented in a clear manner see Stamp, Mark and Richard M. Low, Applied
Cryptanalysis: Breaking Ciphers in the Real World, Wiley-Interscience, Hoboken, New Jersey, 2007, pp. 105‒110.
12 Bernstein, Daniel J., “ChaCha, a variant of Salsa20,” January 28, 2008, http://cr.yp.to/chacha/chacha-20080120.
pdf.
542 ◾ Secret History
Plumstead, Joan B., “Inferring a Sequence Generated by a Linear Congruence,” in Proceedings of the 23rd
Annual Symposium on Foundations of Computer Science, IEEE Computer Society Press, Los Alamitos,
California, 1982, pp. 153‒159.
Reeds, James, ““Cracking” a Random Number Generator,” Cryptologia, Vol. 1. No. 1, January 1977, pp.
20‒26. Reeds shows how a crib can be used to break a linear congruential random number generator.
Reeds, James, “Solution of a Challenge Cipher,” Cryptologia, Vol. 3, No. 2, April 1979, pp. 83‒95.
Reeds, James, “Cracking a Multiplicative Congruential Encryption Algorithm,” in Wang, Peter C. C.,
Arthur L. Schoenstadt, Bert I. Russak, and Craig Comstock, editors, Information Linkage Between
Applied Mathematics and Industry, Academic Press, New York, 1979, pp. 467‒472.
Vahle, Michael O. and Lawrence F. Tolendino, “Breaking a Pseudo Random Number Based Cryptographic
Algorithm,” Cryptologia, Vol. 6, No. 4, October 1982, pp. 319‒328.
Wichmann, Brian and David Hill, “Building a Random-Number Generator,” Byte, Vol. 12, No. 3, March
1987, pp. 127‒128.
On LFSRs
Barker, Wayne G., Cryptanalysis of Shift-Register Generated Stream Cipher Systems, Aegean Park Press,
Laguna Hills, California, 1984.
Golomb, Solomon, Shift Register Sequences, second edition, Aegean Park Press, Laguna Hills, California,
1982. This edition is a reprint of one from Holden-Day, San Francisco, California, 1967. Golomb
worked for the National Security Agency.
Goresky, Mark and Andrew Klapper, Algebraic Shift Register Sequences, Cambridge University Press,
Cambridge, UK, 2012.
Selmer, Ernst S., Linear Recurrence Relations Over Finite Fields, mimeographed lecture notes, 1966,
Department of Mathematics, University of Bergen, Norway. Selmer was the Norwegian government’s
chief cryptographer.
Zierler, Neal, “Linear Recurring Sequences,” Journal of the Society for Industrial and Applied Mathematics,
Vol. 7, No. 1, March 1959, pp. 31‒48.
On A5/1
Barkan, Elad and Eli Biham, “Conditional Estimators: An Effective Attack on A5/1,” in Preneel, Bart, and
Stafford Tavares, editors, Selected Areas in Cryptography 2005, Springer, Berlin, Germany, 2006, pp.
1‒19.
Barkan, Elad, Eli Biham, and Nathan Keller, “Instant Ciphertext-Only Cryptanalysis of GSM Encrypted
Communication,” in Boneh, Dan, editor, Advances in Cryptology — CRYPTO 2003 Proceedings,
Lecture Notes in Computer Science, Vol. 2729, Springer, Berlin, Germany, 2003, pp. 600‒616.
Barkan, Elad, Eli Biham, and Nathan Keller, “Instant Ciphertext-Only Cryptanalysis of GSM Encrypted
Communication,” Journal of Cryptology, Vol. 21, No. 3, July 2008, pp. 392‒429.
Biham, Eli and Orr Dunkelman, “Cryptanalysis of the A5/1 GSM Stream Cipher,” in Roy, Bimal and Eiji
Okamoto, editors, Progress in Cryptology: INDOCRYPT 2000, Lecture Notes in Computer Science,
Vol. 2247, Springer, Berlin, Germany, 2000, pp. 43‒51.
Biryukov, Alex, Adi Shamir, and David Wagner, “Real Time Cryptanalysis of A5/1 on a PC,” in Schneier,
Bruce, editor, Fast Software Encryption, 7th International Workshop, FSE 2000, Lecture Notes in
Computer Science, Vol. 1978, Springer, Berlin, Germany, 2001, pp. 1‒18.
Ekdahl, Patrik and Thomas Johansson, “Another attack on A5/1,” IEEE Transactions on Information Theory,
Vol. 49, No. 1, January 2003, pp. 284‒289, available online at http://www.it.lth.se/patrik/papers/
a5full.pdf.
Golic, Jovan Dj, “Cryptanalysis of Alleged A5 Stream Cipher,” in Fumy, Walter, editor, Advances in
Cryptology — EUROCRYPT ‘97 Proceedings, Lecture Notes in Computer Science, Vol. 1233,
Springer, Berlin, Germany, 1997, pp. 239‒255, available online at https://link.springer.com/content/
pdf/10.1007/3-540-69053-0_17.pdf.
Stream Ciphers ◾ 543
Gueneysu, Tim, Timo Kasper, Martin Novotný, Christof Paar, and Andy Rupp, “Cryptanalysis
with COPACOBANA,” IEEE Transactions on Computers, Vol. 57, No. 11, November 2008, pp.
1498‒1513.
Maximov, Alexander, Thomas Johansson, and Steve Babbage, “An Improved Correlation Attack on A5/1,”
in Handschuh, Helena and M. Anwar Hasan, editors, Selected Areas in Cryptography 2004, Lecture
Notes in Computer Science, Vol. 3357, Springer, Berlin, Germany, 2004, pp. 1‒18.
Stamp, Mark, Information Security: Principles and Practice, Wiley-Interscience, Hoboken, New Jersey, 2006.
Several GSM security flaws are detailed in this book.
On RC4
AlFardan, Nadhem, Daniel J. Bernstein, Kenneth G. Paterson, Bertram Poettering, and Jacob C. N.
Schuldt, “On the Security of RC4 in TLS,” 22nd USENIX Security Symposium, August 2013,
https://www.usenix.org/conference/usenixsecurity13/technical-sessions/paper/alFardan.
Arbaugh, William A., Narendar Shankar, Y. C. Justin Wan, and Kan Zhang, “Your 802.11 Wireless
Network has No Clothes,” IEEE Wireless Communications, Vol. 9, No. 6, December 2002, pp.
44‒51. The publication date was given in this formal reference; the paper itself was dated March 30,
2001.
Borisov, Nikita, Ian Goldberg, and David Wagner, Security of the WEP Algorithm, ISAAC, Computer
Science Department, University of California, Berkeley, http://www.isaac.cs.berkeley.edu/isaac/wep-
faq.html. This page contains a summary of the findings of Brisov, Goldberg, and Wagner, as well as
links to their paper and slides from a pair of presentations.
Fluhrer, Scott, Itsik Mantin, and Adi Shamir, “Weaknesses in the Key Scheduling Algorithm of RC4,” in
Vaudenay, Serge and Amr M. Youssef, editors, Selected Areas in Cryptography 2001, Lecture Notes in
Computer Science, Vol. 2259, Springer, Berlin, Germany, 2002, pp. 1‒24.
Jindal, Poonam and Brahmjit Singh, “RC4 Encryption-A Literature Survey,” International Conference on
Information and Communication Technologies (ICICT 2014), Procedia Computer Science, Vol. 46, 2015,
pp. 697‒705.
Kundarewich, Paul D., Steven J. E. Wilton, and Alan J. Hu, “A CPLD-based RC4 Cracking System,” in
Meng, Max, editor, Engineering Solutions for the Next Millennium, 1999 IEEE Canadian Conference
on Electrical and Computer Engineering, Vol. 1, IEEE, Piscataway, New Jersey, 1999, pp. 397‒402.
Rivest, Ronald L. and Jacob Schuldt, “Spritz — a spongy RC4-like stream cipher and hash function,”
October 27, 2014, https://people.csail.mit.edu/rivest/pubs/RS14.pdf. Rivest confirmed the history of
RC4 and its code in this paper.
Mantin, Itsik, “Analysis of the Stream Cipher RC4,” master’s thesis under the supervision of Adi Shamir,
Weizmann Institute of Science, Rehovot, Israel, November 27, 2001, available online at https://
tinyurl.com/yc9upxmu. This thesis is sometimes referenced under the title “The Security of the
Stream Cipher RC4.” The website referenced also contains other papers on RC4 and WEP.
Mantin, Itsik and Adi Shamir, “A Practical Attack on Broadcast RC4,” in Matsui, Mitsuru, editor, Fast
Software Encryption, 8th International Workshop, FSE 2001, Lecture Notes in Computer Science, Vol.
2355, Springer, Berlin, Germany, 2002, pp 152‒164.
Paul, Goutam and Subhamoy Maitra, “Permutation after RC4 Key Scheduling Reveals the Secret Key,”
in Adams, Carlisle, Ali Miri, and Michael Wiener, editors, Selected Areas of Cryptography, 14th
International Workshop, SAC 2007, Lecture Notes in Computer Science, Vol. 4876, Springer, Berlin
Germany, 2007, pp 360‒337.
Stubblefield, Adam, John Ioannidis, and Aviel D. Rubin, “Using the Fluhrer, Mantin and Shamir Attack
to Break WEP,” AT&T Labs Technical Report TD-4ZCPZZ, Revision 2, August 21, 2001, available
online at https://tinyurl.com/y8eunft9.
Walker, Jesse R., IEEE P802.11 Wireless LANs, Unsafe at any Key Size; an Analysis of the WEP Encapsulation,
IEEE Document 802.11-00/362, submitted October 27, 2000, available online at https://tinyurl.
com/y9b888vl.
544 ◾ Secret History
General
Bernstein, Daniel J., ChaCha, a variant of Salsa20, January 28, 2008, http://cr.yp.to/chacha/
chacha-20080120.pdf.
Cusick, Thomas W., Cunsheng Ding, and Ari Renvall, Stream Ciphers and Number Theory, revised edition,
North-Holland Mathematical Library, Vol. 66, Elsevier, New York, 2004. The original edition, pub-
lished in 1998, was Vol. 55 of the same series.
Pommerening, Klaus, “Cryptanalysis of nonlinear feedback shift registers,” Cryptologia, Vol. 40, No. 4, July
2016, pp. 303‒315.
Ritter, Terry, “The Efficient Generation of Cryptographic Confusion Sequences,” Cryptologia, Vol. 15, No.
2, April 1991, pp. 81–139. This survey paper includes a list of 213 references.
Robshaw, Matt J. B., Stream Ciphers Technical Report TR-701, Version 2.0, RSA Laboratories, Bedford,
Massachusetts, 1995.
Rubin, Frank, 1978, “Computer Methods for Decrypting Random Stream Ciphers,” Cryptologia, Vol. 2,
No. 3, July 1978, pp. 215‒231.
Rueppel, Rainer, A., Analysis and Design of Stream Ciphers, Springer, New York, 1986.
van der Lubbe, Jan, Basic Methods of Cryptography, Cambridge University Press, Cambridge, UK, 1998.
Chapter 20
Suite B All-Stars
In 2005, The National Security Agency (NSA) made public a list of recommended cryptographic
algorithms and protocols. Known as “Suite B,” these are believed to be the best of the unclassified
schemes of that era. Two of them are covered in the present chapter.
1 Blake, Ian, Gadiel Seroussi, and Nigel Smart, Elliptic Curves in Cryptography, London Mathematical Society
Lecture Note Series, Vol. 265, Cambridge University Press, Cambridge, UK, 1999, p. 9.
2 Other “big names” who studied elliptic curves include Abel, Jacobi, Gauss, and Legendre.
3 Koblitz, Neal, Random Curves: Journeys of a Mathematician, Springer, Berlin, Germany, 2008, p. 313.
4 Koblitz, Neal, Random Curves: Journeys of a Mathematician, Springer, Berlin, Germany, 2008, p. 303.
545
546 ◾ Secret History
The graphs of the solutions take two basic forms, depending on whether the elliptic curve has
three real roots, or just one.
We make the following work simpler by avoiding elliptic curves that have roots of multiplicity
higher than one.5 When looking at complex solutions (as opposed to the real solutions graphed in
Figure 20.4), the visualization takes the form of a torus (donut).
5 Another simplification involves not using fields of characteristic 2 or 3. The interested reader may consult the
references for the reasons for these simplifications. These simplifications are just that. We may proceed, with
greater difficulty, without them.
Suite B All-Stars ◾ 547
7 7
–7 7 –7 7
–7 –7
y2 = x3 – 4x = (x)(x – 2)(x + 2) y2 = x3 + 5
Elliptic curves don’t resemble ellipses. The name arises from their connection with elliptic
integrals, which arose in the 19th century, as mathematicians attempted to find formulas for arc
d d
dx x dx
lengths of ellipses. Examples are given by ∫ 3 and ∫ 3 .
c x + ax + b c x + ax + b
Setting the denominator of either integrand above equal to y, then gives us y2 = x3 + ax + b, an
elliptic curve.
We define addition of points on an elliptic curve in a strange way. To add points P1 and P2, we
draw a line through them and observe that this line passes through a third point on the curve, I.
No, I is not the sum! I is merely an intermediate point. We reflect the point I about the x-axis to
get a new point P3. Then we say P1 + P2 = P3. This is illustrated in Figure 20.5.
548 ◾ Secret History
P2
P1
–7 7
P3
–7
If we wish to add a point to itself, we draw a tangent line to the curve at the given point, and
then continue as above.
There’s still one more case that needs to be considered. Adding two points that lie on a
vertical line (or adding a point at which the curve has a vertical tangent line to itself) provides
no other point of intersection with our curve. To “patch” this problem, we introduce an extra
point, ∞. This is a common trick in algebraic geometry. Reflecting ∞ about the x-axis still gives
∞; that is, we do not want a second –∞. With these definitions, the points on an elliptic curve
(together with ∞) form a commutative group. The identity element is ∞, because P + ∞ = P for
any point P.
The addition can be carried out algebraically, as follows. P1 = (x1, y1) and P2 = (x 2, y2) implies
P1 + P2 = (m2 – x1 – x 2, m(2x1 – (m2 – x 2)) – y1), where
m = ( y 2 − y1 )/( x 2 − x1 ), if P1 ≠ P2
and
y 2 = x 3 + 3x + 4 (mod 7)
Suite B All-Stars ◾ 549
By plugging in the values 0, 1, 2, 3, 4, 5, and 6 for x and seeing which are perfect squares
modulo 7, we’re able to quickly get the complete set of solutions:
(0,4), (0,5), (1,1), (1,6), (2,2), (2,5), (5,2), (5,5), (6,0), ∞
There is no point on the curve with an x value of 3, for example, because that would imply
y = 2 (mod 7) for some y value in the set {0, 1, 2, 3, 4, 5, 6} and this is not true.
2
Notice that with the exception of 0, every perfect square that results from plugging in a value
for x leads to two values for y.
It’s interesting to note that if we’re looking for points (x, y) on an elliptic curve such that x and
y are both rational numbers, the number of solutions may be infinite or finite, but if there are only
finitely many, there will be no more than 16. On the other hand, we’re able to get arbitrarily large,
yet still finite, solution sets by working modulo a prime.
So, how many points will satisfy a given elliptic curve modulo p? If we plug in the values, 0, 1, 2,
…, p – 1 we can expect to get a value that is a square root modulo p (and hence, a point on the curve)
about half the time. This is because of a result from number theory that tells us half of the nonzero
integers are perfect squares modulo a prime. But each square root, other than 0, will have two answers,
so there should be about p points. Adding in the point ∞, we are now up to p + 1 points, but we don’t
always get this exact value. Letting the correct value be denoted by N, our error will be |N – (p + 1)|.
German mathematician Helmut Hasse found a bound on this error around 1930. He showed
| N − ( p + 1) | ≤ 2 p .
For example, with p = 101, our estimate suggests 102 points, and Hasse’s theorem guarantees the
actual number is between 102 − 2 101 and 102 + 2 101. That is, rounding to the appropriate
integer, between 82 and 122.
We’re now ready to show how elliptic curves can be used to agree on a key over an insecure
channel. This is the elliptic curve cryptography (ECC) version of Diffie-Hellman key exchange.
Normally, Alice and Bob would appear at this point, but Neal Koblitz recalled:6
When I wrote my first book on cryptography I tried to change this anglocentric choice
of names to names like Alicia and Beatriz or Aniuta and Busiso. However, the Anglo-
American domination of cryptography is firmly entrenched — virtually all books and
journals are in English, for example. My valiant attempt to introduce a dollop of mul-
ticulturalism into the writing conventions of cryptography went nowhere. Everyone
still says “Alice” and Bob.”
As a tribute to Koblitz, the key exchange will be carried out by Aïda and Bernardo. They proceed
as follows.
1. Aïda and Bernardo agree on an elliptic curve E (mod p), for some prime p.
2. They agree on a point B on their curve E. (this is also done publicly)
3. Aïda selects a random (secret) integer a and computes aB, which she sends to Bernardo.
4. Bernardo selects a random (secret) integer b and computes bB, which he sends to Aïda.
5. Both Aïda and Bernardo are now able to compute abB, the x coordinate of which can be
adapted to serve as their secret key for a symmetric system.
6 Koblitz, Neal, Random Curves: Journeys of a Mathematician, Springer, Berlin, Germany, 2008, p. 321.
550 ◾ Secret History
Given points P and B on an elliptic curve, finding an integer x such that xB = P is the elliptic curve
analog of the discrete log problem. No efficient method is known for solving this when the values
are large. The letter B was chosen, as this point plays the role of the base in this version of the
discrete log problem. Although an eavesdropper on an exchange like the one detailed above will
know aB and bB, he cannot efficiently find a or b. This is good, because either one of these values
would allow recovery of the secret key generated by the exchange between Aïda and Bernardo.
When calculating a large multiple of a point, we’d like to avoid the tedium of adding the num-
ber to itself a large number of times. Happily, we can adapt the repeated squaring technique used
to raise a number to a large power modulo some integer n. This technique is most easily explained
with an example.
To find 100P, we express it as 2(2(P + 2(2(2(P + 2P))))). Thus, in place of 99 additions, we have
2 additions and 6 doublings.7 But how did we find this representation?
Another example will serve to illustrate the process. Suppose we wish to calculate 86P. Because
86 is divisible by 2, we start out with
2
Halving 86 gives 43, which is not divisible by 2, so we continue with a P, instead of another 2:
2(P +
When we do this, we subtract 1 from the number we are reducing. Now that we are down to
42, we halve it again, by placing a 2 in our representation. We get
2( P + 2
But half of 42 is 21, which is odd, so we continue with a P:
2( P + 2( P +
Subtracting 1 from 21, we get 20, which can be halved twice, so we append a pair of 2s:
2( P + 2( P + 2(2
After halving 20 twice, we’re down to 5, which is odd, so we use another P:
2( P + 2( P + 2(2( P +
Subtracting 1 from 5, leaves 4, which we can halve twice, so we append a pair of 2s:
2( P + 2( P + 2(2( P + 2(2
We are finally down to 1, so we end with a P, and close off all of the parentheses:
7 This example was taken from Kobitz, Neal, A Course in Number Theory and Cryptography, second edition,
Springer, New York, 1994, p. 178. The example also appears on p. 162 of the first edition, but there are a pair
of typos present in that edition obscure this simple technique.
Suite B All-Stars ◾ 551
equation and does not apply to elliptic curves in general.8 In this special case, the time required
is now sublinear.
To use elliptic curves for enciphering a message, rather than just agreeing on a key, we must
be able to represent plaintext characters by points on the curve. Hasse’s theorem above shows that
for a sufficiently large modulus there will be enough such points, but there is not yet a fast (poly-
nomial time) deterministic way to assign characters to points!9 The problem is, for a given x value,
there may or may not be a point y such that (x, y) lies on the curve. Koblitz described a method of
pairing characters and points as follows.
Once the message has been converted to points on the curve, we’re ready to begin the enciphering.
1. Doubling — Because multiplying a point by 2 is the same as adding it to itself, we use the
formula (adapted from what was given here earlier):
2. Summing — We must now add the original B to our new doubled value:
8 Dimitrov, Vassil S., Kimmo U. Järvinen, Michael J. Jacobson, Jr., Wai Fong Chan, and Zhun Huang, “Provably
Sublinear Point Multiplication on Koblitz Curves and its Hardware Implementation,” IEEE Transactions on
Computers, Vol. 57, No. 11, November 2008, pp. 1469–1481.
9 However, there are probabilistic algorithms that may be applied to make the chance of failure arbitrarily small.
And a theorem from Section 16.5 leads me to believe that a deterministic algorithm in polynomial time does exist.
552 ◾ Secret History
We calculate m = (y2 – y1)/(x2 – x1) = (14 – 21)/(14 – 8) = –7/6 = (–7)(5) = –35 = 23 (mod 29). And
then make use of that value in the finding B + 2B = ((23)2 – 8 – 14, 23(2(8) – ((23)2 – 14)) – 21) =
(507, –11498) = (14, 15).
Aditsan now reveals his public key as follows. The elliptic curve y2 = x3 – 2x + 3 (mod 29) and
the points B = (8, 21) and sB = (14, 15). He keeps s secret. Recall that for larger values, knowing B
and sB does not allow us to efficiently find s.
Seeing that Aditsan has posted his public key, Bisahalani prepares his message. He does this
by converting it to an x value, adding bits at the end, to guarantee that x3 – 2x + 3 will be a perfect
square modulo 29. To make things simpler, for illustrative purposes, we’ll just assume his message
is represented as 12. He ends up with the point on the curve M = (12, 5). He then selects a random
number k = 5 and computes
Scott Vanstone (of the University of Waterloo) was the first to commercialize elliptic curve
cryptography, through a company now called the Certicom Corporation.13 In March 1997,
he offered Koblitz $1,000 a month to serve as a consultant. Koblitz accepted and donated the
money, first to the University of Washington, but upon discovery of its misuse, redirected it to the
Kovalevskaia Fund.14 Certicom, a Canadian company, is a competitor of RSA and has NSA as its
largest customer: “In 2003 NSA paid Certicom a $25 million licensing fee for 26 patents related
to ECC.”15 As was mentioned at the start of this chapter, NSA also encouraged others to use the
system by including a key agreement and a signature scheme based on ECC in its “Suite B” list
of recommendations.16 To answer the obvious question, a quote from NSA is provided below:17
Another suite of NSA cryptography, Suite A, contains classified algorithms that will
not be released. Suite A will be used for the protection of some categories of especially
sensitive information.
The Suite B block cipher, AES, is detailed later in this chapter. ECC earned NSA’s endorsement by
standing the test of time, and massive peer review:18
[E]xcept for a relatively small set of elliptic curves that are easy to avoid, even at present
— more than twenty years after the invention of ECC — no algorithm is known that
finds discrete logs in fewer than 10n/2 operations, where n is the number of decimal
digits in the size of the elliptic curve group.
Neal Koblitz’s political convictions have also stood the test of time. Whereas many activist burn
out, Koblitz’s autobiography shows him sustaining his radicalism for decades on end. He’s been
arrested several times, including during his first year teaching at Harvard, but he never worried
about it affecting his employment.19
I had read about the history of mathematics and was aware of the long tradition of
tolerance of eccentricity and political dissidence among mathematicians.
In June 1997, Koblitz learned that the official RSA website put up a page filled with skeptical
remarks about ECC. This was part of the American company’s aggressive approach and it included
a comment from RSA co-creator Ron Rivest.20
But the security of a cryptosystem based on elliptic curves is not well understood, due in
large part to the abstruse nature of elliptic curves. Few cryptographers understand ellip-
tic curves, so… trying to get an evaluation of the security of an elliptic curve cryptosys-
tem is a bit like trying to get an evaluation of some recently discovered Chaldean poetry.
13 Koblitz, Neal, Random Curves: Journeys of a Mathematician, Springer, Berlin, Germany, 2008, p. 302–303.
14 Koblitz, Neal, Random Curves: Journeys of a Mathematician, Springer, Berlin, Germany, 2008, p. 314.
15 Koblitz, Neal, Random Curves: Journeys of a Mathematician, Springer, Berlin, Germany, 2008, p. 319.
16 NSA Suite B Cryptography, National Security Agency, January 15, 2009, https://web.archive.org/
web/20090117004931/http://www.nsa.gov/ia/programs/suiteb_cryptography/index.shtml.
17 NSA Suite B Cryptography, National Security Agency, January 15, 2009, https://web.archive.org/
web/20090117004931/http://www.nsa.gov/ia/programs/suiteb_cryptography/index.shtml.
18 Koblitz, Neal, Random Curves: Journeys of a Mathematician, Springer, Berlin, Germany, 2008, p. 311.
19 Koblitz, Neal, Random Curves: Journeys of a Mathematician, Springer, Berlin, Germany, 2008, p. 23.
20 Taken here from Koblitz, Neal, Random Curves: Journeys of a Mathematician, Springer, Berlin, Germany, 2008,
p. 313.
554 ◾ Secret History
Koblitz’s reaction, after asking his wife who the Chaldean’s were,21 was not to post a webpage
bashing RSA, but rather to have shirts made featuring an elliptic curve and the text “I Love
Chaldean Poetry.” He reports that they were a hit with the students, with the exception of those
hoping to intern at RSA.22
It is true that a mathematician who is not also something of a poet will never be a
perfect mathematician.
— Karl Weierstrass23
When Weierstrass made the above claim, it is doubtful that he had Chaldean poetry in mind!
It’s more likely that he was referring to the poet’s creativity and sense of beauty. Victor Miller has
engaged his artistic side as an actor/singer in 17 community theater productions.24 He’s also sung
in many choirs and was one of the winners of a vocal competition at Westminster Conservatory
in 2003.
For many years Miller bred and exhibited pedigreed cats. He was a former president of a
national breed club, and bred the U.S. national best of breed Colorpoint Shorthair in 1997. Miller
himself is a rare breed, being one of the few mathematicians to have a Bacon number. Not strictly
limited to community theater, he appeared as an extra in A Beautiful Mind, which featured Ed
Harris, who starred in Apollo Thirteen with Kevin Bacon. Hence, two links connect Miller to
Bacon. This connection game began long before Bacon, as mathematicians tried to find their
shortest path to Paul Erdös, a number theorist with over 1,500 papers and over 500 coauthors.
Miller’s Erdös number is also two. Successful at all of these outside pursuits, Miller once faced an
unfair rejection in his professional life.
Similar to Merkle (and Rene Schoof) my paper on the efficient calculation of the “Weil
Pairing” was rejected from the 1986 FOCS conference (Schoof’s paper on counting
points on elliptic curves over finite fields was rejected the previous year). This led
Hendrik Lenstra to remark that perhaps that this was a badge of honor.25
21 His wife knows some mathematics, as well, and has authored a biography of Sofia Kovalevskaia: Koblitz, Ann
Hibner, A Convergence of Lives: Sofia Kovalevskaia: Scientist, Writer, Revolutionary, Rutgers University Press,
New Brunswick, New Jersey, 1983.
22 Koblitz, Neal, Random Curves: Journeys of a Mathematician, Springer, Berlin, Germany, 2008, p. 314.
23 Quoted here from Bell, Eric Temple, Men of Mathematics, Dover Publications, New York, 1937, p. 432.
24 Theater is a passion Miller shares with his daughter, who is, as of this writing, a professional stage manager
working on Broadway.
25 Email from Victor S. Miller to the author, November 10, 2010.
Suite B All-Stars ◾ 555
open. It was this community that would be responsible for analyzing the security of the submit-
ted systems. Cryptanalysts could submit their findings to NIST’s AES website or present them at
AES conferences.
Acceptance of submissions ended on May 15, 1998. The 15 accepted submissions were pre-
sented at The First Advanced Encryption Standard Candidate Conference in Ventura, California,
August 20‒22, 1998.26 The second conference was held in Rome, Italy, March 22‒23, 1999. Five
candidates were eliminated at (or prior to) this conference due to various kinds of security prob-
lems that were identified.
From the remaining ten candidates, the finalists, announced by NIST a year later, in August
1999, were
NIST justified their selections in a publication28, but no controversy was expected, as their choices
matched the top 5, as voted on at the end of the second AES conference. Cost was a factor in
eliminating two candidates and slow runtime was responsible for the failure of another.
New York City hosted the third AES conference on April 13‒14, 2000. The attacks made on
the finalists proved to be only slightly faster than brute force. Again, attendees voted for their
favorites. NIST announced on October 2, 2000 that Rijndael was the winner. Again, controversy
was avoided as this matched the results of the attendees’ vote. NIST’s full justification was pro-
vided online once more.29 The name Rijndael is a combination of the last names of the Belgians
who collaborated on its design, Vincent Rijmen and Joan Daemen (Figure 20.6). Various pronun-
ciations have been offered. Many Americans pronounce it Rhine-Doll. At one point the creators
had a link on their website for those wishing to hear the authoritative pronunciation; the recording
said, “The correct pronunciation is… AES.”
The algorithm is royalty-free. This was required of all submissions to NIST, should the algo-
rithm be declared the winner. This fact, combined with endorsement by NIST and the worldwide
26 All 15 are listed in Daemen, Joan and Vincent Rijmen, The Design of Rijndael: AES—The Advanced Encryption
Standard, Springer, New York, 2002, p. 3.
27 John Kelsey and Bruce Schneier examined this system in a paper that bore the wonderful title “MARS Attacks!
Report on the First Round of the Development of the Advanced Encryption Standard,” Journal of Research of
the National Institute of Standards and Technology, Vol. 104, No. 5, September–October 1999, pp. 435–459,
available online at http://nvlpubs.nist.gov/nistpubs/jres/104/5/j45nec.pdf.
29 Nechvatal, James, Elaine Barker, Lawrence Bassham, William Burr, Morris Dworkin, James Foti, and Edward
Roback, “Report on the Development of the Advanced Encryption Standard (AES),” Journal of Research of the
National Institute of Standards and Technology, Vol. 106, No. 3, May–June 2001, pp. 511–577, available online
at https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4863838/.
556 ◾ Secret History
Figure 20.6 Joan Daemen (1965–) and Vincent Rijmen (1970–). (Courtesy of Vincent Rijmen.)
community of cryptographers, prompted Ron Rivest to remark, somewhat obviously, “It is likely
that Rijndael will soon become the most widely-used cryptosystem in the world.”30
AES offers a choice of key sizes (128, 192, or 256 bits), but always acts on blocks of 128 bits
(16 characters). The number of rounds depends on the key size. The different key sizes are often
distinguished by referring to AES-128 (10 rounds), AES-192 (12 rounds), or AES-256 (14 rounds).
Like DES, AES is derived from earlier systems. Unlike, DES a large safety margin was built into
AES. The only attacks the creators could find that were better than brute force apply only to six or
fewer rounds (for the 128 bit version).31
To help drive home the size of 2128 we look at it written out:
340,282,366,920,938,463,463,374,607,431,768,211,456.
AES is basically composed of four simple and fast operations that act on a 4 × 4 array of bytes,
referred to as “the state.” Each of the operations is detailed below.
20.3.1 SubBytes
As the name suggests, this operation makes substitutions for the bytes using the table below. It can
be illustrated like the S-boxes of DES, and is even known as the Rijndael S-box, but there is only
one (along with its inverse) and there are other ways to represent it. Below is Rijndael’s S-box:32
30 Daemen, Joan, and Vincent Rijmen, The Design of Rijndael, Springer, Berlin, Germany, 2002, p. vi. Although
it was behind the scenes, NSA also examined AES on NIST’s behalf. They didn’t want NIST to be surprised if
there was some flaw the open community couldn’t find.
31 Daemen, Joan, and Vincent Rijmen, The Design of Rijndael, Springer, Berlin, Germany, 2002, p. 41.
32 This substitution box is usually presented in hexadecimal. For this book, I thought base-10would be clearer.
Finding such a table in Trappe, Wade and Lawrence C. Washington, Introduction to Cryptography with Coding
Theory, Prentice Hall, Upper Saddle River, New Jersey, 2002, saved me the trouble of conversion. This is one of
the most clearly written books covering modern cryptography.
Suite B All-Stars ◾ 557
99 124 119 123 242 107 111 197 48 1 103 43 254 215 171 118
202 130 201 125 250 89 71 240 173 212 162 175 156 164 114 192
183 253 147 38 54 63 247 204 52 165 229 241 113 216 49 21
4 199 35 195 24 150 5 154 7 18 128 226 235 39 178 117
9 131 44 26 27 110 90 160 82 59 214 179 41 227 47 132
83 209 0 237 32 252 177 91 106 203 190 57 74 76 88 207
208 239 170 251 67 77 51 133 69 249 2 127 80 60 159 168
81 163 64 143 146 157 56 245 188 182 218 33 16 255 243 210
205 12 19 236 95 151 68 23 196 167 126 61 100 93 25 115
96 129 79 220 34 42 144 136 70 238 184 29 222 94 11 219
224 50 58 10 73 6 36 92 194 211 172 98 145 149 228 121
231 200 55 109 141 213 78 169 108 86 244 234 101 122 174 8
186 120 37 46 28 166 180 198 232 221 116 31 75 189 139 138
112 62 181 102 72 3 246 14 97 53 87 185 134 193 29 158
225 248 152 17 105 217 142 148 155 30 135 233 206 85 40 223
140 161 137 13 191 230 66 104 65 153 45 15 176 84 187 22
The table above is meant to be read from left to right and from top to bottom, like normal
English text. Thus, in base 10, 0 goes to 99, 1 goes to 124, …, 255 goes to 22. These numbers are
only expressed in base-10 for convenience (and familiarity). In base-2, each is a byte. These 8 bits
may be viewed as the coefficients of a polynomial of degree at most 7. Viewed in this manner, the
table above has a much terser representation. It simply sends each polynomial to its inverse modulo
x8 + x4 + x3 + x + 1, followed by the affine transformation
1 1 1 1 1 0 0 0 d7 0
0 1 1 1 1 1 0 0 d6 1
0 0 1 1 1 1 1 0 d5 1
0 0 0 1 1 1 1 1 d4 0
× ⊕
1 0 0 0 1 1 1 1 d3 0
1 1 0 0 0 1 1 1 d2 0
1 1 1 0 0 0 1 1 d1 1
1 1 1 1 0 0 0 1 d0 1
Example 1
The Rijndael S-box sends 53 to 150. We verify that the alternative method does the same.
Converting 53 to binary we get 00110101, which has the polynomial representation
x 5 + x 4 + x 2 + 1.
This gets sent to its inverse modulo x8 + x4 + x3 + x + 1, which is x5 + x4 + x3 + 1, or 00111001,
in binary. This inverse may be calculated using the extended Euclidean algorithm, as shown in
Section 14.3. There is no difference other than using polynomials instead of integers. Long divi-
sion with polynomials reveals
( x 8 + x 4 + x 3 + x + 1) = ( x 3 + x 2 + x )( x 5 + x 4 + x 2 + 1) + ( x 3 + x 2 + 1)
558 ◾ Secret History
and
( x 5 + x 4 + x 2 + 1) = ( x 2 )( x 3 + x 2 + 1) + 1
The final remainder of 1 shows that the two polynomials are relatively prime, a necessary condi-
tion for the inverse of one to exist modulo the other.
We now solve for the remainder in each of the two equalities above:
( x 3 + x 2 + 1) = ( x 8 + x 4 + x 3 + x + 1) − ( x 3 + x 2 + x )( x 5 + x 4 + x 2 + 1)
1 = ( x 5 + x 4 + x 2 + 1) − ( x 2 )( x 3 + x 2 + 1)
Using the first equality to substitute for (x3 + x 2 + 1) in the second gives
1 = ( x 5 + x 4 + x 2 + 1) − ( x 2 )[( x 8 + x 4 + x 3 + x + 1) − ( x 3 + x 2 + x )( x 5 + x 4 + x 2 + 1)]
Distributing the x 2 gives
1 = ( x 5 + x 4 + x 2 + 1) − ( x 2 )( x 8 + x 4 + x 3 + x + 1) + ( x 5 + x 4 + x 3 )( x 5 + x 4 + x 2 + 1).
Combining the (x5 + x4 + x 2 + 1) terms gives
1 = −( x 2 )( x 8 + x 4 + x 3 + x + 1) + ( x 5 + x 4 + x 3 + 1)( x 5 + x 4 + x 2 + 1).
Reducing the equality above modulo (x8 + x4 + x3 + x + 1) gives
1 = ( x 5 + x 4 + x 3 + 1)( x 5 + x 4 + x 2 + 1)
So, we see that the inverse of x5 + x4 + x 2 + 1 (mod x8 + x4 + x3 + x + 1) is x5 + x4 + x3 + 1, as claimed
above. We write this as the binary vector 00111001 and plug it into the matrix equation:
1 1 1 1 1 0 0 0 a7 0
0 1 1 1 1 1 0 0 a6 1
0 0 1 1 1 1 1 0 a5 1
0 0 0 1 1 1 1 1 a4 0
× ⊕
1 0 0 0 1 1 1 1 a3 0
1 1 0 0 0 1 1 1 a2 0
1 1 1 0 0 0 1 1 a1 1
1 1 1 1 0 0 0 1 a0 1
to get
1 1 1 1 1 0 0 0 0 0 1 0 1
0 1 1 1 1 1 0 0 0 1 1 1 0
0 0 1 1 1 1 1 0 1 1 1 1 0
0 0 0 1 1 1 1 1 1 0 1 0 1
× ⊕ = ⊕ =
1 0 0 0 1 1 1 1 1 0 0 0 0
1 1 0 0 0 1 1 1 0 0 1 0 1
1 1 1 0 0 0 1 1 0 1 0 1 1
1 1 1 1 0 0 0 1 1 1 1 1 0
Suite B All-Stars ◾ 559
Converting the output, 10010110, to base-10 gives 150, which matches what our table provides.
To invert the affine transformation, we must perform the XOR first and follow it by multipli-
cation with the inverse of the 8 × 8 matrix above. We then have
0 1 0 1 0 0 1 0 a7 0
0 0 1 0 1 0 0 1 a6
1
1 0 0 1 0 1 0 0 a5 1
0 1 0 0 1 0 1 0 × a4 ⊕ 0
0 0 1 0 0 1 0 1 a3 0
1 0 0 1 0 0 1 0 a2 0
0 1 0 0 1 0 0 1 a1 1
1 0 1 0 0 1 0 0 a0 1
The reference they provided is [LiNi86] R. Lidl and H. Niederreiter, Introduction to finite fields
and their applications, Cambridge University Press, 1986.
Any irreducible polynomial of degree 8 could have been used, but by selecting the first from a
list provided in a popular (at least among algebraists) book, Rijmen and Daemen allayed suspicion
that there was something special about this particular polynomial, that might provide a backdoor.
Again, the design process was made transparent.
20.3.2 ShiftRows
In this step, the first row is left unchanged, but the second, third, and fourth rows have their bytes
shifted left by one, two, and three bytes, respectively. The shifts are all cyclical. Denoting each
33 Daemen, Joan, and Vincent Rijmen, AES Proposal: Rijndael, Document version 2, Date: 03/09/99, p. 25,
available online at https://www.cs.miami.edu/home/burt/learning/Csc688.012/rijndael/rijndael_doc_V2.pdf.
Thanks to Bill Stallings for providing this reference!
560 ◾ Secret History
byte as ai,j for some 0 ≤ i, j ≤ 3, we get the results shown in Table 20.1 to represent the ShiftRows
operation.
a0,0 a0,1 a0,2 a0,3 → a0,0 a0,1 a0,2 a0,3 (no change)
a1,0 a1,1 a1,2 a1,3 → a1,1 a1,2 a1,3 a1,0 (one byte left shift)
a2,0 a2,1 a2,2 a2,3 → a2,2 a2,3 a2,0 a2,1 (two byte left shift)
a3,0 a3,1 a3,2 a3,3 → a3,3 a3,0 a3,1 a3,2 (three byte left shift)
The inverse of this step is a cyclic right shift of the rows by the same amounts.
20.3.3 MixColumns
In this step, each column of the state is viewed as a polynomial of degree 3 or less. For example,
the following column
a0
a1
a2
a3
is viewed as a(x) = a3x3 + a 2x 2 + a1x + a0. However, the coefficients, a3, a 2, a1, and a0, are all bytes.
That is, the coefficients themselves form polynomials that may be added or multiplied modulo the
irreducible polynomial x8 + x4 + x3 + x + 1 from the SubBytes step.
In the MixColumns step, each column, expressed as a polynomial, is multiplied by the poly-
nomial c(x) = 3x3 + x 2 + x + 2. It is then reduced modulo x4 + 1, so that it may still be expressed as
a column (i.e., a polynomial of degree 3 or smaller).
Working modulo x4 + 1 is a bit different than modulo x8 + x4 + x3 + x + 1. First of all, x4 + 1
is reducible! So a randomly chosen c(x) needn’t be invertible. For this reason, c(x) had to be cho-
sen carefully, but how was x4 + 1 chosen? It was picked so that products could be easily reduced.
Moding out by x4 + 1 is the same as defining x4 = –1, but –1 = 1 (mod 2), so we have x4 = 1. This
allows us to very easily reduce powers of x. We have x5 = x, x6 = x 2, x7 = x3, and x8 = x0 = 1. In
general, xn = xn (mod 4). Thus,
reduces to
+ 3a0 x 3 + a1 x 3 + a2 x 3 + 2a3 x 3 .
2 3 1 1 a0
1 2 3 1 a1
×
1 1 2 3 a2
3 1 1 2 a3
This needs to be applied to every column. The multiplication is done for each pair of bytes
modulo x8 + x4 + x3 + x + 1.
However, because the matrix consists only of 1, 2, and 3, which correspond to the polynomi-
als, 1, x, and x + 1, respectively, the byte multiplication is especially simple. Multiplying a byte by
1 modulo x8 + x4 + x3 + x + 1 changes nothing. Multiplying by x amounts to a left shift of all of the
bits. Multiplying by x + 1 is just the shift described for x, followed by an XOR with the original
value. We need to be careful with the left shifts though! If the leftmost bit was already a 1, it will
be shifted out and we must then XOR our result with 00011011 to compensate. This is because the
x7 bit was shifted to x8, which cannot be represented by bits 0 through 7, but we have x8 = x4 + x3
+ x + 1 (mod x8 + x4 + x3 + x + 1), so the representation for x4 + x3 + x + 1 will serve.
For deciphering, one must use the inverse of c(x) modulo x4 + 1. This is given by
20.3.4 AddRoundKey
Finally, we involve the key! This is simply an XOR (self inverse) of each byte of the state with a
byte of the key for the relevant round. Each round uses a distinct key derived from the original
key. This is done as follows.
First, the original key is taken 32 bits at a time and placed at the beginning of what will become
the “expanded key.” This expanded key will eventually be divided into equal size pieces to provide
the round keys, in order. For AES-128, the original key will serve to initialize the expanded key
562 ◾ Secret History
blocks k0, k1, k 2, k 3. For AES-196, k4 and k5 will also be filled at this point; for AES-256, k6 and k7
will be filled. Then, more 32 bit blocks are defined recursively. The formulas for each of the three
key sizes follow. They all involve a function, f, which will be detailed shortly.
ki = ki − 4 ⊕ ki −1 , if i ≠ 0 (mod 4)
ki = ki − 6 ⊕ ki −1 , if i ≠ 0 (mod 6)
where f consists of a circular left shift of 1 byte for the input, followed by a substitution using
Rijndael’s S-box, for each byte, and finally an XOR of this result with the appropriate round con-
stant, RC (to be discussed).
The 256-bit case uses f, but also requires us to introduce a second function f 2.
The function f 2 is simpler than f, as it only makes use of the S-box. The shift and XOR are omitted.
The round constants are defined as follows (for any size key).
RC1 = x 0
RC 2 = x 1
RC1 = 00000001
RC 2 = 00000010
RC 3 = 00000100
RC 4 = 00001000
RC 5 = 00010000
Suite B All-Stars ◾ 563
RC 6 = 00100000
RC 7 = 01000000
RC 8 = 10000000
RC 9 = 00011011
What concerns us the most about AES is its simple algebraic structure. It is possible
to write AES encryption as a relatively simple closed algebraic formula over the finite
field with 256 elements. This is not an attack, just a representation, but if anyone can
ever solve those formulas, then AES will be broken. This opens up an entirely new
avenue of attack. No other block cipher we know of has such a simple algebraic repre-
sentation. We have no idea whether this leads to an attack or not, but not knowing is
reason enough to be skeptical about the use of AES.
They referenced a paper by Ferguson and others, detailing the simple representation of AES.37
Nevertheless, in 2005, AES was publicly endorsed by the National Security Agency, which made
it part of their “Suite B” list of recommendations.
There is, as of this writing, no practical attack against properly implemented AES. There are
some theoretical attacks, but this is a completely different matter. If someone finds a way to break
a cipher in 2% of the time that a brute-force attack would require, it will be of tremendous inter-
est to cryptologists, but if that 2% still requires millions of years on the world’s fastest computer,
34 Schneier, Bruce, “AES Announced,” Crypto-Gram Newsletter, October 15, 2000, http://www.schneier.com/
crypto-gram-0010.html.
35 Ferguson, Niels and Bruce Schneier, Practical Cryptography, Wiley, Indianapolis, Indiana, 2003, p. 56.
36 Ferguson, Niels and Bruce Schneier, Practical Cryptography, Wiley, Indianapolis, Indiana, 2003, p. 57.
37 Ferguson, Niels, Richard Schroeppel, and Doug Whiting, “A Simple Algebraic Representation of Rijndael,”
in Vaudenay, Serge and Amr M. Youssef, editors, Selected Areas in Cryptography: 8th Annual International
Workshop, SAC 2001, Lecture Notes in Computer Science, Vol. 2259, Springer, New York, 2001, pp. 103–111.
564 ◾ Secret History
it has no importance to someone simply wanting to keep his messages private. A few theoretical
attacks and attacks against reduced rounds versions of AES can be found in the papers referenced
at the end of this chapter.
Figure 20.7 Bruce Schneier (1963–), Master of Metaphors. (Photograph by Per Ervland; http://
www.schneier.com/photo/.)
One of the finalists for the AES competition mentioned above was Twofish, designed by a team
that included Bruce Schneier (Figure 20.7). Schneier was also the creator of Blowfish. In addition
to his technical skills, Bruce has one of the most entertaining writing styles of anyone working
in the field of computer security. You will see his works cited in footnotes throughout part two
of this book. In recent years, Bruce has focused on practicalities of security (e.g., implementation
issues, non-cryptanalytic attacks, etc.) and broad issues. His style is informal and contains numer-
ous metaphors and examples. His monthly Crypto-Gram email newsletter now also exists in blog
form.38 It’s packed with links to articles covering all aspects of security. Schneier was a strong critic
of policies implemented under George W. Bush following 9/11. A conclusion he and coauthor
Niels Ferguson drew from their years of experience is a bit unsettling:39
In all our years working in this field, we have yet to see an entire system that is secure.
That’s right. Every system we have analyzed has been broken in one way or another.
38 Schneier, Bruce, Schneier on Security, A blog covering security and security technology, http://www.schneier.
com/. There are links here to many of his published essays, as well as links for purchasing signed copies of his
books.
39 Schneier, Bruce and Niels Ferguson, Practical Cryptography, Wiley, Indianapolis, Indiana, 2003, p. 1.
Suite B All-Stars ◾ 565
40 Koblitz, Neal, Random Curves: Journeys of a Mathematician, Springer, Berlin, Germany, 2008, p. 313.
566 ◾ Secret History
Koblitz, Neal, Alfred J. Menezes, and Scott Vanstone, “The State of Elliptic Curve Cryptography,” Designs,
Codes and Cryptography, Vol. 19, Nos. 2‒3, March 2000, pp. 173‒193.
Lenstra, Jr., Hendrik W., “Factoring Integers with Elliptic Curves,” Annals of Mathematics, second series,
Vol. 126, No. 3, November 1987, pp. 649‒673. Lenstra had his result in 1984, before elliptic curves
served any other cryptologic purpose. See Koblitz, Neal, Random Curves: Journeys of a Mathematician,
Springer, Berlin, Germany, 2008, p. 299.
Menezes, Alfred J. Elliptic Curve Public Key Cryptosystems, Kluwer Academic Publishers, Boston,
Massachusetts, 1993. This was the first book devoted completely to elliptic curve cryptography.
Miller, Victor S., “Use of Elliptic Curves in Cryptography,” in Williams, Hugh C., editor, Advances in
Cryptology — CRYPTO ‘85 Proceedings, Lecture Notes in Computer Science, Vol. 218, Springer,
Berlin, Germany, 1986, pp. 417‒426.
NSA/CSS, The Case for Elliptic Curve Cryptography, National Security Agency/Central Security Service,
Fort George G. Meade, Maryland, January 15, 2009, https://web.archive.org/web/20090117023500/
http://www.nsa.gov/business/programs/elliptic_curve.shtml. The reasons behind the National
Security Agency’s endorsement of elliptic curve cryptography are detailed in this essay. The conclud-
ing paragraph states,
Elliptic Curve Cryptography provides greater security and more efficient performance than the
first generation public key techniques (RSA and Diffie-Hellman) now in use. As vendors look
to upgrade their systems they should seriously consider the elliptic curve alternative for the
computational and bandwidth advantages they offer at comparable security.
Rosing, Michael, Implementing Elliptic Curve Cryptography, Manning Publications Co., Greenwich,
Connecticut, 1999.
Smart, Nigel P., “The Discrete Logarithm Problem on Elliptic Curves of Trace One,” Journal of Cryptology,
Vol. 12, No. 3, Summer 1999, pp. 193‒196. This paper presents an attack against a very rare sort of
elliptic curve that can easily be avoided.
Solinas, Jerome A., “An Improved Algorithm for Arithmetic on a Family of Elliptic Curves,” in Kaliski, Jr.,
Burton S., editor, Advances in Cryptology — CRYPTO ‘97 Proceedings, Lecture Notes in Computer
Science, Vol. 1294, Springer, Berlin, Germany, 1997, pp. 357‒371. At the Crypto ‘97 conference,
Solinas gave an analysis of an improved algorithm for computations on the curves Koblitz proposed.
This was the first paper presented publicly at a cryptography meeting by an NSA employee.41
Washington, Lawrence C., Elliptic Curves: Number Theory and Cryptography, CRC Press, Boca Rotan,
Florida, 2003.
On AES
Biham, Eli and Nathan Keller, “Cryptanalysis of reduced variants of Rijndael,” Proceedings of the Third
Advanced Encryption Standard Conference, National Institute of Standards and Technology (NIST),
Washington, DC, 2000, pp. 230‒241.
Biryukov, Alex, Dmitry Khovratovich, and Ivica Nikolić, “Distinguisher and related-key attack on the full
AES-256,” in Halevi, Shai, editor, Advances in Cryptology —CRYPTO 2009 Proceedings, Lecture
Notes in Computer Science, Vol. 5677, Springer, Berlin, Germany, 2009, pp. 231‒249. The abstract
includes the following:
Finally we extend our results to find the first publicly known attack on the full 14-round AES-
256: a related-key distinguisher which works for one out of every 235 keys with 2120 data and
time complexity and negligible memory. This distinguisher is translated into a key-recovery
attack with total complexity of 2131 time and 265 memory.
41 Koblitz, Neal, Random Curves: Journeys of a Mathematician, Springer, Berlin, Germany, 2008, p. 312.
Suite B All-Stars ◾ 567
Biryukov, Alex, Orr Dunkelman, Nathan Keller, Dmitry Khovratovich, and Adi Shamir, “Key Recovery
Attacks of Practical Complexity on AES-256 Variants with up to 10 Rounds,” in Gilbert, Henri,
editor, Advances in Cryptology, EUROCRYPT 2010 Proceedings, Lecture Notes in Computer Science,
Vol. 6110, Springer, Berlin, Germany, 2010, pp. 299‒319. This paper presents a practical attack on
10-round AES-256. Because AES-256 has 14 rounds, this attack is not practical against real-world
AES.
Biryukov, Alex and Dmitry Khovratovich, “Related-key Cryptanalysis of the Full AES-192 and AES-256,”
in Matsui, Mitsuru, editor, Advances in Cryptology — ASIACRYPT 2009 Proceedings, Lecture Notes
in Computer Science, Vol. 5912, Springer, Berlin, Germany, 2009, pp. 1‒18.
Bogdanov, Andrey, Dmitry Khovratovich, and Christian Rechberger, “Biclique Cryptanalysis of the Full
AES,” in Lee, Dong Hoon and Xiaoyun Wang, editors, Advances in Cryptology — ASIACRYPT 2011
Proceedings, Lecture Notes in Computer Science, Vol. 7073, Springer, Heidelberg, Germany, 2011,
pp. 344‒371.
Courtois, Nicolas T. and Josef Pieprzyk, “Cryptanalysis of Block Ciphers with Overdefined Systems of
Equations,” in Zheng, Yuliang, editor, Advances in Cryptology — ASIACRYPT 2002 Proceedings,
Lecture Notes in Computer Science, Vol. 2501, Springer, Berlin, Germany, 2002, pp. 267‒287.
Daemen, Joan and Vincent Rijmen, “The Design of Rijndael: AES—The Advanced Encryption Standard”,
Springer, New York, 2002. This is full disclosure, a 238-page book by the creators of AES explaining
exactly how the cipher was constructed and tested. Something like this should have accompanied
DES.
Ferguson, Niels, John Kelsey, Stefan Lucks, Bruce Schneier, Mike Stay, David Wagner, and Doug Whiting,
“Improved Cryptanalysis of Rijndael,” in Schneier, Bruce, editor, Fast Software Encryption, 7th
International Workshop, FSE 2000, Lecture Notes in Computer Science, Vol. 1978, Springer, New
York, 2000, pp. 213‒230.
Ferguson, Niels, Richard Schroeppel, and Doug Whiting, “A Simple Algebraic Representation of Rijndael,”
in Vaudenay, Serge and Amr M. Youssef, editors, Selected Areas in Cryptography, 8th Annual
International Workshop, SAC 2001, Lecture Notes in Computer Science, Vol. 2259, Springer, New
York, 2001, pp. 103‒111.
Gilbert, Henri and Thomas Peyrin, “Super-Sbox Cryptanalysis: Improved Attacks for AES-like permutations,”
in Hong, Seokhie and Tetsu Iwata, editors, Fast Software Encryption, 17th International Workshop,
FSE2010, Springer, Berlin, Germany, 2010, pp. 365‒383.
Moser, Jeff, “A Stick Figure Guide to the Advanced Encryption Standard (AES),” Moserware, http://www.
moserware.com/2009/09/stick-figure-guide-to-advanced.html, September 22, 2009.
Musa, Mohammad A, Edward F. Schaefer, and Stephen Wedig, “A Simplified AES Algorithm and its
Linear and Differential Cryptanalyses,” Cryptologia, Vol. 27, No. 2, April 2003, pp. 148–177. This is
useful for pedagogical purposes, but it must be stressed that attacks on the simplified version don’t
necessarily “scale up” to attacks on AES.
National Institute of Standards and Technology (NIST), Announcing the Advanced Encryption Standard
(AES), Federal Information Processing Standards Publication 197, November 26, 2001, available
online at http://csrc.nist.gov/publications/fips/fips197/fips-197.pdf.
Phan, Raphaël C.-W., “Impossible Differential Cryptanalysis of 7-Round Advanced Encryption Standard
(AES),” Information Processing Letters, Vol. 91, No. 1, July 16, 2004, pp. 33–38.
Rijmen, Vincent, “Practical-Titled Attack on AES-128 Using Chosen-Text Relations,” January 2010,
https://eprint.iacr.org/2010/337.pdf.
Tao, Biaoshuai and Hongjun Wu, “Improving the Biclique Cryptanalysis of AES,” in Foo, Ernest and
Douglas Stebila, editors, Information Security and Privacy, ACISP 2015 Proceedings, Lecture Notes in
Computer Science, Vol. 9144, Springer, Cham Switzerland, 2015, pp. 39–56.
Chapter 21
Toward Tomorrow
The history of cryptology can be broken down into eras, with the first being the paper and pencil
era. Eventually, the process of encryption was mechanized and we entered the electromechanical
machine era. To break these more difficult ciphers, computers were created. But the computers
weren’t limited to cryptanalysis; they were also used to encipher. Thus, we have the computer era.
In recent years, an increasing ability to manipulate quantum particles and DNA has led to new
cryptologic techniques and new kinds of computers. As a consequence, we have entered the era of
post-quantum cryptology. This chapter tells the story of the birth of this era (at least as far as is
publicly known) and speculates on what might come next.
569
570 ◾ Secret History
If Alice wishes to establish a key with Bob for future messages, without meeting him in per-
son, she begins by generating a random string of 0s and 1s and a random string of +s and ×s.1 For
example, such a string might begin
0 1 1 0 1 0 1 1 1 0 0 1 0 0 1 0 1 1 0 1
+ × + + ×
× × + × + ×
+ + × + + × +
×
×
She will now polarize photons to represent each 0 and 1. The + signs indicate she polarizes those
particular photons in either the ∣ direction or the — direction. She will use ∣ to represent 1 and — to
represent 0. If a particular bit has an × under it, Alice will use \ to represent 1 and / to represent
0. Adding a third line to our previous list of bits and polarization schemes, we show what Alice
actually sends (photons with the indicated polarizations):
0 1 1 0 1 0 1 1 1 0 0 1 0 0 1 0 1 1 0 1
+ × + +
×
× × + × + ×
+ + × + +
× +
× ×
— \ ∣
— \ / \ ∣ \
— / ∣
— / ∣ —
\ ∣
/ \
On the receiving end, Bob sets up a filter for each photon in an attempt to determine its polar-
ization. If he sets his filter up like so ∣, he’ll correctly interpret any photons sent using the +
scheme. The ∣ photons will come through and the — photons will reveal themselves by not com-
ing through! Similarly, setting his filter up as — will correctly identify all photons sent according
to the + scheme. However, photons sent using the × scheme will come through with probability
½. Once through, they will appear to have the orientation of the filter; thus, Bob only has a fifty
percent chance of guessing these correctly.
Similarly, if he uses one of the filters from the × scheme, he’ll correctly identify the polar-
izations of all photons sent according to the × scheme, but err on average on half of the rest.
Remember, Bob doesn’t know which scheme, + or ×, Alice used for any particular photon; he
must guess!
We now add a fourth line to our diagram showing what sort of filter Bob used for each photon.
0 1 1 0 1 0 1 1 1 0 0 1 0 0 1 0 1 1 0 1
+ × + + × × × + × + × + + × + + × + × ×
— \ ∣ —
\ / \ ∣ \ — / ∣ — / ∣ —
\ ∣ / \
× + +
× ×
+ × + × × + × + × × +
+ × + ×
Of course, Bob has no way of knowing which recovered bits are correct and which aren’t. He
calls Alice and tells her what scheme he used for each. In our example, she would tell him that he
guessed correctly for positions 3, 5, 7, 8, 9, 13, 14, 16, and 20. It’s okay if someone is listening in
at this stage. Alice and Bob then discard all positions for which Bob guessed incorrectly. The bits
that remain will serve as their key. Thus, their key begins 111110001. These digits should not be
revealed; Alice only confirms that Bob guessed the correct filtering scheme for certain positions.
This being the case, the pair knows that Bob has correctly recovered the bits in those positions. He
may have correctly recovered other bits by chance, but those are ignored. Now suppose someone,
say Eve, was eavesdropping on the first communication, when Alice sent the photons. To eaves-
drop, Eve would have had to set up filters to measure the photon polarizations. There is no other
way to obtain the information she seeks. But Eve’s filters will have changed the polarizations on
the photons for which she guessed the wrong scheme. These changes would carry through to Bob.
Thus, to make sure Eve wasn’t listening in, Alice and Bob spot check some positions. Alice might
ask, “For positions 7, 13, and 16 did you get 0, 1, and 0?” If Bob answers affirmatively, they gain
confidence that there was no eavesdropper, discard the bits revealed and use the rest as their key.
To make the example above fit on a single line, only 20 of the photons sent were detailed. In a
real implementation, Alice would have to send many more. If she hopes to establish a 128-bit key,
sending 256 photons wouldn’t be enough. Bob may guess the correct filtering scheme for half of
the photons he receives, but no room would be left to check for an eavesdropper. Alice would be
wiser to send 300 bits, allowing room for Bob to be an unlucky guesser, as well as allowing enough
randomly checked values to determine with a high degree of certainty that no one was listening in.
It may happen that Eve guessed correctly on the filter orientations for all of the bits Alice and
Bob used to check for her presence on the line, but the probability of her managing that for n bits
is only (1/2)n; however, the probability of her eavesdropping going undetected is a bit higher. She’ll
guess the correct orientation half the time, but even when she guesses wrong, she’ll get the correct
value half the time by chance; hence, she draws the correct conclusion 3/4 of the time. Thus, by
using n check bits, Alice and Bob reduce the probability of Eve going unnoticed to (3/4)n, which
can still be made as small as they desire.
At first, we wanted the quantum signal to encode the transmitter’s confidential mes-
sage in such a way that the receiver could decode it if no eavesdropper were present,
but any attempt by the eavesdropper to intercept the message would spoil it without
revealing any information. Any such futile attempt at eavesdropping would be detected
by the legitimate receiver, alerting him to the presence of the eavesdropper. Since this
2 A minor theme in this book, an important paper that was initially rejected: Wiesner, Stephen, “Conjugate
Coding,” SIGACT News, Vol. 15, No. 1, Winter-Spring 1983, pp. 78–88.
3 Bennett, Charles H., Gilles Brassard, Seth Breidbart, and Stephen Wiesner, “Quantum Cryptography, or
Unforgeable Subway Tokens,” in Chaum, David, Ronald L. Rivest, and Alan T. Sherman, editors, Advances in
Cryptology, Proceedings of Crypto ’82, Plenum Press, New York, 1983, pp. 267–275. This is the first published
paper on Quantum Cryptography. Indeed, the first paper in which those words were even put together.
4 Brassard, Gilles, “Brief History of Quantum Cryptography: A Personal Perspective,” in Proceedings of IEEE
Information Theory Workshop on Theory and Practice in Information Theoretic Security, Awaji Island, Japan,
October 17, 2005, pp. 19–23. A longer (14 pages) version is available online at http://arxiv.org/pdf/quant-
ph/0604072v1.pdf. The passage here is taken form page 4 of the online version.
572 ◾ Secret History
early scheme was unidirectional, it required the legitimate parties to share a secret
key, much as in a one-time pad encryption. The originality of our scheme was that
the same one-time pad could be reused safely over and over again if no eavesdropping
were detected. Thus, the title of our paper was “Quantum Cryptography II: How to
reuse a one-time pad safely even if P = NP.”5 We submitted this paper to major theo-
retical computer science conferences, such as STOC (The ACM Annual Symposium on
Theoretical Computer Science), but we failed to have it accepted. Contrary to Wiesner’s
“Conjugate Coding”, however, our “Quantum Cryptography II” paper has forever
remained unpublished (copies are available from the authors).
Undeterred by the rejection, Bennett and Brassard came up with a new way of doing things (the
scheme presented at the start of this chapter) and gave a long presentation at the 1983 IEEE
5 Bennett, Charles H., Gilles Brassard, and Seth Breidbart, “Quantum Cryptography II: How to reuse a one-
time pad safely even if P = NP,” paper rejected from 15th Annual ACM Symposium on Theory of Computing,
Boston, May 1983. Historical document dated “November 1982” available from the first two authors.
Toward Tomorrow ◾ 573
Symposium on Information Theory (ISIT), which was held in St-Jovite, Canada (near Brassard’s
hometown of Montréal). Brassard notes that the one-page abstract that was published6 provides
the official birth certificate for Quantum Key Distribution.7
Seeing print and making an impact can be two very different things. Although Wiesner,
Bennett, and Brassard were now getting their ideas published, hardly anyone was taking
notice. Well, even the best researchers can benefit from social networking. Brassard got an
assist when his good friend Vijay Bhargava invited him to give a talk on whatever he wanted (!)
at an IEEE conference to be held in Bangalore, India, in December 1984. Brassard accepted
the invitation and gave a talk on quantum cryptography. The associated five-page paper,
“Quantum Cryptography: Public key Distribution and Coin Tossing,” authored by Bennett
and Brassard, 8 has, as of this writing (October 13, 2020) earned 8,417 citations according to
Google Scholar.
As a side-note, when I ask students to write papers in my cryptology class, someone always asks
how long it has to be. Well, a five-page paper could be acceptable… The 1953 paper by Crick and
Watson9 that described the double helical structure of DNA for the first time was only two pages
long. So, I could be content with a two page paper… Most students are used to writing to a given
length, rather than writing until the topic has been completely covered in as clear a manner as
possible. They usually keep asking after I make comments like those above.
Back to Bennett and Brassard! The 8,046 citations referred to above are strongly skewed
to more recent years. Brassard noted, “Throughout the 1980’s, very few people took quan-
tum cryptography seriously and most people simply ignored it.”10 In fact, in 1987, Doug
Wiedemann had a paper published in Sigact News that presented the exact same scheme as in
the 1984 paper by Bennett and Brassard. He even called it quantum cryptography!11 So, there
was someone deeply interested in the topic who didn’t know about Bennett and Brassard’s
work — nor, apparently, did the editor who published the reinvention! One wonders who the
reviewers were.
The two researchers decided that they needed to physically demonstrate their scheme to
gain some attention. They recruited some help and (without any special budget) had their secret
6 Bennett, Charles H. and Gilles Brassard, “Quantum Cryptography and its Application to Provably Secure
Key Expansion, Public-key Distribution, and Cointossing,” in Proceedings of IEEE International Symposium on
Information Theory (abstracts), St-Jovite, Quebec, Canada, IEEE, New York, 1983, p. 91.
7 Brassard, Gilles, “Brief History of Quantum Cryptography: A Personal Perspective,” in Proceedings of IEEE
Information Theory Workshop on Theory and Practice in Information Theoretic Security, Awaji Island, Japan,
October 17, 2005, pp. 19–23. A longer, 14-page version is available online at http://arxiv.org/pdf/quant-
ph/0604072v1.pdf.
8 Bennett, Charles H. and Gilles Brassard, “Quantum cryptography: Public key Distribution and coin tossing,”
in International Conference on Computers, Systems & Signal Processing, Vol. 1, Bangalore, India, pp. 175–179,
December 1984, available online at https://arxiv.org/ftp/arxiv/papers/2003/2003.06557.pdf.
9 Watson James D. and Crick Francis H. C., “A Structure for Deoxyribose Nucleic Acid,” Nature, Vol. 171,
No. 4356, April 25, 1953, pp. 737–738. This paper was really a single page. Acknowledgments and references
caused it to spill over a bit onto a second page.
10 Brassard, Gilles, “Brief History of Quantum Cryptography: A Personal Perspective,” in Proceedings of IEEE
Information Theory Workshop on Theory and Practice in Information Theoretic Security, Awaji Island, Japan,
October 17, 2005, pp. 19–23. A longer, 14-page version is available online at http://arxiv.org/pdf/quant-
ph/0604072v1.pdf. The passage here is taken from page 5 of the online version.
11 Wiedemann, Doug, “Quantum cryptography,” Sigact News, Vol. 18, No. 2, 1987, pp. 48–51.
574 ◾ Secret History
quantum transmission system, over a distance of 32.5 centimeters, working in October 1989.
Brassard recalls,12
The funny thing is that, while our theory had been serious, our prototype was mostly a
joke. Indeed, the largest piece in the prototype was the power supply needed to feed in
the order of one thousand volts to Pockels cells, used to turn photon polarization. But
power supplies make noise, and not the same noise for the different voltages needed
for different polarizations. So, we could literally hear the photons as they flew, and
zeroes and ones made different noises. Thus, our prototype was unconditionally secure
against any eavesdropper who happened to be deaf! : -)
Despite the noise, this demonstration marked the turning point for Bennett and Brassard.
Physicists were now interested. Physicist Artur K. Ekert found a different way to accomplish the
quantum key distribution. He used quantum entanglement, rather than polarity and published
his result in a physics journal, which helped spread the idea of quantum cryptography more broad-
ly.13 The results thus far were even featured in the popular journal Scientific American.14
Today, there is tremendous worldwide interest in the field, and experiments that transmit
actual quantum bits are constantly setting new records in terms of distance. As mentioned above,
Bennett and Brassard began in 1989 by measuring distances in centimeters, as photons were sent
from a machine to another machine right next to it, and progressed from there to distances around
100 kilometers. Many experts thought this was about the limit, without the use of “quantum
repeaters” that could serve as relays, but how could a signal be repeated, if the act of reading it
changed it? In 2002, Brassard observed that,15
A repeater that doesn’t measure was thought to be impossible in the early 1980s, but
since then scientists have shown that it is feasible in principle. But we’re nowhere near
the technology to build one.
In late October 2010, researchers from Georgia Institute of Technology presented a breakthrough
at the annual meeting of the Optical Society of America (OSA). The team had built a quantum
repeater that could allow quantum bits to be sent distances of 1,000 kilometers or more.16
In the meanwhile, the record for open air transmission of photons hit 144 kilometers, as a key
was sent from one of the Canary Islands to another. The team initially used bursts of photons but
succeeded with single photons in 2007.17 This is a very important distinction. Many experiments
12 Brassard, Gilles, “Brief History of Quantum Cryptography: A Personal Perspective,” in Proceedings of IEEE
Information Theory Workshop on Theory and Practice in Information Theoretic Security, Awaji Island, Japan,
October 17, 2005, pp. 19–23. A longer, 14-page version is available online at http://arxiv.org/pdf/quant-
ph/0604072v1.pdf. The passage here is taken from page 6 of the online version.
13 Ekert, Artur K., “Quantum Cryptography Based on Bell’s Theorem,” Physical Review Letters, Vol. 67, No. 6,
www.nature.com/news/2007/070305/full/070305-12.html.
Toward Tomorrow ◾ 575
used groups of photons, all having the same polarizations, to represent the individual bits. This
makes the system more reliable, but defeats the security of the theoretical model.
On October 21, 2009, quantum key distribution was used to transmit votes securely in a Swiss
election.18
At the risk of sounding like a travel log, I now present a result from Japan. In 2011, a paper
titled “Field test of quantum key distribution in the Tokyo QKD Network” with 46 authors
appeared in Optics Express.19 It described a QKD network that achieved quantum OTP (One-
Time Pad) encryption for distances up to 135 km at a fast enough rate to allow video conferenc-
ing and mobile telephony. The demonstration made in October 2010 included (intentionally) an
eavesdropper. The system detected the eavesdropper’s presence and rerouted to cut that person out,
with no noticeable interruption for the participants (a buffer stored enough key to cover the com-
munication until the rerouting was completed). The authors noted, “These demonstrations suggest
that practical applications of QKD in a metropolitan network may be just around the corner.”
They also noted how such networks could be compromised.
Many QKD protocols, such as the one-way BB84 protocol, have been proven to be
unconditionally secure, which means the protocol, which is based on mathematical
device model assumptions, cannot be “cracked” as long as the laws of physics remain
true. On the other hand real world implementations have unavoidable imperfections
and will therefore be susceptible to side-channel attacks.20
It’s very important not to underestimate the threat of side-channel attacks. They have been grow-
ing in importance for decades.
In 2016, China set up a 2,000 km long quantum channel linking Beijing and Shanghai. This
long run actually has 32 “trusted nodes” along the way to refresh the signal. These are potential
weak spots.21 The Chinese also launched a satellite in 2016 for the purpose of establishing a link
for quantum key distribution in orbit. By 2020, it was successfully exchanging key with a portable
station (weighing only 80 kg!) on the ground. This is not just a proof of concept. Industrial and
18 “Geneva is Counting on Quantum Cryptography as it Counts its Votes,” October 11, 2007, https://cordis.
europa.eu/docs/projects/cnect/3/506813/080/publishing/readmore/SECOQC-pressrelease2.pdf.
19 Sasaki, M., M. Fujiwara, H. Ishizuka, W. Klaus, K. Wakui, M. Takeoka, S. Miki, T. Yamashita, Z. Wang, A.
Commercial Bank of China (ICBC) and the People’s Bank of China use the system, although with
heavier, but faster, ground stations.22
Quantum particles aren’t just for defense, they can also be used to attack ciphers when they are
doing their thing in a quantum computer.
Our Sycamore processor takes about 200 seconds to sample one instance of a quan-
tum circuit a million times—our benchmarks currently indicate that the equiva-
lent task for a state-of-the-art classical supercomputer would take approximately
10,000 years. This dramatic increase in speed compared to all known classical
22 Lu, Donna, “China has developed the world’s first mobile quantum satellite station,” New Scientist, January 9,
2020, available online athttps://www.newscientist.com/article/2229673-china-has-developed-the-worlds-first-
mobile-quantum-satellite-station/#.
23 Shor, P. W., “Algorithms for quantum computation: discrete logarithms and factoring,” in Goldwasser, Shafi,
editor, Proceedings 35th Annual Symposium on Foundations of Computer Science, IEEE Computer Society Press,
Los Alamitos, California, 1994, pp. 124–134.
24 Grover, Lov K., “A fast quantum mechanical algorithm for database search,” in Miller, Gary L., editor,
Proceedings of 28th ACM Symposium on Theory of Computing (STOC ‘96 ), ACM Press, New York, 1996, pp.
212–219.
25 Bennett, Charles H., Ethan Bernstein, Gilles Brassard, and Umesh Vazirani, “Strengths and Weaknesses of
Quantum Computing,” SIAM Journal on Computing, Vol. 26, No. 5, October 1997, pp. 1510–1523, available
online at https://arxiv.org/pdf/quant-ph/9701001.pdf.
Toward Tomorrow ◾ 577
IBM, a competitor in the quantum computer development race, objected to this claim, saying that
the time on a state-of-the-art classical supercomputer is 2.5 days, not 10,000 years.27 As of May
2020, IBM has 18 quantum computers, Honeywell has 6, and Google has 5.28
One way to protect communications against such new machines (as well as improved versions,
yet to be, that will make these look like toys) is by setting up a quantum key distribution network,
as described earlier in this chapter. Another is to replace current algorithms with ones believed to
be able to resist quantum computer attacks. The next two sections detail how NSA and NIST are
slowly prodding people in this direction.
Until this new [quantum resistant algorithms] suite is developed and products are
available implementing the quantum resistant suite, we will rely on current algo-
rithms. For those partners and vendors that have not yet made the transition to Suite
B elliptic curve algorithms, we recommend not making a significant expenditure to do
so at this point but instead to prepare for the upcoming quantum resistant algorithm
transition.30
The CNSA Suite did not contain any new algorithms. The list had the old popular schemes like
AES, Elliptic Curve schemes, SHA, Diffie-Hellman, and RSA. That is RSA was placed in higher
esteem than in Suite B and DSA was dropped. The main difference in the retained algorithms
26 Arute, Frank, Kunal Arya, […], and John M. Martinis “Quantum Supremacy Using a Programmable
Superconducting Processor,” Nature, Vol. 574, No. 7779, October 24, 2019, pp. 505–510, available online at
https://www.nature.com/articles/s41586-019-1666-5. I hope the 74 authors I represented by […] will forgive
me.
27 Pednault, Edwin, John Gunnels, Dmitri Maslov, and Jay Gambetta, “On “Quantum supremacy”,” IBM Research
https://www.cnet.com/news/ibm-now-has-18-quantum-computers-in-its-fleet-of-weird-machines/.
29 “Commercial National Security Algorithm Suite,” National Security Agency | Central Security Service,
was that the key sizes were much larger. For example, for Diffie-Hellman key exchange, it was
“Minimum 3072-bit modulus to protect up to TOP SECRET.”31
The other newsworthy update was expressed as follows:
Unfortunately, the growth of elliptic curve use has bumped up against the fact of con-
tinued progress in the research on quantum computing, which has made it clear that
elliptic curve cryptography is not the long term solution many once hoped it would be.
Thus, we have been obligated to update our strategy.32
These lines led to much speculation, a summary of which was presented in a paper by Neal Koblitz,
a co-discoverer of elliptic curve cryptography, and Alfred J. Menezes.33 In an email to me, Koblitz
noted, “It’s interesting that one of the leading contenders for “post-quantum cryptography” is
based on elliptic curves, but in a totally different way from ECC. This is the “isogeny-based”
approach of Jao and others.”34
We’re looking to replace three NIST cryptographic standards and guidelines that would
be the most vulnerable to quantum computers. They deal with encryption, key estab-
lishment and digital signatures, all of which use forms of public key cryptography.36
All of the submitters whose entrees met NIST’s acceptability requirements would be invited to
present their algorithms at a workshop in early 2018. NIST planned for this to be followed by an
evaluation phase that would “take an estimated three to five years” to complete.37 We are still in
this multi-round evaluation phase, as of this writing. Of the original 82 submission received by
the November 30, 2017 deadline, 26 made it to round 2. These semi-finalists were announced
31 “Commercial National Security Algorithm Suite,” National Security Agency | Central Security Service,
August 19, 2015, https://apps.nsa.gov/iaarchive/programs/iad-initiatives/cnsa-suite.cfm.
32 “Commercial National Security Algorithm Suite,” National Security Agency | Central Security Service,
2018, https://www.nist.gov/news-events/news/2016/12/nist-asks-public-help-future-proof-electronic-information.
37 NIST Asks Public to Help Future-Proof Electronic Information, NIST News, December 20, 2016, updated January 8,
2018, https://www.nist.gov/news-events/news/2016/12/nist-asks-public-help-future-proof-electronic-information.
Toward Tomorrow ◾ 579
on January 30, 2019.38 On July 22, 2020, NIST announced seven third-round finalists and eight
alternates.39 Which will win is far from obvious. Given this, and the quote from Neal Koblitz and
Alfred J. Menezes that follows, I think singling one out to detail here would be inappropriate.
Most quantum-resistant systems that have been proposed are complicated, have cri-
teria for parameter selection that are not completely clear, and in some cases (such as
NTRU) have a history of successful attacks on earlier versions.40
21.6 Predictions
Over the course of this text we’ve seen several inaccurate predictions made by very clever and successful
individuals (such as Alan Turing, Martin Gardner, Gilles Brassard). Because turnaround is fair play,
I also made a prediction in the first edition. It was “By 2040 quantum computers will have become
a reality necessitating a complete rethinking of encryption.” This prediction included a footnote that
added “I actually think it will happen sooner. I picked a year far enough away to give me a wide safety
margin, yet still within my expected lifetime, so I can receive criticism in person, if I’m wrong.”
Looking back, this prediction doesn’t seem very bold. Here’s my new (bolder) prediction for the sec-
ond edition: By 2050 a computer than bends spacetime will be a reality (if it isn’t already, somewhere).
In the meanwhile, let’s consider another new type of computer.
When I was an undergraduate in the ‘60s, I thought biology was stuff that smelled
funny in the refrigerator. Now, biology is finite strings over a four-letter alphabet and
functions performed by enzymes on these strings.
38 Alagic, Gorjan, Jacob Alperin-Sheriff, Daniel Apon, David Cooper, Quynh Dang, Carl Miller, Dustin Moody,
Rene Peralta, Ray Perlner, Angela Robinson, Daniel Smith-Tone, and Yi-Kai Liu, NISTIR 8240, Status Report
on the First Round of the NIST Post-Quantum Cryptography Standardization Process, NIST Information
Technology Laboratory, Computer Security Resource Center, Publications, January 2019, https://csrc.nist.
gov/publications/detail/nistir/8240/final.
39 PQC Standardization Process: NIST, Third Round Candidate Announcement, July 22, 2020, https://csrc.
nist.gov/News/2020/pqc-third-round-candidate-announcement.
40 Cook, John D., “Between now and quantum,” John D. Cook Consulting Blog, May 23, 2019, https://www.
His new insight for the 1990s was that DNA could take the place of traditional computing means.
It would be well suited for calculations that can be made through the use of parallel processing to
a massive degree. Adleman demonstrated the idea of DNA computing by solving an instance of
the (noncryptographic) directed Hamiltonian path problem. It is illustrated in Figure 21.4 with
the graph he used.
0 6
You may imagine the circles as locations on a map. They are called vertices (or nodes or points).
The lines connecting them may be thought of as roads connecting the points of interest. They are
usually called edges. Some of the edges are one-way, in the directions indicated by the arrows. In
Toward Tomorrow ◾ 581
some cases, there is a separate edge, connecting the same vertices, but in the opposite direction,
offering the traveler a simple way back. The challenge is, given a starting vertex vin and an end-
ing vertex vout, to find a path that passes through all of the other vertices exactly once. The path
needn’t make use of every edge. Such paths do not exist for every directed graph. If one is present,
it is called a Hamiltonian path after William Rowan Hamilton (1805–1865). Take a few moments
to find a Hamiltonian path for the graph in Figure 21.4 with vin = 0 and vout = 6. Once you have
found it, label the intermediate points 1, 2, 3, 4, 5, in the order your solution passes through them.
For this particular problem, the solution is unique. Some graphs have several distinct solutions for
a given starting point and ending point.
This simple sounding problem is NP-complete. We can solve small examples by hand, but
there is no known polynomial time solution for such problems in general.
To find the solution you discovered, Adleman began by randomly choosing bases to form
strands of DNA, labeled O0, O1, O2, O3, O4, O5, and O6, Known as oligonucleotides (hence, the O
used in our notation), these strands are only half of the DNA “ladder” we usually picture when
imagining DNA. A few sample values are given below:
O2=TATCGGATCGGTATATCCGA
O3=GCTATTCGAGCTTAAAGCTA
O4=GGCTAGGTACCAGCATGCTT
We may then form the Watson-Crick complement of each of these, using the fact that the
complement of A is T and the complement of G is C.
O2=ATAGCCTAGCCATATAGGCT
O3=CGATAAGCTCGAATTTCGAT
O4=CCGATCCATGGTCGTACGAA
There is nothing special about the choices used for the bases or their order. All that matters is
that we have as many strands as there are vertices in our graph. Random strings of A, C, G, and T
will suffice. However, there is no flexibility in forming the complements (above) or in selecting the
bases used to represent the edges (below).
O2→3=GTATATCCGAGCTATTCGAG
O3→4=CTTAAAGCTAGGCTAGGTAC
Edge Oi→j is created by taking the second half of Oi and appending to it the first half of Oj.
These edges are defined so that they may join (by bonding) the oligonucleotides representing the
vertices, as shown below:
Thus, the vertices of our original graph are represented by O1, O2, O3, O4, O5, O6, and O7.
Notice that the edge O3→2 will not be the same as the edge O2→3. Starting off with oligonucle-
otides of length 20 ensures that we’ll have more than enough different possibilities to encode all
7 vertices, and from these get the representations of all 14 edges. When the strands representing
the vertices and edges are placed close enough to bond, the bonds will represent paths in the
graph.
582 ◾ Secret History
Because the only possible bondings are between C and G or A and T, we can only have vertices
linked by edges, in the manner shown above, if the appropriate path is in fact present. A much
weaker sort of bonding, like the following, is possible:
(edge O2→3)
GTATATCCGAGCTATTCGAG
CGATAAGCTCGAATTTCGAT
(vertex O3)
This doesn’t represent anything in terms of the problem we are investigating. Fortunately, such
weak bondings typically break apart and do not present any interference with the strong bonds
formed to represent partial paths.
When a DNA “soup” is prepared containing many copies of the oligonucleotides represent-
ing the vertices and edges of a graph, bonding very quickly forms potential solutions to the
Hamiltonian path problem. The investigator may then filter out a valid solution (if one exists for
the particular problem) by selecting DNA segments of the appropriate length.
Writing a program to look for a solution to such a problem, for a large graph, using a tradi-
tional computer would not take very long, but execution of the program would. DNA computing
is quite different in that the setup and final interpretation are the time-consuming parts. The
actual runtime of the DNA program is extremely short. Improved lab techniques should eventu-
ally cut down on the presently time-consuming portions of this approach, but it is too early to
predict how practical this method may become.
Adleman’s small example was intended merely as an illustration of his new approach to com-
puting. It is obviously easier to do this particular problem by hand. Even for this small problem,
the lab work required a full week. The value of the approach is, as Adleman points out, “that the
methods described here could be scaled-up to accommodate much larger graphs.”43
A key advantage for this approach to larger Hamiltonian path problems is that the number of
distinct oligonucleotides needed to represent the graph grows linearly with the size of the graph,
although many copies of each are necessary. Adleman used approximately 3 × 1013 copies of the
oligonucleotides representing each edge in his small example. This was far more than was neces-
sary and likely led to many copies of the solution being present. Adleman expects that the neces-
sary number of copies of each oligonucleotide grows exponentially with the number of vertices.
Adleman sums up the advantage of this manner of computation, with some plausible improve-
ments: “At this scale, the number of operations per second during the ligation step would exceed
that of current supercomputers by more than a thousand fold.”44 Other advantages include dra-
matically increased efficiency and decreased storage space.
It is important to note that DNA computing is not limited to a special class of problems. In the
reference section below you’ll see a paper by Boneh, Lipton, and Dunworth that shows how DES
can be broken by a DNA computer. This is done by using DNA to code every possible key and
then trying them all at once! Such are the possibilities of parallel processing to the degree DNA
computing allows.
Back in 1995, Lipton estimated that a DNA computer with trillions of parallel processors
could be made for $100,000.45 This price tag brings to mind the high cost that Diffie and Hellman
43 Adleman, Leonard M., “Molecular Computation of Solutions to Combinatorial Problems,” Science, Vol. 266,
No. 5187, Nov. 11, 1994, pp. 1021–1024, p. 1022 cited here.
44 Adleman, Leonard M., “Molecular Computation of Solutions to Combinatorial Problems,” Science, Vol. 266,
No. 5187, Nov. 11, 1994, pp. 1021–1024, p. 1023 cited here.
45 Bass, Thomas A., “Gene Genie,” Wired, Vol. 3, No. 8, August 1995, pp. 114–117, 164–168.
Toward Tomorrow ◾ 583
originally placed on their hypothetical DES cracker ($20 million). When that machine finally
appeared, thanks to the EFF, it cost less than $250,000. How much cheaper will DNA computers
be in the future? Will you end up owning one?
Just as traditional computers began as specialized machines for solving particular problems,
and only later became “universal” programmable machines, so the history of DNA computers
goes. A programmable DNA computer was put forth by Israeli researchers from the Weizmann
Institute of Science in 2002.46 It offered a tremendous advantage in terms of speed, efficiency,
and storage capacity over traditional machines, but couldn’t do everything they can. Who says
you can’t have it all? Not Melina Kramer et al! In 2008, they created a hybrid device, combining
biological components and traditional silicon-based chips.47 With such hybrids, the biological
components can take over where they offer an advantage, while we still have old-school silicon
technology to handle tasks for which it’s better suited.
It’s hard to predict where this technology will lead. Should we be surprised that this brand
new field was created by someone outside of biology, thinking about it in his spare time? Not at
all, according to Adleman. As more and more fields of inquiry are reduced to mathematics, the
ability of a single person to comprehend large portions of it becomes more plausible. Adleman
predicted that,48
The next generation could produce a scientist in the old sense, a real generalist, who
could learn the physics, chemistry, and biology, and be able to contribute to all three
disciplines at once.
For a hundred years, it has seemed that science has been growing ever more complicated, but
perhaps mathematics will serve as a simplifying force. Andrew Wiles, the man who finally proved
Fermat’s Last Theorem, has made a similar comment about mathematics itself:49
Mathematics does sometimes give the impression of being spread over such a large area
that even one mathematician can’t understand another, but if you think back to 18th
century mathematics, most modern mathematicians would understand it all and in a
much more unified way than the 18th century mathematicians. I think this dispersion
that one senses is really just because we don’t understand it well enough yet and over
the next 200 years all our current methods and proofs will be simplified and people
will see it as a whole and it will be much easier. I mean nowadays most high school
students will study calculus. That would have been unthinkable in the 17th century,
but now it’s routine and that will happen to current mathematics in 300 years’ time.
46 Benenson, Yaakov, Rivka Adar, Tamar Paz Elizur, Zvi Livneh, and Ehud Shapiro, “DNA Molecule Provides
a Computing Machine with Both Data and Fuel,” Proceedings of the National Academy of Sciences, Vol. 100,
No. 5, March 4, 2003 (submitted in 2002), pp. 2191–2196. For a popular account see Lovgren, Stefan,
“Computer Made from DNA and Enzymes,” National Geographic News, February 24, 2003, http://news.
nationalgeographic.com/news/2003/02/0224_030224_DNAcomputer.html.
47 Kramer, Melina, Marcos Pita, Jian Zhou, Maryna Ornatska, Arshak Poghossian, Michael Schoning, and
Evgeny Katz, “Coupling of Biocomputing Systems with Electronic Chips: Electronic Interface for Transduction
of Biochemical Information,” Journal of Physical Chemistry, Vol. 113, No. 6, February 12, 2009, pp. 2573–
2579. The work was done in 2008 but published in 2009.
48 Bass, Thomas A., “Gene Genie,” Wired, Vol. 3, No. 8, August 1995, pp. 114–117, 164–168.
49 Fermat’s Last Tango, Clay Mathematics Institute, Cambridge, Massachusetts, 2001, bonus feature May 24,
quantum key distribution (QKD) implementations always rely on detectors to measure the
relevant quantum property of single photons. Here we demonstrate experimentally that the
detectors in two commercially available QKD systems can be fully remote-controlled using
specially tailored bright illumination. This makes it possible to tracelessly acquire the full
secret key; we propose an eavesdropping apparatus built from off-the-shelf components. The
loophole is likely to be present in most QKD systems using avalanche photodiodes to detect
single photons. We believe that our findings are crucial for strengthening the security of practi-
cal QKD, by identifying and patching technological deficiencies.
The authors noted, “It’s patchable of course… just a question of time.”50
Shor, Peter W., “Algorithms for Quantum Computation: Discrete Logarithms and Factoring,” in Goldwasser,
Shafi, editor, Proceedings of the 35th Annual Symposium on Foundations of Computer Science, IEEE
Computer Society Press, Los Alamitos, California, 1994, pp. 124–134. This is a preliminary version
of the following reference.
Shor, Peter W., “Polynomial-time Algorithms for Prime Factorization and Discrete Logarithms on a
Quantum Computer,” SIAM Journal on Computing, Vol. 26, No. 5, 1997, pp. 1484–1509.
Wiesner, Stephen, “Conjugate Coding,” SIGACT News, Vol. 15, No. 1, Winter 1983, pp. 78–88. This paper
was written circa 1970. Wiesner was too far ahead of his time and couldn’t get it published until 1983!
On Post-Quantum Cryptography
Researchers didn’t wait for quantum computers to come on the market before attempting to develop systems
that can resist such machines. It’s not true that quantum computers can break any cipher. McEliece’s sys-
tem, alluded to briefly in Section 16.5, and some lattice-based systems, such as recent versions of NTRU,51
are believed to be secure against such machines… so far. A few references follow.
Bernstein, Daniel J., Johannes Buchmann, and Erik Dahmen, editors, Post-Quantum Cryptography.
Springer, Berlin, Germany, 2009.
Buchmann, Johannes and Jintai Ding, editors, Post-Quantum Cryptography: Second International Workshop
(PQCrypto 2008), Lecture Notes in Computer Science, Vol. 5299, Springer, Berlin, Germany, 2008.
Ding, Jintai and Rainer Steinwandt, editors, Post-Quantum Cryptography: 10th International Workshop,
(PQCrypto 2019), Lecture Notes in Computer Science, Vol. 11505, Springer, Cham, Switzerland,
2019.
50 “Quantum Hacking,” NTNU [Norwegian University of Science and Technology] Department of Electronics and
Telecommunications, http://web.archive.org/web/20120113024714/http://www.iet.ntnu.no/groups/optics/qcr/.
51 Hoffstein, Jeffrey, Jill Pipher, and Joseph H. Silverman, An Introduction to Mathematical Cryptography,
Springer, New York, 2008. Chapter 6 of this introductory text is focused on lattice-based cryptosystems,
including NTRU, which was created by the authors. Recall, though, that there are attacks on early versions of
this cipher. See https://en.wikipedia.org/wiki/NTRUEncrypt for some updates.
586 ◾ Secret History
Ding, Jintai and Jean-Pierre Tillich, editors, Post-Quantum Cryptography: 11th International Workshop,
(PQCrypto 2020), Lecture Notes in Computer Science, Vol. 12100, Springer, Cham Switzerland,
2020.
Gaborit, Philippe, editor, Post-Quantum Cryptography: 5th International Workshop, (PQCrypto 2013),
Lecture Notes in Computer Science, Vol. 7932, Springer, Berlin, Germany, 2013.
Koblitz, Neal and Alfred J. Menezes, “A Riddle Wrapped in an Enigma,” IEEE Security & Privacy, Vol. 14,
No. 6, November-December 2016, pp. 34–42, available online at https://eprint.iacr.org/2015/1018.pdf.
Lange, Tanja and Rainer Steinwandt, editors, Post-Quantum Cryptography: 9th International Workshop,
(PQCrypto 2018), Lecture Notes in Computer Science, Vol. 10786, Springer, Cham, Switzerland,
2018.
Lange, Tanja and Tsuyoshi Takagi, editors, Post-Quantum Cryptography: 8th International Workshop,
(PQCrypto 2017), Lecture Notes in Computer Science, Vol. 10346, Springer, Cham Switzerland,
2017.
Mosca, Michele, editor, Post-Quantum Cryptography: 6th International Workshop, (PQCrypto 2014), Lecture
Notes in Computer Science, Vol. 8772, Springer, Cham, Switzerland, 2014.
Post-quantum Cryptography: International Workshop (PQCrypto 2006). The proceedings for this first
PQCrypto conference were not published, but the papers are, with only three exceptions, available
online by following a link from the conference’s webpage, https://postquantum.cr.yp.to/.
Post-Quantum cryptography, http://pqcrypto.org/. This website provides a “one-minute” introduction and
useful links.
Sendrier, Nicolas, editor, Post-Quantum Cryptography: Third International Workshop (PQCrypto 2010),
Lecture Notes in Computer Science, Vol. 6061, Springer, Berlin, Germany, 2010.
Takagi, Tsuyoshi, editor, Post-Quantum Cryptography: 7th International Workshop, (PQCrypto 2016 ),
Lecture Notes in Computer Science, Vol. 9606, Springer, Cham, Switzerland, 2016.
Takagi, Tsuyoshi, Masato Wakayama, Keisuke Tanaka, Noboru Kunihiro, Kazufumi Kimoto, and Dung
Hoang Duong, editors, Mathematical Modelling for Next-Generation Cryptography, CREST Crypto-
Math Project, Springer, Singapore, 2018.
Yang, Bo-Yin, editor, Post-Quantum Cryptography: 4th International Workshop, (PQCrypto 2011), Lecture
Notes in Computer Science, Vol. 7071, Springer, Berlin, Germany, 2011.
On DNA Computing
Adleman, Leonard M., “Molecular Computation of Solutions to Combinatorial Problems,” Science, Vol.
266, No. 5187, November 11, 1994, pp. 1021–1024, available online at https://www2.cs.duke.edu/
courses/cps296.5/spring06/papers/Adleman94.pdf. This is where it all began.
Adleman, Leonard M., “On Constructing a Molecular Computer,” in Lipton, Richard J. and Eric B.
Baum, editors, DNA Based Computers: DIMACS workshop, April 4, 1995, DIMACS Series in Discrete
Mathematics and Computer Science, Vol. 27, American Mathematical Society, Providence, Rhode
Island, 1996, pp. 1–22. Other papers in this important volume are referenced below.
Adleman, Leonard M., “Computing with DNA,” Scientific American, Vol. 279, No. 2, August 1998,
pp. 54–61. Another article on the same topic, by the same author, but aimed at a wider audience.
Adleman, Leonard M., Paul W. K. Rothemund, Sam Roweis, and Erik Winfree, “On applying molec-
ular computation to the Data Encryption Standard,” in Landweber, Laura F. and Eric B. Baum,
editors, DNA Based Computers II: DIMACS workshop, June 10-12, 1996, DIMACS Series in
Discrete Mathematics and Theoretical Computer Science, Vol. 44, American Mathematical Society,
Providence, Rhode Island, 1998, pp. 31–44.
Amos, Martyn, Theoretical and Experimental DNA Computation, Springer, Berlin, Germany, 2005.
Bass, Thomas A., “Gene Genie,” Wired, Vol. 3, No. 8, August 1995, pp. 114–117, 164–168. This is a lively
account of DNA computing that also gives the reader a glimpse into Adleman’s personality.
Benenson, Yaakov, Rivka Adar, Tamar Paz-Elizur, Zvi Livneh, and Ehud Shapiro, “DNA Molecule Provides
a Computing Machine with both Data and Fuel,” Proceedings of the National Academy of Sciences,
Vol. 100, No. 5, March 4, 2003, pp. 2191–2196.
Toward Tomorrow ◾ 587
Boneh, Dan, Christopher Dunworth, Richard J. Lipton, and Jiri Sgall, “On the Computational Power
of DNA,” Discrete Applied Mathematics, Vol. 71, No. 1–3, December 5, 1996, pp. 79–94, available
online at http://www.dna.caltech.edu/courses/cs191/paperscs191/bonehetal.pdf.
Boneh, Dan, Richard J. Lipton, and Christopher Dunworth, “Breaking DES Using a Molecular Computer,”
in Lipton, Richard J. and Eric B. Baum, editors, DNA Based Computers: DIMACS workshop, April 4,
1995, DIMACS Series in Discrete Mathematics and Computer Science, Vol. 27, American Mathematical
Society, Providence, Rhode Island, 1996, pp. 37–66. Here is the abstract:
Recently Adleman has shown that a small traveling salesman problem can be solved by molec-
ular operations. In this paper we show how the same principles can be applied to breaking the
Data Encryption Standard (DES). Our method is based on an encoding technique presented
by Lipton. We describe in detail a library of operations which are useful when working with
a molecular computer. We estimate that given one arbitrary (plain-text, cipher-text) pair, one
can recover the DES key in about 4 months of work. Furthermore, if one is given ciphertext,
but the plaintext is only known to be one of several candidates then it is still possible to recover
the key in about 4 months of work. Finally, under chosen cipher-text attack it is possible to
recover the DES key in one day using some preprocessing.
Devlin, Keith, “Test Tube Computing with DNA,” Math Horizons, Vol. 2, No. 4, April 1995, pp. 14–21.
This Mathematical Association of America (MAA) journal consists of articles easily accessible to
undergraduates. The article cited here is especially nice and provides a more detailed description of
the biochemistry than is given in this book.
Ignatova, Zoja, Israel Martinez-Perez, and Karl-Heinz Zimmermann, DNA Computing Models, Springer,
Berlin, Germany, 2008.
Kari, Lila, Greg Gloor, and Sheng Yu, “Using DNA to Solve the Bounded Post Correspondence Problem,”
Theoretical Computer Science, Vol. 231, No. 2, January 200, pp. 192–203.
Lipton, Richard J., “DNA Solution of Hard Computational Problems,” Science, Vol. 268, No, 5210, April
28, 1995, pp. 542–545.
Lipton, Richard J., “Speeding Up Computation via Molecular Biology,” in Lipton, Richard J. and Eric B.
Baum, editors, DNA Based Computers: DIMACS workshop, April 4, 1995, DIMACS Series in Discrete
Mathematics and Computer Science, Vol. 27, American Mathematical Society, Providence, Rhode
Island, 1996, pp. 67–74.
Lovgren, Stefan, “Computer Made from DNA and Enzymes,” National Geographic News, February 24,
2003, http://news.nationalgeographic.com/news/2003/02/0224_030224_DNAcomputer.html. This
is a popular account of Benenson, Yaakov, Rivka Adar, Tamar Paz-Elizur, Zvi Livneh, and Ehud
Shapiro, “DNA Molecule Provides a Computing Machine with both Data and Fuel,” Proceedings of
the National Academy of Sciences, Vol. 100, No. 5, March 4, 2003, pp. 2191–2196.
Păun, Gheorghe, Grzegorz Rozenberg, and Arto Salomaa, DNA Computing: New Computing Paradigms,
Springer, New York, 1998.
Pelletier, Olivier and André Weimerskirch “Algorithmic Self-Assembly of DNA Tiles and its Application to
Cryptanalysis,” 2001, available online at https://arxiv.org/abs/cs/0110009. In this paper, the authors
took steps towards a DNA computer attack on NTRU. They noted,
Assuming that a brute force attack can be mounted to break a key security of 240 the described
meet-in-the-middle attack in DNA might break systems with a key security of 280. However,
many assumptions are very optimistic for the near future. Furthermore we understand that
using a higher security level, e.g., a key security of 2285 as proposed in [5] [the 1998 paper
proposing NTRU] puts public-key systems like NTRU far out of range for a successful crypt-
analysis in DNA.
Index
A MixColumns, 560–561
and NSA, 553, 556n. 30, 563, 577
Abel, Rudolf, 94 references, 566–567
Abstract Algebra, 8, 103, 223, 230n. 16, 415–416 ShiftRows, 559–560
Abzug, Bella, 353–354 SubBytes (Rijndael S-box), 556–559, 560
Academia, 355, 423–421 workings of, 556–563
Access Now, 526 AF, 274
Ace pilots, 274 Affine cipher, 33–37
Acoustical attack, 348, 574; see also Side channel attacks Africa, 251, 468
Adair, Gilbert, 18–19 Agony columns, 147
Adams, Abigail, 414 Agrawal, Manindra, 473–474, 476
Adams, John, 414 Agrippa, Cornelius, 62
Adams, Mike, 208 Aïda and Bernardo, 549–550
ADFGVX, 163, 166–169, 173, 182, 335–336 Airbus, 373–374
references, 195, 196 Air Force Security Agency, 346
ADFGX, 163, 166–168, 335–336 ALA, see American Library Association
cryptanalysis, of 168–182 Alba, Dennis, 518
Aditsan and Bisahalani, 551–552 Alberti, Leon Battista, 10, 61, 64
Adleman, Len Albion College, 60
DNA computing, 579–583, 586, 587 Aleutian Islands, 274
in group photo, 488, Alexander’s Weekly Messenger, 11, 59
and Loren M. Kohnfelder, 509n. 1 Alexandria, 450
Mailsafe, 510 Algebraic cryptography, 21
Merkle-Hellman knapsack cipher, 485–486 Algiers, 320
NSA, 429 Alice and Bob, 430–432, 482, 549
primality testing, 473 Alicia and Beatriz, 549
RSA, 417, 423, 430, 432 All-America Cable Company, 184
universal scientist, 583 Allen, Jr., Lew, 354
Adolf and Bertholt, 432 Alphabet of the Magi, 42–43
ADONIS, 305 Alshamrani, Mohammed Saeed, 528
Advanced Encryption Standard, see AES Altland, Nicholas, 190
“The Adventure of the Dancing Men,” 14–16, 53–54 Amazon.com, 526
Advertisement, 147, 188, 261–263 American Association for the Advancement of Science
Aegean Park Press, 187 (AAAS), 429
AES, 397, 407, 553, 554–564, 569, 577 American Black Chamber, The (aka the Cipher Bureau),
AddRoundKey, 561–563 183–186, 193, 263, 346n. 3
attacks on, 563–564, 566–567 American Black Chamber, The, 186–189, 197
conferences/competition, 554–555, 564 American Civil Liberties Union, 526
history of, 554–556, 559, 563 American Council on Education, 428–429
in iPhone, 522 American Cryptogram Association, xxv, 4–5, 21, 55n.
irreducible polynomial, 559–560 58, 78, 431n. 42
key size choices, 556 American Library Association, 193
589
American Mathematical Monthly (appears 10 times per Autokey, 64, 79–80, 401, 411, 533
year), 201 Avalanche effect, 335
American Revolution, 45–47, 57, 124, 413–414 Axiom, 472
Ames, Aldrich, 357n. 45 Axioms, 147, 249
Amsterdam, 512 Aykroyd, Dan, 511
Anagrams, 117–119
Anarchists, 120 B
Anderson, Jay, xxv
Anderson, Ross, 501, 506, 555 Babbage, Charles, xix, 66, 213
Anglo-Saxon, 7 Babbington, Anthony, 44–45
Angooki Taipu A, see Red Babel, 19
Angooki Taipu B, see Purple Back door
Aniuta and Busiso, 549 AES, 559
Annapolis, Maryland, 478 alleged in Crypto AG machines, 356–358, 360
Anti-draft pamphlets, 193 DES, 392
Anti-war, 249, 415, 421, 517 government attempt to require, 520, 527
Anti-war Council, 249 iPhones, 522–523, 527, 528, 529–530
Apollo 13, 554 via side channel, 361
Apple, poisoned, 256 Bacon, Kevin, 554
Apple computers, 485, 521–530 Bacon, Roger, 42
Arabic cryptology, 19, 54–55 Bacon, Sir Francis, 129–131, 133–134, 136, 158–159
Arc lengths, 547 Bacon number, 554
Argentis, 9n. 13 Bacon’s cipher, 130, 133, 158–159
Aristagoras of Miletus, 7 Bad Religion, 337, 342
Arizona, 163, 283 Baez, Joan, 353
Arkansas, 274 Bai, Shi, 447
Arlington National Cemetery, 192, 276 Balkans, 518
Armed Forces Security Agency (AFSA), 346, 347, 349 Baltic Sea, 166
Arms limitation, 499 Baltimore Sun, The, 358
Army Navy Journal, 121–122 Balzac, Honoré de, 14
Army Security Agency, 192, 346, 376 Bamford, James, 345–346, 360–361, 363, 365, 377
Army Signal Corps, 105, 121–122, 186, 263, 346n. 3 discovery of Yardley manuscript by, 190
Arnault, François, 471–472 government censorship, 192, 194
Arnold, Benedict, 48 Bangalore, India, 573
Ars Magna, 123 Barkan, Robert, 424
Artificial Intelligence, 255–256 Barlow, John P., 512
Artificial language, 119 Barnes, Bruce, 429
ASCII, 419, 483–484, 487, 532 Barr, William, 528–529
Asimov, Isaac, 332, 339, 341–342 Bartek, Douglas J., 502
Askri, 394 Baseball, 285
Assarpour, Ali, 122 Base 2 pseudoprime, see Pseudoprime
Assassination, 45, 194, 274 Bass-O-Matic, 511, 512
Aston, Philip, 116–117 Bataan Death March, 284
Aston, Sir George, 152 Battle of Wits, 224, 271, 272
Asymmetric keys, 417 Battlestar Galactica, 313
Asymptotically approaches, 8–9, 459, 467 Baudot, J. M. E., 100
Atari, 495 Baudot code, 98, 100
Atbash, 19 Bauer, Craig P., 31, 79, 101n. 48, 103, 207–208, 579
Atheists, 258, 327 Bauer, Friedrich L., 107
AT&T, 93, 102, 309, 310, 373, 516 Bazeries, Major Etienne, 136
A-3 Scrambler, 310, 311, 312 BBVA Foundation Frontiers of Knowledge Award,
Atlantis, 130 585
Atomic bomb, see Nuclear weapons/annihilation Beauregard, General Pierre, 8
Auckland, 512 Beautiful Mind, A, 42n. 48, 554
Australia, 306, 373 Beer, 501
Australian coastwatcher, 151 Beethoven, 38, 541
Authenticity, 408–409, 493 Beijing, 575
variants, 401, 503–504 Dooley, John F., 10n. 18, 54, 57, 191–192
weak keys, 395 Dorabella cipher, 33; see also Elgar, Edward
workings of, 380–390 Double transposition, 116, 119–120, 126, 127, 182
workshops, 391–392, 393, 410 Downey, Peter J., 502
Desalination plant, 274 Doyle, Arthur Conan, 10, 14–15, 16, 48, 53–54
Desch, Joe, 259–260 Dr. Dennis F. Casey Heritage Center on Joint Base San
Determinant of a matrix, 203, 209 Antonio, 306
Deterministic test, 471, 473–476, 477–479, 489, 497, 551 Dropbox, 526
Deutsch, Harold, 253–254 Drug, see Venona
Dewey, Thomas E., 187, 286 DSA, 445, 504–506, 513, 577
Dice, 261–263 DSD-1, 380; see also DES
Dickson, Paul, 424 DSS, see Digital Signature Algorithm (DSA)
Dictionary attack, 55, 120, 503, 519 Dublin, 134
Differential cryptanalysis, 394–395, 432n. 44, 512 Duffy, Austen, 399
Diffie, Whitfield, 406, 414–415, 417, 430, 432, 433, 509 Dunin, Elonka, 104
NSA Hall of Honor, 434 Dunlap, Jack, 364
objections to DES, 390, 391, 403, 410, 582–583 Dunworth, Christopher, 582, 587
pictures of, 415, 480, 488 d’Urfé, Madame, 65
reaction to attempted intimidation, 426, 428–429 Durrett, Deanne, 284
Diffie–Hellman key exchange, 414–417, 422, 445
elliptic curve cryptography (ECC) version of, 549, E
566
quantum computer threat to, 576, 577, 578 e, 9, 110, 468, 519, 520
Diffusion, 335–336, 389 EARN IT, see Eliminating Abusive and Rampant
Digital Fountain, 406 Neglect of Interactive Technologies Act of
Digital Signature Algorithm (DSA), 445, 504–506, 513, 2020 (EARN IT), 530
577 Earth, circumference, 450
Digital signatures, 500, 553, 577 Easter Island, 16, 17
DSA, 445, 504–507, 513 ECB, see Electronic Code Book Mode (ECB)
Elgamal, 497–498, 513, 552 Echelon, 373
RSA, 423, 445, 495–497, 513 von Eckhardt, Felix, 163
Digital Signature Standard (DSS), see Digital Signature ECM (Electric Cipher Machine) II, see SIGABA
Algorithm (DSA) e-commerce treaty, 500
Digraph frequencies, 20, 22 Ecstasy, 518
Digraphic cipher, 64, 147–148; see also Playfair cipher Eddington, Sir Arthur, 249, 336
Diophantus, 545 Edge (of a graph), 580
Dimitrov, Vassil S., 550–551 Education of a Poker Player, The, 189
Diplomatic ciphers, 96, 311; see also Red; Purple EFF, see Electronic Frontier Foundation
Diplomatic codes, 94, 183, 188, 196 Egypt, 443
Discrete log problem, 416, 422, 439, 486–487, 550, 553, Egyptians, 115, 348, 450, 486
576 Einstein, 249
Disparation, La, 18–19 Eisenhower, Dwight D., 304–305, 322n. 22,
Disquisitiones Arithmeticae, 447 356, 363
Distinguishing languages, 74 Ekert, Artur K., 574
Distributed.Net, 397 Election, 186, 516, 575
Dixon, John D., xxv, 454, 459 Electric Cipher Machine, see SIGABA
DNA, double helical structure, 501, 573 Electromagnetic emanations, see TEMPEST
DNA computers, 569, 579–583 Electronic Battlefield, The, 424
background, 579–580 Electronic Code Book Mode (ECB), 403, 409
DES attack, 582, 587 Electronic Frontier Foundation (EFF), xixn. 1, 410, 511,
example, 580–582 512, 526
history, 579–580, 582–583 DES Cracker 395–397, 401, 443, 583
programmable, 583 Elgamal, Taher, 486
references, 586–587 Elgamal encryption, 486–487, 504–505, 513, 551–552
Dodgson, Charles, 26, 74, 104 attack on, 444
Donnelly, Ignatius, 130–131 references, 491, 506
Donut, 546 signatures, 497–498
Hoover, J. Edgar, 263, 363 Institute of Electrical and Electronics Engineers, see
Horner, Captain E. W., 276 IEEE
Hot line, between Washington, DC and Moscow, 94–95 International Data Encryption Algorithm (IDEA), see
Hotmail, 370 IDEA
House Permanent Select Committee on Intelligence, International Organization for Standardization (ISO),
366–369 xxi
Huang, Ming-Deh A., 473 International Traffic in Arms Regulations, see ITAR
Huffman, David A., 39–40, 42 Internet, 340, 367, 515, 517, 540, 579
Huffman coding, 39–42 distributed attack using the, 397
Human rights activists, 518 Great Internet Mersenne Prime Search (GIMPS),
Humor, 256n. 65, 257, 340, 401, 462, 479, 512 476–477
Journal of Craptology, 401, 432n. 44 PGP distributed on the, 511
xkcd, 285, 390 RSA, and, 430
Hungarian code talkers, 276 similarity of to print media, 242
Hunnicutt, Tom “Captain T,” 274 Introduction to Finite Fields and their Applications, 559
Hurt, John B., 263, 264 Invertible matrix
Hutton, Timothy, 364 in AES, 557–559, 561
Huygens, Christian, 118 in matrix encryption 201, 203, 204, 207, 208, 209,
Hybrid DNA/silicon computer, 583 210
Hybrid system, 305, 473, 509, 510, 511, 520 Invisible inks, see Secret inks
iPads, 525
I iPhones, 521–523, 527, 528, 529–530
IPsec, 499
Iberian Peninsula, 275 IRA, see Irish Republican Army
IBM, 349, 379–380, 391–393, 404, 552, 555, 577 Iran, 357, 367
IDEA, 512 Irreducible polynomial, 407, 536, 559, 560
IEEE, 423–427, 572–573 Ireland, 114–116, 165, 500
IEEE Symposium on Information Theory, 572–573 Irish Republican Army, 114–116
IEEE Transactions on Aerospace and Electronics Systems, ISO, see International Organization for Standardization
424, 426 Isograms, 23, 55, 56
IEEE Transactions on Information Theory, 482 Isomorphs, 264–265
Illinois, 133 Israel, 364, 373n. 99, 378, 418, 555, 583
iloveyou, 519 Crypto AG, and, 357, 359
Immortality, 74 USS Liberty, and, 350, 377
Index of Coincidence (IC), 67–72, 74, 81, 96, 172, 211, 264, Italian language, 74, 331
Index of prohibited books, 62, 194 Italy, 280n. 2, 555
India, 356, 473–474, 573, 576 ITAR, 423, 425–427
Indian Institute of Technology in Kanpur, 473–474 Ithaca, New York, 424
India Pale Ale, 501 IV, see Initialization Vector (IV)
Indigo, 268 Iwo Jima, 276, 278, 281, 285
Indonesia, 373
Induction, 184 J
Industrial and Commercial Bank of China (ICBC),
575–576 Jade, 97, 276; see also Venona (Jade was used as a
Indus Valley script, 16–17 codename for both a Japanese cipher and the
Infinity, 467, 545 decipherment of Soviet OTPs)
Information Assurance Directorate (IAD), 347, 577 Jao, David, 578
Information theory, 327, 336, 340, 423, 425, 482, 573 Japan; see also Red; Purple; World War II
Nyquist, Harry, and, 317, 329n. 3 attack on Hong Kong, 116
Pynchon, Thomas, and, 337 one-time pad use, 94–95
Inglis, Chris, 365–366, 370–372 QKD Network in, 575
Initialization Vector (IV), 403, 404, 405, 407, 540–541 Japanese auto manufacturer, 373
Inman, Bobby Ray, 306, 429, 433 Japanese ciphers, see Coral; Green; Jade; Orange; Purple;
Inouye, Captain Kingo, 191 Red
Inquisition, 123 Japanese codes; see also JN-25
Institute for Advanced Study, Princeton, 327, 479 broken by Yardley, 185–186, 263
Institute for Advanced Study, Radcliffe, 521 Japanese Diplomatic Secrets, 187–188, 192
NIST, see National Institute of Standards and “On Digital Signatures and Public-Key Cryptosystems,”
Technology (NIST) 423, 434, 463
Nixon, Richard, 517 $100,000 prize, 477
NKVD, 364 One-time pad, 92–96, 101–103, 116, 120
Nobel Prize, 402 breakable if misused, 96–98
Nodes, 575, 580 discovery of, 93–94, 98–100, 102–103
Nomenclator, 44–48, 49, 57, 134–135, 196 Guevara, Ché, 95–96
Non-cryptanalytic attacks, 347–348, 361, 564, 574 German use of, 94, 96
Noninvertible matrices in matrix encryption, 214 Japanese use of, 94–95
Non-pattern words, 23, 55, 56 19th century discovery of, 93n. 27, 102–103
Nonsecret encryption, see Public Key cryptography OSS use of, 94
Normandy, 274, 282 quantum, 572n. 5, 575
Norris, Mike, 423, 427 references, 105–106
North Carolina State University, 21, 91, 200, 215–216 Soviet use of, 94, 96–98
Northern Kentucky University, 245 as unbreakable cipher, 92–94, 327, 533
North Korea, 350–352, 367 for voice, 311, 313, 318
Not of interest, 53, 110n. 3, 377 One-way function, 393, 483, 502
Novak, Kayla 211 Open problems, 23,192, 212–213, 416, 422, 446; see also
Noyes, Rear Admiral Leigh, 296 Unsolved ciphers
NP-complete, 477–479, 483–484, 487, 489, 581 Optical Society of America (OSA), 574
NP-hard, 478–479 Optics Express, 575
NP (nondeterministic polynomial time), 417, 477–479, OP-20-G, 346
572 Opus 100, 341
NSA, see National Security Agency Orange, 267–268, 287
NSASAB, 336, 533n. 1 Orgies, 183
NSF, see National Science Foundation Orwell, George, 424, 516
NTRU, 579, 585, 587 Osbourne, Ozzy, 26, 42
Nuclear weapons/annihilation, 98, 166, 311, 356, Osmussaar, 166
423–424, 519, 520 OSS, see Office of Strategic Services
protests against 415, 421, 513 Ossifrage, squeamish, 460
RSA, and, 512 OTP, see One-time pad
Nulls, 44, 104, 116, 121, 126, 147 Our Fighting Navy, 249
Nyman, Bertil, 466 Output Feedback Mode (OFB), 405–406
Nyquist, Harry, 317, 329n. 3 Oyobi, 267
Ozeki, Naoshi, 191
O
P
Oakland, 320
Obama, Barack, 373, 520 P (polynomial time), 461, 473, 477–479, 551, 576, 581
OCB, see Offset Codebook Mode (OCB) P = NP, 417, 478–479, 572
Odensholm, 166 P = NP, proof for special case of, 479
Odom, Lieutenant General William E., 349 P ≠ NP, 479
OFB, see Output Feedback Mode (OFB) Padding, 94, 408, 442, 444, 503, 510
Office of Naval Research, 423 Painvin, Georges, 168–169, 172–173, 182, 196, 198
Office of Strategic Services, 15n. 27, 94, 186n. 30 País, El, 370
Official Secrets Act, 194 Paracelsus, 62
Offset Codebook Mode (OCB), 406–409 Parallel processing, 236, 245, 580, 582
Ohaver, M. E., 199, 214 Paris, 168, 320
Ohio, 253, 259–260 Parrish, Thomas, 255
Oklahoma City, 517 Party line, 310n. 3
O’Leary, Jeremiah, 165 Passwords
Oligonucleotides, 581–582 hacked, 519
Oliver, John, 527 hash functions and, 500–504
OL-31, 320 key encryption, and, 500–504, 518
Omnibus Crime Control and Safe Streets Act, selection of, 518–520
355 Pattern words, 20–21, 23, 30, 55–56, 60; see also Non-
“On Computable Numbers,” 249–250 pattern words
Preimage computation, 498, 501, 502 Public Key cryptography; see also Diffie–Hellman key
Pretty Good Privacy (PGP), see PGP exchange; Elgamal encryption; Elliptic Curve
Primality testing Cryptography; RSA
AKS algorithm, 473–476 classified discovery at GCHQ, 432
definition, 468 developers, group photo of, 488
deterministic, 473–477 knapsack encryption, 482–486, 490, 499
elliptic curve, 473, 565 knapsack problem, 478
Fermat test, 468–470 linear codes, 478, 487, 490, 565, 566, 585
Miller–Rabin–Selfridge test, 465–473 McEliece system, 478, 487, 490, 585
Miller–Rabin test, 470–473 Merkle’s first scheme, 479–482, 490
Rabin–Miller test, 470–473 prehistory, 413–414
References, 488–489, 565 puzzle scheme, 479–482, 490
strong pseudoprimality test, 470–473 references, 433–434, 490–491
Prime numbers, 465–468; see also Factoring algorithms; Public Key Cryptography Standard #1 (PKCS #1),
Primality testing; RSA Factoring challenge 442–443
arbitrarily large runs without any, 466 Punitive expedition, 165
in Diffie–Hellman key exchange, 415 Purdy, Anthony, 338–339
in DSA, 504, 505 Purple, 106, 186, 268–276, 286–287, 311, 372–373
in ECC, 549 analog, 271, 273, 372–373
in Elgamal, 487, 497 cryptanalysis, of 270–273
in Fermat’s little theorem, 418 fragment of, 275–276
generation of, 473 intelligence from, 273, 274–276
humor, 462 keyspace, 269–270
infinitely many (proof), 465–466 period of, 269
Mersenne primes, 466, 476–477, 565 references, 286–287
in RSA 419, 432 schematics, 268
in RSA attacks, 435, 439, 441, 445, 496, 497, 520 workings of, 268–270
references, 488–489 Puzzle Palace, The, 192
top 10 largest known, 466, 476–477 Pyle, Joseph Gilpin, 131
Prime number theorem, 467 Pynchon, Thomas, 337–338, 341–342
Primitive polynomial, 536–537 Pyrenees, 275
Primitive root (aka generator), 487, 504
Princeton’s Institute for Advanced Study, 327, 479 Q
Princeton University, 201, 250, 285, 327
Printer Codes, xix QL69.C9, 193
PRISM, 370 QKD Network, see Quantum Key Distribution Network
Prisoner of war, 38, 116–117 (QKD Network)
Privacy, see Laws (actual and proposed) Quadratic sieve, 459–460
Privacy lock, 310 Quagmire III, 78
Prize money, 133, 397, 446, 472, 476–477, 479, 585 Quaker, 45
PRNG, see Pseudorandom number generator Quantum computers, 105, 461, 576–579, 584–585
Probable Word, see Cribs Quantum Cryptography
Proceedings of the Engineers’ Club of Philadelphia, background, 569
75 devices, 573–576
Project X, 311; see also SIGSALY example, 569–571
Project X-61753, 311; see also SIGSALY history, 571–576
Propaganda, failed, 350–351 Quantum Key Distribution Network (QKD Network),
Proto, Richard “Rick”, 433, 434 575–576, 577, 584–585
Protocols, 430–431, 436, 499, 545, 575 Quantum repeater, 574
Prozess, Der, 357 Queen Elizabeth, 44–45
Psalm 46, 131–133 Queneau, Raymond, 339
Pseudoprime, 469–470
Pseudorandom number generator, 98, 533; see also R
Stream cipher
Psychological method, 230, 234, 236, 519 Rabin, Michael O., 471–472
PT Boat, 151–152 Radcliffe, 296
P-38 Lightnings, 274 Rader, Dennis, xix, 31
Torture, 16, 38, 256, 283, 284, 350, 352 Unabomber (Kaczynski, Theodore), 30, 31, 57, 116
Townsend, Robert, 45, 47, 57 Unconstitutional, 365, 426
Traicté des chiffres, 64–65, 411 Uncounterfeitable currency, 571
Training exercise, 264–267 Undergraduate contributions
Traitors, 363–370 AKS primality test, 474
Transatlantic cable, 184 first hybrid system, 509
Transient Electromagnetic Pulse Emanation Standard, first public key system, 414n. 4, 479–482
see TEMPEST matrix encryption attacks, 207, 211–212
Transponders, 424, 426 Poe challenge solved, 60
Transport Layer Security (TLS), 499, 541 reconstruction of Polish work on Enigma, 245, 258
Transposition cipher, 78, 101, 122, 190, 389; see also running key cipher attack, 91
Anagrams; Cardano grille; Skytale timing attack 443
ADFGX and ADFGVX use of columnar, 166–168, Vigenère Cipher on a TI-83, 104
182, 335–336 Unforgeable subway tokens, 571, 584
columnar, 110–117, 147 Unicity point, 92, 119, 148, 152, 327, 335, 340
double, 116, 119–120, 126–127, 182 Unicycle, 327
rail fence, 107–108 United Nations, xxii
rectangular, 108–110 Universal language, 119
references, 126–127, 195, 196 Universal machine, 250, 583
Rubik’s cube, 127 Universal Product Code (UPC), xix
word transposition, 120–121 Universal scientist, 583
Traveling salesman problem, 478, 587 University of Alberta, 338
Treatise on the Astrolabe, 42 University of California, Berkeley, 480
Treaty, 355, 500 University of California, Davis, 406
Trinity Churchyard, 26, 28 University of California, Los Angeles, 477
Triple DES, 397, 398, 401, 532 University of California, San Diego, 406
Trithemius, Johannes, 57, 62–64 University of California, Santa Barbara, 405, 485
Tromer, Eran, 461 University of Cambridge, 501, 525
Truman, Harry S., 276, 346, 347 University of Nevada, 406
Trump, Donald, 527 University of Virginia, 146
TSEC/KL-7 (ADONIS/POLLUX), 305 University of Warsaw, Poland, 337
Tsosie, Harry, 283 University of Washington, 553
Tuchman, Walter, 391, 392, 393, 410 University of Waterloo, 53, 553
Tuition refund, 451 Unix, 502
Tunny, 252 Unsolved Ciphers, 31, 33, 75–79, 101, 251
Turing, Alan, 247–251, 255–259, 319–320, 324, 327, 579 UPC, see Universal Product Code
Turing, John F., 256, 259 U.S. Customs Department, 513
Turing, Sara, 256, 259 USS Liberty, 349–350, 377
Turing, Sir Dermot, xxv, 256 USS Pueblo, 349–352, 377, 378
Turing machine, 249–250 U-2 spy plane, 94, 350n. 18
Turing test, 256
Turning grille, see Cardano grille V
Tutti Frutti, 114
Twenties, 268, 269, 270 Vader, Darth (similar to the German Der Vater - a
TWIRL, 461 codename?), 125–126
Twitter, 526 de la Vallée-Poussin, C. J., 467
Twofish, 555, 564 Valley of Fear, The, 48
$200,000 prize, 446 Van Assche, Gilles 500
Van Eck, Wim, 347
U Van Eck phreaking, 347; see also TEMPEST
Vanstone, Scott, 507, 553
UCLA, 477 Vault, 263
UKUSA, 372–373 Venona, 96–97, 105, 355
Ultra Americans, The, 255 Ventura, California, 555
Ultra Secret, 255, 286 de Vere, Edward, 133–134, 158
Ultra Secret, The, 259 Vernam, Gilbert, 93–94, 98–100, 102, 105
Ulysses, 194 Vernam cipher, see One-time pad