Free to follow every thread. No paywall, no dead ends.
Cryptography: the story on HearLore | HearLore
Cryptography
The first known use of cryptography dates back to ancient Egypt, where carved ciphertext was discovered on stone, though it may have been created merely for the amusement of literate observers rather than as a functional tool for concealing information. This ancient practice laid the groundwork for a discipline that would eventually evolve from simple letter substitutions to the complex mathematical systems securing the modern internet. Before the modern era, cryptography focused almost exclusively on message confidentiality, converting readable information into unintelligible nonsense text that could only be reversed by those possessing the secret key. The sender of an encrypted message shared the decryption technique only with the intended recipient to preclude access from adversaries, a practice that remained largely unchanged for millennia until the advent of rotor cipher machines in World War I and the computer revolution of World War II. In these early days, the cryptography literature often used the names Alice for the sender, Bob for the intended recipient, and Eve for the eavesdropping adversary, creating a standardized cast of characters for understanding the dynamics of secure communication. The field was effectively synonymous with encryption, converting plaintext to ciphertext, which could only be read by reversing the process, a method that relied heavily on the secrecy of the key rather than the secrecy of the algorithm itself.
Classical Ciphers and Ancient Codes
The Greeks of Classical times are said to have known of ciphers such as the scytale transposition cipher, which was claimed to have been used by the Spartan military to rearrange the order of letters in a message. An early substitution cipher known as the Caesar cipher replaced each letter in the plaintext with a letter three positions further down the alphabet, a method Suetonius reports that Julius Caesar used to communicate with his generals. In India, the 2000-year-old Kama Sutra of Vātsyāyana speaks of two different kinds of ciphers called Kautiliyam and Mulavediya, where the cipher letter substitutions were based on phonetic relations or pairing letters to use reciprocal ones. The Arab mathematician and polymath Al-Kindi wrote a book on cryptography entitled Risalah fi Istikhraj al-Mu'amma, which described the first known use of frequency analysis cryptanalysis techniques, a method that would eventually render nearly all classical ciphers vulnerable to an informed attacker. David Kahn notes in The Codebreakers that modern cryptology originated among the Arabs, the first people to systematically document cryptanalytic methods, with Al-Khalil writing the Book of Cryptographic Messages in the 8th century. The development of the polyalphabetic cipher by Leon Battista Alberti around the year 1467 introduced the use of different ciphers for various parts of a message, a significant innovation that made frequency analysis much less effective. Despite these advancements, breaking a message without using frequency analysis essentially required knowledge of the cipher used and perhaps of the key involved, making espionage, bribery, and burglary more attractive approaches to the cryptanalytically uninformed.
Common questions
When did the first known use of cryptography occur and where was it found?
The first known use of cryptography dates back to ancient Egypt, where carved ciphertext was discovered on stone. This ancient practice may have been created merely for the amusement of literate observers rather than as a functional tool for concealing information.
Who developed the Caesar cipher and how did it function?
Suetonius reports that Julius Caesar used the Caesar cipher to communicate with his generals. This early substitution cipher replaced each letter in the plaintext with a letter three positions further down the alphabet.
What was the significance of the Enigma machine during World War II?
The Enigma machine, used by the German government and military from the late 1920s and during World War II, implemented a complex electro-mechanical polyalphabetic cipher. The breaking and reading of the Enigma cipher at Poland's Cipher Bureau for 7 years before the war and subsequent decryption at Bletchley Park was important to Allied victory.
When was public-key cryptography proposed and who developed the RSA algorithm?
Whitfield Diffie and Martin Hellman proposed the notion of public-key cryptography in a groundbreaking 1976 paper. The race to find a practical public-key encryption system was finally won in 1978 by Ronald Rivest, Adi Shamir, and Len Adleman, whose solution has since become known as the RSA algorithm.
What legal consequences did the Digital Millennium Copyright Act have for cryptographers?
In 1998, U.S. President Bill Clinton signed the Digital Millennium Copyright Act, which criminalized all production, dissemination, and use of certain cryptanalytic techniques and technology. This led to cases like that of Dmitry Sklyarov, who was arrested during a visit to the US from Russia and jailed for five months pending trial for alleged violations of the DMCA.
How does quantum computing threaten current cryptographic standards?
Estimates suggest that a quantum computer could reduce the effort required to break today's strongest RSA or elliptic-curve keys from millennia to mere seconds. To mitigate this quantum threat, researchers are developing quantum-resistant algorithms whose security rests on problems believed to remain hard for both classical and quantum computers.
The Enigma machine, used by the German government and military from the late 1920s and during World War II, implemented a complex electro-mechanical polyalphabetic cipher that brought about a substantial increase in cryptanalytic difficulty after World War I. Cryptanalysis of the new mechanical ciphering devices proved to be both difficult and laborious, leading to the development of the Colossus, the world's first fully electronic, digital, programmable computer, which assisted in the decryption of ciphers generated by the German Army's Lorenz SZ40/42 machine at Bletchley Park. The breaking and reading of the Enigma cipher at Poland's Cipher Bureau for 7 years before the war, and subsequent decryption at Bletchley Park, was important to Allied victory, demonstrating the critical role of codebreaking in the outcome of global conflicts. In the United Kingdom, cryptanalytic efforts at Bletchley Park during World War II spurred the development of more efficient means for carrying out repetitive tasks, such as military code breaking, which culminated in the creation of the Colossus. The ciphers implemented by better quality examples of these machine designs brought about a substantial increase in cryptanalytic difficulty, yet the sheer volume of encrypted traffic required new methods to keep pace. The history of this era is marked by the intense competition between those who built the machines and those who sought to break them, a struggle that would eventually lead to the birth of modern computer science and the mathematical foundations of cryptography.
The Mathematical Revolution of 1976
In a groundbreaking 1976 paper, Whitfield Diffie and Martin Hellman proposed the notion of public-key cryptography, in which two different but mathematically related keys are used, a public key and a private key. A public key system is so constructed that calculation of one key is computationally infeasible from the other, even though they are necessarily related, a concept that historian David Kahn described as the most revolutionary new concept in the field since polyalphabetic substitution emerged in the Renaissance. The race to find a practical public-key encryption system was finally won in 1978 by Ronald Rivest, Adi Shamir, and Len Adleman, whose solution has since become known as the RSA algorithm, which was previously released as an MIT Technical Memo in April 1977. Ralph Merkle was working on similar ideas at the time and encountered publication delays, and Hellman has suggested that the term used should be Diffie, Hellman, Merkle asymmetric key cryptography. A document published in 1997 by the Government Communications Headquarters revealed that cryptographers at GCHQ had anticipated several academic developments, with James H. Ellis conceiving the principles of asymmetric key cryptography around 1970 and Clifford Cocks inventing a solution very similar in design rationale to RSA in 1973. This mathematical revolution transformed cryptography from an art based on linguistic patterns into a science grounded in abstract mathematics, including information theory, computational complexity, statistics, and number theory.
The Shadow of the NSA and the Clipper Chip
The National Security Agency was involved with the design of the Data Encryption Standard during its development at IBM and its consideration by the National Bureau of Standards as a possible Federal Standard for cryptography. DES was designed to be resistant to differential cryptanalysis, a powerful and general cryptanalytic technique known to the NSA and IBM, that became publicly known only when it was rediscovered in the late 1980s by Biham and Shamir. Another instance of the NSA's involvement was the 1993 Clipper chip affair, an encryption microchip intended to be part of the Capstone cryptography-control initiative, which was widely criticized by cryptographers for two reasons: the cipher algorithm called Skipjack was then classified, and the scheme included a special escrow key held by the government for use by law enforcement. The whole initiative was also criticized based on its violation of Kerckhoffs's Principle, as the scheme included a special escrow key held by the government for use by law enforcement, raising concerns that the NSA had deliberately made the cipher weak to assist its intelligence efforts. The entire affair illustrates the difficulty of determining what resources and knowledge an attacker might actually have, and the tension between national security and the privacy rights of citizens. The Clipper chip initiative lapsed long before the cipher was declassified in 1998, but the controversy surrounding it highlighted the growing power of cryptography and the government's desire to maintain access to encrypted communications.
The Legal Battles Over Encryption
In 1998, U.S. President Bill Clinton signed the Digital Millennium Copyright Act, which criminalized all production, dissemination, and use of certain cryptanalytic techniques and technology, specifically those that could be used to circumvent digital rights management technological schemes. This had a noticeable impact on the cryptography research community since an argument can be made that any cryptanalytic research violated the DMCA, leading to cases like that of Dmitry Sklyarov, who was arrested during a visit to the US from Russia and jailed for five months pending trial for alleged violations of the DMCA arising from work he had done in Russia, where the work was legal. The 1995 case Bernstein v. United States ultimately resulted in a 1999 decision that printed source code for cryptographic algorithms and systems was protected as free speech by the United States Constitution, a landmark ruling that challenged the government's ability to regulate the export of cryptography. In the United Kingdom, the Regulation of Investigatory Powers Act gives UK police the powers to force suspects to decrypt files or hand over passwords that protect encryption keys, with failure to comply being an offense punishable on conviction by a two-year jail sentence or up to five years in cases involving national security. The 2016 FBI, Apple encryption dispute concerns the ability of courts in the United States to compel manufacturers' assistance in unlocking cell phones whose contents are cryptographically protected, a conflict that continues to shape the legal landscape of digital privacy and security.
The Quantum Threat and Future Horizons
Estimates suggest that a quantum computer could reduce the effort required to break today's strongest RSA or elliptic-curve keys from millennia to mere seconds, rendering current protocols such as the versions of TLS that rely on those keys insecure. To mitigate this quantum threat, researchers are developing quantum-resistant algorithms whose security rests on problems believed to remain hard for both classical and quantum computers, a field known as post-quantum cryptography. A 2017 review in Nature surveys the leading post-quantum cryptography families, including lattice-based, code-based, multivariate-quadratic, and hash-based schemes, and stresses that standardization and deployment should proceed well before large-scale quantum machines become available. The potential impact of quantum computing is already being considered by some cryptographic system designers, making the need for preemptive caution rather more than merely speculative. Lightweight cryptography has also emerged as a critical area of research, concerned with cryptographic algorithms developed for a strictly constrained environment, such as the Internet of Things, which requires strict constraints on power consumption, processing power, and security. The field continues to evolve, with new algorithms like Keccak being selected as the new SHA-3 hash algorithm in 2012, and the ongoing development of secure communication protocols that can withstand the computational power of future quantum machines.