Читать книгу Cryptography, Information Theory, and Error-Correction - Aiden A. Bruen - Страница 22

Cryptography

Оглавление

Shannon published just one paper in cryptography, namely “Communication theory of secrecy systems,” [Sha49b]. Its contents had appeared in a war‐time classified Bell Laboratories document which was then declassified. The beginning sentence is very revealing. It reads as follows:

The problems of cryptography and secrecy systems furnish an interesting application of communication theory.

Indeed, this is precisely the point of view which inspired the authors of this book! We believe it is unrealistic to separate the study of cryptography from the study of communication theory embodying error‐correction and information theory.

To illustrate this, Shannon points out that just as in error‐correction, where the receiver tries to decode the message over a noisy channel so also, in cryptography, a receiver (this time, Eve, the eavesdropper) tries to decode the message over a noisy channel, the noise being the scrambling by the key which obfuscates the plain text to the cipher text.

In this paper, Shannon discusses at length his two famous principles of confusion and diffusion described in detail in Chapter 4. The Vernam cipher offers perfect security. We discuss perfect security in detail in Part II of the book where it is shown that, under appropriate conditions, perfect security corresponds precisely to a Latin square. Shannon's paper makes it quite clear that he was aware of this phenomenon though he did not explicitly state it.

In the paper, Shannon clearly differentiates between computational and unconditional security. Whether or not he “missed” public key cryptography is far from clear. However, in [Mas02] Massey points out that Hellman of Diffie–Hellman fame, has credited the following words from Shannon's paper as the inspiration for their discovery:

The problem of good cipher design is essentially one of finding difficult problems . We may construct our cipher in such a way that breaking it is equivalent to the solution of some problem known to be laborious.

Of course, the jury is still out, as Massey [Mas02] points out, on whether one‐way functions, the foundations of public key cryptography, really exist. We refer to Chapters 3 and 4 on this point.

Shannon theory: information compression and communication.

Shannon's revolutionary paper [Sha49b] on information theory electrified the scientific world and has dominated the area of communication theory for over 50 years. No other work of the twentieth century has had greater impact on science and engineering.

First of all, Shannon unified what had been a diverse set of communications – voice, data, telegraphy, and television. He quantified and explained exactly what information means. The unit of information is the Shannon bit. As Golomb et al. [GBC+02] so elegantly puts it, this is the “amount of information gained (or entropy removed) upon learning the answer to a question whose two possible answers were equally likely, a priori.”

In the above, we can think of entropy as “uncertainty” analogous to entropy in physics (which is the key idea in the second law of thermodynamics). An example would be the tossing of a fair coin and learning which turned up – heads or tails. If the coin were biased, so that the probability of a head was (and the probability of a tail was ) with , the information gained, on learning the outcome of the toss, would be less than one. The exact amount of information gained would be

(1.1)

Note that when and , this works out to be 1. However if, for example , we gain only approximately 0.918 Shannon bits of information on learning the outcome of the coin toss.

It can be mathematically proven that the only information function that gives sensible results is the appropriate generalization to a probability distribution of Formula (1.1) above. Formula (1.1) ties in to the fundamental notion of entropy (or uncertainty). There are many examples of redundancy in the English language, i.e. the use of more letters or words or phrases than are necessary to convey the information content being transmitted. As Shannon points out, the existence of redundancy in the language is what makes crosswords possible.

This redundancy can be reduced in various ways. An example is by writing acronyms such as “U.S.” for “United States.” When information is to be electronically transmitted, we remove redundancy by data‐compression. Shannon's formula for data compression is intimately related to entropy which is in turn related to the average number of yes–no questions needed to pin down a fact. Shannon showed that it is possible to obtain a bound for the maximum compression which is the best possible. The actual technique for compressing to that ultimate degree is embodied in the construction of the so‐called Huffman codes, well known to all computer science undergraduates. Later, other compression techniques followed, leading to modern technologies used in, for example, mp3's (music compression). This part of Shannon's work is also connected to the later work of Kolmogorov on algorithmic complexity and the minimum length binary program needed for a Turing machine to print out a given sequence.

But this was only the beginning. Shannon then went on to prove his fundamental result on communication, based on entropy and the mathematical ideas delineated above. He showed that any given communications channel has a maximum capacity for reliably transmitting information which he calculated. One can approach this maximum by certain coding techniques – random coding and now turbo coding – but one can never quite reach it. To put it succinctly: Capacity is the bound to error‐free coding. Thus, for the last 50 years, the study of error correction has boiled down to attempts to devise techniques of encoding in order to come close to the Shannon capacity. We will have much to say about this bound in Parts II and III of this book.

Shannon's work, theoretical and practical, still dominates the field and the landscape. To quote Cover in [Cov02]:

This ability to create new fields and develop their form and depth surely places Shannon in the top handful of creative minds of the century.

Few can disagree with this assessment. Indeed, in Part III of this book, we describe protocols in cryptography and error‐correction based squarely on C.E. Shannon's work in information theory.

Cryptography, Information Theory, and Error-Correction

Подняться наверх