Читать книгу Digital Communications 1 - Safwan El Assad - Страница 20

2.3. Uncertainty, amount of information and entropy (Shannon’s 1948 theorem)

Оглавление

The realization of an event x of probability p(x) conveys a quantity of information h(x) related to its uncertainty. h(x) is an increasing function of its improbability 1/p(x):


If an event x is certain, then p(x) = 1, the uncertainty is zero and therefore the quantity of information h(x) brought by its realization is null.

Moreover, let us consider the realization of a pair of independent events x and y. Their joint realization leads to the amount of information it brings being the sum of the quantities of information brought by each of them:

[2.11]

but ft(x,y) is also written:

[2.12]

so, from the relationships [2.11] and [2.12], the form of the function f is of logarithmic type, the base a of the logarithmic function can be any

[2.13]

It is agreed by convention that we select, as the unit of measure of information, the information obtained by the random selection of a single event out of two equally probable events pi = pj = 1/2. In this case, we can write:


If we choose the logarithm in base 2, λ becomes equal to unity and therefore h(xi) = h(xj) = log2(2) = 1 Shannon (Sh) or 1 bit of information, not to be confused with the digital bit (binary digit) which represents one of the binary digits 0 or 1.

Finally, we can then write:

[2.14]

It is sometimes convenient to work with logarithms in base e or with logarithms in base 10. In these cases, the units will be:

loge e = 1 natural unit = 1 nat (we choose 1 among e)

log10 10 = 1 decimal unit = dit (we choose 1 among 10)

Knowing that:


the relationships between the three units are:

 – natural unit: 1 nat = log2(e) = 1/loge(2) = 1.44 bits of information;

 – decimal unit: 1 dit= log2(10) = 1/log10(2) = 3.32 bits of information.

They are pseudo-units without dimension.

Digital Communications 1

Подняться наверх