Читать книгу Digital Communications 1 - Safwan El Assad - Страница 25

2.3.4.2. Entropy of an alphabetic source with (26 + 1) characters

Оглавление

 – For a uniform law:⟹H = log2 (27) = 4.75 bits of information per character

 – In the French language (according to a statistical study):⟹H = 3.98 bits of information per character

Thus, a text of 100 characters provides an information = 398 bits.

The inequality of the probabilities makes a loss of 475 – 398 = 77 bits of information.

Digital Communications 1

Подняться наверх