Читать книгу Digital Communications 1 - Safwan El Assad - Страница 21

2.3.1. Entropy of a source

Оглавление

Let a stationary memoryless source S produce random independent events (symbols) s, belonging to a predetermined set [S] = [s1,s2, ... ,sN]. Each event (symbol) Si is of given probability pi, with:


The source S is then characterized by the set of probabilities [P] = [p1,p2, ... ,PN]. We are now interested in the average amount of information from this source of information, that is to say, resulting from the possible set of events (symbols) that it carries out, each is taken into account with its probability of occurrence. This average amount of information from the source S is called “entropy H(S) of the source”.

It is therefore defined by:

[2.15]

Digital Communications 1

Подняться наверх