Читать книгу Digital Communications 1 - Safwan El Assad - Страница 27
2.5. Discrete channels and entropies
ОглавлениеBetween the source of information and the destination, there is the medium through which information is transmitted. This medium, including the equipment necessary for transmission, is called the transmission channel (or simply the channel).
Let us consider a discrete stationary and memoryless channel (discrete: the alphabet of the symbols at the input and the one at the output are discrete).
Figure 2.2. Basic transmission system based on a discrete channel. For a color version of this figure, see www.iste.co.uk/assad/digital1.zip
We denote:
– [X] = [xl, x2, ... , xn]: the set of all the symbols at the input of the channel;
– [y] = [yi, ... , ym]: the set of all the symbols at the output of the channel;
– [P(X)] = [p(x1), p(x2), ...,p(xn)]: the vector of probability of symbols at the input of the channel;
– [P(Y)] = [p(yi), p(y2), ... , p(ym)]: the vector of probability of symbols at the output of the channel.
Because of the perturbations, the space [Y] can be different from the space [X], and the probabilities P(Y) can be different from the probabilities P(X).
We define a product space [X • Y] and we introduce the matrix of the probabilities of the joint symbols, input-output [P(X, Y)]:
[2.20]
We deduce, from this matrix of probabilities:
[2.21]
[2.22]
We then define the following entropies:
– the entropy of the source:[2.23]
– the entropy of variable Y at the output of the transmission channel:[2.24]
– the entropy of the two joint variables (X, Y)Because of the disturbances in the transmission channel, if the symbolinput-output:[2.25]