Читать книгу Digital Communications 1 - Safwan El Assad - Страница 28
2.5.1. Conditional entropies
ОглавлениеBecause of the disturbances in the transmission channel, if the symbol yj appears at the output, there is an uncertainty on the symbol xi, j = 1, ... ,,n which has been sent.
Figure 2.3. Ambiguity on the symbol at the input when yj is received
The average value of this uncertainty, or the entropy associated with the receipt of the symbol yj, is:
[2.26]
The mean value of this entropy for all the possible symbols yj received is:
[2.27]
Which can be written as:
[2.28]
or:
[2.29]
The entropy H(X/Y) is called equivocation (ambiguity) and corresponds to the loss of information due to disturbances (as I(X, Y) = H(X)− H(X/Y)). This will be specified a little further.
Because of disturbances, if the symbol xi is issued, there is uncertainty about the received symbol yj, j = 1, ... , m.
Figure 2.4. Uncertainty on the output when we know the input
The entropy of the random variable Y at the output knowing the X at the input is:
[2.30]
This entropy is a measure of the uncertainty on the output variable when that of the input is known.
The matrix P(Y/X) is called the channel noise matrix:
[2.31]
A fundamental property of this matrix is:
[2.32]
Where: p(yj/xi) is the probability of receiving the symbol yj when the symbol xi has been emitted.
In addition, one has:
[2.33]
with:
[2.34]
p(yj) is the probability of receiving the symbol yjwhatever the symbol xi emitted, and:
[2.35]
p(xi/yj) is the probability that the symbol xi was issued when the symbol yj is received.