Читать книгу Digital Communications 1 - Safwan El Assad - Страница 32

2.7.1. Shannon’s theorem: capacity of a communication system

Оглавление

Shannon also formulated the capacity of a communication system by the following relation:

[2.59]

where:

 – B: is the channel bandwidth, in hertz;

 – Ps: is the signal power, in watts;

  is the power spectral density of the (supposed) Gaussian and white noise in its frequency band B;

  is the noise power, in watts.

EXAMPLE.– Binary symmetric channel (BSC).

Any binary channel will be characterized by the noise matrix:


If the binary channel is symmetric, then one has:

p(y1/x2) = p(y2/x1) = p

p(y1/x1) = p(y2/x2) = 1 − p


Figure 2.5. Binary symmetric channel

The channel capacity is:


The conditional entropy H(Y/X) is:


Hence:


But max H(Y) = 1 for p(y1) = p(y2). It follows from the symmetry of the channel that if p(y1) = p(y2), then p(x1) = p(x2) = 1/2, and C will be given by:



Figure 2.6. Variation of the capacity of a BSC according to p

Digital Communications 1

Подняться наверх