Читать книгу Spatial Multidimensional Cooperative Transmission Theories And Key Technologies - Lin Bai - Страница 26

2.1.1.1The combining method of known channels

Оглавление

Based on the models expressed in Eqs. (2.2) and (2.3), when the channel state information h is known, the signal combination method can be designed separately according to the optimal discriminant criteria of different linear combination vectors w. In the following, the combination methods of received signals such as the minimum mean square error, the maximum SNR, the maximum likelihood estimation, the maximum ratio, and the general selection diversity will be introduced.

(1) Minimum mean square error combining

The minimum mean square error algorithm determines the combining signal vector w by minimizing the mean square error between the transmitted signal s and a linear combiner output.

If the signal and noise both obey Gaussian distribution, then the ideal MMSE combination is also a linear MMSE combination. In order to implement the linear MMSE combination algorithm, the statistical properties of h, s, and n need to be known first. It is generally assumed that , and E(nnH) = Rn, where, E(·) represents the mathematical expectation of a random variable. In this section, we assume that the covariance matrix Rn is full rank.

The output of the linear combiner, namely the MSE of signal s and its estimated value , is


It is easy to know that the MSE is a function of vector w. According to the orthogonality principle, the optimal combining vector is


where Rr = E(yyH) is the covariance matrix of the received signal vector y and c = E(ys*) is the correlation vector of vectors y and s*. If s and n are not related, then . By substituting it in Eq. (2.5), the optimal MMSE combination vector can be obtained.


where .

Therefore, MMSE is


It can be seen from Eq. (2.7) that the value of MMSE depends on the transmitted signal s, the channel transmission vector h, and the noise covariance matrix Rn. Taking the extreme value as an example, when → ∞, Eq. (2.7) can be simplified as . In this case, although → ∞, MMSE will never reach zero because is limited.

Substituting Eq. (2.7) into Eq. (2.6) we get


substituting Eq. (2.8) into Eq. (2.3), we can obtain the estimated signal based on MMSE.


where, . And then the SNR of the received signal can be defined as


(2) Maximum SNR combining

If the criterion for maximizing SNR is used in the linear combination, then for the combined vector w, the combined signal can be written as


where the first item and the second item on the right-hand side are the signal and noise, respectively. Thus, the SNR is given as follows:


where the full rank covariance matrix Rn must be a positive definite matrix (matrix decomposition is used in the derivation from row 2 to row 3 of Eq. (2.12)). In addition, if and only if (α is non-zero constant), the equal sign can be used from row 3 to row 4 of Eq. (2.12). In order to make the inequality take the maximum value, it should set


and then the combined vector with the maximum signal-to-noise ratio (MSNR) can be expressed as


Comparing Eq. (2.14) with Eq. (2.6), we see that the MMSE combination is essentially an MSNR combination.

(3) Maximum likelihood combining

If the transmitted signal s is considered as an estimated parameter, the signal s can be estimated by the maximum likelihood (ML) estimation algorithm. Assuming that vector n in Eq. (2.2) is a zero-mean CSCG random variable, namely , then for vector r given in Eq. (2.2), the probability density of the signal s is


Therefore, the maximum likelihood estimate for signal s is


where the maximum likelihood combining vector is


It is easy to know that the maximum likelihood estimation is actually achieved by adjusting the weight vector wML through a linear combination operation, which we call the maximum likelihood combining.

Comparing Eq. (2.17) with Eq. (2.14), the maximum likelihood combining is essentially an MSNR combination.

(4) Maximum ratio combining

Maximum ratio combining (MRC) is a linear combination technology often used in the fading channel environment, which can effectively improve the system performance of the fading channel. In fact, MRC can be seen as a special case of MSNR combination.

In order to derive the MRC algorithm, we assume that the noise terms in Eq. (2.1) are uncorrelated and their variances are equal, namely E(nnH) = N0I. In this case, the SNR can be obtained according to Eq. (2.12).


According to the Cauchy–Schwartz inequality, it is easy to prove that the combined vector that maximizes SNR is w = αh. This is also a special case of the MSNR combination vector in Eq. (2.14). The linear combination of linear combination vectors w = αh is called MRC.

When the MRC algorithm is applied, the SNR can be obtained.


(5) Generalized selection diversity combining

In wireless communication systems, selection diversity (SD) is also a common spatial diversity technology. Different from the principle that the MRC algorithm combines all the received signals to maximize SNR, the SD algorithm picks out only the strongest signal of the N received signals for processing, which makes it easy to implement.

The SNR of the SD algorithm is


where

In order to improve the performance of SD, a generalized SD combining (GSDC) algorithm is proposed in related literatures. The generalized SD combining selects M signals from the N received signals. When M = N, GSDC is equivalent to the optimal MRC combining. When M = 1, GSDC is a common SD algorithm. It is easy to see that GSDC algorithm has a good balance between performance and computational complexity.

If M signals are obtained under the MSNR standard, the final SNR is


where SNR(k) is the kth maximum SNR of SNRk, k = 1, 2, . . . , N.

Spatial Multidimensional Cooperative Transmission Theories And Key Technologies

Подняться наверх