Читать книгу Position, Navigation, and Timing Technologies in the 21st Century - Группа авторов - Страница 26

36.3.2.1 Multiple Model Adaptive Estimation

Оглавление

One implementation of the Gaussian sum filtering approach is known as multiple model adaptive estimation (MMAE). The MMAE filter uses a weighted Gaussian sum to address the situation where unknown or uncertain parameters exist within the system model. Some examples of these types of situations include modeling discrete failure modes, unknown structural parameters, or processes with multiple discrete modes of operation (e.g. “jump” processes).


Figure 36.1 Gaussian sum illustration. The random variable xsum is represented by a weighted sum of three individual Gaussian densities. In this example, xsum = 0.25x1 + 0.5x2 + 0.25x3.

Consider our standard linear Gaussian process and observation models, repeated from Eqs. 36.17 and 36.18 for clarity:

(36.32)

(36.33)

In the previous development, it was assumed that the system model parameters (i.e. ) were known. Let us now consider the situation where some of the system model parameters are unknown.

To address this situation, we can define a vector of the unknown system parameters, a, and jointly estimate these parameters along with the state vector. In other words, we must now solve for the following density:

(36.34)

which, after applying Bayes’ rule, can be expressed as

(36.35)

It is important to note that this expression is the product of the “known‐system model” pdf, p(xk| a, ℤk), and a new density function, p(a| ℤk), which is the pdf of the unknown system parameters, conditioned on the observation set. Assuming a ∈ ℝn, the parameter density can be written as

(36.36)

Applying Bayes’ rule yields

(36.37)

Marginalizing the denominator about the parameter vector results in a more familiar form:

(36.38)

where p(zk| a, ℤk−1) is the measurement prediction density, which, given our linear observation model, is expressed as the following normal distribution:

(36.39)

Unfortunately, the integral in the denominator is intractable in general, which requires an additional constraint. If the system parameters can be chosen from a finite set (e.g. a ∈ {a[1], a[2], ⋯, a[j]}), the parameter density can be expressed as the sum of the individual probabilities of the finite set. This results in a system parameter pdf defined as

(36.40)

where is the probability of the j‐th parameter vector at time k‐1, and δ(·) is the delta function. It can be observed that the sum of the weights must be unity in order to represent a probability density. Substituting Eq. 36.40 into Eq. 36.38:

(36.41)

Moving the position of the summation operators and parameter weight vector:

(36.42)

The properties of the delta function can be exploited to rewrite the numerator and eliminate the integral from the denominator:

(36.43)

At this point, we have established the posterior pdf of the parameter vector as a finite weighted set. Revisiting our system parameter pdf, now defined at time k

(36.44)

and substituting into Eq. 36.43 yields the parameter density update relationship

(36.45)

In the above equation, the predicted measurement pdf, p(zk| a[j], ℤk − 1), is evaluated at the measurement realization at time k, which yields the likelihood of realizing the current measurement, conditioned on the parameter set j. As mentioned previously, these likelihood values are based on the following evaluation of a normal density function:

(36.46)

where zk is the measurement realization at time k. This likelihood is equivalent to the likelihood of the residual from a Kalman filter tuned to the j‐th parameter vector, a[j].

Practically speaking, the parameter pdf consists of the discrete (fixed) parameter set and the associated weights (likelihood) at each epoch. The parameter density update shown in Eq. 36.45 shows the evolution of each parameter weight as a function of time, which can be rewritten as

(36.47)

Our final task is to determine the overall posterior joint pdf of the system. Substituting Eq. 36.44 into Eq. 36.35, we obtain

(36.48)

which, when combined with knowledge of the delta function and implementing a straightforward rearrangement of terms produces the joint posterior density function

(36.49)

This pdf is clearly a weighted sum of Gaussian densities, each of these densities corresponding to the posterior state estimate of an individual Kalman filter, tuned to the parameter vector a[j]. The blended posterior state estimate and covariance are given by

(36.50)

(36.51)

The MMAE filter can be visualized in block diagram form in Figure 36.2.

Additional forms that are very similar conceptually to the MMAE filter are known as interactive mixture model (IMM) estimators [8] and Rao‐Blackwellized particle filters (RB‐PFs) [9, 10], to name a few.

In the next section, we present a simple example to illustrate a potential application for Gaussian sum filters derived in this section.

Position, Navigation, and Timing Technologies in the 21st Century

Подняться наверх