Читать книгу Machine Learning for Tomographic Imaging - Professor Ge Wang - Страница 34
2.3.2.1 Methodology
ОглавлениеRecall the objective function for statistical reconstruction equation (2.24), the regularization term ψ(x) represents prior information on reconstructed images. By utilizing the sparsity constraint in terms of a learned dictionary as the regularizer, the objective function can be rewritten as
where Es=enjs∈RN×J represents an operator to extract an image patch from an image, and bi=lnI0Ii denotes a line integral. It is worth mentioning that they proposed two strategies for dictionary learning: a global dictionary learned before a statistical iterative reconstruction (GDSIR) and an adaptive dictionary learned in the statistical iterative reconstruction (ADSIR). The former uses a predetermined dictionary to sparsely represent an image while the later learns a dictionary based on intermediate images obtained during the iterative reconstruction process and only uses the dictionary for sparse coding each intermediate image. In this subsection, let us look at the two update processes for GDSIR and ADSIR, respectively.
For GDSIR, a redundant dictionary was trained for a chest region in the baseline image, as shown in figure 2.5. Then, the image reconstruction process is equivalent to solving the following optimization problem, which contains two variables x and α:
An alternate minimization is performed with respect to x and α alternately. First, the sparse expression α˜ is fixed, and the current image is updated. At this point, the objective function becomes
minx∑i=1Iwi2([Ax]i−bi)2+λ∑sEsx−Dα˜s22.(2.27)
Figure 2.5. Global dictionary learned using the online dictionary learning method (Mairal et al (2009). Reprinted with permission from Xu et al 2012). Copyright 2012 IEEE.
Using the separable paraboloid alternative method (Elbakri and Fessler 2002), the objective function is optimized iteratively:
xjt=xjt−1−∑i=1Iaijwi[Axt−1]i−bi+2λ∑s∑n=1NenjsEsxt−1n−Da˜sn∑i=1Iaijwi∑k=1Jaik+2λ∑s∑n=1Nenjs∑k=1Jenks,j=1,2,…,J(2.28)
where t=1,2,…,T indexes the iterations.
After obtaining an intermediate image xt, it is re-coded for a sparse representation. The objective function is changed to
This equation represents a sparse coding problem, which can be solved using the OMP method described in subsection 2.2.1.
For ADSIR, the image reconstruction process is equivalent to solving the following optimization problem:
In ADSIR, the dictionary D is regarded as an additional variable along with α and x. The objective function is solved in an alternating fashion, by keeping all variables D, α, and x fixed except for one at a time, gradually converging to a satisfactory sparse representation α, a final image x, and the associated dictionary D. As mentioned above, dictionary learning-based reconstruction consists of two steps: sparse coding and image updating. It is noted that the updating step for an intermediate reconstructed image x with a fixed sparse representation α˜ and the current dictionary D˜ is exactly the same as that for GDSIR expressed in equation (2.28). For the sparse coding step, the D and α are estimated with a fixed reconstruction image xt, whose objective function is thus as follows:
This equation is the generic dictionary learning and sparse representation problem, which can be solved with respect to α and D alternatingly using the K-SVD algorithm described in subsection 2.2.2.
In summary, the GDSIR and ADSIR algorithms are summarized in algorithms 2.2 and 2.3, respectively. In order to improve the convergence speed, a fast online algorithm is used to train a dictionary from patches extracted from an intermediate image.
Algorithm 2.2. Workflow for the GDSIR algorithm. Reprinted with permission from Xu et al (2012). Copyright 2012 IEEE.
Global dictionary learning |
1: Choose parameters for dictionary learning. |
2: Extract patches to form a training set. |
3: Construct a global dictionary. |
Image reconstruction |
4: Initialize x0, α0, and t = 0. |
5: Set parameters λ, ε, and L0S. |
6: while a stopping criterion is not satisfied do |
7: Update xt−1 to xt using equation (2.29). |
8: Represent xt with a sparse code αt using OMP. |
9: end while |
10: Output the final reconstruction. |
Algorithm 2.3. Workflow for the ADSIR algorithm. Reprinted with permission from Xu et al (2012). Copyright 2012 IEEE.
1: Choose λ, ε, L0S and other parameters. |
2: Initialize x0, D0, α0, and t = 0. |
3: while a stopping criterion is not satisfied do |
4: Update xt−1 to xt using equation (2.29). |
5: Extract patches from xt from the training set. |
6: Construct a dictionary D0 from the training set. |
7: Represent xt with a sparse code αt using OMP. |
8: end while |
9: Output the final reconstruction. |