Жанры
Авторы
Контакты
О сайте
Книжные новинки
Популярные книги
Найти
Главная
Авторы
Professor Ge Wang
Machine Learning for Tomographic Imaging
Читать книгу Machine Learning for Tomographic Imaging - Professor Ge Wang - Страница 1
Оглавление
Предыдущая
Следующая
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
...
67
Оглавление
Купить и скачать книгу
Вернуться на страницу книги Machine Learning for Tomographic Imaging
Оглавление
Страница 1
Страница 2
Страница 3
Contents
Страница 5
Страница 6
Foreword
Preface
Acknowledgments
Author biographies
Introduction
0.1 Artificial intelligence/machine learning/deep learning
0.2 Image analysis versus image reconstruction
0.3 Analytic/iterative/deep learning algorithms for tomographic reconstruction
0.4 The field of deep reconstruction and the need for this book
0.5 The organization of this book
0.6 More to learn and what to expect next
References
Страница 19
IOP
Publishing Machine Learning for Tomographic Imaging Ge Wang, Yi Zhang, Xiaojing Ye and Xuanqin Mou Chapter 1 Background knowledge 1.1 Imaging principles and
a priori
information 1.1.1 Overview
1.1.2 Radon transform and non-ideality in data acquisition
1.1.3 Bayesian reconstruction
1.1.4 The human vision system
1.1.5 Data decorrelation and whitening
1.1.6 Sparse coding
References
IOP
Publishing Machine Learning for Tomographic Imaging Ge Wang, Yi Zhang, Xiaojing Ye and Xuanqin Mou Chapter 2 Tomographic reconstruction based on a learned dictionary 2.1 Prior information guided reconstruction
2.2 Single-layer neural network
2.2.1 Matching pursuit algorithm
2.2.2 The K-SVD algorithm
2.3 CT reconstruction via dictionary learning
2.3.1 Statistic iterative reconstruction framework (SIR)
2.3.2 Dictionary-based low-dose CT reconstruction
2.3.2.1 Methodology
2.3.2.2 Experimental results
2.4 Final remarks
References
IOP
Publishing Machine Learning for Tomographic Imaging Ge Wang, Yi Zhang, Xiaojing Ye and Xuanqin Mou Chapter 3 Artificial neural networks 3.1 Basic concepts
3.1.1 Biological neural network
3.1.2 Neuron models
3.1.3 Activation function
Sigmoid
Tanh
ReLU
Leaky ReLU
ELU
3.1.4 Discrete convolution and weights
A special convolution: 1 × 1 convolution
Transposed convolution
3.1.5 Pooling strategy
3.1.6 Loss function
Mean squared error/L2
Mean absolute error/L1
Mean absolute percentage error
Cross entropy
Poisson
Cosine proximity
3.1.7 Backpropagation algorithm
3.1.8 Convolutional neural network
History
Architecture
Distinguishing features
A CNN example: LeNet-5
3.2 Training, validation, and testing of an artificial neural network
3.2.1 Training, validation, and testing datasets
3.2.2 Training, validation, and testing processes
{buyButton}
Подняться наверх