Читать книгу Handbook of Intelligent Computing and Optimization for Sustainable Development - Группа авторов - Страница 24
1.4.1.1 Training of a Neural Network
ОглавлениеThere are two techniques which standardize the weight and make ANN special; these are forward propagation and back propagation. In the normal forward propagation, simplified sample weights are multiplied at every node, and the sample outputs are recorded at the output layer. In the back propagation, as you can say from the name, it was from the output layer to the input layer. In this process, the error margin at every layer is deduced and the input weights will be changed so as to get the minimum error. The error value will be investigated every time and it is helpful in changing the weights at nodes. At every hidden node, functions called activation functions are also used. Some of them are as follows. Please see Figure 1.2.