NN学习笔记

来源:互联网 发布:数据库实训总结 编辑:程序博客网 时间:2024/06/06 04:13

A list of the training algorithms that are available in the Neural Network Toolbox software and that use gradient- or Jacobian-based methods, is shown in the following table.


Function

Algorithm

trainlm

Levenberg-Marquardt

trainbr

Bayesian Regularization

trainbfg

BFGS Quasi-Newton

trainrp

Resilient Backpropagation

trainscg

Scaled Conjugate Gradient

traincgb

Conjugate Gradient with Powell/Beale Restarts

traincgf

Fletcher-Powell Conjugate Gradient

traincgp

Polak-Ribiére Conjugate Gradient

trainoss

One Step Secant

traingdx

Variable Learning Rate Gradient Descent

traingdm

Gradient Descent with Momentum

traingd

Gradient Descent



Note that:


During training, the progress is constantly updated in the training window. Of most interest are the performance, the magnitude of the gradient of performance and the number of validation checks. The magnitude of the gradient and the number of validation checks are used to terminate the training. The gradient will become very small as the training reaches a minimum of the performance. 


Ifthe magnitude of the gradient is less than 1e-5, the training will stop. This limit can be adjusted by setting the parameter net.trainParam.min_grad


Thenumber of validation checks represents the number of successive iterations that the validation performance fails to decrease. If this number reaches 6 (the default value), the training will stop. In this run, you can see that the training did stop because of the number of validation checks. You can change this criterion by setting the parameternet.trainParam.max_fail. (Note that your results may be different than those shown in the following figure, because of the random setting of the initial weights and biases.)


There are other criteria that can be used to stop network training. They are listed in the following table.

Parameter

Stopping Criteria

min_grad

Minimum Gradient Magnitude

max_fail

Maximum Number of Validation Increases

time

Maximum Training Time

goal

Minimum Performance Value

epochs

Maximum Number of Training Epochs (Iterations)

From the training window, you can access four plots: performance, training state, error histogram, and regression. 

Theperformance plot shows the value of the performance function versus the iteration number. It plots training, validation, and test performances. Thetraining state plot shows the progress of other training variables, such as the gradient magnitude, the number of validation checks, etc. 

Theerror histogram plot shows the distribution of the network errors. 

Theregression plot shows a regression between network outputs and network targets. 


You can use the histogram and regression plots to validate network performance



【1】http://cn.mathworks.com/help/nnet/ug/train-and-apply-multilayer-neural-networks.html

0 0
原创粉丝点击