NN学习笔记
来源:互联网 发布:数据库实训总结 编辑:程序博客网 时间:2024/06/06 04:13
A list of the training algorithms that are available in the Neural Network Toolbox software and that use gradient- or Jacobian-based methods, is shown in the following table.
Function Algorithm
trainlm
Levenberg-Marquardt
trainbr
Bayesian Regularization
trainbfg
BFGS Quasi-Newton
trainrp
Resilient Backpropagation
trainscg
Scaled Conjugate Gradient
traincgb
Conjugate Gradient with Powell/Beale Restarts
traincgf
Fletcher-Powell Conjugate Gradient
traincgp
Polak-Ribiére Conjugate Gradient
trainoss
One Step Secant
traingdx
Variable Learning Rate Gradient Descent
traingdm
Gradient Descent with Momentum
traingd
Gradient Descent
Note that:
During training, the progress is constantly updated in the training window. Of most interest are the performance, the magnitude of the gradient of performance and the number of validation checks. The magnitude of the gradient and the number of validation checks are used to terminate the training. The gradient will become very small as the training reaches a minimum of the performance.
Ifthe magnitude of the gradient is less than 1e-5, the training will stop. This limit can be adjusted by setting the parameter net.trainParam.min_grad
.
Thenumber of validation checks represents the number of successive iterations that the validation performance fails to decrease. If this number reaches 6 (the default value), the training will stop. In this run, you can see that the training did stop because of the number of validation checks. You can change this criterion by setting the parameternet.trainParam.max_fail
. (Note that your results may be different than those shown in the following figure, because of the random setting of the initial weights and biases.)
There are other criteria that can be used to stop network training. They are listed in the following table.
From the training window, you can access four plots: performance, training state, error histogram, and regression.Theperformance plot shows the value of the performance function versus the iteration number. It plots training, validation, and test performances. Thetraining state plot shows the progress of other training variables, such as the gradient magnitude, the number of validation checks, etc.
Theerror histogram plot shows the distribution of the network errors.
Theregression plot shows a regression between network outputs and network targets.
You can use the histogram and regression plots to validate network performance
【1】http://cn.mathworks.com/help/nnet/ug/train-and-apply-multilayer-neural-networks.html
- NN学习笔记
- Tensorflow学习笔记(用哪学哪)tf.nn.dropout
- TensorFlow学习笔记之tf.nn.softmax()与tf.nn.softmax_cross_entropy_with_logits的用法
- 机器学习之k-NN(学习笔记一)
- torch学习笔记(一) nn类结构-Module
- torch学习笔记(二) nn类结构-Linear
- DL学习笔记【17】nn包中的各位Convolutional layers
- DL学习笔记【18】nn包中的各位Criterions
- DL学习笔记【19】nn包中的各位Modules
- DL学习笔记【20】nn包中的各位Simple layers
- tensorflow 学习笔记1-最邻近实现(NN)
- 学习笔记,C,n+nn+nnn+nnnn+nnnnn
- [TensorFlow 学习笔记-04]卷积函数之tf.nn.conv2d
- TensorFlow学习笔记(十七)tf.nn.conv2d
- 机器学习实战笔记(一) K-近邻(k-NN)算法
- tf.nn.embedding_lookup()笔记
- k-nn算法学习
- 机器学习:k-NN
- 干货:在嵌入式系统设计中,五个让传感器变得更简单的技巧!
- NYOJ-背包问题
- NWPU周赛题解
- 视图控制器转场详解
- NanoPC-T2 以太网分析(3)
- NN学习笔记
- Java核心技术(第8版)学习笔记_异常
- 快速幂
- AtomicBoolean 的使用,就是替代if语句
- for应用,转义字符,break和continue的区别
- Laravel框架学习(ORM<一>)
- js判断当前环境是否为苹果手机
- 【bzoj3203】[Sdoi2013]保护出题人 凸包+三分法
- Apple Pay接入详细教程