[深度学习论文笔记][Weight Initialization] Delving deep into rectifiers: Surpassing human-level performance
来源:互联网 发布:沙发品牌推荐 知乎 编辑:程序博客网 时间:2024/05/16 17:37
He, Kaiming, et al. “Delving deep into rectifiers: Surpassing human-level performance on imagenet classification.” Proceedings of the IEEE International Conference on Computer Vision. 2015. [Citations: 477].
1 PReLU
[PReLU]
• α is a learnable parameter.
• If α is a fixed small number, PReLU becomes Leaky ReLU (LReLU), but LReLU has negligible impact on accuracy compared with ReLU.
• We allow the α to vary on different channels.
[Backprop]
[Optimization] Do not use weight decay (l_2 regularization) for α_d .
• A weight decay tends to push α d to zero, thus biases PReLU towards ReLU.
• We use α_d = 0.25 as the initialization.
[Experiment] Conv1 has coefficients (0.681 and 0.596) significantly greater than 0.
• Filters of conv1 are mostly Gabor-like filters such as edge or texture detectors.
• The learned results show that both positive and negative responses of the filters are respected.
The deeper conv layers in general have smaller coefficients.
• Activations gradually become “more nonlinear” at increasing depths.
• I.e., the learned model tends to keep more information in earlier stages and becomes more discriminative in deeper stages.
2 Weight Initialization
[Forward Case] Consider ReLU activation function.
Note if x has zero mean, then . And we assume s has zero mean and has a symmetric distribution.
We want
then
[Backward Case]
We want
then
[Issue] When the input signal is not normalized (e.g., in [128, 128])
• Since the variance of the input signal can be roughly preserved from the first layer to the last.
• Its magnitude can be so large that the softmax operator will overflow.
[Solution] Normalize the input signal, but this may impact other hyper-parameters. Another solution is to include a small factor on the weights
among all or some layers. E.g., use a std of 0.01 for the first two fc layers and 0.001 for the last.
- [深度学习论文笔记][Weight Initialization] Delving deep into rectifiers: Surpassing human-level performance
- 《Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification》阅读笔记与实现
- PReLU:Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification 笔记
- PRelu--Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification
- Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification
- [深度学习论文笔记][Weight Initialization] Random walk initialization for training very deep feedforward netw
- [深度学习论文笔记][Weight Initialization] Understanding the difficulty of training deep feedforward neural
- [深度学习论文笔记][Weight Initialization] Batch Normalization: Accelerating Deep Network Training by Reducin
- Re-ID:AlignedReID: Surpassing Human-Level Performance in Person Re-Identification 论文解析
- [深度学习论文笔记][Video Classification] Delving Deeper into Convolutional Networks for Learning Video Repre
- [深度学习论文笔记][Weight Initialization] Exact solutions to the nonlinear dynamics of learning in deep lin
- [深度学习论文笔记][Image Classification] Human Performance
- [深度学习论文笔记][Face Recognition] DeepFace: Closing the Gap to Human-Level Performance in Face Verificati
- Surpassing Human-Level Performance on ImageNet Classification ImageNet Classification
- [深度学习论文笔记][Weight Initialization] 参数初始化部分论文导读
- [深度学习论文笔记][Human Pose Estimation] DeepPose: Human Pose Estimation via Deep Neural Networks
- [深度学习论文笔记][Weight Initialization] Data-dependent Initializations of Convolutional Neural Networks
- [深度学习论文笔记][Weight Initialization] All you need is a good init
- 使用随机算法产生一个数,要求把1-1000W之间这些数全部生成。
- 15章 上机1
- 123. Best Time to Buy and Sell Stock III(dp)
- sha1签名
- 15章 上机4商品批发显示总金额
- [深度学习论文笔记][Weight Initialization] Delving deep into rectifiers: Surpassing human-level performance
- 如何判断用户是拒绝还是允许通讯录权限?
- java练习--判断字符出现次数
- 遍历map的四种方式
- android 图片 色彩 Bitmap.Config RGB_565 ARGB_8888
- php中使用redis HyperLogLogs
- macOS 下android开发之 应用Android Screen Monitor共享手机屏幕,android应用屏幕演示
- rails5下ActionCable初使用(错误心得)
- Cython 初探及关于性能提升的初步讨论