CNN基础(2)

来源:互联网 发布:诺基亚c500i软件下载 编辑:程序博客网 时间:2024/06/05 02:07

Neural Network architectures

这里写图片描述

a mathematical model of neuron


Activation Functions

  • Sigmoid

    σ(x)=1/(1+ex)

    这里写图片描述

    • Sigmoids saturate and kill gradients
    • Sigmoid outputs are not zero-centered
  • tanh

    tanh(x)=2σ(2x)1

    这里写图片描述

    • in practice the tanh non-linearity is always preferred to the sigmoid nonlinearity
  • ReLU(Rectified Linear Unit )

    f(x)=max(0,x)

    这里写图片描述这里写图片描述

    • (+) It was found to greatly accelerate
    • (+) ReLU can be implemented by simply thresholding a matrix of activations at zero
    • (-)Unfortunately, ReLU units can be fragile during training and can “die”
  • Leaky ReLU

    f(x)=
原创粉丝点击