Neural Networks and Deep Learning 学习笔记(一)

来源:互联网 发布:java面向对象思维导图 编辑:程序博客网 时间:2024/04/30 18:46

1. 为什么假设wxjwjxj后,wx就变成向量了?

The first change is to write jwjxj as a dot product, wxjwjxj, where w and x are vectors whose components are the weights and inputs, respectively.

向量的点积为标量,两个同样维度的向量的点积恰好是每个维度积的累加,以上变换可用公式表示为

(w1,w2,w3,...,wj)(x1,x2,x3,...,xj)=w1x1+w2x2+w3x3+...+wjxj=jwjxj

2. 用perceptron实现‘与非’、‘与’、‘或’门。

we have a perceptron with two inputs, each with weight −2, and an overall bias of 3. Then we see that input 00 produces output 1, since (2)0+(2)0+3=3 is positive. Here, I’ve introduced the ∗ symbol to make the multiplications explicit. Similar calculations show that the inputs 01 and 10 produce output 1. But the input 11 produces output 0, since (2)1+(2)1+3=1 is negative. And so our perceptron implements a NAND gate!

以上是电子书中实现的与非门。以此类推实现与门的方式可以为bias设置为-3,x1x2的权重设置为2,则输入00,计算为 2×0+2×0+(3)=3,为负。输入01,计算为 2×0 +2×1+(3)=1 为负。输入11,2×1+2×1+(3)=1 为正。实现与门。

实现或门的bias设置为-1,权重设置为2。则输入00,输出为负,输入01,输出为正,输入11,输出为正。

3. exp是啥?

exp是以e为底的自然对数,特点是对ex求导还是ex

4. 这个偏导数是怎么得到的(未解决)?

ΔoutputjoutputwjΔwj+outputbΔb

0 0
原创粉丝点击