斯坦福机器学习第四周(神经网络及其应用)
来源:互联网 发布:男士大衣品牌 知乎 编辑:程序博客网 时间:2024/05/16 04:38
1.为什么要引入神经网络(Neural Network)
一句话总结就是当特征值n特别大时,比如当n为100时;仅仅是其2次项特征值
2.神经网络模型(Neural Network Model)
Let’s examine how we will represent a hypothesis function using neural networks. At a very simple level, neurons are basically computational units that take inputs (dendrites) as electrical inputs (called “spikes”) that are channeled to outputs (axons). In our model, our dendrites are like the input features
x1…xn , and the output is the result of our hypothesis function. In this model ourx0 input node is sometimes called the “bias unit.” It is always equal to 1. In neural networks, we use the same logistic function as in classification,11+e−θTx , yet we sometimes call it a sigmoid (logistic) activation function. In this situation, our “theta” parameters are sometimes called “weights“.
如图就是一个只包含一个神经元的模型,黄色圆圈为神经元细胞(cell body),
而真正的神经网络是若干个这样不同的神经元组合而成的,如下图
其中
3.神经网络的数学定义(Mathematical definition)
比如一个
The values for each of the “activation” nodes is obtained as follows:
This is saying that we compute our activation nodes by using a 3×4 matrix of parameters. We apply each row of the parameters to our inputs to obtain the value for one activation node. Our hypothesis output is the logistic function applied to the sum of the values of our activation nodes, which have been multiplied by yet another parameter matrix
Θ(2) containing the weights for our second layer of nodes.
因此我们可以看出,其实每一个活化单元的值都是以上一层作为输入,以上一层的权重矩阵的对应一行为参数,然后进行关于函数
Each layer gets its own matrix of weights,
Θ(j) . The dimensions of these matrices of weights is determined as follows:If network has
sj units in layerj andsj+1 units in layerj+1 , thenΘ(j) will be of dimensionsj+1×(sj+1) .
4.矢量化(Vectorized implementation)
先做出如下定义:
令:
setting
总结一下
重复进行(1)(2),然后可以发现最后一层的活化单元(activation unit)的值就是
5.Examples and Intuitions
用神经网络实现逻辑门(Logical gate)
例1.与门(And gate)
Remember that
x0 is our bias variable and is always 1.
Let’s set our first theta matrix as:
This will cause the output of our hypothesis to only be positive if both
x1 andx2 are 1. In other words:
例2.同或门(And gate)
Layer 2:
Layer 3:
- 斯坦福机器学习第四周(神经网络及其应用)
- Stanford机器学习---第四周.神经网络模型
- coursera-斯坦福-机器学习-吴恩达-第4周笔记-神经网络
- 斯坦福机器学习_神经网络
- 斯坦福机器学习第五周(如何训练神经网络)
- 机器学习-第四周作业——构建深度神经网络
- 斯坦福机器学习第一周
- AndrewNg机器学习第四周作业:关于使用逻辑回归、神经网络训练数据并应用之的心得
- 机器学习第四周
- 机器学习第四周(一)
- 机器学习第四周(二)
- 斯坦福机器学习2:监督学习应用
- 斯坦福机器学习公开课笔记(六)--神经网络的学习
- 斯坦福机器学习-week4 学习笔记——初识神经网络
- 斯坦福机器学习-week5 学习笔记(2)--神经网络小结
- 斯坦福机器学习公开课笔记--神经网络的学习
- 【斯坦福《机器学习》笔记】[第2集] 监督学习应用.梯度下降
- 斯坦福机器学习公开课笔记(五)--神经网络的表示
- linux 多台主机时间同步
- django初体验
- 新浪微博Oauth授权认证登录
- 已知dsc 和fdf ,编译bios fd
- 理解SVM比较好的几篇博文
- 斯坦福机器学习第四周(神经网络及其应用)
- MySQL常用查询
- Hive UDF开发
- poj1228-Grandpa's Estate 带边上节点的凸包(稳定凸包)问题
- AppCompatActivity全屏/状态栏设置颜色
- Hive UDAF开发
- 集群
- 键盘遮挡输入框
- com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure