【学习笔记】【Coursera】【MachineLearning】Neural Networks
来源:互联网 发布:全国dna数据库比对 编辑:程序博客网 时间:2024/05/21 15:49
课程地址:https://www.coursera.org/learn/machine-learning/home/week/4
Representation
Scene
- deal with non-linear classification/hypotheses with hundreds of thousands of features
- belongs to classification
Model Representation
- Neuron model: Logistic unit (no hidden layer)
- input vector:
x=⎡⎣⎢⎢⎢x0x1x2x3⎤⎦⎥⎥⎥ weights/parameters:θ=⎡⎣⎢⎢⎢θ0θ1θ2θ3⎤⎦⎥⎥⎥ - bias unit:
x0=1 hΘ(x)=11+e−z;z=ΘTx : sigmoid (logistic) activation function
- input vector:
- Neural Network (input layer 1; hidden layer 2; output layer 3)
a(l)i = “activation” of uniti in layerl L = total no. of layers in networksl = no. of units(not counting bias unit) in layerl - bias unit:
x0=1;a(2)0=1 (not drawing in the picture) a(2)1=g(Θ(1)10x0+Θ(1)11x1+Θ(1)12x2+Θ(1)13x3) hΘ(x)=a(3)1=g(Θ(2)10a(2)0+Θ(2)11a(2)1+Θ(2)12a(2)2+Θ(2)13a(2)3) Θ(l) = matrix of weights controlling function mapping from layerj to layerl +1, will be of dimensionsl+1×(sl+1) e.g.Θ(1)=⎡⎣⎢⎢⎢Θ(1)10Θ(1)20Θ(1)30Θ(1)11Θ(1)21Θ(1)31Θ(1)12Θ(1)22Θ(1)32Θ(1)13Θ(1)23Θ(1)33⎤⎦⎥⎥⎥;size=3×4 {x(i),y(i)} =ith input
- Neuron model: Logistic unit (no hidden layer)
- in Multi-class classification(K classes & K >= 3)
y∈RK ,hΘ(x)∈RK ,SL=K y(i)k =kth value ofith target vector(hΘ(x(i)))k =kth value ofith output vectore.g.y(1)=⎡⎣⎢100⎤⎦⎥y(2)=⎡⎣⎢010⎤⎦⎥y(3)=⎡⎣⎢001⎤⎦⎥y(1)1=1 - in Binary classification(K = 1 or 2)
y∈0 or 1 ,hΘ(x)∈R ,SL=1
- in Multi-class classification(K classes & K >= 3)
Vectorization
Add
Cost Function
- 分别取输出向量(output)与目标向量(target)的一个对应元素(
(hΘ(x(i)))k 和y(i)k )代入式中求值C=y(i)klog(hΘ(x(i)))k+(1−y(i)k)log(1−(hΘ(x(i)))k - 计算所有矩阵中的所有元素求得cost
J(Θ)=−1m∑i=1m∑k=1KC - 加上正则化项(regularization term),其值为所有
Θ 矩阵元素的平方和,再乘以惩罚率λ (Θj0 对应偏项bias term,通常不计入计算)+λ2m∑l=1L−1∑i=1sl∑j=1s(l+1)(Θ(l)ji)2
0 0
- 【学习笔记】【Coursera】【MachineLearning】Neural Networks
- Coursera机器学习课程笔记(5) Neural Networks Representation
- coursera机器学习笔记之“Neural Networks (part one)”
- coursera机器学习笔记之“Neural Networks (part two)”
- Neural Networks学习笔记
- Coursera公开课笔记: 斯坦福大学机器学习第九课“神经网络的学习(Neural Networks: Learning)”
- 【学习笔记】【Coursera】【MachineLearning】Large scale machine learning
- [Coursera机器学习]Neural Networks Learning WEEK5编程作业
- Coursera deeplearning.ai 深度学习笔记1-3-Shallow Neural Networks-浅层神经网络原理推导与代码实现
- Coursera deeplearning.ai 深度学习笔记1-4-Deep Neural Networks-深度神经网络原理推导与代码实现
- 学习笔记3 Supervised Neural Networks
- VGG Convolutional Neural Networks Practical 学习笔记
- neural networks学习笔记(一)
- neural networks学习笔记(二)
- TensorFlow学习笔记14----Convolutional Neural Networks
- 《Neural Networks for Machine Learning》学习笔记
- neural networks and deep learning 学习笔记
- Improving Deep Neural Networks学习笔记(一)
- http-server
- Hibernate的缓存_一级缓存
- nginx配置反向代理示例
- oracle--纯度级别(purity level)
- C++学习笔记五——函数重载(多态)、函数模板及函数模板重载和完全匹配与最佳匹配
- 【学习笔记】【Coursera】【MachineLearning】Neural Networks
- [Lintcode]Rotate List旋转链表
- 思维,方法与想法
- Sublime Package Control:There are no packages available for installation
- NOIP2012复赛DAY2
- ASP.NET会话(Session)保存模式
- Redis入门(二)列表List
- 优秀的前端开发工程师简历是怎么样的?
- jQuery中的Ajax