[Coursera][Stanford] Machine Learning Week 1 2
来源:互联网 发布:c语言平均值数组 编辑:程序博客网 时间:2024/06/05 11:17
有很多东西不回忆就会遗忘,在此也算给自己的学习做一个笔记及记录。
开始时间:2014年8月
1. Introduce
对Machine Learning 的定义: Field of study that give computers the ability to learn without being explicity programed.
课程涉及讨论的算法: ---Supervised learning 监督学习
---Unsupervised learning 非监督学习
Others: Reinforcement learning, recommender systems
Also talks about: Practral advice for applying learning algorithm.
2. Linear Regression with one Variable
介绍一个变量的线性回归算法,介绍了梯度下降算法来计算theta。
3. Multiple features (variables)
多个变量的情况,介绍Gradient descent 中的 Feature Scaling, Learning rate
4. Normal equation
Method to solve for theta analytically.
5. 比较 Gradient Descent 和 Normal Equation ,
6. 介绍Octave基本功能
7. vectorized implementation
Programming Exercises 1
2 Linear regression with one variable
2.2 Gradient Descent
2.2.3 Computing the cost J(θ)
J = (1 / (2 * m)) * sum(((X * theta - y).^2));2.2.4 Gradient descent
% 1st theta = theta - alpha * ( 1 / m) * (X' * (X * theta - y)); % 2nd %temp1 = theta(1) - alpha * ( 1 / m) * sum(X * theta - y); %temp2 = theta(2) - alpha * ( 1 / m) * sum((X(:,2) .* (X * theta - y))); %theta = [temp1;temp2];3 Linear regression with multiple variables
3.1 Feature Normalization
mu = mean(X);sigma = std(X);% problem: need for many other case[m n] = size(X);for i = 1:n X_norm(:,i) = (X(:,i) - mu(i)) / sigma(i);end3.2 Gradient Descent
J = (1 / (2 * m)) * sum(((X * theta - y).^2));
theta = theta - alpha * ( 1 / m) * (X' * (X * theta - y));3.3 Normal Equations
theta = pinv(X' * X) * X' * y;
0 0
- [Coursera][Stanford] Machine Learning Week 1 2
- [Coursera][Stanford] Machine Learning Week 3
- [Coursera][Stanford] Machine Learning Week 4
- [Coursera][Stanford] Machine Learning Week 5
- Machine Learning Stanford (week 2)
- Machine Learning Stanford (week 1)
- Stanford Machine Learning 学习笔记(Week 2)
- Coursera Machine Learning Note - Week 2
- Coursera Machine Learning Week 2 ex1
- coursera Machine Learning Week 1学习笔记
- Coursera Machine Learning Note - Week 1
- Machine Learning Stanford (week 3)
- Stanford Andrew Ng ——Machine Learning WEEK 1
- Machine Learning - Andrew Ng on Coursera (Week 2)
- Coursera Machine Learning Week 3 - Programming Exercise 2: Logistic Regression
- Machine Learning - Andrew Ng on Coursera (Week 1)
- Machine Learning- Coursera - Stanford - Programming Exercises
- Coursera Machine Learning Week 1.1: Introduction
- 二分图基础--Girls and Boys
- 事务
- android基础知识
- UVA10160
- Ubuntu下ftp搭建和使用
- [Coursera][Stanford] Machine Learning Week 1 2
- 相对路径和绝对路径的区别
- 二分图最大匹配基础总结
- Ethernet Bridge + netfilter Howto
- 数组作为形参的sizeof问题
- 字符串分割
- opencv 学习之 视频读取
- openGL 在shader中得到相对于屏幕的点
- Android编程--常用代码