linear regression(3)-Gradient Descent in Practice I/II(Feature Scalling/Learning Rate)
来源:互联网 发布:电脑怎么清理软件 编辑:程序博客网 时间:2024/05/22 04:52
Gradient Descent in Practice I - Feature Scaling
goal:speed up gradient descent by having each of our input values in roughly the same range
xi:=(xi−μi)/si
Where μi is the average of all the values for feature (i) and si is the range of values (max - min), or si is the standard deviation.
Gradient Descent in Practice II - Learning Rate
goal:find the fit learning rate to make the J(θ) will decrease on every iteration.
summary:
If α is too small: slow convergence.
If α is too large: may not decrease onevery iteration and thus may not converge.
Polynomial Regression
goal:simplify our hypothesis functioncombine multiple features into one
For example, if our hypothesis function is hθ(x)=θ0+θ1x1
then we can create additional features based on x1, to get the quadratic function hθ(x)=θ0+θ1x1+θ2x12
or the cubic function hθ(x)=θ0+θ1x1+θ2x12+θ3x13
In the cubic version, we have created new features x2 and x3 where x2=x12 and x3=x13.
To make it a square root function, we could do: hθ(x)=θ0+θ1x1+θ2x1
One important thing to keep in mind is, if you choose your features this way thenfeature scaling becomes very important.
- linear regression(3)-Gradient Descent in Practice I/II(Feature Scalling/Learning Rate)
- 第三讲 Gradient descent in practice II: Learning rate
- 梯度下降实用技巧II之学习率 Gradient descent in practice II -- learning rate
- 4 - 4 - Gradient Descent in Practice II - Learning Rate (9 min)
- 4 - 3 - Gradient Descent in Practice I - Feature Scaling (9 min)
- Machine Learning - Gradient Descent in Practice
- 梯度下降实用技巧I之特征缩放 Gradient Descent in practice I - feature scaling
- 2 Linear Regression, Gradient descent
- Linear regression and gradient descent algorithm
- [Machine Learning] [Octave]Gradient Descent Practice
- Machine Learning in Gradient Descent
- [Machine Learning][Linear Regression]Feature Scaling
- [blabla]a quick code about linear regression using gradient descent
- 线性回归、梯度下降(Linear Regression、Gradient Descent)
- 2 - 7 - Gradient Descent For Linear Regression (10 min)
- 机器学习笔记5---Gradient descent for linear regression
- linear regression(2)-Gradient Descent for Multiple Variables
- linear regression(4)-normal equation***compare with gradient descent
- LeetCode
- uva10905
- [LibreOffice]Calc文档开发_004:Calc文档_文本内容的插入
- C语言函数sscanf()的用法
- spring cloud配置管理器的思考
- linear regression(3)-Gradient Descent in Practice I/II(Feature Scalling/Learning Rate)
- 1015. 德才论 (25)-PAT乙级
- uva10763
- PHP与MySQL第一天笔记
- 那些好玩的gradle
- JavaOop03多态
- c++ 网络编程基础
- EffectManager
- 坑题