机器学习笔记之特征及多项式回归

来源:互联网 发布:淘宝代购售假申诉 编辑:程序博客网 时间:2024/06/05 06:36

Features and Polynomial Regression

We can improve our features and the formof our hypothesis function in a couple different ways.

We can combine multiple features intoone. For example, we can combine x1 and x2 into a newfeature x3 bytaking x1x2.

Polynomial Regression

Our hypothesis function need not belinear (a straight line) if that does not fit the data well.

We can change the behavior orcurve of our hypothesis function by making it a quadratic, cubic orsquare root function (or any other form).

For example, if our hypothesis function is hθ(x)=θ0+θ1x1 then wecan create additional features based on x1, to get thequadratic function hθ(x)=θ0+θ1x1+θ2x21 or thecubic function hθ(x)=θ0+θ1x1+θ2x21+θ3x31

In the cubic version, we have created new features x2 and x3 where x2=x21 and x3=x31.

To make it a square root function, we could do: hθ(x)=θ0+θ1x1+θ2x1−−√

One important thing to keep in mind is,if you choose your features this way then feature scaling becomes veryimportant.

eg. if x1 has range 1- 1000 then range of x21 becomes 1- 1000000 and that of x31 becomes 1- 1000000000


 

原创粉丝点击