转行程序员2 机器学习 线性回归 Linear Regression II 纯属敦促自己学习

来源:互联网 发布:pdf文件修改软件 编辑:程序博客网 时间:2024/04/29 04:02

主要学习了Andrew Ng的公开课 machine learning 之Linear Regression II,其exercise网址如下

http://openclassroom.stanford.edu/MainFolder/DocumentPage.php?course=MachineLearning&doc=exercises/ex3/ex3.html


scale 后的 normal equation 所得到的 theta2与 梯度下降法得到的theta 数值相同

而均与未经scale处理,利用normal equation所得的theta1不同。


编写的代码如下:

%%%%%%%%%%%%%%%%%%%%%%%%Linear regression2

clear all; close all; clc

x = load('ex3x.dat');
y = load('ex3y.dat');
m = length(y); % store the number of training examples 样本数量
x = [ones(m, 1), x]; % Add a column of ones to x
%%%%%%%%%%%%%%%%%%%%%%%%% normal equations
theta1=(x'*x)\x'*y;
% scale
sigma = std(x);
mu = mean(x);
x(:,2) = (x(:,2) - mu(2))./ sigma(2);
x(:,3) = (x(:,3) - mu(3))./ sigma(3);

theta=zeros(3,1);
alpha=1;% learning rate
J = zeros(50, 1);
for inter=1:50 % 迭代次数
    h_theta=x*theta;
    J(inter) = 1/2/m*(h_theta-y)'*(h_theta-y);
    theta=theta-alpha/m*x'*(h_theta-y);
end
% now plot J
% technically, the first J starts at the zero-eth iteration
% but Matlab/Octave doesn't have a zero index
figure;
plot(0:49, J(1:50), '-')
xlabel('Number of iterations')
ylabel('Cost J')
%%%%%%%%%%%%%%%%%%%%%%%%% normal equations
theta2=(x'*x)\x'*y;


0 0