转行程序员1 机器学习 线性回归 Linear Regression 纯属敦促自己学习

来源:互联网 发布:人工智能相关的电影 编辑:程序博客网 时间:2024/03/28 21:21

主要学习了Andrew Ng的公开课 machine learning 之Linear Regression,其exercise网址如下

http://openclassroom.stanford.edu/MainFolder/DocumentPage.php?course=MachineLearning&doc=exercises/ex2/ex2.html


编写的代码如下:


%%%%%%%%%%%%%%%%%%%%%%%%Linear regression

clear all; close all; clc

x = load('ex2x.dat');
y = load('ex2y.dat');

figure % open a new figure window
plot(x, y, 'o');
ylabel('Height in meters')
xlabel('Age in years')
m = length(y); % store the number of training examples 样本数量
x = [ones(m, 1), x]; % Add a column of ones to x  

theta=zeros(2,1);
alpha=0.07;% learning rate

for inter=1:1500 % 迭代次数
    h_theta=x*theta;
    theta=theta-alpha/m*x'*(h_theta-y);
end
hold on % Plot new data without clearing old plot
plot(x(:,2), x*theta, '-') % remember that x is now a matrix with 2 columns
                           % and the second column contains the time info
legend('Training data', 'Linear regression')

%%%%%%%%%%%%%%%%%%%%%%%%  Understanding J(theta)
J_vals = zeros(100, 100);   % initialize Jvals to 100x100 matrix of 0's
theta0_vals = linspace(-3, 3, 100);
theta1_vals = linspace(-1, 1, 100);
for i = 1:length(theta0_vals)
      for j = 1:length(theta1_vals)
      t = [theta0_vals(i); theta1_vals(j)];
      J_vals(i,j) = 1/2/m*sum((theta0_vals(i)+theta1_vals(j)*x(:,2)-y(:)).^2);
    end
end

% Plot the surface plot
% Because of the way meshgrids work in the surf command, we need to
% transpose J_vals before calling surf, or else the axes will be flipped
J_vals = J_vals';
figure;
surf(theta0_vals, theta1_vals, J_vals)
xlabel('\theta_0'); ylabel('\theta_1')
% Plot the cost function with 15 contours spaced logarithmically
% between 0.01 and 100
figure;
contour(theta0_vals, theta1_vals, J_vals, logspace(-2, 2, 15))
xlabel('\theta_0'); ylabel('\theta_1')

0 0
原创粉丝点击