斯坦福大学机器学习课程二线性回归编程作业2

来源:互联网 发布:业务流程优化方法 编辑:程序博客网 时间:2024/05/17 09:09

上篇博文续,compute Cost函数的代码为:

function J = computeCost(X, y, theta)%COMPUTECOST Compute cost for linear regression%   J = COMPUTECOST(X, y, theta) computes the cost of using theta as the%   parameter for linear regression to fit the data points in X and y% Initialize some useful valuesm = length(y); % number of training examples% You need to return the following variables correctly J = 0;% ====================== YOUR CODE HERE ======================% Instructions: Compute the cost of a particular choice of theta%               You should set J to the cost.J = sum((X * theta - y).^2) / (2*m);  % X(79,2) theta(2,1)% =========================================================================end

gradientDescent函数代码为:

%   taking num_iters gradient steps with learning rate alpha% Initialize some useful valuesm = length(y); % number of training examplesJ_history = zeros(num_iters, 1);theta_s = theta;for iter = 1:num_iters    % ====================== YOUR CODE HERE ======================    % Instructions: Perform a single gradient step on the parameter vector    %               theta.     %    % Hint: While debugging, it can be useful to print out the values    %       of the cost function (computeCost) and gradient here.    %    % ?????theta0?theta1?????    theta(1) = theta(1) - alpha / m * sum(X * theta_s - y);    theta(2) = theta(2) - alpha / m * sum((X * theta_s - y) .*X (:,2));    theta_s = theta;    % ============================================================    % Save the cost J in every iteration        J_history(iter) = computeCost(X, y, theta);endend
所求出的线性模型如图所示:


4.可视化代价函数和参数之间的关系

%% ============= Part 4: Visualizing J(theta_0, theta_1) =============fprintf('Visualizing J(theta_0, theta_1) ...\n')% Grid over which we will calculate Jtheta0_vals = linspace(-10, 10, 100);theta1_vals = linspace(-1, 4, 100);% initialize J_vals to a matrix of 0'sJ_vals = zeros(length(theta0_vals), length(theta1_vals));% Fill out J_valsfor i = 1:length(theta0_vals)    for j = 1:length(theta1_vals)  t = [theta0_vals(i); theta1_vals(j)];  J_vals(i,j) = computeCost(X, y, t);    endend% Because of the way meshgrids work in the surf command, we need to% transpose J_vals before calling surf, or else the axes will be flippedJ_vals = J_vals';% Surface plotfigure;surf(theta0_vals, theta1_vals, J_vals)xlabel('\theta_0'); ylabel('\theta_1');% Contour plotfigure;% Plot J_vals as 15 contours spaced logarithmically between 0.01 and 100contour(theta0_vals, theta1_vals, J_vals, logspace(-2, 3, 20))xlabel('\theta_0'); ylabel('\theta_1');hold on;plot(theta(1), theta(2), 'rx', 'MarkerSize', 10, 'LineWidth', 2);

作出的图像如下:


多变量的线性回归模型和程序将在下篇博客发表。


阅读全文
0 0