Andrew Ng机器学习笔记ex1 线性回归

来源:互联网 发布:淘宝优惠券怎么找 编辑:程序博客网 时间:2024/05/29 17:13

代价函数计算
computeCost.m

function J = computeCost(X, y, theta)%COMPUTECOST Compute cost for linear regression%   J = COMPUTECOST(X, y, theta) computes the cost of using theta as the%   parameter for linear regression to fit the data points in X and y% Initialize some useful valuesm = length(y); % number of training examples% You need to return the following variables correctly J = 0;% ====================== YOUR CODE HERE ======================% Instructions: Compute the cost of a particular choice of theta%               You should set J to the cost.sqr=(X*theta-y).^2;J=sum(sqr)/(2*m);% =========================================================================end

梯度下降
gradientDescent.m

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)%GRADIENTDESCENT Performs gradient descent to learn theta%   theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by %   taking num_iters gradient steps with learning rate alpha% Initialize some useful valuesm = length(y); % number of training examplesJ_history = zeros(num_iters, 1);% ====================== YOUR CODE HERE ======================% Instructions: Perform a single gradient step on the parameter vector%               theta.%% Hint: While debugging, it can be useful to print out the values%       of the cost function (computeCost) and gradient here.%for iter = 1:num_iters,   sums=X'*(X*theta-y);   theta=theta-alpha/m*sums;end;% ============================================================% Save the cost J in every iterationJ_history(iter) = computeCost(X, y, theta);end
阅读全文
0 0
原创粉丝点击