Stanford 机器学习 Week2 作业: Linear Regression

来源:互联网 发布:四大审计软件 编辑:程序博客网 时间:2024/03/28 16:42

Plotting the Data

data = load('ex1data1.txt');       % read comma separated dataX = data(:, 1); y = data(:, 2);m = length(y);                     % number of training examplesplot(x, y, 'rx', 'MarkerSize', 10); % Plot the data,'rx'表示用红叉画点,'MarkerSize' = 10设定红叉大小ylabel('Profit in $10,000s'); % Set the y−axis label xlabel('Population of City in 10,000s'); % Set the x−axis label

Computing the cost J(θ)

l = length(X);T = 1 / 2 / l * ( X * theta - y) .^ 2;J = sum(T);

Gradient descent

t1 = theta(1) - alpha / m * sum( X * theta - y);t2 = theta(2) - alpha / m * sum((X * theta - y) .* X(:,2));theta = [t1; t2];

Feature Normalization

l = size(X_norm,2);for i = 1:l    mu(i) = mean(X_norm(:,i));    sigma(i) = std(X_norm(:,i));    X_norm(:,i) = (X_norm(:,i) - mu(i)) / sigma(i);end;

Gradient Descent(muitiple variables)

m = length(y); % number of training examplesJ_history = zeros(num_iters, 1);n = size(X,2);tmp = zeros(n,1);for iter = 1:num_iters    for i = 1:n        tmp(i) = theta(i) - alpha / m * (X * theta - y)' * X(:,i);    end;    theta = tmp;    J_history(iter) = computeCostMulti(X, y, theta);end

Normal Equations

theta = pinv(X'*X)*X'*y
0 0
原创粉丝点击