斯坦福机器学习公开课第一次编程作业

来源:互联网 发布:永恒之塔捏脸数据 编辑:程序博客网 时间:2024/06/06 07:38

第一次编程作业跟着讲义感觉还是很简单,毕竟大部分代码都给了,自己只需要写一点点的算法实现,其实就是编写几个数学公式的事情,接上代码。
PS:博主在做一维的情况时对各个变量的计算进行了向量化,因此损失函数和梯度下降可以完全适应于多特征值的情况。
损失函数:

function J = computeCost(X, y, theta)%COMPUTECOST Compute cost for linear regression%   J = COMPUTECOST(X, y, theta) computes the cost of using theta as the%   parameter for linear regression to fit the data points in X and y% Initialize some useful valuesm = length(y); % number of training examples% You need to return the following variables correctly J = 0;% ====================== YOUR CODE HERE ======================% Instructions: Compute the cost of a particular choice of theta%               You should set J to the cost.s = 0;for iter=1:m    h = theta' * X(iter,:)';    s  = s + (h-y(iter))^2;endJ = s/(2*m);% =========================================================================end

梯度下降:

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)%GRADIENTDESCENT Performs gradient descent to learn theta%   theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by %   taking num_iters gradient steps with learning rate alpha% Initialize some useful valuesm = length(y); % number of training examplesJ_history = zeros(num_iters, 1);for iter = 1:num_iters    % ====================== YOUR CODE HERE ======================    % Instructions: Perform a single gradient step on the parameter vector    %               theta.     %    % Hint: While debugging, it can be useful to print out the values    %       of the cost function (computeCost) and gradient here.    %    s = zeros(size(theta));    for j = 1:m        h = theta' * X(j,:)';        s = s + (h-y(j)) * (X(j,:)');    end    theta  = theta - alpha * s / m;    % ============================================================    % Save the cost J in every iteration        J_history(iter) = computeCost(X, y, theta);endend

Normal Equations:

function [theta] = normalEqn(X, y)%NORMALEQN Computes the closed-form solution to linear regression %   NORMALEQN(X,y) computes the closed-form solution to linear %   regression using the normal equations.theta = zeros(size(X, 2), 1);% ====================== YOUR CODE HERE ======================% Instructions: Complete the code to compute the closed form solution%               to linear regression and put the result in theta.%% ---------------------- Sample Solution ----------------------theta = inv(X'*X)*X'*y% -------------------------------------------------------------% ============================================================end

ex1_multi.m:

%% Machine Learning Online Class%  Exercise 1: Linear regression with multiple variables%%  Instructions%  ------------% %  This file contains code that helps you get started on the%  linear regression exercise. %%  You will need to complete the following functions in this %  exericse:%%     warmUpExercise.m%     plotData.m%     gradientDescent.m%     computeCost.m%     gradientDescentMulti.m%     computeCostMulti.m%     featureNormalize.m%     normalEqn.m%%  For this part of the exercise, you will need to change some%  parts of the code below for various experiments (e.g., changing%  learning rates).%%% Initialization%% ================ Part 1: Feature Normalization ================%% Clear and Close Figuresclear ; close all; clcfprintf('Loading data ...\n');%% Load Datadata = load('ex1data2.txt');X = data(:, 1:2);y = data(:, 3);m = length(y);% Print out some data pointsfprintf('First 10 examples from the dataset: \n');fprintf(' x = [%.0f %.0f], y = %.0f \n', [X(1:10,:) y(1:10,:)]');fprintf('Program paused. Press enter to continue.\n');pause;% Scale features and set them to zero meanfprintf('Normalizing Features ...\n');[X mu sigma] = featureNormalize(X);% Add intercept term to XX = [ones(m, 1) X];%% ================ Part 2: Gradient Descent ================% ====================== YOUR CODE HERE ======================% Instructions: We have provided you with the following starter%               code that runs gradient descent with a particular%               learning rate (alpha). %%               Your task is to first make sure that your functions - %               computeCost and gradientDescent already work with %               this starter code and support multiple variables.%%               After that, try running gradient descent with %               different values of alpha and see which one gives%               you the best result.%%               Finally, you should complete the code at the end%               to predict the price of a 1650 sq-ft, 3 br house.%% Hint: By using the 'hold on' command, you can plot multiple%       graphs on the same figure.%% Hint: At prediction, make sure you do the same feature normalization.%fprintf('Running gradient descent ...\n');% Choose some alpha valuealpha = 0.01;num_iters = 400;% Init Theta and Run Gradient Descent theta = zeros(3, 1);[theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters);% Plot the convergence graphfigure;plot(1:numel(J_history), J_history, '-b', 'LineWidth', 2);xlabel('Number of iterations');ylabel('Cost J');% Display gradient descent's resultfprintf('Theta computed from gradient descent: \n');fprintf(' %f \n', theta);fprintf('\n');% Estimate the price of a 1650 sq-ft, 3 br house% ====================== YOUR CODE HERE ======================% Recall that the first column of X is all-ones. Thus, it does% not need to be normalized.price = theta' * [1;(1650-mu)/sigma;3]; % You should change this% ============================================================fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...         '(using gradient descent):\n $%f\n'], price);fprintf('Program paused. Press enter to continue.\n');pause;%% ================ Part 3: Normal Equations ================fprintf('Solving with normal equations...\n');% ====================== YOUR CODE HERE ======================% Instructions: The following code computes the closed form %               solution for linear regression using the normal%               equations. You should complete the code in %               normalEqn.m%%               After doing so, you should complete this code %               to predict the price of a 1650 sq-ft, 3 br house.%%% Load Datadata = csvread('ex1data2.txt');X = data(:, 1:2);y = data(:, 3);m = length(y);% Add intercept term to XX = [ones(m, 1) X];% Calculate the parameters from the normal equationtheta = normalEqn(X, y);% Display normal equation's resultfprintf('Theta computed from the normal equations: \n');fprintf(' %f \n', theta);fprintf('\n');% Estimate the price of a 1650 sq-ft, 3 br house% ====================== YOUR CODE HERE ======================price = theta' * [1;1650;3]; % You should change this% ============================================================fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...         '(using normal equations):\n $%f\n'], price);
原创粉丝点击