斯坦福大学机器学习课程二线性回归编程作业3(多变量)

来源:互联网 发布:tengine windows版本 编辑:程序博客网 时间:2024/06/06 03:21

多变量的线性回归练习以预测房屋价格为例,基本过程和单变量类似,

1. Feature Normalization

%% ================ Part 1: Feature Normalization ================%% Clear and Close Figuresclear ; close all; clcfprintf('Loading data ...\n');%% Load Datadata = load('ex1data2.txt');X = data(:, 1:2);y = data(:, 3);m = length(y);% Print out some data pointsfprintf('First 10 examples from the dataset: \n');fprintf(' x = [%.0f %.0f], y = %.0f \n', [X(1:10,:) y(1:10,:)]');fprintf('Program paused. Press enter to continue.\n');pause;% Scale features and set them to zero meanfprintf('Normalizing Features ...\n');[X mu sigma] = featureNormalize(X);% Add intercept term to XX = [ones(m, 1) X];

featureNormalize.m函数代码:

function [X_norm, mu, sigma] = featureNormalize(X)%FEATURENORMALIZE Normalizes the features in X %   FEATURENORMALIZE(X) returns a normalized version of X where%   the mean value of each feature is 0 and the standard deviation%   is 1. This is often a good preprocessing step to do when%   working with learning algorithms.% You need to set these values correctlyX_norm = X;mu = zeros(1, size(X, 2));sigma = zeros(1, size(X, 2));% ====================== YOUR CODE HERE ======================% Instructions: First, for each feature dimension, compute the mean%               of the feature and subtract it from the dataset,%               storing the mean value in mu. Next, compute the %               standard deviation of each feature and divide%               each feature by it's standard deviation, storing%               the standard deviation in sigma. %%               Note that X is a matrix where each column is a %               feature and each row is an example. You need %               to perform the normalization separately for %               each feature. %% Hint: You might find the 'mean' and 'std' functions useful.%       mu = mean(X); % mean valuesigma = std(X); %standard deviationX_norm = (X - repmat(mu, size(X,1),1)) ./ repmat(sigma, size(X,1), 1);% ============================================================end

2.梯度下降

%% ================ Part 2: Gradient Descent ================% ====================== YOUR CODE HERE ======================% Instructions: We have provided you with the following starter%               code that runs gradient descent with a particular%               learning rate (alpha). %%               Your task is to first make sure that your functions - %               computeCost and gradientDescent already work with %               this starter code and support multiple variables.%%               After that, try running gradient descent with %               different values of alpha and see which one gives%               you the best result.%%               Finally, you should complete the code at the end%               to predict the price of a 1650 sq-ft, 3 br house.%% Hint: By using the 'hold on' command, you can plot multiple%       graphs on the same figure.%% Hint: At prediction, make sure you do the same feature normalization.%fprintf('Running gradient descent ...\n');% Choose some alpha valuealpha = 0.01;num_iters = 400;% Init Theta and Run Gradient Descent theta = zeros(3, 1);[theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters);% Plot the convergence graphfigure;plot(1:numel(J_history), J_history, '-b', 'LineWidth', 2);xlabel('Number of iterations');ylabel('Cost J');% Display gradient descent's resultfprintf('Theta computed from gradient descent: \n');fprintf(' %f \n', theta);fprintf('\n');% Estimate the price of a 1650 sq-ft, 3 br house% ====================== YOUR CODE HERE ======================% Recall that the first column of X is all-ones. Thus, it does% not need to be normalized.price = 0; % You should change this% ============================================================fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...         '(using gradient descent):\n $%f\n'], price);fprintf('Program paused. Press enter to continue.\n');pause;
继续闪退,只能写在下篇博客了。





阅读全文
0 0