斯坦福大学机器学习课程二线性回归编程作业3(多变量)
来源:互联网 发布:tengine windows版本 编辑:程序博客网 时间:2024/06/06 03:21
多变量的线性回归练习以预测房屋价格为例,基本过程和单变量类似,
1. Feature Normalization
%% ================ Part 1: Feature Normalization ================%% Clear and Close Figuresclear ; close all; clcfprintf('Loading data ...\n');%% Load Datadata = load('ex1data2.txt');X = data(:, 1:2);y = data(:, 3);m = length(y);% Print out some data pointsfprintf('First 10 examples from the dataset: \n');fprintf(' x = [%.0f %.0f], y = %.0f \n', [X(1:10,:) y(1:10,:)]');fprintf('Program paused. Press enter to continue.\n');pause;% Scale features and set them to zero meanfprintf('Normalizing Features ...\n');[X mu sigma] = featureNormalize(X);% Add intercept term to XX = [ones(m, 1) X];
featureNormalize.m函数代码:
function [X_norm, mu, sigma] = featureNormalize(X)%FEATURENORMALIZE Normalizes the features in X % FEATURENORMALIZE(X) returns a normalized version of X where% the mean value of each feature is 0 and the standard deviation% is 1. This is often a good preprocessing step to do when% working with learning algorithms.% You need to set these values correctlyX_norm = X;mu = zeros(1, size(X, 2));sigma = zeros(1, size(X, 2));% ====================== YOUR CODE HERE ======================% Instructions: First, for each feature dimension, compute the mean% of the feature and subtract it from the dataset,% storing the mean value in mu. Next, compute the % standard deviation of each feature and divide% each feature by it's standard deviation, storing% the standard deviation in sigma. %% Note that X is a matrix where each column is a % feature and each row is an example. You need % to perform the normalization separately for % each feature. %% Hint: You might find the 'mean' and 'std' functions useful.% mu = mean(X); % mean valuesigma = std(X); %standard deviationX_norm = (X - repmat(mu, size(X,1),1)) ./ repmat(sigma, size(X,1), 1);% ============================================================end
2.梯度下降
%% ================ Part 2: Gradient Descent ================% ====================== YOUR CODE HERE ======================% Instructions: We have provided you with the following starter% code that runs gradient descent with a particular% learning rate (alpha). %% Your task is to first make sure that your functions - % computeCost and gradientDescent already work with % this starter code and support multiple variables.%% After that, try running gradient descent with % different values of alpha and see which one gives% you the best result.%% Finally, you should complete the code at the end% to predict the price of a 1650 sq-ft, 3 br house.%% Hint: By using the 'hold on' command, you can plot multiple% graphs on the same figure.%% Hint: At prediction, make sure you do the same feature normalization.%fprintf('Running gradient descent ...\n');% Choose some alpha valuealpha = 0.01;num_iters = 400;% Init Theta and Run Gradient Descent theta = zeros(3, 1);[theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters);% Plot the convergence graphfigure;plot(1:numel(J_history), J_history, '-b', 'LineWidth', 2);xlabel('Number of iterations');ylabel('Cost J');% Display gradient descent's resultfprintf('Theta computed from gradient descent: \n');fprintf(' %f \n', theta);fprintf('\n');% Estimate the price of a 1650 sq-ft, 3 br house% ====================== YOUR CODE HERE ======================% Recall that the first column of X is all-ones. Thus, it does% not need to be normalized.price = 0; % You should change this% ============================================================fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ... '(using gradient descent):\n $%f\n'], price);fprintf('Program paused. Press enter to continue.\n');pause;继续闪退,只能写在下篇博客了。
阅读全文
0 0
- 斯坦福大学机器学习课程二线性回归编程作业3(多变量)
- 斯坦福大学机器学习课程线性回归编程作业二(多变量2)
- 斯坦福大学机器学习课程二线性回归编程作业1
- 斯坦福大学机器学习课程二线性回归编程作业2
- 斯坦福大学机器学习第三课“多变量线性回归“
- Coursera机器学习 week2 多变量线性回归 编程作业代码
- 机器学习(二)——多变量线性回归
- 机器学习课程 Multivariate Linear Regression(多变量线性回归)
- 机器学习:多变量线性回归
- 斯坦福大学机器学习笔记——多变量的线性回归以及梯度下降法注意事项(内有代码)
- Python学习(机器学习_多变量线性回归)
- 机器学习(3)——多变量线性回归
- 机器学习笔记(四) 多变量线性回归
- Andrew Ng机器学习笔记(二):多变量线性回归
- 第二周-Coursera/Stanford机器学习课程学习笔记-多变量线性回归
- 多变量线性回归(二)
- Stanford机器学习课程笔记——多变量线性回归模型
- 深度学习(多变量线性回归)
- php die函数
- OAF中实现单选按钮
- Android_基本控件--TextView、EditView、assets资源访问
- HDU Cake
- Spring task实现定时调度某一方法
- 斯坦福大学机器学习课程二线性回归编程作业3(多变量)
- 通过修改SystemInit匹配12M外部晶振
- 【Redis】Redis概要
- 一行烂代码,三千烦恼丝——漫谈代码规范对开发组织的重要性
- Java 9 中的 9 个特性 详细
- django 富文本 登陆验证及跳转 及POST
- 1005. 继续(3n+1)猜想
- 生活中的三观
- vim编辑器