斯坦福機器學習編程作業1
来源:互联网 发布:电子商务模式的大数据 编辑:程序博客网 时间:2024/05/16 12:54
本文僅爲本人記錄學習之用。
1.computeCost.m
function J = computeCost(X, y, theta)%COMPUTECOST Compute cost for linear regression% J = COMPUTECOST(X, y, theta) computes the cost of using theta as the% parameter for linear regression to fit the data points in X and y% Initialize some useful valuesm = length(y); % number of training examples% You need to return the following variables correctly J = 0;% ====================== YOUR CODE HERE ======================% Instructions: Compute the cost of a particular choice of theta% You should set J to the cost.J=sum((X*theta-y).^2)/(2*m);% =========================================================================end
2.gradientDescent.m
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)%GRADIENTDESCENT Performs gradient descent to learn theta% theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by % taking num_iters gradient steps with learning rate alpha% Initialize some useful valuesm = length(y); % number of training examplesJ_history = zeros(num_iters, 1);for iter = 1:num_iters % ====================== YOUR CODE HERE ====================== % Instructions: Perform a single gradient step on the parameter vector % theta. % % Hint: While debugging, it can be useful to print out the values % of the cost function (computeCost) and gradient here. % theta=theta-alpha/m*X'*(X*theta-y); % ============================================================ % Save the cost J in every iteration J_history(iter) = computeCost(X, y, theta);endend
3.Feature Normalize.m
function [X_norm, mu, sigma] = featureNormalize(X)%FEATURENORMALIZE Normalizes the features in X % FEATURENORMALIZE(X) returns a normalized version of X where% the mean value of each feature is 0 and the standard deviation% is 1. This is often a good preprocessing step to do when% working with learning algorithms.% You need to set these values correctlyX_norm = X;mu = zeros(1, size(X, 2));sigma = zeros(1, size(X, 2));% ====================== YOUR CODE HERE ======================% Instructions: First, for each feature dimension, compute the mean% of the feature and subtract it from the dataset,% storing the mean value in mu. Next, compute the % standard deviation of each feature and divide% each feature by it's standard deviation, storing% the standard deviation in sigma. %% Note that X is a matrix where each column is a % feature and each row is an example. You need % to perform the normalization separately for % each feature. %% Hint: You might find the 'mean' and 'std' functions useful.% M(1)=max(X(:,1))-min(X(:,1));M(2)=max(X(:,2))-min(X(:,2));meanX=mean(X);X=(X-meanX);X(:,1)=X(:,1)./M(1);X(:,2)=X(:,2)./M(2);X_norm=X;mu=meanX';sigma=std(X);% ============================================================end
ComputeCostMulti.m
function J = computeCostMulti(X, y, theta)%COMPUTECOSTMULTI Compute cost for linear regression with multiple variables% J = COMPUTECOSTMULTI(X, y, theta) computes the cost of using theta as the% parameter for linear regression to fit the data points in X and y% Initialize some useful valuesm = length(y); % number of training examples% You need to return the following variables correctly J = 0;% ====================== YOUR CODE HERE ======================% Instructions: Compute the cost of a particular choice of theta% You should set J to the cost.J=sum(((X*theta)-y).^2)/(2*m);% =========================================================================endgrandientDescentMuluti.m
function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)%GRADIENTDESCENTMULTI Performs gradient descent to learn theta% theta = GRADIENTDESCENTMULTI(x, y, theta, alpha, num_iters) updates theta by% taking num_iters gradient steps with learning rate alpha% Initialize some useful valuesm = length(y); % number of training examplesJ_history = zeros(num_iters, 1);for iter = 1:num_iters % ====================== YOUR CODE HERE ====================== % Instructions: Perform a single gradient step on the parameter vector % theta. % % Hint: While debugging, it can be useful to print out the values % of the cost function (computeCostMulti) and gradient here. % theta=theta-alpha/m*(X'*(X*theta-y)); % ============================================================ % Save the cost J in every iteration J_history(iter) = computeCostMulti(X, y, theta);endend
阅读全文
0 0
- 斯坦福機器學習編程作業1
- 斯坦福机器学习1
- 斯坦福cs231n 1-15视频
- 斯坦福机器学习1-3
- 斯坦福卡雷尔机器人的作业1-2
- [斯坦福的编程范式]1-3
- 斯坦福《机器学习》Lesson8感想-------1、SMO
- 斯坦福机器学习笔记-Lecture 1,2
- 机器学习-斯坦福课程系列1【基本概念】
- 斯坦福机器学习笔记(1)
- python实现斯坦福机器学习ex1.1
- 斯坦福机器学习课程笔记1
- Coursera 斯坦福 算法课 Course 1 Week 1
- 深度学习笔记[1]: CNN for CV斯坦福CS231n (1)
- [斯坦福ios开发][assignment2]1
- 斯坦福机器学习-week 3 学习笔记(1)
- 斯坦福公开课之编程方法学 1
- 斯坦福iOS 7公开课-Assignment 1
- 搜狐2016研发工程师编程试题
- 305
- 数据库sql语句的exists使用说明
- Redis 在新浪微博中的应用
- 306
- 斯坦福機器學習編程作業1
- 307
- 算法题中常见的C++ STL
- 308
- 通过ContentObserver自动获取短信验证码
- 【Linux学习笔记】23:Bash基础-通配符和其它特殊符号
- 309
- HDU
- Linux shell 中的关联数组