Machine Learning -- ex1 作业分析
来源:互联网 发布:进入国外网站软件 编辑:程序博客网 时间:2024/06/06 00:54
先看一下作业的要求:
前四个函数是必须要写的
warmUpExercise.m 视频中给来练习的函数。不多解释
plotData.m 要求如下:
题目的大体意思是将ex1data1.txt中x,y的值导入 并且画出图来
照着PDF 的代码稍加修改或抄上
function plotData(x, y)%PLOTDATA Plots the data points x and y into a new figure %画出(x,y)的图像% PLOTDATA(x,y) plots the data points and gives the figure axes labels of% population and profit.%给出轴的名称figure; % open a new figure window% ====================== YOUR CODE HERE ======================% Instructions: Plot the training data into a figure using the % "figure" and "plot" commands. Set the axes labels using% the "xlabel" and "ylabel" commands. Assume the % population and revenue data have been passed in% as the x and y arguments of this function.%% Hint: You can use the 'rx' option with plot to have the markers% appear as red crosses. Furthermore, you can make the% markers larger by using plot(..., 'rx', 'MarkerSize', 10);data = load('ex1data1.txt');x = data(:,1);y = data(:,2);plot(x,y,'rx','MarkerSize',10);xlabel('Population of City in 10,000s');ylabel('Profit in $10,1000s');
得到如下的图像:
第二个函数:
computeCost.m // 是让我们计算代价的
看了一下:
function J = computeCost(X, y, theta)
那么X是什么?????
在这呢
x原本是行向量 又在前面加了一个行向量组成了 X(theta0,theta1)
PDF中给出了代价方程 所以我们就直接照着方程写代码
J = sum(((X * theta) - y).^2)/(2*m)
解释一下上面这行代码:
X * theta:X是一个 m * 2 的矩阵 theta 是一个 2 * 1 的向量 所以得到一个 m * 1 的向量
完整代码如下:
function J = computeCost(X, y, theta)%COMPUTECOST Compute cost for linear regression% J = COMPUTECOST(X, y, theta) computes the cost of using theta as the% parameter for linear regression to fit the data points in X and y% Initialize some useful valuesm = length(y); % number of training examples% You need to return the following variables correctly J = 0;% ====================== YOUR CODE HERE ======================% Instructions: Compute the cost of a particular choice of theta% You should set J to the cost.J = sum(((X * theta) - y).^2)/(2*m)% =========================================================================end
gradientDescent.m
也就是不断的更新theta 并计算出J
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)%GRADIENTDESCENT Performs gradient descent to learn theta% theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by % taking num_iters gradient steps with learning rate alpha% Initialize some useful valuesm = length(y); % number of training examplesJ_history = zeros(num_iters, 1);%生成了一个(迭代数 * 1)的零矩阵for iter = 1:num_iters % ====================== YOUR CODE HERE ====================== % Instructions: Perform a single gradient step on the parameter vector % theta. % % Hint: While debugging, it can be useful to print out the values % of the cost function (computeCost) and gradient here. % temp1 = theta(1) - alpha * (1/m) * sum((X * theta) - y); temp2 = theta(2) - alpha * (1/m) * sum(((X * theta) - y).*X(:,2)); theta(1) = temp1; theta(2) = temp2; %X(:,2) X的第二列 也就是x % ============================================================ % Save the cost J in every iteration J_history(iter) = computeCost(X, y, theta);endend
ok 之后给出其他的函数
0 0
- Machine Learning -- ex1 作业分析
- coursera Machine Learning ex1
- Stanford Machine Learning ex1
- machine-learning-ex1
- Machine Learning Week 2 ex1
- Coursera Machine Learning Week 2 ex1
- 单维与多维线性回归代码( machine-learning-ex1 ) Stanford machine learning
- Coursera Machine Learning 作业代码
- Coursera 《Machine Learning》 编程作业7:K-means聚类和主成分分析
- machine-learning第三周 上机作业
- Machine Learning Foundations(NTU) 第一次作业
- machine-learning第四周 上机作业
- machine-learning第五周 上机作业
- machine-learning第六周 上机作业
- machine-learning第七周 上机作业
- MOOC Machine learning 作业交流帖1
- MOOC Machine Learning 作业交流帖2
- MOOC Machine Learning 作业交流帖3
- Java中的泛型(Generic)
- 排序算法Java实现——选择排序(直接选择排序)
- Java坑爹玩意儿之-多线程
- 使用注解加反射去除switch重构代码
- Scrapy入门教程中遇到的坑
- Machine Learning -- ex1 作业分析
- mybatis教程--延迟加载详解
- 每日一题 No.26 vector容器的学习
- javaweb学习前言
- 第一份工作 2015-2017
- springmvc基础
- Lintcode29 Interleaving String solution 题解
- 2017“久源软件杯”安徽科技学院第八届程序设计大赛
- python_test程序使用说明