Coursera 《Machine Learning》 编程作业1:线性回归
来源:互联网 发布:115网盘网络异常请重试 编辑:程序博客网 时间:2024/06/10 22:56
Coursera 《Machine Learning》 编程作业1:线性回归
单变量线性回归
本次练习中,你是一家餐厅的CEO,现在收集了一组数据<城市人口,开店利润>,现在需要用单变量线性回归来预测在哪一个城市开店比较好。
数据示例:第一列代表城市人口数;第二列表示利润
6.1101,17.5925.5277,9.13028.5186,13.662...........
数据可视化:首先加载数据,将数据第一列保存在X,数据第二列保存到y
======================= Part 2: Plotting =======================fprintf('Plotting Data ...\n')data = load('ex1data1.txt');X = data(:, 1); y = data(:, 2);m = length(y); % number of training examples% Plot Data% Note: You have to complete the code in plotData.mplotData(X, y);fprintf('Program paused. Press enter to continue.\n');pause;
plotData函数:plot函数执行画图,指定数据点颜色、形状、大小,xlabel,ylabel加横纵坐标标题
function plotData(x, y)%PLOTDATA Plots the data points x and y into a new figure % PLOTDATA(x,y) plots the data points and gives the figure axes labels of% population and profit.% ====================== YOUR CODE HERE ======================% Instructions: Plot the training data into a figure using the % "figure" and "plot" commands. Set the axes labels using% the "xlabel" and "ylabel" commands. Assume the % population and revenue data have been passed in% as the x and y arguments of this function.%% Hint: You can use the 'rx' option with plot to have the markers% appear as red crosses. Furthermore, you can make the% markers larger by using plot(..., 'rx', 'MarkerSize', 10);figure;% open a new figure windowplot(x,y,'rx','MarkerSize',10);ylabel('Profit in $10,000s');xlabel('Population of City in 10,000s');% ============================================================end
画出来的图形如下:
梯度下降
1.假设函数
给定一些样本数据(training set)后,将这些数据‘喂’给学习算法(learning algorithm)来对样本数据进行训练,进而学习到了一个假设函数h。当需要预测新数据的结果时,将新数据作为假设函数的输入,假设函数计算后得到结果,这个结果就作为预测值。
假设函数的一般表达形式如下:θ称为模型的权重,x就是输入特征变量
每个输入样本都可以看成是一个向量,每个输出样本有n+1个features,在本次练习中是单变量特征,只有城市的人口数量,在预测房价的例子中,输入x可以是房子的面积,卧室的数量等一系列特征。我们只要确定了权重θ,那么对于新的样本数据,就可以通过确定了的假设函数h来预测该样本的结果y了
2.代价函数
定义代价函数如下:用 m 来表示训练集样本的数量(size of training set),x(i)表示第i个样本的输入特征向量,y(i)表示第i个样本的预测结果
我们现在要做的便是为我们的模型选择最佳的参数θ,优化代价函数,使得代价函数取最小值
3.梯度下降法
梯度下降是一个用来求函数最小值的算法,我们将使用梯度下降算法来求出代价函数J(θ)的最小值。
使用梯度下降法来求参数,更新规则如下:下标j表示第j个参数
对代价函数求导:
若每次批量训练所有m个实例,则称为批梯度下降算法
4.matlab代码实现
计算代价函数:
function J = computeCost(X, y, theta)%COMPUTECOST Compute cost for linear regression% J = COMPUTECOST(X, y, theta) computes the cost of using theta as the% parameter for linear regression to fit the data points in X and y% Initialize some useful valuesm = length(y); % number of training examples% You need to return the following variables correctly J = 0 ;% ====================== YOUR CODE HERE ======================% Instructions: Compute the cost of a particular choice of theta% You should set J to the cost.prediction = X * theta;sqr = (prediction - y).^2;J = 1 / (2 * m) * sum(sqr);% =========================================================================end
梯度下降算法:
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)%GRADIENTDESCENT Performs gradient descent to learn theta% theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by % taking num_iters gradient steps with learning rate alpha% Initialize some useful valuesm = length(y); % number of training examplesJ_history = zeros(num_iters, 1);for iter = 1:num_iters % ====================== YOUR CODE HERE ====================== % Instructions: Perform a single gradient step on the parameter vector % theta. % % Hint: While debugging, it can be useful to print out the values % of the cost function (computeCost) and gradient here. % prediction = X * theta; error = prediction - y; sums = X' * error; delta = 1 / m * sums; theta = theta - alpha * delta; % ============================================================ % Save the cost J in every iteration J_history(iter) = computeCost(X, y, theta);end
可视化回归方程:
可视化代价函数:代价函数J(θ0 , θ1)的图形(只有一个特征变量(population))。因此,共有两个模型参数,θ0对应着x0, θ1对应着x1
- Coursera 《Machine Learning》 编程作业1:线性回归
- machine learning(线性回归)
- Machine Learning线性回归
- Coursera Machine Learning 作业代码
- Coursera Machine Learning机器学习课程编程作业参考答案
- Coursera—machine learning(Andrew Ng)第二周编程作业
- Coursera—machine learning(Andrew Ng)第三周编程作业
- Coursera—machine learning(Andrew Ng)第四周编程作业
- Coursera—machine learning(Andrew Ng)第五周编程作业
- Coursera—machine learning(Andrew Ng)第六周编程作业
- Coursera—machine learning(Andrew Ng)第七周编程作业
- Coursera—machine learning(Andrew Ng)第八周编程作业
- Machine learning (1)线性回归
- Machine Learning 1-线性回归算法分析
- Coursera《machine learning》--(6)逻辑回归
- Coursera《machine learning》--(2)单变量线性回归(Linear Regression with One Variable)
- Coursera Machine Learning 作业代码 week3
- 机器学习(Machine Learning)心得体会(1)线性回归
- 莫比乌斯反演例题(双解):bzoj 2045(Mobius)
- dlib 09 dlib自带demo 类LeNet
- c#调用c++程序(DLL方法,以及opencv,运行开源的人脸识别seetaface)
- dubbo-master使用
- 个人感想之写在最前面
- Coursera 《Machine Learning》 编程作业1:线性回归
- 状态机(剑圣砍方块)
- apriori推荐算法
- SpringCloud(第 013 篇)电影微服务使用定制化 Feign 在客户端进行负载均衡调度并为 Feign 配置帐号密码登录认证 Eureka
- Connect to Sqlite and do insert, delete, update and select
- inotify
- java jdk安装参考
- 最长公共子序列(LCS)(一)---动态规划
- Android 多线程之HandlerThread 完全详解