斯坦福大学机器学习课程二线性回归编程作业2
来源:互联网 发布:业务流程优化方法 编辑:程序博客网 时间:2024/05/17 09:09
上篇博文续,compute Cost函数的代码为:
function J = computeCost(X, y, theta)%COMPUTECOST Compute cost for linear regression% J = COMPUTECOST(X, y, theta) computes the cost of using theta as the% parameter for linear regression to fit the data points in X and y% Initialize some useful valuesm = length(y); % number of training examples% You need to return the following variables correctly J = 0;% ====================== YOUR CODE HERE ======================% Instructions: Compute the cost of a particular choice of theta% You should set J to the cost.J = sum((X * theta - y).^2) / (2*m); % X(79,2) theta(2,1)% =========================================================================end
gradientDescent函数代码为:
% taking num_iters gradient steps with learning rate alpha% Initialize some useful valuesm = length(y); % number of training examplesJ_history = zeros(num_iters, 1);theta_s = theta;for iter = 1:num_iters % ====================== YOUR CODE HERE ====================== % Instructions: Perform a single gradient step on the parameter vector % theta. % % Hint: While debugging, it can be useful to print out the values % of the cost function (computeCost) and gradient here. % % ?????theta0?theta1????? theta(1) = theta(1) - alpha / m * sum(X * theta_s - y); theta(2) = theta(2) - alpha / m * sum((X * theta_s - y) .*X (:,2)); theta_s = theta; % ============================================================ % Save the cost J in every iteration J_history(iter) = computeCost(X, y, theta);endend所求出的线性模型如图所示:
4.可视化代价函数和参数之间的关系
%% ============= Part 4: Visualizing J(theta_0, theta_1) =============fprintf('Visualizing J(theta_0, theta_1) ...\n')% Grid over which we will calculate Jtheta0_vals = linspace(-10, 10, 100);theta1_vals = linspace(-1, 4, 100);% initialize J_vals to a matrix of 0'sJ_vals = zeros(length(theta0_vals), length(theta1_vals));% Fill out J_valsfor i = 1:length(theta0_vals) for j = 1:length(theta1_vals) t = [theta0_vals(i); theta1_vals(j)]; J_vals(i,j) = computeCost(X, y, t); endend% Because of the way meshgrids work in the surf command, we need to% transpose J_vals before calling surf, or else the axes will be flippedJ_vals = J_vals';% Surface plotfigure;surf(theta0_vals, theta1_vals, J_vals)xlabel('\theta_0'); ylabel('\theta_1');% Contour plotfigure;% Plot J_vals as 15 contours spaced logarithmically between 0.01 and 100contour(theta0_vals, theta1_vals, J_vals, logspace(-2, 3, 20))xlabel('\theta_0'); ylabel('\theta_1');hold on;plot(theta(1), theta(2), 'rx', 'MarkerSize', 10, 'LineWidth', 2);
作出的图像如下:
多变量的线性回归模型和程序将在下篇博客发表。
阅读全文
0 0
- 斯坦福大学机器学习课程二线性回归编程作业2
- 斯坦福大学机器学习课程二线性回归编程作业1
- 斯坦福大学机器学习课程线性回归编程作业二(多变量2)
- 斯坦福大学机器学习课程二线性回归编程作业3(多变量)
- 【斯坦福大学-机器学习】2.单变量线性回归(二)
- 斯坦福大学机器学习课程--逻辑回归算法
- 斯坦福大学机器学习课程笔记一但变量线性回归模型
- Andrew NG机器学习线性回归编程作业
- 机器学习【2】(二)线性回归
- 斯坦福大学机器学习课程学习笔记-逻辑回归
- 斯坦福大学机器学习——线性回归(Linear Regression)
- 斯坦福大学机器学习第二课 “单变量线性回归”
- 斯坦福大学机器学习第三课“多变量线性回归“
- 机器学习-斯坦福大学-Lecture2-单变量线性回归
- 【斯坦福大学-机器学习】2.单变量线性回归(一)
- 斯坦福机器学习Coursera课程:第二周作业--一元和多元线性回归
- 斯坦福大学机器学习课程讲义
- 斯坦福大学机器学习课程讲义
- 2.0vue.js 路由嵌套
- mybatis入门
- [透析] 卷积神经网络CNN究竟是怎样一步一步工作的?
- QT内存管理机制
- pyltp安装
- 斯坦福大学机器学习课程二线性回归编程作业2
- 《编程之美》学而思
- [RK3288][Android6.0] MIPI DSI显示屏移植调试总结
- B
- PAT B1051. 复数乘法
- java中JVM的原理
- C语言 编译错误处理
- 20170722 卸载美团
- vmware中macosx的共享目录设置方法