斯坦福大学机器学习课程线性回归编程作业二(多变量2)
来源:互联网 发布:js appendchild的用法 编辑:程序博客网 时间:2024/06/05 09:20
gradientDescentMulti函数代码为:
function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)%GRADIENTDESCENTMULTI Performs gradient descent to learn theta% theta = GRADIENTDESCENTMULTI(x, y, theta, alpha, num_iters) updates theta by% taking num_iters gradient steps with learning rate alpha% Initialize some useful valuesm = length(y); % number of training examplesJ_history = zeros(num_iters, 1);for iter = 1:num_iters % ====================== YOUR CODE HERE ====================== % Instructions: Perform a single gradient step on the parameter vector % theta. % % Hint: While debugging, it can be useful to print out the values % of the cost function (computeCostMulti) and gradient here. % theta = theta - alpha / m * X' * (X * theta - y); % ============================================================ % Save the cost J in every iteration J_history(iter) = computeCostMulti(X, y, theta);endend
所作出的图像为:
4.normal equations
%% ================ Part 3: Normal Equations ================fprintf('Solving with normal equations...\n');% ====================== YOUR CODE HERE ======================% Instructions: The following code computes the closed form % solution for linear regression using the normal% equations. You should complete the code in % normalEqn.m%% After doing so, you should complete this code % to predict the price of a 1650 sq-ft, 3 br house.%%% Load Datadata = csvread('ex1data2.txt');X = data(:, 1:2);y = data(:, 3);m = length(y);% Add intercept term to XX = [ones(m, 1) X];% Calculate the parameters from the normal equationtheta = normalEqn(X, y);% Display normal equation's resultfprintf('Theta computed from the normal equations: \n');fprintf(' %f \n', theta);fprintf('\n');% Estimate the price of a 1650 sq-ft, 3 br house% ====================== YOUR CODE HERE ======================price = 0; % You should change this% ============================================================fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ... '(using normal equations):\n $%f\n'], price);
其中normalEqn代码为:
function [theta] = normalEqn(X, y)%NORMALEQN Computes the closed-form solution to linear regression % NORMALEQN(X,y) computes the closed-form solution to linear % regression using the normal equations.theta = zeros(size(X, 2), 1);% ====================== YOUR CODE HERE ======================% Instructions: Complete the code to compute the closed form solution% to linear regression and put the result in theta.%% ---------------------- Sample Solution ----------------------theta = pinv(X' * X) * X' * y;% -------------------------------------------------------------% ============================================================end
阅读全文
0 0
- 斯坦福大学机器学习课程线性回归编程作业二(多变量2)
- 斯坦福大学机器学习课程二线性回归编程作业3(多变量)
- 斯坦福大学机器学习课程二线性回归编程作业2
- 斯坦福大学机器学习课程二线性回归编程作业1
- 斯坦福大学机器学习第三课“多变量线性回归“
- Coursera机器学习 week2 多变量线性回归 编程作业代码
- 机器学习(二)——多变量线性回归
- 机器学习课程 Multivariate Linear Regression(多变量线性回归)
- 机器学习:多变量线性回归
- 斯坦福大学机器学习笔记——多变量的线性回归以及梯度下降法注意事项(内有代码)
- Python学习(机器学习_多变量线性回归)
- 机器学习笔记(四) 多变量线性回归
- Andrew Ng机器学习笔记(二):多变量线性回归
- 第二周-Coursera/Stanford机器学习课程学习笔记-多变量线性回归
- 多变量线性回归(二)
- Stanford机器学习课程笔记——多变量线性回归模型
- 深度学习(多变量线性回归)
- Coursera公开课笔记: 斯坦福大学机器学习第四课“多变量线性回归(Linear Regression with Multiple Variables)”
- PAT A1099
- go语言开发环境搭建
- 给messageCheckBox 赋值
- 杨辉三角两种输出结果
- 数据结构 BFS层次遍历二叉树【C语言版本】
- 斯坦福大学机器学习课程线性回归编程作业二(多变量2)
- hibernate加载策略之lazy
- 剑指offer 二叉搜索树以及双向链表
- 1024. 科学计数法 (20)
- Java集合源码学习(四)HashMap分析
- 2016TID敏捷持续集成演讲材料——公开版
- C语言细节总结
- 冒泡排序(C实现)
- [leetcode]67. Add Binary@Java