Programming Exercise 2: Logistic Regression Machine Learning
来源:互联网 发布:淘宝账号在哪看到 编辑:程序博客网 时间:2024/06/05 23:45
大家好,今天总结Coursera网课上Andrew Ng MachineLearning 第二次作业
(1)sigmoid.m
function g = sigmoid(z)%SIGMOID Compute sigmoid functoon% J = SIGMOID(z) computes the sigmoid of z.% You need to return the following variables correctly g = zeros(size(z));% ====================== YOUR CODE HERE ======================% Instructions: Compute the sigmoid of each value of z (z can be a matrix,% vector or scalar).g=1./(1+exp(-z));% =============================================================end
(2)costFunction.m
function [J, grad] = costFunction(theta, X, y)%COSTFUNCTION Compute cost and gradient for logistic regression% J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the% parameter for logistic regression and the gradient of the cost% w.r.t. to the parameters.% Initialize some useful valuesm = length(y); % number of training examples% You need to return the following variables correctly J = 0;grad = zeros(size(theta));% ====================== YOUR CODE HERE ======================% Instructions: Compute the cost of a particular choice of theta.% You should set J to the cost.% Compute the partial derivatives and set grad to the partial% derivatives of the cost w.r.t. each parameter in theta%% Note: grad should have the same dimensions as theta%h=sigmoid(X*theta);J=-1/m*sum((log(h)'*y+log(1-h)'*(1-y)));grad=1/m*(X'*(h-y));% =============================================================end
(3)predict.m
function p = predict(theta, X)%PREDICT Predict whether the label is 0 or 1 using learned logistic %regression parameters theta% p = PREDICT(theta, X) computes the predictions for X using a % threshold at 0.5 (i.e., if sigmoid(theta'*x) >= 0.5, predict 1)m = size(X, 1); % Number of training examples% You need to return the following variables correctlyp = zeros(m, 1);% ====================== YOUR CODE HERE ======================% Instructions: Complete the following code to make predictions using% your learned logistic regression parameters. % You should set p to a vector of 0's and 1's%p(find(sigmoid(X*theta)>=0.5),1)=1;% =========================================================================end
(4) costFunctionReg.m
function [J, grad] = costFunctionReg(theta, X, y, lambda)%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization% J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using% theta as the parameter for regularized logistic regression and the% gradient of the cost w.r.t. to the parameters. % Initialize some useful valuesm = length(y); % number of training examples% You need to return the following variables correctly J = 0;grad = zeros(size(theta));% ====================== YOUR CODE HERE ======================% Instructions: Compute the cost of a particular choice of theta.% You should set J to the cost.% Compute the partial derivatives and set grad to the partial% derivatives of the cost w.r.t. each parameter in theta[n1,n2]=size(theta);h=sigmoid(X*theta);theta2=theta(2:n1,1)J=-1/m*sum((log(h)'*y+log(1-h)'*(1-y)))+lambda/(2*m)*sum(theta2.^2);grad_orig=1/m*(X'*(h-y));grad=1/m*(X'*(h-y))+lambda/m*theta;grad(1)=grad_orig(1);% =============================================================end
0 0
- Programming Exercise 2: Logistic Regression Machine Learning
- Coursera Machine Learning Week 3 - Programming Exercise 2: Logistic Regression
- Machine Learning week 3 programming exercise logistic regression
- Coursera Machine Learning 第三周 quiz Programming Exercise 2: Logistic Regression
- Programming Exercise 1: Linear Regression Machine Learning
- Machine Learning:Logistic Regression
- [Machine Learning]--Logistic Regression
- Machine Learning--logistic regression
- Machine Learning:Logistic Regression
- 机器学习(Machine Learning)心得体会(2)逻辑回归Exercise 2:Logistic Regression
- Machine Learning week 3 quiz: programming assignment-Logistic Regression
- Machine Learning Notes - Logistic Regression
- Machine Learning - Regularized Logistic Regression
- machine learning之logistic regression
- 斯坦福大学机器学习公开课---Programming Exercise 2: Logistic Regression
- 【Machine Learning实验2】 Logistic Regression求解classification问题
- Stanford Machine Learning 公开课笔记(2) Logistic Regression
- 【Machine Learning实验2】 Logistic Regression求解classification问题
- ffmpeg代码学习计划(持续更新)
- opencv只操作不规则多边形roi
- Oracle数据库连接Eclipse(JDBC六大步骤详解)
- Angularjs------- 定义全局变量的3中方法
- AndroidStudion遇到VM内存不足
- Programming Exercise 2: Logistic Regression Machine Learning
- 蓝桥杯 基础练习 数列排序(4)之快排
- C# 抽象类
- OpenCV的安装与配置
- 个人项目——音乐播放器(一)
- 【Unity&Atlas】NGUI与UGUI打包图集的步骤以及比较
- 基于java语言的单链表
- HDU—校赛—1004
- 如何避免 OOM 异常