深度学习 Deep LearningUFLDL 最新Tutorial 学习笔记 2:Logistic Regression
来源:互联网 发布:android安全卫士源码 编辑:程序博客网 时间:2024/05/22 15:39
1 Logistic Regression 简述
Linear Regression 研究连续量的变化情况,而Logistic Regression则研究离散量的情况,简单地说就是对于判断一个训练样本是属于1还是0。那么很容易地我们会想到概率,对,就是我们计算样本属于1的概率及属于0的概率,这样就可以根据概率来估计样本的情况,通过概率也将离散问题变成了连续问题。
Specifically, we will try to learn a function of the form:
The function
我们只需要计算y=1的概率就ok了。其Cost Function如下:
J(θ)=−∑i(y(i)log(hθ(x(i)))+(1−y(i))log(1−hθ(x(i)))).
除了方程不一样,其他的计算和Linear Regression是完全一样的。
OK,接下来我们来看看练习怎么做。
2 exercise1B 解答
本练习通过使用MNIST的数据来判断手写数字0或者1.
我直接贴出代码:
ex1b_regression.m (无需更改)
addpath ../commonaddpath ../common/minFunc_2012/minFuncaddpath ../common/minFunc_2012/minFunc/compiled% Load the MNIST data for this exercise.% train.X and test.X will contain the training and testing images.% Each matrix has size [n,m] where:% m is the number of examples.% n is the number of pixels in each image.% train.y and test.y will contain the corresponding labels (0 or 1).binary_digits = true;[train,test] = ex1_load_mnist(binary_digits);% Add row of 1s to the dataset to act as an intercept term.train.X = [ones(1,size(train.X,2)); train.X]; test.X = [ones(1,size(test.X,2)); test.X];% Training set dimensionsm=size(train.X,2);n=size(train.X,1);% Train logistic regression classifier using minFuncoptions = struct('MaxIter', 100);% First, we initialize theta to some small random values.theta = rand(n,1)*0.001;% Call minFunc with the logistic_regression.m file as the objective function.%% TODO: Implement batch logistic regression in the logistic_regression.m file!%%tic;%theta=minFunc(@logistic_regression, theta, options, train.X, train.y);%fprintf('Optimization took %f seconds.\n', toc);% Now, call minFunc again with logistic_regression_vec.m as objective.%% TODO: Implement batch logistic regression in logistic_regression_vec.m using% MATLAB's vectorization features to speed up your code. Compare the running% time for your logistic_regression.m and logistic_regression_vec.m implementations.%% Uncomment the lines below to run your vectorized code.%theta = rand(n,1)*0.001;tic;theta=minFunc(@logistic_regression_vec, theta, options, train.X, train.y);fprintf('Optimization took %f seconds.\n', toc);% Print out training accuracy.tic;accuracy = binary_classifier_accuracy(theta,train.X,train.y);fprintf('Training accuracy: %2.1f%%\n', 100*accuracy);% Print out accuracy on the test set.accuracy = binary_classifier_accuracy(theta,test.X,test.y);fprintf('Test accuracy: %2.1f%%\n', 100*accuracy);
logistic_regression.m
function [f,g] = logistic_regression(theta, X,y) % % Arguments: % theta - A column vector containing the parameter values to optimize. % X - The examples stored in a matrix. % X(i,j) is the i'th coordinate of the j'th example. % y - The label for each example. y(j) is the j'th example's label. % m=size(X,2); n=size(X,1); % initialize objective value and gradient. f = 0; g = zeros(size(theta)); % % TODO: Compute the objective function by looping over the dataset and summing % up the objective values for each example. Store the result in 'f'. % % TODO: Compute the gradient of the objective by looping over the dataset and summing % up the gradients (df/dtheta) for each example. Store the result in 'g'. %%%% YOUR CODE HERE %%%% Step 1?Compute Cost Functionfor i = 1:m f = f - (y(i)*log(sigmoid(theta' * X(:,i))) + (1-y(i))*log(1-... sigmoid(theta' * X(:,1))));endfor j = 1:n for i = 1:m g(j) = g(j) + X(j,i)*(sigmoid(theta' * X(:,i)) - y(i)); end end
ex1_load_mnist.m (无需更改)
function [train, test] = ex1_load_mnist(binary_digits) % Load the training data X=loadMNISTImages('train-images-idx3-ubyte'); % 784x60000 60000张图片28x28pixel y=loadMNISTLabels('train-labels-idx1-ubyte')'; % 1*60000 if (binary_digits) % Take only the 0 and 1 digits X = [ X(:,y==0), X(:,y==1) ]; %通过y==0和y==1直接得到y=0和1的index y = [ y(y==0), y(y==1) ]; end % Randomly shuffle the data I = randperm(length(y)); y=y(I); % labels in range 1 to 10 X=X(:,I); % We standardize the data so that each pixel will have roughly zero mean and unit variance. s=std(X,[],2); %??std??X??? m=mean(X,2); X=bsxfun(@minus, X, m); X=bsxfun(@rdivide, X, s+.1); % 就是计算(x-m)/s 加0.1是为了防止分母为0 % Place these in the training set train.X = X; train.y = y; % Load the testing data X=loadMNISTImages('t10k-images-idx3-ubyte'); y=loadMNISTLabels('t10k-labels-idx1-ubyte')'; if (binary_digits) % Take only the 0 and 1 digits X = [ X(:,y==0), X(:,y==1) ]; y = [ y(y==0), y(y==1) ]; end % Randomly shuffle the data I = randperm(length(y)); y=y(I); % labels in range 1 to 10 X=X(:,I); % Standardize using the same mean and scale as the training data. X=bsxfun(@minus, X, m); X=bsxfun(@rdivide, X, s+.1); % Place these in the testing set test.X=X; test.y=y;
【说明:本文为原创文章,转载请注明出处:blog.csdn.net/songrotek 欢迎交流QQ:363523441】
2 0
- 深度学习 Deep LearningUFLDL 最新Tutorial 学习笔记 2:Logistic Regression
- 深度学习 Deep Learning UFLDL 最新 Tutorial 学习笔记 1:Linear Regression
- 深度学习 Deep Learning UFLDL 最新Tutorial 学习笔记 5:Softmax Regression
- 深度学习 Deep Learning UFLDL 最新Tutorial 学习笔记 5:Softmax Regression
- 【学习笔记2】Logistic Regression
- logistic regression 学习笔记
- UFLDL Tutorial学习笔记(一)Linear&Logistic&Softmax Regression
- 深度学习 Deep Learning UFLDL 最新Tutorial 学习笔记 3:Vectorization
- 深度学习 Deep Learning UFLDL 最新Tutorial 学习笔记 4:Debugging: Gradient Checking
- 李宏毅机器学习课程笔记2:Classification、Logistic Regression、Brief Introduction of Deep Learning
- 吴恩达深度学习课程笔记 2.2Logistic Regression逻辑回归
- 机器学习笔记 - Logistic Regression
- Deep Learning(Logistic Regression)学习之MNIST C++实现
- Deep Learning(Logistic Regression)学习之MNIST C++实现
- Deep Learning(Logistic Regression)学习之MNIST C++实现
- UFLDL学习笔记2——Logistic Regression
- Andrew机器学习笔记2:逻辑回归 logistic regression
- Stanford机器学习笔记-2.Logistic Regression
- Python 列表
- [C++]C++ 第一章
- UFLDL Exercise:Softmax Regression
- 一个很难很难解决的问题:在linux下编译arm嵌入式的qt时加入opengl es1 选项就会编译不过
- 尝试读取或写入受保护的内存。这通常指示其他内存已损坏 ,新中二代身份证读取 Syn_ReadMsg,Syn_R
- 深度学习 Deep LearningUFLDL 最新Tutorial 学习笔记 2:Logistic Regression
- 获取字符串的字符次数
- 深度分析Linux下双网卡绑定七种模式
- ARM控制流指令
- 磁盘空间不足 转移ORACLE数据库数据文件的解决方案
- Ubuntu创建用户命令
- finally真的一定会执行吗-并发
- php数据转换为html table或者csv文件
- proguard android混淆