Andrew NG UFLDL讲义学习代码之Logistic Regression
来源:互联网 发布:关于淘宝诈骗的视频 编辑:程序博客网 时间:2024/06/08 08:56
一个小回顾:
视频中或者讲义中在讲到为什么线性回归时损失函数会采用均方误差的时候用的其误差关于高斯分布时的求解非常有意思,也是一个很好的证明方法,可以转化为等价的问题,可以仔细体会一下。
另外在求解logistic回归时的目标函数采用了最大似然概率模型很容易就求出了。所用数据库为mnist手写字数据库,此次分类0和1字符。另外数据库我已经上传到了这儿
clc clear allX = load('mnist_all');mytrain0 = X.train0;mytrain1 = X.train1;mytest0 = X.test0;mytest1 = X.test1;test = [mytest0',mytest1'];test = ismember(test,0);test = ismember(test,0);test =double( [ones(1,size(test,2));test]);testy = [zeros(size(mytest0,1),1);ones(size(mytest1,1),1)];num0 = size(mytrain0,1);num1 = size(mytrain1,1);train = [mytrain0',mytrain1'];train = ismember(train,0);train = ismember(train,0);numall = size(train,2);index = randperm(numall);y = [zeros(num0,1);ones(num1,1)];train = train(:,index);y = y(index);train =double( [ones(1,numall);train]);learnRate = 0.1;w = rand(size(train,1),1);%% stochastic gradient descentiter = 100;% for j = 1:iter% for i = 1:numall% dis = my_sigmod(w'*train(:,i))-y(i);% luckyrate = train(:,i)*dis;% w = w- learnRate*luckyrate;% end% w_f = w;% end%% batch gradient descentfor k = 1:1 for f = 1:100 temp = repmat(y' - my_sigmod(w'*train),size(train,1),1); dis = temp.*train; sum_ = sum(dis,2); w = w+ learnRate *sum_; end w_f = w;endnum_correct = 0;num_error = 0;num_all = size(test,2);for i = 1:num_all if my_sigmod(w_f'*test(:,i))>0.5 preValue = 1; else preValue = 0; end if preValue==testy(i) num_correct = num_correct+1; else num_error = num_error +1; endendfprintf('error predict: %d ;',num_error);fprintf('correct predict : %d ; ',num_correct);fprintf('correct Rate :%f ;', num_correct/num_all);
程序实现了随机梯度下降法和批次梯度下降法,均取得了很好的效果。如图其正确率达到了99.9%
1 0
- Andrew NG UFLDL讲义学习代码之Logistic Regression
- Andrew NG UFLDL讲义学习代码之Linear Regression
- 监督学习之Logistic regression——Andrew Ng机器学习笔记(二)
- 机器学习之&&Andrew Ng课程复习--- Cost Function(Logistic Regression)
- 机器学习之&&Andrew Ng课程复习--- Advanced optimization(Logistic Regression)
- 斯坦福 机器学习Andrew NG 第三讲Logistic Regression
- Andrew NG 机器学习 练习2-Logistic Regression
- Andrew NG 机器学习 Logistic Regression 第三周编程作业
- Andrew NG机器学习课程笔记系列之——机器学习之逻辑回归(Logistic Regression)
- Andrew NG机器学习课程笔记系列之——机器学习之逻辑回归(Logistic Regression)
- 笔记 of Andrew Ng , Linear Regression 和 Logistic Regression
- coursera Machine learning Andrew NG 学习笔记(二)—Logistic regression
- 机器学习 Machine Learning(by Andrew Ng)----第四章 逻辑回归(Logistic Regression)
- ANDREW Ng教授的机器学习(Machine Learning)学习笔记(3)-- Logistic回归(Logistic regression)
- Machine Learning by Andrew Ng --- Logistic Regression with two classes
- Machine Learning by Andrew Ng --- Logistic Regression by using Regularization
- Andrew Ng Machine Learning 专题【Logistic Regression & Regularization】
- [ML of Andrew Ng]Week 3 Logistic Regression and Regularization
- 一维数组实现杨辉三角
- 函数(-)
- make命令
- 博客专栏目录页
- Android照片墙完整版,完美结合LruCache和DiskLruCache
- Andrew NG UFLDL讲义学习代码之Logistic Regression
- 9个offer,12家公司,35场面试,从微软到谷歌,应届计算机毕业生的2012求职之路
- 加了FLAG_ACTIVITY_SINGLE_TOP后,不能再多次创建MainActivity
- 考察团组成
- Emacs学习笔记(1)——安装及默认路径修改
- android通过jni调用C代码socket出错问题
- 适配器模式(9)
- RFC学习笔记 -- 5245 ICE & 3261 SIP
- linux查看文件系统的使用空间和文件夹占用空间的命令