机器学习实验二--Logistic Regression
来源:互联网 发布:ubuntu lightdm 配置 编辑:程序博客网 时间:2024/05/21 07:15
Logistic Regression
机器学习实验(一)—Linear Regression
算法流程
实验二的实验报告和代码
PCA的算法流程: 关于PCA中数学原理可以点击这里PCA的数学原理
PCA和Logistic Regression的MATLAB代码
function test_log_regression()% 读入数据X = load('logistic_x.txt'); Y = load('logistic_y.txt');% X = load('data\\fourclass.txt');% Y = load('data\\fourclasslabel.txt');tic; [coeff, score, latent, tsquared, explained] = pca(X);%PCA过程toc;X =score(:,1);%需要降到几维就去前k维作为结果X = [ones(size(X, 1), 1) X];% 用牛顿法计算 theta[theta, ll] = log_regression(X ,Y);% 绘制图像m=size(X,1);figure; hold on;% plot(X(Y < 0, 2), X(Y < 0, 3), 'rx', 'linewidth', 2);% plot(X(Y > 0, 2), X(Y > 0, 3), 'go', 'linewidth', 2);% x1 = min(X(:,2)):.1:max(X(:,2));% x2 = -(theta(1) / theta(3)) - (theta(2) / theta(3)) * x1;% %x2 = -(theta(1) / theta(3)) - (theta(2) / theta(3)) * x1;% plot(x1,x2, 'linewidth', 2);% xlabel('x1'); ylabel('x2');%--------------------------------------------------------------------------% plot(X(Y < 1.5, 2), X(Y < 1.5, 3), 'rx', 'linewidth', 2);% plot(X(Y > 1.5, 2), X(Y > 1.5, 3), 'go', 'linewidth', 2);% x1 = min(X(:,2)):.1:max(X(:,2));% x2 = -(theta(1) / theta(3)) - (theta(2) / theta(3)) * x1+log(3)/theta(3);% x2 = -(theta(1) / theta(3)) - (theta(2) / theta(3)) * x1;% plot(x1,x2, 'linewidth', 2);% xlabel('x1'); ylabel('x2');%------------------------------------------------------plot(X(Y < 0, 2), Y(Y<0,1), 'rx', 'linewidth', 2);plot(X(Y > 0, 2), Y(Y>0,1), 'go', 'linewidth', 2);x1 = min(X(:,2)):.01:max(X(:,2));%x1 = -theta(1)/theta(2);%y = -2:.001:2;%disp(y);x2 = theta(1) +theta(2) * x1;y = 2*(1./ (1 + exp(-x2)))-1;plot(x1,y, 'linewidth', 2);xlabel('x1');ylabel('Y');%predict=X * theta;acc=sum(((predict>0)*2-1)~=Y)/m;disp(acc);endfunction [theta,ll] = log_regression(X,Y)max_iters=40;% rows of X are training samples% rows of Y are corresponding -1/1 values% newton raphson: theta = theta - inv(H)* grad;% with H = hessian, grad = gradient%Y = Y-1.5;mm = size(X,1);nn = size(X,2);theta = zeros(nn,1);ll = zeros(max_iters, 1);for ii = 1:max_iters %%%%%%%%%%%%%% %% 课堂完成内容 margins = Y .* (X * theta); ll(ii) = (1/mm) * sum(log(1 + exp(-margins))); probs = 1 ./ (1 + exp(margins)); grad = -(1/mm) * (X' * (probs .* Y)); H = (1/mm) * (X' * diag(probs .* (1 - probs)) * X); theta = theta - H \ grad; %%%%%%%%%%%%%%enda=X * theta;predict = 1 ./ (1 + exp(-a));acc=sum(((predict>1.5)*2-1)==Y)/mm;disp(theta);end
其中对于PCA的MATLAB操作很简单:
[coeff, score, latent, tsquared, explained] = pca(X);%PCA过程X =score(:,1);%需要降到几维就去前k维作为结果
1 0
- 机器学习实验二--Logistic Regression
- 机器学习(二)之Logistic Regression
- 机器学习 Logistic Regression
- 机器学习 logistic regression
- 机器学习-Logistic Regression
- 机器学习-Logistic Regression
- 斯坦福机器学习实验之2-逻辑回归(Logistic Regression)
- [机器学习实验3]Logistic Regression and Newton Method
- 机器学习之logistic Regression
- 机器学习笔记 - Logistic Regression
- logistic regression 分类:机器学习
- 机器学习: Logistic Regression--python
- 机器学习基石-Logistic Regression
- 机器学习之Logistic regression
- 机器学习总结二:逻辑回归Logistic Regression
- (转载)【机器学习算法系列之二】浅析Logistic Regression
- 《转行机器学习》二、Logistic Regression及其实现
- 机器学习-- Logistic回归 Logistic Regression
- Pow(x, n) / Super Pow
- gentoo on macbook pro driver for the Broadcom Facetime HD webcam
- @Table(name="problem",schema = "root") 的schema属性
- linux新创建用户登陆后-bash
- 【物联网】QCA4010开发环境搭建(二)(解决WIN10下不能驱动问题)
- 机器学习实验二--Logistic Regression
- 我的学习清单
- Swift 网络请求数据与解析
- RPC原理与实例解析
- LongAdder类学习小结
- C# 类型、对象、方法执行时的相互关系的一点思考
- HDU Flowers【完全背包】
- Java 获取本地时间与网络时间
- 数据库范式