机器学习实验二--Logistic Regression

来源:互联网 发布:ubuntu lightdm 配置 编辑:程序博客网 时间:2024/05/21 07:15

Logistic Regression

机器学习实验(一)—Linear Regression
算法流程
实验二的实验报告和代码
这里写图片描述
这里写图片描述
这里写图片描述

PCA的算法流程: 关于PCA中数学原理可以点击这里PCA的数学原理

这里写图片描述
这里写图片描述
PCA和Logistic Regression的MATLAB代码

function test_log_regression()% 读入数据X = load('logistic_x.txt'); Y = load('logistic_y.txt');% X = load('data\\fourclass.txt');% Y = load('data\\fourclasslabel.txt');tic; [coeff, score, latent, tsquared, explained] =  pca(X);%PCA过程toc;X =score(:,1);%需要降到几维就去前k维作为结果X = [ones(size(X, 1), 1) X];% 用牛顿法计算 theta[theta, ll] = log_regression(X ,Y);% 绘制图像m=size(X,1);figure; hold on;% plot(X(Y < 0, 2), X(Y < 0, 3), 'rx', 'linewidth', 2);% plot(X(Y > 0, 2), X(Y > 0, 3), 'go', 'linewidth', 2);% x1 = min(X(:,2)):.1:max(X(:,2));% x2 = -(theta(1) / theta(3)) - (theta(2) / theta(3)) * x1;% %x2 = -(theta(1) / theta(3)) - (theta(2) / theta(3)) * x1;% plot(x1,x2, 'linewidth', 2);% xlabel('x1');   ylabel('x2');%--------------------------------------------------------------------------% plot(X(Y < 1.5, 2), X(Y < 1.5, 3), 'rx', 'linewidth', 2);% plot(X(Y > 1.5, 2), X(Y > 1.5, 3), 'go', 'linewidth', 2);% x1 = min(X(:,2)):.1:max(X(:,2));% x2 = -(theta(1) / theta(3)) - (theta(2) / theta(3)) * x1+log(3)/theta(3);% x2 = -(theta(1) / theta(3)) - (theta(2) / theta(3)) * x1;% plot(x1,x2, 'linewidth', 2);% xlabel('x1');   ylabel('x2');%------------------------------------------------------plot(X(Y < 0, 2), Y(Y<0,1), 'rx', 'linewidth', 2);plot(X(Y > 0, 2), Y(Y>0,1),  'go', 'linewidth', 2);x1 = min(X(:,2)):.01:max(X(:,2));%x1 = -theta(1)/theta(2);%y = -2:.001:2;%disp(y);x2 = theta(1)  +theta(2) * x1;y = 2*(1./ (1 + exp(-x2)))-1;plot(x1,y, 'linewidth', 2);xlabel('x1');ylabel('Y');%predict=X * theta;acc=sum(((predict>0)*2-1)~=Y)/m;disp(acc);endfunction [theta,ll] = log_regression(X,Y)max_iters=40;% rows of X are training samples% rows of Y are corresponding -1/1 values% newton raphson: theta = theta - inv(H)* grad;% with H = hessian, grad = gradient%Y = Y-1.5;mm = size(X,1);nn = size(X,2);theta = zeros(nn,1);ll = zeros(max_iters, 1);for ii = 1:max_iters    %%%%%%%%%%%%%%    %% 课堂完成内容    margins = Y .* (X * theta);    ll(ii) = (1/mm) * sum(log(1 + exp(-margins)));    probs = 1 ./ (1 + exp(margins));    grad = -(1/mm) * (X' * (probs .* Y));    H = (1/mm) * (X' * diag(probs .* (1 - probs)) * X);    theta = theta - H \ grad;    %%%%%%%%%%%%%%enda=X * theta;predict = 1 ./ (1 + exp(-a));acc=sum(((predict>1.5)*2-1)==Y)/mm;disp(theta);end

其中对于PCA的MATLAB操作很简单:

 [coeff, score, latent, tsquared, explained] =  pca(X);%PCA过程X =score(:,1);%需要降到几维就去前k维作为结果
1 0
原创粉丝点击