斯坦福大学-Logistic回归_Exercise Code

来源:互联网 发布:mysql官方文档 编辑:程序博客网 时间:2024/05/17 05:15

Logistic回归和牛顿法  Logistic Regression and Newton's Method

http://openclassroom.stanford.edu/MainFolder/DocumentPage.php?course=MachineLearning&doc=exercises/ex4/ex4.html

clear all; close all; clcx = load('ex4x.dat'); y = load('ex4y.dat');[m, n] = size(x);% Add intercept term to xx = [ones(m, 1), x]; % Plot the training data% Use different markers for positives and negativesfigurepos = find(y); neg = find(y == 0);%find是找到的一个向量,其结果是find函数括号值为真时的值的编号plot(x(pos, 2), x(pos,3), '+')hold onplot(x(neg, 2), x(neg, 3), 'o')hold onxlabel('Exam 1 score')ylabel('Exam 2 score')% Initialize fitting parameterstheta = zeros(n+1, 1);% Define the sigmoid functiong = inline('1.0 ./ (1.0 + exp(-z))'); % Newton's methodMAX_ITR = 7;J = zeros(MAX_ITR, 1);for i = 1:MAX_ITR    % Calculate the hypothesis function    z = x * theta;    h = g(z);%转换成logistic函数        % Calculate gradient and hessian.    % The formulas below are equivalent to the summation formulas    % given in the lecture videos.    grad = (1/m).*x' * (h-y);%梯度的矢量表示法    H = (1/m).*x' * diag(h) * diag(1-h) * x;%hessian矩阵的矢量表示法        % Calculate J (for testing convergence)    J(i) =(1/m)*sum(-y.*log(h) - (1-y).*log(1-h));%损失函数的矢量表示法        theta = theta - H\grad;end% Display thetatheta% Calculate the probability that a student with% Score 20 on exam 1 and score 80 on exam 2 % will not be admittedprob = 1 - g([1, 20, 80]*theta)%画出分界面% Plot Newton's method result% Only need 2 points to define a line, so choose two endpointsplot_x = [min(x(:,2))-2,  max(x(:,2))+2];% Calculate the decision boundary lineplot_y = (-1./theta(3)).*(theta(2).*plot_x +theta(1));plot(plot_x, plot_y)legend('Admitted', 'Not admitted', 'Decision Boundary')hold off% Plot Jfigureplot(0:MAX_ITR-1, J, 'o--', 'MarkerFaceColor', 'r', 'MarkerSize', 8)xlabel('Iteration'); ylabel('J')% Display JJ


0 0
原创粉丝点击