Stanford 机器学习-Regularization(Week 3)-代码

来源:互联网 发布:猎马搜索源码 编辑:程序博客网 时间:2024/05/22 00:14

NG机器学习公开课

7-Regularization

所有的代码整个到一个.m文件里了。

clear; close all; clc%本次测试集使用的是训练集,预测训练集的准确度function p = predict(theta, X)    m = size(X,1);    p = zeros(m, 1);    p = sigmoid(X*theta) >= 0.5;endfunction plotData(X, y)%创建新的图像figure;hold on;%pos中保存的是y=1的在原矩阵的行值pos = find(y==1); neg = find(y==0);%X(pos, 1)保存的是y结果为1的所有的第一个属性的值plot(X(pos, 1), X(pos, 2), 'k+', 'LineWidth', 2, 'MarkerSize', 7);plot(X(neg, 1), X(neg, 2), 'ko', 'MarkerFaceColor', 'y', 'MarkerSize', 7);plot(X(neg, 1), X(neg, 2), 'ko',  'MarkerSize', 7);hold off;end%sigmoid函数function g = sigmoid(z)g = zeros(size(z));%g = (1 + e.^(-1.*z)).^(-1);g = 1./(1+exp(-z));end%logistic计算损失函数的函数function [J, grad] = costFunction(theta, X, y)m = length(y);J = 0;grad = zeros(size(theta));hx = sigmoid(X*theta);J = -1/m*(y'*log(hx) + ((1-y)'*log(1-hx)));grad = 1/m*X'*(hx-y);endfunction out = mapFeature(X1, X2)%返回一个具有更多特征的特征数组%将两个特征变为多个特征degree = 6;out = ones(size(X1(:, 1)));for i = 1:degree    for h = 0:i        out(:, end+1) = (X1.^(i-j)).*(X2.^j);    endendendfunction plotDecisionBoundary(theta, X, y)plotData(X(:,2:3), y);hold on;if size(X,2) <= 3    plot_x = [min(X(:, 2))-2, max(X(:, 2))+2];    plot_y = [min(X(:, 3))-2, max(X(:, 3))+2];    %根据两个点画出一条线    plot(plot_x, plot_y);    legend('admitted', 'Not admitted', 'Decision Boundary');    axis([30,100,30,100]);else    u = linspace(-1, 1.5, 50);    v = linspace(-1, 1.5, 50);    z = zeros(length(u), length(v));    for i = 1:length(u)        for j = 1:length(v)            z(i,j) = mapFeature(u(i), v(j))*theta;        end    end    z = z';    contour(u, v, z, [0,0], 'LineWidth', 2);endhold off    end%主程序data = load('ex2data1.txt');X = data(:, [1,2]); y = data(:, 3);plotData(X, y);hold on;xlabel('Exam 1 score');ylabel('Exam 2 score');%legend函数表示添加图例legend('Admitted', 'Not admitted');hold off%实现logistic regression的cost function和gradient[m, n] = size(X);X = [ones(m, 1) X];initial_theta = zeros(n+1, 1);[cost, grad] = costFunction(initial_theta, X, y);%using fminunc进行优化,首先设置优化参数,然后调用优化函数options = optimset('GradObj', 'on', 'MaxIter','400');[theta, cost] = fminunc(@(t)(costFunction(t,X,y)), initial_theta, options);plotDecisionBoundary(theta, X, y);hold on;xlabel('Exam 1 socre')ylabel('Exam 2 score')legend('admitted', 'not admitted')hold offprob = sigmoid([1,45,85]*theta);fprintf('the pro is %f\n' ,prob);p = predict(theta, X);fprintf('Train Accuracy: %f\n', mean(double(p == y)) * 100);
0 0
原创粉丝点击