UFLDL Exercise: Softmax Regression

来源:互联网 发布:淘宝商品名称字数限制 编辑:程序博客网 时间:2024/06/05 04:12

这是UFLDL关于softmax回归的练习题。

练习主要编写softmaxCost.m和softmaxPredict.m两个文件。

softmaxCost.m

function [cost, grad] = softmaxCost(theta, numClasses, inputSize, lambda, data, labels)% numClasses - the number of classes % inputSize - the size N of the input vector% lambda - weight decay parameter% data - the N x M input matrix, where each column data(:, i) corresponds to%        a single test set% labels - an M x 1 matrix containing the labels corresponding for the input data%% Unroll the parameters from thetatheta = reshape(theta, numClasses, inputSize);numCases = size(data, 2);groundTruth = full(sparse(labels, 1:numCases, 1));%cost = 0;%thetagrad = zeros(numClasses, inputSize);%% ---------- YOUR CODE HERE --------------------------------------%  Instructions: Compute the cost and gradient for softmax regression.%                You need to compute thetagrad and cost.%                The groundTruth matrix might come in handy.M = theta * data;M = bsxfun(@minus, M, max(M, [], 1));M = exp(M);M = bsxfun(@rdivide, M, sum(M));thetagrad = (groundTruth-M)*data'/(-numCases) + lambda*theta;cost = groundTruth(:)'*log(M(:))/(-numCases) + sum(theta(:).^2)*lambda/2;% ------------------------------------------------------------------% Unroll the gradient matrices into a vector for minFuncgrad = [thetagrad(:)];end

softmaxPredict.m

function [pred] = softmaxPredict(softmaxModel, data)% softmaxModel - model trained using softmaxTrain% data - the N x M input matrix, where each column data(:, i) corresponds to%        a single test set%% Your code should produce the prediction matrix % pred, where pred(i) is argmax_c P(y(c) | x(i)). % Unroll the parameters from thetatheta = softmaxModel.optTheta;  % this provides a numClasses x inputSize matrixpred = zeros(1, size(data, 2));%% ---------- YOUR CODE HERE --------------------------------------%  Instructions: Compute pred using theta assuming that the labels start %                from 1.%numClasses = softmaxModel.numClasses;%inputSize = softmaxModel.inputSize;%theta = reshape(theta, numClasses, inputSize);M = theta*data;M = bsxfun(@minus,M,max(M,[],1));M = exp(M);M = bsxfun(@rdivide,M,sum(M));[maxv,pred] = max(M);% ---------------------------------------------------------------------end

迭代100次,Accuracy: 92.640%。

参考:

[1]http://deeplearning.stanford.edu/wiki/index.php/Softmax回归

[2]http://deeplearning.stanford.edu/wiki/index.php/Exercise:Softmax_Regression

[3]http://www.cnblogs.com/tornadomeet/archive/2013/03/23/2977621.html【参考此文章修改了部分实现,加快了训练速度】

0 0