UFLDL Exercise:Softmax Regression
来源:互联网 发布:淘宝网店装修多少钱 编辑:程序博客网 时间:2024/06/05 17:07
练习题网址:http://deeplearning.stanford.edu/wiki/index.php/Exercise:Softmax_Regression
softmaxCost.m
function [cost, grad] = softmaxCost(theta, numClasses, inputSize, lambda, data, labels)% numClasses - the number of classes % inputSize - the size N of the input vector% lambda - weight decay parameter% data - the N x M input matrix, where each column data(:, i) corresponds to% a single test set% labels - an M x 1 matrix containing the labels corresponding for the input data%% Unroll the parameters from thetatheta = reshape(theta, numClasses, inputSize);numCases = size(data, 2);groundTruth = full(sparse(labels, 1:numCases, 1));cost = 0;thetagrad = zeros(numClasses, inputSize);%% ---------- YOUR CODE HERE --------------------------------------% Instructions: Compute the cost and gradient for softmax regression.% You need to compute thetagrad and cost.% The groundTruth matrix might come in handy.M = theta * data;M = bsxfun(@minus,M,max(M,[],1));Exp_M = exp(M);p = bsxfun(@rdivide,Exp_M,sum(Exp_M));% general cost% cost = cost - 1/numCases*sum(sum(groundTruth.*log(p)));cost = cost - 1/numCases * groundTruth(:)' * log(p(:));% add Weight Decaycost = cost + lambda/2 * sum(theta(:).^2);% compute thetagradthetagrad = -1/numCases*(groundTruth - p)*data'+lambda*theta;% M = bsxfun(@minus,theta*data,max(theta*data, [], 1));% M = exp(M);% p = bsxfun(@rdivide, M, sum(M));% cost = -1/numCases * groundTruth(:)' * log(p(:)) + lambda/2 * sum(theta(:) .^ 2);% thetagrad = -1/numCases * (groundTruth - p) * data' + lambda * theta;% ------------------------------------------------------------------% Unroll the gradient matrices into a vector for minFuncgrad = [thetagrad(:)];end
softmaxPredict.m
function [pred] = softmaxPredict(softmaxModel, data)% softmaxModel - model trained using softmaxTrain% data - the N x M input matrix, where each column data(:, i) corresponds to% a single test set%% Your code should produce the prediction matrix % pred, where pred(i) is argmax_c P(y(c) | x(i)). % Unroll the parameters from thetatheta = softmaxModel.optTheta; % this provides a numClasses x inputSize matrixpred = zeros(1, size(data, 2));%% ---------- YOUR CODE HERE --------------------------------------% Instructions: Compute pred using theta assuming that the labels start % from 1.M = theta*data;M = bsxfun(@minus,M,max(M));Exp_M = exp(M);p = bsxfun(@rdivide,Exp_M,sum(Exp_M));[~,pred] = max(p);% ---------------------------------------------------------------------end
0 0
- UFLDL Exercise:Softmax Regression
- UFLDL Exercise:Softmax Regression
- UFLDL Exercise:Softmax Regression
- UFLDL Exercise: Softmax Regression
- UFLDL教程:Exercise:Softmax Regression
- Ufldl Exercise:Softmax Regression Softmax回归练习
- Stanford UFLDL教程 Exercise:Softmax Regression
- UFLDL教程答案(4):Exercise:Softmax Regression
- UFLDL教程Exercise答案(4):Softmax Regression
- UFLDL——Exercise: Softmax Regression (softmax回归)
- 【UFLDL-exercise5-Softmax Regression】
- UFLDL 08 Softmax Regression
- UFLDL Softmax Regression 推导
- UFLDL Tutorial-Softmax Regression
- Exercise:Softmax Regression 代码示例
- UFLDL练习(PCA and Whitening && Softmax Regression)
- UFLDL学习笔记3(Softmax Regression)
- ufldl学习笔记与编程作业:Softmax Regression(vectorization加速)
- C# 设置ListView某一行在刷新时始终保持可见,不随滚动条的滚动而被屏蔽
- NSMutableString和NSString区别,及相互转换方法
- Java Reflection
- 怎么也调不出的bug
- libevent2.0分析:事件循环的一生
- UFLDL Exercise:Softmax Regression
- PAT_BASIC LEVEL_ 1015 多键排序
- maven Nexus入门指南(图文)
- aoj 595 撒哈拉大冒险
- 黄金分割数 蓝桥杯
- 没有tools.jar包的解决办法
- HDU 1158 Employment Planning (简单二维dp)
- 线段树专题
- hdu 3874 Necklace 线段树