Exercise: Implement deep networks for digit classification 代码示例

来源:互联网 发布:跨区域淘宝客服招聘 编辑:程序博客网 时间:2024/05/21 16:50

练习参考Implement deep networks for digit classification


       这个练习用到了一个四层的深度神经网络。第一层是数据输入层;第二、三层是稀疏自编码器层,分别取两个稀疏自编码器的隐藏层作为第二、三层;第四层为Softmax分类器,用于分类0到9的手写数字。在训练Softmax分类器后,又对整个网络进行了微调。在微调时,需要把L2~L4这三层作为一个模型进行调整,调用一次minFunc函数,得到调整后的三层的参数值。经测试,发现微调后在测试数据上的准确率有大幅度提高。


       微调时需要利用反向传播算法计算三层的梯度值:

STEP 2: Train the first sparse autoencoder

addpath minFunc/options.Method = 'lbfgs'; options.maxIter = 400;options.display = 'on';[sae1OptTheta, cost] = minFunc( @(p) sparseAutoencoderCost(p, ...                                   inputSize, hiddenSizeL1, ...                                   lambda, sparsityParam, ...                                   beta, trainData), ...                              sae1Theta, options);

STEP 2: Train the second sparse autoencoder

[sae2OptTheta, cost] = minFunc( @(p) sparseAutoencoderCost(p, ...                                   hiddenSizeL1, hiddenSizeL2, ...                                   lambda, sparsityParam, ...                                   beta, sae1Features), ...                              sae2Theta, options);

STEP 3: Train the softmax classifier

options.maxIter = 100;softmaxModel = softmaxTrain(hiddenSizeL2, numClasses, lambda, ...                            sae2Features, trainLabels, options);saeSoftmaxOptTheta = softmaxModel.optTheta(:);

Step 4: Implement fine-tuning

stackedAECost.m

stack{1}.i = data;for d = 1:numel(stack)-1    stack{d}.o = sigmoid(stack{d}.w * stack{d}.i + repmat(stack{d}.b,1,M));    stack{d+1}.i = stack{d}.o;endstack{end}.o = sigmoid(stack{end}.w * stack{end}.i + repmat(stack{end}.b,1,M));Mat = softmaxTheta * stack{end}.o;Mat = exp(bsxfun(@minus, Mat, max(Mat, [], 1)));P = bsxfun(@rdivide, Mat, sum(Mat));Mat = log(P);WD = lambda / 2 * sum(sum(softmaxTheta.^2)); cost = -sum(sum(groundTruth.*Mat)) / M + WD;softmaxThetaGrad = -(groundTruth - P) * stack{end}.o' ./ M + lambda.*softmaxTheta;stack{end}.delta = -softmaxTheta' * (groundTruth - P) .* stack{end}.o .* (1 - stack{end}.o);for d = numel(stack)-1:-1:1    stack{d}.delta = stack{d+1}.w' * stack{d+1}.delta .* stack{d}.o .* (1 - stack{d}.o);endfor d = 1:numel(stack)    stackgrad{d}.w = stack{d}.delta * stack{d}.i' / M;    stackgrad{d}.b = mean(stack{d}.delta,2);end

STEP 5: Finetune softmax model

options.Method = 'lbfgs';  options.maxIter = 400;  options.display = 'on';  [stackedAEOptTheta, cost] = minFunc( @(p) stackedAECost(p, inputSize, hiddenSizeL2, ...                                                numClasses, netconfig, ...                                                lambda, trainData, trainLabels), ...                                                                   stackedAETheta, options);  
STEP 6: Test 
stackedAEPredict.m

M = size(data, 2);stack{1}.i = data;for d = 1:numel(stack)-1    stack{d}.o = sigmoid(stack{d}.w * stack{d}.i + repmat(stack{d}.b,1,M));    stack{d+1}.i = stack{d}.o;endstack{end}.o = sigmoid(stack{end}.w * stack{end}.i + repmat(stack{end}.b,1,M));Mat = softmaxTheta * stack{end}.o;[~,pred] = max(Mat);


0 0