torch7 nn 模块 ClassNLLCriterion

来源:互联网 发布:软件测试常用方法 编辑:程序博客网 时间:2024/06/05 19:33

nn模块中的loss函数有一个选择就是:nn.ClassNLLCriterion。既The negative log likelihood criterion。

需要说明的是,其表达式criterion = nn.ClassNLLCriterion([weights]) 中,weights是用于类别加权的,为一个1维的Tensor。

============模块自带文档======================

ClassNLLCriterion

 criterion = nn.ClassNLLCriterion([weights]) 

The negative log likelihood criterion. It is useful to train a classication problem with n classes.
If provided, the optional argument weights should be a 1D Tensor assigning weight to each of the classes.
This is particularly useful when you have an unbalanced training set.

The input given through a forward() is expected to contain log-probabilities of each class: input has to be a 1D Tensor of size n .
Obtaining log-probabilities in a neural network is easily achieved by adding a  LogSoftMax  layer in the last layer of your neural network.
You may use  CrossEntropyCriterion  instead, if you prefer not to add an extra layer to your network.
This criterion expect a class index (1 to the number of class) as target when calling  forward(input, target ) and  backward(input, target) .

The loss can be described as:
 loss(x, class) = -x[class] 

or in the case of the weights argument being specified:
 loss(x, class) = -weights[class] * x[class] 

The following is a code fragment showing how to make a gradient step given an input x , a desired output y (an integer 1 to n , in this case n = 2 classes), a network mlp and a learning rate learningRate :
function gradUpdate(mlp, x, y, learningRate)
   local criterion = nn.ClassNLLCriterion()
   pred = mlp:forward(x)
   local err = criterion:forward(pred, y)
   mlp:zeroGradParameters()
   local t = criterion:backward(pred, y)
   mlp:backward(x, t)
   mlp:updateParameters(learningRate)
end     
==============================================

0 0
原创粉丝点击