[torch] remove layers from the model

来源:互联网 发布:公司logo在线制作软件 编辑:程序博客网 时间:2024/05/21 11:33

https://github.com/torch/nn/blob/master/doc/containers.md

odel = nn.Sequential()model:add(nn.Linear(10, 20))model:add(nn.Linear(20, 20))model:add(nn.Linear(20, 30))model:remove(2)> modelnn.Sequential {  [input -> (1) -> (2) -> output]  (1): nn.Linear(10 -> 20)  (2): nn.Linear(20 -> 30)}

https://groups.google.com/forum/#!topic/torch7/W1af2omm18s

require 'nn'require 'rnn'require 'os'require 'cunn'featdim=10hiddenSize=5temperature=2numTargetClasses=21batch=3seq=4model = nn.Sequencer(        nn.Sequential()                :add(nn.FastLSTM(featdim, hiddenSize):maskZero(1))                :add(nn.MaskZero(nn.Linear(hiddenSize, numTargetClasses),1))                :add(nn.MaskZero(nn.MulConstant(1/temperature),1))                --:add(nn.MaskZero(nn.LogSoftMax(),1)))print(model)input={}for i=1,seq do    table.insert(input,torch.rand(batch,featdim))endout1=model:forward(input)local m = model.modules--m[1].module.modules[3]=nilm[1].module.modules[3]=nn.Identity()print(model)out2=model:forward(input)for i=1,seq do    print(out1[i]*2-out2[i])end

output

nn.Sequencer @ nn.Recursor @ nn.Sequential {  [input -> (1) -> (2) -> (3) -> output]  (1): nn.FastLSTM(10 -> 5)  (2): nn.MaskZero @ nn.Linear(5 -> 21)  (3): nn.MaskZero @ nn.MulConstant}nn.Sequencer @ nn.Recursor @ nn.Sequential {  [input -> (1) -> (2) -> (3) -> output]  (1): nn.FastLSTM(10 -> 5)  (2): nn.MaskZero @ nn.Linear(5 -> 21)  (3): nn.Identity} 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0[torch.DoubleTensor of size 3x21]Columns 1 to 10 0.1501 -0.0584 -0.1265  0.1189 -0.0639  0.1172 -0.0891  0.1156 -0.0944  0.2143 0.1466 -0.0396 -0.1348  0.1289 -0.0446  0.1270 -0.0739  0.1079 -0.0979  0.2021 0.1440 -0.0401 -0.1134  0.1281 -0.0608  0.1034 -0.0582  0.0809 -0.1187  0.2039Columns 11 to 20 0.1220 -0.0880  0.1872  0.1126  0.1963  0.0020  0.0096 -0.1716  0.1589 -0.2559 0.1368 -0.0833  0.1888  0.1275  0.1928  0.0062  0.0279 -0.1468  0.1529 -0.2321 0.1367 -0.1016  0.2025  0.1276  0.1887  0.0198  0.0378 -0.1519  0.1527 -0.2348Columns 21 to 21 0.0528 0.0514 0.0532[torch.DoubleTensor of size 3x21]Columns 1 to 10 0.1626 -0.0541 -0.1265  0.1148 -0.0606  0.1298 -0.0956  0.1248 -0.0810  0.2055 0.1406 -0.0517 -0.1321  0.1217 -0.0596  0.1040 -0.0814  0.1172 -0.1044  0.2122 0.1515 -0.0570 -0.1184  0.1281 -0.0723  0.0901 -0.0719  0.0834 -0.1072  0.2081Columns 11 to 20 0.1230 -0.0808  0.2008  0.1037  0.2006 -0.0072  0.0060 -0.1646  0.1551 -0.2592 0.1364 -0.0880  0.1557  0.1126  0.2007  0.0040  0.0115 -0.1521  0.1662 -0.2402 0.1402 -0.0989  0.1979  0.1155  0.1838  0.0034  0.0276 -0.1636  0.1467 -0.2400Columns 21 to 21 0.0455 0.0398 0.0434[torch.DoubleTensor of size 3x21]Columns 1 to 10 0.1576 -0.0524 -0.1315  0.1203 -0.0593  0.1163 -0.0893  0.1187 -0.0870  0.2037 0.1274 -0.0541 -0.1538  0.1317 -0.0486  0.0992 -0.0824  0.1237 -0.1050  0.2171 0.1407 -0.0447 -0.1118  0.1192 -0.0642  0.1090 -0.0678  0.0982 -0.1176  0.2117Columns 11 to 20 0.1362 -0.0816  0.1843  0.1056  0.1979 -0.0084  0.0111 -0.1536  0.1543 -0.2453 0.1477 -0.0807  0.1195  0.1227  0.1965  0.0000  0.0120 -0.1453  0.1678 -0.2224 0.1234 -0.1008  0.1901  0.1213  0.1986  0.0223  0.0233 -0.1609  0.1651 -0.2495Columns 21 to 21 0.0364 0.0373 0.0573[torch.DoubleTensor of size 3x21]

诶???????
不对啊….为什么…

阅读全文
0 0