Check failed: registry.count(type) == 1 (0 vs. 1) Unknown layer type: ImageData
来源:互联网 发布:dota2公开天梯数据 编辑:程序博客网 时间:2024/06/01 08:17
請注意,本類僅僅是記錄開發過程中遇到對問題,可能會亂貼代碼,亂貼圖,亂貼報錯信息,不保證能解決問題,以及有優美的排版,後面有時間我會重新整理的。
報錯信息
I0705 16:19:07.375211 15866 layer_factory.hpp:77] Creating layer dataF0705 16:19:07.375231 15866 layer_factory.hpp:81] Check failed: registry.count(type) == 1 (0 vs. 1) Unknown layer type: ImageData (known types: AbsVal, Accuracy, ArgMax, BNLL, BatchNorm, BatchReindex, Bias, Concat, ContrastiveLoss, Convolution, Crop, Data, Deconvolution, Dropout, DummyData, ELU, Eltwise, Embed, EuclideanLoss, Exp, Filter, Flatten, HDF5Data, HDF5Output, HingeLoss, Im2col, InfogainLoss, InnerProduct, Input, LRN, LSTM, LSTMUnit, Log, MVN, MemoryData, MultinomialLogisticLoss, PReLU, Parameter, Pooling, Power, RNN, ReLU, Reduction, Reshape, SPP, Scale, Sigmoid, SigmoidCrossEntropyLoss, Silence, Slice, Softmax, SoftmaxWithLoss, Split, TanH, Threshold, Tile)*** Check failure stack trace: ***Process finished with exit code 134 (interrupted by signal 6: SIGABRT)
所有控制臺信息
Loaded ImageNet labels:n01440764 tench, Tinca tincan01443537 goldfish, Carassius auratusn01484850 great white shark, white shark, man-eater, man-eating shark, Carcharodon carchariasn01491361 tiger shark, Galeocerdo cuvierin01494475 hammerhead, hammerhead sharkn01496331 electric ray, crampfish, numbfish, torpedon01498041 stingrayn01514668 cockn01514859 henn01518878 ostrich, Struthio camelus...Loaded style labels:Detailed, Pastel, Melancholy, Noir, HDRWARNING: Logging before InitGoogleLogging() is written to STDERRW0705 16:19:05.940695 15866 _caffe.cpp:139] DEPRECATION WARNING - deprecated use of Python interfaceW0705 16:19:05.940731 15866 _caffe.cpp:140] Use this instead (with the named "weights" parameter):W0705 16:19:05.940734 15866 _caffe.cpp:142] Net('/tmp/tmpHpeL3b', 1, weights='/home/pikachu/bvlc/caffe/models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel')I0705 16:19:05.942809 15866 net.cpp:51] Initializing net from parameters: state { phase: TEST level: 0}layer { name: "data" type: "DummyData" top: "data" dummy_data_param { shape { dim: 1 dim: 3 dim: 227 dim: 227 } }}layer { name: "conv1" type: "Convolution" bottom: "data" top: "conv1" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 96 pad: 0 kernel_size: 11 group: 1 stride: 4 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0.1 } }}layer { name: "relu1" type: "ReLU" bottom: "conv1" top: "conv1"}layer { name: "pool1" type: "Pooling" bottom: "conv1" top: "pool1" pooling_param { pool: MAX kernel_size: 3 stride: 2 }}layer { name: "norm1" type: "LRN" bottom: "pool1" top: "norm1" lrn_param { local_size: 5 alpha: 0.0001 beta: 0.75 }}layer { name: "conv2" type: "Convolution" bottom: "norm1" top: "conv2" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 256 pad: 2 kernel_size: 5 group: 2 stride: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0.1 } }}layer { name: "relu2" type: "ReLU" bottom: "conv2" top: "conv2"}layer { name: "pool2" type: "Pooling" bottom: "conv2" top: "pool2" pooling_param { pool: MAX kernel_size: 3 stride: 2 }}layer { name: "norm2" type: "LRN" bottom: "pool2" top: "norm2" lrn_param { local_size: 5 alpha: 0.0001 beta: 0.75 }}layer { name: "conv3" type: "Convolution" bottom: "norm2" top: "conv3" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 384 pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0.1 } }}layer { name: "relu3" type: "ReLU" bottom: "conv3" top: "conv3"}layer { name: "conv4" type: "Convolution" bottom: "conv3" top: "conv4" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 384 pad: 1 kernel_size: 3 group: 2 stride: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0.1 } }}layer { name: "relu4" type: "ReLU" bottom: "conv4" top: "conv4"}layer { name: "conv5" type: "Convolution" bottom: "conv4" top: "conv5" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 256 pad: 1 kernel_size: 3 group: 2 stride: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0.1 } }}layer { name: "relu5" type: "ReLU" bottom: "conv5" top: "conv5"}layer { name: "pool5" type: "Pooling" bottom: "conv5" top: "pool5" pooling_param { pool: MAX kernel_size: 3 stride: 2 }}layer { name: "fc6" type: "InnerProduct" bottom: "pool5" top: "fc6" param { lr_mult: 0 } param { lr_mult: 0 } inner_product_param { num_output: 4096 weight_filler { type: "gaussian" std: 0.005 } bias_filler { type: "constant" value: 0.1 } }}layer { name: "relu6" type: "ReLU" bottom: "fc6" top: "fc6"}layer { name: "fc7" type: "InnerProduct" bottom: "fc6" top: "fc7" param { lr_mult: 0 } param { lr_mult: 0 } inner_product_param { num_output: 4096 weight_filler { type: "gaussian" std: 0.005 } bias_filler { type: "constant" value: 0.1 } }}layer { name: "relu7" type: "ReLU" bottom: "fc7" top: "fc7"}layer { name: "fc8" type: "InnerProduct" bottom: "fc7" top: "fc8" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } inner_product_param { num_output: 1000 }}layer { name: "probs" type: "Softmax" bottom: "fc8" top: "probs"}I0705 16:19:05.942888 15866 layer_factory.hpp:77] Creating layer dataI0705 16:19:05.942898 15866 net.cpp:84] Creating Layer dataI0705 16:19:05.942904 15866 net.cpp:380] data -> dataI0705 16:19:05.955576 15866 net.cpp:122] Setting up dataI0705 16:19:05.955610 15866 net.cpp:129] Top shape: 1 3 227 227 (154587)I0705 16:19:05.955615 15866 net.cpp:137] Memory required for data: 618348I0705 16:19:05.955623 15866 layer_factory.hpp:77] Creating layer conv1I0705 16:19:05.955646 15866 net.cpp:84] Creating Layer conv1I0705 16:19:05.955651 15866 net.cpp:406] conv1 <- dataI0705 16:19:05.955660 15866 net.cpp:380] conv1 -> conv1I0705 16:19:06.482457 15866 net.cpp:122] Setting up conv1I0705 16:19:06.482484 15866 net.cpp:129] Top shape: 1 96 55 55 (290400)I0705 16:19:06.482488 15866 net.cpp:137] Memory required for data: 1779948I0705 16:19:06.482504 15866 layer_factory.hpp:77] Creating layer relu1I0705 16:19:06.482517 15866 net.cpp:84] Creating Layer relu1I0705 16:19:06.482520 15866 net.cpp:406] relu1 <- conv1I0705 16:19:06.482525 15866 net.cpp:367] relu1 -> conv1 (in-place)I0705 16:19:06.482942 15866 net.cpp:122] Setting up relu1I0705 16:19:06.482954 15866 net.cpp:129] Top shape: 1 96 55 55 (290400)I0705 16:19:06.482956 15866 net.cpp:137] Memory required for data: 2941548I0705 16:19:06.482959 15866 layer_factory.hpp:77] Creating layer pool1I0705 16:19:06.482967 15866 net.cpp:84] Creating Layer pool1I0705 16:19:06.482970 15866 net.cpp:406] pool1 <- conv1I0705 16:19:06.482975 15866 net.cpp:380] pool1 -> pool1I0705 16:19:06.483016 15866 net.cpp:122] Setting up pool1I0705 16:19:06.483021 15866 net.cpp:129] Top shape: 1 96 27 27 (69984)I0705 16:19:06.483023 15866 net.cpp:137] Memory required for data: 3221484I0705 16:19:06.483026 15866 layer_factory.hpp:77] Creating layer norm1I0705 16:19:06.483032 15866 net.cpp:84] Creating Layer norm1I0705 16:19:06.483036 15866 net.cpp:406] norm1 <- pool1I0705 16:19:06.483038 15866 net.cpp:380] norm1 -> norm1I0705 16:19:06.483219 15866 net.cpp:122] Setting up norm1I0705 16:19:06.483227 15866 net.cpp:129] Top shape: 1 96 27 27 (69984)I0705 16:19:06.483228 15866 net.cpp:137] Memory required for data: 3501420I0705 16:19:06.483232 15866 layer_factory.hpp:77] Creating layer conv2I0705 16:19:06.483242 15866 net.cpp:84] Creating Layer conv2I0705 16:19:06.483244 15866 net.cpp:406] conv2 <- norm1I0705 16:19:06.483248 15866 net.cpp:380] conv2 -> conv2I0705 16:19:06.489406 15866 net.cpp:122] Setting up conv2I0705 16:19:06.489430 15866 net.cpp:129] Top shape: 1 256 27 27 (186624)I0705 16:19:06.489434 15866 net.cpp:137] Memory required for data: 4247916I0705 16:19:06.489445 15866 layer_factory.hpp:77] Creating layer relu2I0705 16:19:06.489454 15866 net.cpp:84] Creating Layer relu2I0705 16:19:06.489459 15866 net.cpp:406] relu2 <- conv2I0705 16:19:06.489464 15866 net.cpp:367] relu2 -> conv2 (in-place)I0705 16:19:06.489872 15866 net.cpp:122] Setting up relu2I0705 16:19:06.489881 15866 net.cpp:129] Top shape: 1 256 27 27 (186624)I0705 16:19:06.489882 15866 net.cpp:137] Memory required for data: 4994412I0705 16:19:06.489886 15866 layer_factory.hpp:77] Creating layer pool2I0705 16:19:06.489892 15866 net.cpp:84] Creating Layer pool2I0705 16:19:06.489893 15866 net.cpp:406] pool2 <- conv2I0705 16:19:06.489897 15866 net.cpp:380] pool2 -> pool2I0705 16:19:06.489941 15866 net.cpp:122] Setting up pool2I0705 16:19:06.489945 15866 net.cpp:129] Top shape: 1 256 13 13 (43264)I0705 16:19:06.489948 15866 net.cpp:137] Memory required for data: 5167468I0705 16:19:06.489949 15866 layer_factory.hpp:77] Creating layer norm2I0705 16:19:06.489958 15866 net.cpp:84] Creating Layer norm2I0705 16:19:06.489959 15866 net.cpp:406] norm2 <- pool2I0705 16:19:06.489962 15866 net.cpp:380] norm2 -> norm2I0705 16:19:06.490140 15866 net.cpp:122] Setting up norm2I0705 16:19:06.490146 15866 net.cpp:129] Top shape: 1 256 13 13 (43264)I0705 16:19:06.490149 15866 net.cpp:137] Memory required for data: 5340524I0705 16:19:06.490150 15866 layer_factory.hpp:77] Creating layer conv3I0705 16:19:06.490159 15866 net.cpp:84] Creating Layer conv3I0705 16:19:06.490161 15866 net.cpp:406] conv3 <- norm2I0705 16:19:06.490165 15866 net.cpp:380] conv3 -> conv3I0705 16:19:06.501384 15866 net.cpp:122] Setting up conv3I0705 16:19:06.501408 15866 net.cpp:129] Top shape: 1 384 13 13 (64896)I0705 16:19:06.501412 15866 net.cpp:137] Memory required for data: 5600108I0705 16:19:06.501425 15866 layer_factory.hpp:77] Creating layer relu3I0705 16:19:06.501435 15866 net.cpp:84] Creating Layer relu3I0705 16:19:06.501440 15866 net.cpp:406] relu3 <- conv3I0705 16:19:06.501446 15866 net.cpp:367] relu3 -> conv3 (in-place)I0705 16:19:06.501895 15866 net.cpp:122] Setting up relu3I0705 16:19:06.501907 15866 net.cpp:129] Top shape: 1 384 13 13 (64896)I0705 16:19:06.501909 15866 net.cpp:137] Memory required for data: 5859692I0705 16:19:06.501914 15866 layer_factory.hpp:77] Creating layer conv4I0705 16:19:06.501925 15866 net.cpp:84] Creating Layer conv4I0705 16:19:06.501929 15866 net.cpp:406] conv4 <- conv3I0705 16:19:06.501935 15866 net.cpp:380] conv4 -> conv4I0705 16:19:06.511385 15866 net.cpp:122] Setting up conv4I0705 16:19:06.511409 15866 net.cpp:129] Top shape: 1 384 13 13 (64896)I0705 16:19:06.511412 15866 net.cpp:137] Memory required for data: 6119276I0705 16:19:06.511420 15866 layer_factory.hpp:77] Creating layer relu4I0705 16:19:06.511430 15866 net.cpp:84] Creating Layer relu4I0705 16:19:06.511432 15866 net.cpp:406] relu4 <- conv4I0705 16:19:06.511437 15866 net.cpp:367] relu4 -> conv4 (in-place)I0705 16:19:06.511828 15866 net.cpp:122] Setting up relu4I0705 16:19:06.511837 15866 net.cpp:129] Top shape: 1 384 13 13 (64896)I0705 16:19:06.511839 15866 net.cpp:137] Memory required for data: 6378860I0705 16:19:06.511842 15866 layer_factory.hpp:77] Creating layer conv5I0705 16:19:06.511852 15866 net.cpp:84] Creating Layer conv5I0705 16:19:06.511853 15866 net.cpp:406] conv5 <- conv4I0705 16:19:06.511858 15866 net.cpp:380] conv5 -> conv5I0705 16:19:06.519031 15866 net.cpp:122] Setting up conv5I0705 16:19:06.519052 15866 net.cpp:129] Top shape: 1 256 13 13 (43264)I0705 16:19:06.519055 15866 net.cpp:137] Memory required for data: 6551916I0705 16:19:06.519067 15866 layer_factory.hpp:77] Creating layer relu5I0705 16:19:06.519074 15866 net.cpp:84] Creating Layer relu5I0705 16:19:06.519078 15866 net.cpp:406] relu5 <- conv5I0705 16:19:06.519083 15866 net.cpp:367] relu5 -> conv5 (in-place)I0705 16:19:06.519243 15866 net.cpp:122] Setting up relu5I0705 16:19:06.519249 15866 net.cpp:129] Top shape: 1 256 13 13 (43264)I0705 16:19:06.519251 15866 net.cpp:137] Memory required for data: 6724972I0705 16:19:06.519253 15866 layer_factory.hpp:77] Creating layer pool5I0705 16:19:06.519258 15866 net.cpp:84] Creating Layer pool5I0705 16:19:06.519260 15866 net.cpp:406] pool5 <- conv5I0705 16:19:06.519264 15866 net.cpp:380] pool5 -> pool5I0705 16:19:06.519304 15866 net.cpp:122] Setting up pool5I0705 16:19:06.519307 15866 net.cpp:129] Top shape: 1 256 6 6 (9216)I0705 16:19:06.519309 15866 net.cpp:137] Memory required for data: 6761836I0705 16:19:06.519311 15866 layer_factory.hpp:77] Creating layer fc6I0705 16:19:06.519318 15866 net.cpp:84] Creating Layer fc6I0705 16:19:06.519320 15866 net.cpp:406] fc6 <- pool5I0705 16:19:06.519325 15866 net.cpp:380] fc6 -> fc6I0705 16:19:06.907001 15866 net.cpp:122] Setting up fc6I0705 16:19:06.907024 15866 net.cpp:129] Top shape: 1 4096 (4096)I0705 16:19:06.907028 15866 net.cpp:137] Memory required for data: 6778220I0705 16:19:06.907038 15866 layer_factory.hpp:77] Creating layer relu6I0705 16:19:06.907047 15866 net.cpp:84] Creating Layer relu6I0705 16:19:06.907052 15866 net.cpp:406] relu6 <- fc6I0705 16:19:06.907058 15866 net.cpp:367] relu6 -> fc6 (in-place)I0705 16:19:06.907871 15866 net.cpp:122] Setting up relu6I0705 16:19:06.907883 15866 net.cpp:129] Top shape: 1 4096 (4096)I0705 16:19:06.907887 15866 net.cpp:137] Memory required for data: 6794604I0705 16:19:06.907891 15866 layer_factory.hpp:77] Creating layer fc7I0705 16:19:06.907902 15866 net.cpp:84] Creating Layer fc7I0705 16:19:06.907905 15866 net.cpp:406] fc7 <- fc6I0705 16:19:06.907913 15866 net.cpp:380] fc7 -> fc7I0705 16:19:07.075175 15866 net.cpp:122] Setting up fc7I0705 16:19:07.075198 15866 net.cpp:129] Top shape: 1 4096 (4096)I0705 16:19:07.075201 15866 net.cpp:137] Memory required for data: 6810988I0705 16:19:07.075209 15866 layer_factory.hpp:77] Creating layer relu7I0705 16:19:07.075217 15866 net.cpp:84] Creating Layer relu7I0705 16:19:07.075220 15866 net.cpp:406] relu7 <- fc7I0705 16:19:07.075225 15866 net.cpp:367] relu7 -> fc7 (in-place)I0705 16:19:07.075441 15866 net.cpp:122] Setting up relu7I0705 16:19:07.075448 15866 net.cpp:129] Top shape: 1 4096 (4096)I0705 16:19:07.075448 15866 net.cpp:137] Memory required for data: 6827372I0705 16:19:07.075451 15866 layer_factory.hpp:77] Creating layer fc8I0705 16:19:07.075456 15866 net.cpp:84] Creating Layer fc8I0705 16:19:07.075459 15866 net.cpp:406] fc8 <- fc7I0705 16:19:07.075462 15866 net.cpp:380] fc8 -> fc8I0705 16:19:07.082876 15866 net.cpp:122] Setting up fc8I0705 16:19:07.082902 15866 net.cpp:129] Top shape: 1 1000 (1000)I0705 16:19:07.082906 15866 net.cpp:137] Memory required for data: 6831372I0705 16:19:07.082914 15866 layer_factory.hpp:77] Creating layer probsI0705 16:19:07.082923 15866 net.cpp:84] Creating Layer probsI0705 16:19:07.082926 15866 net.cpp:406] probs <- fc8I0705 16:19:07.082931 15866 net.cpp:380] probs -> probsI0705 16:19:07.083515 15866 net.cpp:122] Setting up probsI0705 16:19:07.083523 15866 net.cpp:129] Top shape: 1 1000 (1000)I0705 16:19:07.083526 15866 net.cpp:137] Memory required for data: 6835372I0705 16:19:07.083528 15866 net.cpp:200] probs does not need backward computation.I0705 16:19:07.083531 15866 net.cpp:200] fc8 does not need backward computation.I0705 16:19:07.083534 15866 net.cpp:200] relu7 does not need backward computation.I0705 16:19:07.083535 15866 net.cpp:200] fc7 does not need backward computation.I0705 16:19:07.083537 15866 net.cpp:200] relu6 does not need backward computation.I0705 16:19:07.083539 15866 net.cpp:200] fc6 does not need backward computation.I0705 16:19:07.083542 15866 net.cpp:200] pool5 does not need backward computation.I0705 16:19:07.083544 15866 net.cpp:200] relu5 does not need backward computation.I0705 16:19:07.083546 15866 net.cpp:200] conv5 does not need backward computation.I0705 16:19:07.083549 15866 net.cpp:200] relu4 does not need backward computation.I0705 16:19:07.083551 15866 net.cpp:200] conv4 does not need backward computation.I0705 16:19:07.083554 15866 net.cpp:200] relu3 does not need backward computation.I0705 16:19:07.083555 15866 net.cpp:200] conv3 does not need backward computation.I0705 16:19:07.083559 15866 net.cpp:200] norm2 does not need backward computation.I0705 16:19:07.083560 15866 net.cpp:200] pool2 does not need backward computation.I0705 16:19:07.083564 15866 net.cpp:200] relu2 does not need backward computation.I0705 16:19:07.083565 15866 net.cpp:200] conv2 does not need backward computation.I0705 16:19:07.083567 15866 net.cpp:200] norm1 does not need backward computation.I0705 16:19:07.083570 15866 net.cpp:200] pool1 does not need backward computation.I0705 16:19:07.083572 15866 net.cpp:200] relu1 does not need backward computation.I0705 16:19:07.083575 15866 net.cpp:200] conv1 does not need backward computation.I0705 16:19:07.083576 15866 net.cpp:200] data does not need backward computation.I0705 16:19:07.083578 15866 net.cpp:242] This network produces output probsI0705 16:19:07.083587 15866 net.cpp:255] Network initialization done.I0705 16:19:07.182833 15866 upgrade_proto.cpp:44] Attempting to upgrade input file specified using deprecated transformation parameters: /home/pikachu/bvlc/caffe/models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodelI0705 16:19:07.182853 15866 upgrade_proto.cpp:47] Successfully upgraded file specified using deprecated data transformation parameters.W0705 16:19:07.182857 15866 upgrade_proto.cpp:49] Note that future Caffe releases will only support transform_param messages for transformation fields.I0705 16:19:07.182858 15866 upgrade_proto.cpp:53] Attempting to upgrade input file specified using deprecated V1LayerParameter: /home/pikachu/bvlc/caffe/models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodelI0705 16:19:07.318497 15866 upgrade_proto.cpp:61] Successfully upgraded file specified using deprecated V1LayerParameterI0705 16:19:07.352970 15866 net.cpp:744] Ignoring source layer drop6I0705 16:19:07.366560 15866 net.cpp:744] Ignoring source layer drop7I0705 16:19:07.369844 15866 net.cpp:744] Ignoring source layer lossW0705 16:19:07.374799 15866 _caffe.cpp:139] DEPRECATION WARNING - deprecated use of Python interfaceW0705 16:19:07.374815 15866 _caffe.cpp:140] Use this instead (with the named "weights" parameter):W0705 16:19:07.374817 15866 _caffe.cpp:142] Net('/tmp/tmpcTvnDX', 1, weights='/home/pikachu/bvlc/caffe/models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel')I0705 16:19:07.375140 15866 net.cpp:51] Initializing net from parameters: state { phase: TEST level: 0}layer { name: "data" type: "ImageData" top: "data" top: "label" transform_param { mirror: false crop_size: 227 mean_file: "/home/pikachu/bvlc/caffe/data/ilsvrc12/imagenet_mean.binaryproto" } image_data_param { source: "/home/pikachu/bvlc/caffe/data/flickr_style/train.txt" batch_size: 50 new_height: 256 new_width: 256 }}layer { name: "conv1" type: "Convolution" bottom: "data" top: "conv1" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 96 pad: 0 kernel_size: 11 group: 1 stride: 4 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0.1 } }}layer { name: "relu1" type: "ReLU" bottom: "conv1" top: "conv1"}layer { name: "pool1" type: "Pooling" bottom: "conv1" top: "pool1" pooling_param { pool: MAX kernel_size: 3 stride: 2 }}layer { name: "norm1" type: "LRN" bottom: "pool1" top: "norm1" lrn_param { local_size: 5 alpha: 0.0001 beta: 0.75 }}layer { name: "conv2" type: "Convolution" bottom: "norm1" top: "conv2" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 256 pad: 2 kernel_size: 5 group: 2 stride: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0.1 } }}layer { name: "relu2" type: "ReLU" bottom: "conv2" top: "conv2"}layer { name: "pool2" type: "Pooling" bottom: "conv2" top: "pool2" pooling_param { pool: MAX kernel_size: 3 stride: 2 }}layer { name: "norm2" type: "LRN" bottom: "pool2" top: "norm2" lrn_param { local_size: 5 alpha: 0.0001 beta: 0.75 }}layer { name: "conv3" type: "Convolution" bottom: "norm2" top: "conv3" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 384 pad: 1 kernel_size: 3 group: 1 stride: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0.1 } }}layer { name: "relu3" type: "ReLU" bottom: "conv3" top: "conv3"}layer { name: "conv4" type: "Convolution" bottom: "conv3" top: "conv4" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 384 pad: 1 kernel_size: 3 group: 2 stride: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0.1 } }}layer { name: "relu4" type: "ReLU" bottom: "conv4" top: "conv4"}layer { name: "conv5" type: "Convolution" bottom: "conv4" top: "conv5" param { lr_mult: 0 } param { lr_mult: 0 } convolution_param { num_output: 256 pad: 1 kernel_size: 3 group: 2 stride: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0.1 } }}layer { name: "relu5" type: "ReLU" bottom: "conv5" top: "conv5"}layer { name: "pool5" type: "Pooling" bottom: "conv5" top: "pool5" pooling_param { pool: MAX kernel_size: 3 stride: 2 }}layer { name: "fc6" type: "InnerProduct" bottom: "pool5" top: "fc6" param { lr_mult: 0 } param { lr_mult: 0 } inner_product_param { num_output: 4096 weight_filler { type: "gaussian" std: 0.005 } bias_filler { type: "constant" value: 0.1 } }}layer { name: "relu6" type: "ReLU" bottom: "fc6" top: "fc6"}layer { name: "fc7" type: "InnerProduct" bottom: "fc6" top: "fc7" param { lr_mult: 0 } param { lr_mult: 0 } inner_product_param { num_output: 4096 weight_filler { type: "gaussian" std: 0.005 } bias_filler { type: "constant" value: 0.1 } }}layer { name: "relu7" type: "ReLU" bottom: "fc7" top: "fc7"}layer { name: "fc8_flickr" type: "InnerProduct" bottom: "fc7" top: "fc8_flickr" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } inner_product_param { num_output: 5 }}layer { name: "probs" type: "Softmax" bottom: "fc8_flickr" top: "probs"}layer { name: "loss" type: "SoftmaxWithLoss" bottom: "fc8_flickr" bottom: "label" top: "loss"}layer { name: "acc" type: "Accuracy" bottom: "fc8_flickr" bottom: "label" top: "acc"}I0705 16:19:07.375211 15866 layer_factory.hpp:77] Creating layer dataF0705 16:19:07.375231 15866 layer_factory.hpp:81] Check failed: registry.count(type) == 1 (0 vs. 1) Unknown layer type: ImageData (known types: AbsVal, Accuracy, ArgMax, BNLL, BatchNorm, BatchReindex, Bias, Concat, ContrastiveLoss, Convolution, Crop, Data, Deconvolution, Dropout, DummyData, ELU, Eltwise, Embed, EuclideanLoss, Exp, Filter, Flatten, HDF5Data, HDF5Output, HingeLoss, Im2col, InfogainLoss, InnerProduct, Input, LRN, LSTM, LSTMUnit, Log, MVN, MemoryData, MultinomialLogisticLoss, PReLU, Parameter, Pooling, Power, RNN, ReLU, Reduction, Reshape, SPP, Scale, Sigmoid, SigmoidCrossEntropyLoss, Silence, Slice, Softmax, SoftmaxWithLoss, Split, TanH, Threshold, Tile)*** Check failure stack trace: ***Process finished with exit code 134 (interrupted by signal 6: SIGABRT)
阅读全文
0 0
- Check failed: registry.count(type) == 1 (0 vs. 1) Unknown layer type: ImageData
- Check failed: registry.count(type) == 1 (0 vs. 1) Unknown layer type: Python
- Check failed: registry.count(type) == 1 (0 vs. 1) Unknown layer type: ConvolutionDepthwise
- Check failed: registry.count(type) == 1 (0 vs. 1) Unknown layer type
- Check failed: registry.count(type) == 1 (0 vs. 1) Unknown layer type: Python
- Check failed: registry.count(t ype) == 1 (0 vs. 1) Unknown layer type
- Check failed: registry.count(t ype) == 1 (0 vs. 1) Unknown layer type
- windows7下解决caffe check failed registry.count(type) == 1(0 vs. 1) unknown layer type问题
- caffe,运行fast-rcnn出现Check failed: registry.count(type) == 1 (0 vs. 1) Unknown layer type: ROIPooling
- libcaffe Check failed: registry.count(type) == 1 (0 vs. 1) Unknown layer type: Input (known types: )
- caffe问题Check failed: registry.count(type) == 1 (0 vs. 1) Unknown layer type: Python
- Check failed: registry.count(t ype) == 1 (0 vs. 1) Unknown layer type: Input (known types: Input )
- Check failed: registry.count(t ype) == 1 (0 vs. 1) Unknown layer type: Input (known types: Input )
- Check failed: registry.count(t ype) == 1 (0 vs. 1) Unknown layer type: Input (known types: Input )
- 【边喝caffee边Caffe 】(三) Check failed: registry.count(t ype) == 1 (0 vs. 1) Unknown layer type
- 【神经网络与深度学习】Caffe训练执行时爆出的Check failed: registry.count(t ype) == 1 (0 vs. 1) Unknown layer type
- [Caffe]:关于Check failed: registry.count(type) == 0 (1 vs. 0)错误
- Unknown layer type: Python
- caffe 下一些参数的设置
- python3学习之tuple
- 湖南省第十一届大学生计算机程序设计竞赛—阶乘除法
- eclipse导入tomcat时Unknown version of Tomcat was specified
- Android自绘View界面布局实现文字水印效果
- Check failed: registry.count(type) == 1 (0 vs. 1) Unknown layer type: ImageData
- HTML5学习05-Web Storage存储
- ubuntu搭建svn服务
- HBase常用操作之namespace
- 解决关于父view的alpha为0 子view的alpha为0的问题
- POJ 2810 Take Your Vitamins 笔记
- BZOJ 3673 可持久化并查集 by zky
- java8 时间与日期api的总结与实例
- markdown范本