...C:\Users\vlj\Documents\GitHub\caffe\python\caffe\test\test_coord_map.py:46: DeprecationWarning: Please use assertEqual instead. self.assertEquals(ax, 1) ........E........WARNING: Logging before InitGoogleLogging() is written to STDERR I0122 21:16:09.047402 24972 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead. I0122 21:16:09.047402 24972 net.cpp:294] The NetState phase (1) differed from the phase (0) specified by a rule in layer train_data I0122 21:16:09.047402 24972 net.cpp:330] The NetState did not contain stage 'val' specified by a rule in layer val_data I0122 21:16:09.047402 24972 net.cpp:294] The NetState phase (1) differed from the phase (0) specified by a rule in layer loss I0122 21:16:09.047402 24972 net.cpp:330] The NetState did not contain stage 'val' specified by a rule in layer loss I0122 21:16:09.047402 24972 net.cpp:51] Initializing net from parameters: state { phase: TEST level: 0 stage: "deploy" } layer { name: "deploy_data" type: "Input" top: "data" include { phase: TEST stage: "deploy" } input_param { shape { dim: 1 dim: 1 dim: 10 dim: 10 } } } layer { name: "ip" type: "InnerProduct" bottom: "data" top: "ip" inner_product_param { num_output: 2 } } layer { name: "pred" type: "Softmax" bottom: "ip" top: "pred" include { phase: TEST stage: "deploy" } } I0122 21:16:09.047402 24972 layer_factory.hpp:77] Creating layer deploy_data I0122 21:16:09.047402 24972 net.cpp:84] Creating Layer deploy_data I0122 21:16:09.047402 24972 net.cpp:380] deploy_data -> data I0122 21:16:09.047402 24972 net.cpp:122] Setting up deploy_data I0122 21:16:09.047402 24972 net.cpp:129] Top shape: 1 1 10 10 (100) I0122 21:16:09.047402 24972 net.cpp:137] Memory required for data: 400 I0122 21:16:09.047402 24972 layer_factory.hpp:77] Creating layer ip I0122 21:16:09.047402 24972 net.cpp:84] Creating Layer ip I0122 21:16:09.047402 24972 net.cpp:406] ip <- data I0122 21:16:09.047402 24972 net.cpp:380] ip -> ip I0122 21:16:09.047402 24972 net.cpp:122] Setting up ip I0122 21:16:09.047402 24972 net.cpp:129] Top shape: 1 2 (2) I0122 21:16:09.047402 24972 net.cpp:137] Memory required for data: 408 I0122 21:16:09.047402 24972 layer_factory.hpp:77] Creating layer pred I0122 21:16:09.047402 24972 net.cpp:84] Creating Layer pred I0122 21:16:09.047402 24972 net.cpp:406] pred <- ip I0122 21:16:09.047402 24972 net.cpp:380] pred -> pred I0122 21:16:09.047402 24972 net.cpp:122] Setting up pred I0122 21:16:09.047402 24972 net.cpp:129] Top shape: 1 2 (2) I0122 21:16:09.047402 24972 net.cpp:137] Memory required for data: 416 I0122 21:16:09.047402 24972 net.cpp:200] pred does not need backward computation. I0122 21:16:09.047402 24972 net.cpp:200] ip does not need backward computation. I0122 21:16:09.047402 24972 net.cpp:200] deploy_data does not need backward computation. I0122 21:16:09.047402 24972 net.cpp:242] This network produces output pred I0122 21:16:09.047402 24972 net.cpp:255] Network initialization done. I0122 21:16:09.054404 24972 net.cpp:294] The NetState phase (0) differed from the phase (1) specified by a rule in layer val_data I0122 21:16:09.054404 24972 net.cpp:294] The NetState phase (0) differed from the phase (1) specified by a rule in layer deploy_data I0122 21:16:09.054404 24972 net.cpp:294] The NetState phase (0) differed from the phase (1) specified by a rule in layer pred I0122 21:16:09.054404 24972 net.cpp:51] Initializing net from parameters: state { phase: TRAIN level: 0 stage: "train" } layer { name: "train_data" type: "DummyData" top: "data" top: "label" include { phase: TRAIN stage: "train" } dummy_data_param { shape { dim: 1 dim: 1 dim: 10 dim: 10 } shape { dim: 1 dim: 1 dim: 1 dim: 1 } } } layer { name: "ip" type: "InnerProduct" bottom: "data" top: "ip" inner_product_param { num_output: 2 } } layer { name: "loss" type: "SoftmaxWithLoss" bottom: "ip" bottom: "label" top: "loss" include { phase: TRAIN stage: "train" } include { phase: TEST stage: "val" } } I0122 21:16:09.054404 24972 layer_factory.hpp:77]. Creating layer train_data I0122 21:16:09.055407 24972 net.cpp:84] Creating Layer train_data I0122 21:16:09.055407 24972 net.cpp:380] train_data -> data I0122 21:16:09.055407 24972 net.cpp:380] train_data -> label I0122 21:16:09.055407 24972 net.cpp:122] Setting up train_data I0122 21:16:09.055407 24972 net.cpp:129] Top shape: 1 1 10 10 (100) I0122 21:16:09.055407 24972 net.cpp:129] Top shape: 1 1 1 1 (1) I0122 21:16:09.055407 24972 net.cpp:137] Memory required for data: 404 I0122 21:16:09.055407 24972 layer_factory.hpp:77] Creating layer ip I0122 21:16:09.055407 24972 net.cpp:84] Creating Layer ip I0122 21:16:09.055407 24972 net.cpp:406] ip <- data I0122 21:16:09.055407 24972 net.cpp:380] ip -> ip I0122 21:16:09.055407 24972 net.cpp:122] Setting up ip I0122 21:16:09.055407 24972 net.cpp:129] Top shape: 1 2 (2) I0122 21:16:09.055407 24972 net.cpp:137] Memory required for data: 412 I0122 21:16:09.055407 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.055407 24972 net.cpp:84] Creating Layer loss I0122 21:16:09.055407 24972 net.cpp:406] loss <- ip I0122 21:16:09.055407 24972 net.cpp:406] loss <- label I0122 21:16:09.055407 24972 net.cpp:380] loss -> loss I0122 21:16:09.055407 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.055407 24972 net.cpp:122] Setting up loss I0122 21:16:09.055407 24972 net.cpp:129] Top shape: (1) I0122 21:16:09.055407 24972 net.cpp:132] with loss weight 1 I0122 21:16:09.055407 24972 net.cpp:137] Memory required for data: 416 I0122 21:16:09.055407 24972 net.cpp:198] loss needs backward computation. I0122 21:16:09.055407 24972 net.cpp:198] ip needs backward computation. I0122 21:16:09.055407 24972 net.cpp:200] train_data does not need backward computation. I0122 21:16:09.055407 24972 net.cpp:242] This network produces output loss I0122 21:16:09.055407 24972 net.cpp:255] Network initialization done. I0122 21:16:09.057404 24972 net.cpp:294] The NetState phase (1) differed from the phase (0) specified by a rule in layer train_data I0122 21:16:09.057404 24972 net.cpp:330] The NetState did not contain stage 'deploy' specified by a rule in layer deploy_data I0122 21:16:09.057404 24972 net.cpp:294] The NetState phase (1) differed from the phase (0) specified by a rule in layer loss I0122 21:16:09.057404 24972 net.cpp:330] The NetState did not contain stage 'deploy' specified by a rule in layer pred I0122 21:16:09.057404 24972 net.cpp:51] Initializing net from parameters: state { phase: TEST level: 0 stage: "val" } layer { name: "val_data" type: "DummyData" top: "data" top: "label" include { phase: TEST stage: "val" } dummy_data_param { shape { dim: 1 dim: 1 dim: 10 dim: 10 } shape { dim: 1 dim: 1 dim: 1 dim: 1 } } } layer { name: "ip" type: "InnerProduct" bottom: "data" top: "ip" inner_product_param { num_output: 2 } } layer { name: "loss" type: "SoftmaxWithLoss" bottom: "ip" bottom: "label" top: "loss" include { phase: TRAIN stage: "train" } include { phase: TEST stage: "val" } } I0122 21:16:09.057404 24972 layer_factory.hpp:77] Creating layer val_data I0122 21:16:09.057404 24972 net.cpp:84] Creating Layer val_data I0122 21:16:09.057404 24972 net.cpp:380] val_data -> data I0122 21:16:09.057404 24972 net.cpp:380] val_data -> label I0122 21:16:09.057404 24972 net.cpp:122] Setting up val_data I0122 21:16:09.057404 24972 net.cpp:129] Top shape: 1 1 10 10 (100) I0122 21:16:09.057404 24972 net.cpp:129] Top shape: 1 1 1 1 (1) I0122 21:16:09.057404 24972 net.cpp:137] Memory required for data: 404 I0122 21:16:09.057404 24972 layer_factory.hpp:77] Creating layer ip I0122 21:16:09.057404 24972 net.cpp:84] Creating Layer ip I0122 21:16:09.057404 24972 net.cpp:406] ip <- data I0122 21:16:09.057404 24972 net.cpp:380] ip -> ip I0122 21:16:09.057404 24972 net.cpp:122] Setting up ip I0122 21:16:09.057404 24972 net.cpp:129] Top shape: 1 2 (2) I0122 21:16:09.057404 24972 net.cpp:137] Memory required for data: 412 I0122 21:16:09.057404 24972 layer_facto.ry.hpp:77] Creating layer loss I0122 21:16:09.058401 24972 net.cpp:84] Creating Layer loss I0122 21:16:09.058401 24972 net.cpp:406] loss <- ip I0122 21:16:09.058401 24972 net.cpp:406] loss <- label I0122 21:16:09.058401 24972 net.cpp:380] loss -> loss I0122 21:16:09.058401 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.058401 24972 net.cpp:122] Setting up loss I0122 21:16:09.058401 24972 net.cpp:129] Top shape: (1) I0122 21:16:09.058401 24972 net.cpp:132] with loss weight 1 I0122 21:16:09.058401 24972 net.cpp:137] Memory required for data: 416 I0122 21:16:09.058401 24972 net.cpp:198] loss needs backward computation. I0122 21:16:09.058401 24972 net.cpp:198] ip needs backward computation. I0122 21:16:09.058401 24972 net.cpp:200] val_data does not need backward computation. I0122 21:16:09.058401 24972 net.cpp:242] This network produces output loss I0122 21:16:09.058401 24972 net.cpp:255] Network initialization done. I0122 21:16:09.059403 24972 net.cpp:304] The NetState level (0) is above the min_level (1) specified by a rule in layer Level1Only I0122 21:16:09.059403 24972 net.cpp:304] The NetState level (0) is above the min_level (1) specified by a rule in layer Level>=1 I0122 21:16:09.059403 24972 net.cpp:51] Initializing net from parameters: state { phase: TEST level: 0 } layer { name: "data" type: "DummyData" top: "data" dummy_data_param { shape { dim: 1 dim: 1 dim: 10 dim: 10 } } } layer { name: "NoLevel" type: "InnerProduct" bottom: "data" top: "NoLevel" inner_product_param { num_output: 1 } } layer { name: "Level0Only" type: "InnerProduct" bottom: "data" top: "Level0Only" include { min_level: 0 max_level: 0 } inner_product_param { num_output: 1 } } layer { name: "Level>=0" type: "InnerProduct" bottom: "data" top: "Level>=0" include { min_level: 0 } inner_product_param { num_output: 1 } } I0122 21:16:09.059403 24972 layer_factory.hpp:77] Creating layer data I0122 21:16:09.059403 24972 net.cpp:84] Creating Layer data I0122 21:16:09.059403 24972 net.cpp:380] data -> data I0122 21:16:09.059403 24972 net.cpp:122] Setting up data I0122 21:16:09.059403 24972 net.cpp:129] Top shape: 1 1 10 10 (100) I0122 21:16:09.059403 24972 net.cpp:137] Memory required for data: 400 I0122 21:16:09.059403 24972 layer_factory.hpp:77] Creating layer data_data_0_split I0122 21:16:09.059403 24972 net.cpp:84] Creating Layer data_data_0_split I0122 21:16:09.059403 24972 net.cpp:406] data_data_0_split <- data I0122 21:16:09.059403 24972 net.cpp:380] data_data_0_split -> data_data_0_split_0 I0122 21:16:09.059403 24972 net.cpp:380] data_data_0_split -> data_data_0_split_1 I0122 21:16:09.059403 24972 net.cpp:380] data_data_0_split -> data_data_0_split_2 I0122 21:16:09.059403 24972 net.cpp:122] Setting up data_data_0_split I0122 21:16:09.059403 24972 net.cpp:129] Top shape: 1 1 10 10 (100) I0122 21:16:09.059403 24972 net.cpp:129] Top shape: 1 1 10 10 (100) I0122 21:16:09.059403 24972 net.cpp:129] Top shape: 1 1 10 10 (100) I0122 21:16:09.059403 24972 net.cpp:137] Memory required for data: 1600 I0122 21:16:09.059403 24972 layer_factory.hpp:77] Creating layer NoLevel I0122 21:16:09.059403 24972 net.cpp:84] Creating Layer NoLevel I0122 21:16:09.059403 24972 net.cpp:406] NoLevel <- data_data_0_split_0 I0122 21:16:09.059403 24972 net.cpp:380] NoLevel -> NoLevel I0122 21:16:09.059403 24972 net.cpp:122] Setting up NoLevel I0122 21:16:09.059403 24972 net.cpp:129] Top shape: 1 1 (1) I0122 21:16:09.059403 24972 net.cpp:137] Memory required for data: 1604 I0122 21:16:09.059403 24972 layer_factory.hpp:77] Creating layer Level0Only I0122 21:16:09.059403 24972 net.cpp:84] Creating Layer Level0Only I0122 21:16:09.059403 24972 net.cpp:406] Level0Only <- data_data_0_split_1 I0122 21:16:09.059403 24972 net.cpp:380] Level0Only -> Level0Only I0122 21:16:09.059403 24972 net.cpp:122] Setting up Level0Only I0122 21:16:09.059403 24972 net.cpp:129] Top shape: 1 1 (1) I0122 21:16:09.059403 24972 net.cpp:137] Memory required for data: 1608 I0122 .21:16:09.059403 24972 layer_factory.hpp:77] Creating layer Level>=0 I0122 21:16:09.059403 24972 net.cpp:84] Creating Layer Level>=0 I0122 21:16:09.059403 24972 net.cpp:406] Level>=0 <- data_data_0_split_2 I0122 21:16:09.059403 24972 net.cpp:380] Level>=0 -> Level>=0 I0122 21:16:09.059403 24972 net.cpp:122] Setting up Level>=0 I0122 21:16:09.059403 24972 net.cpp:129] Top shape: 1 1 (1) I0122 21:16:09.059403 24972 net.cpp:137] Memory required for data: 1612 I0122 21:16:09.059403 24972 net.cpp:200] Level>=0 does not need backward computation. I0122 21:16:09.059403 24972 net.cpp:200] Level0Only does not need backward computation. I0122 21:16:09.059403 24972 net.cpp:200] NoLevel does not need backward computation. I0122 21:16:09.059403 24972 net.cpp:200] data_data_0_split does not need backward computation. I0122 21:16:09.059403 24972 net.cpp:200] data does not need backward computation. I0122 21:16:09.059403 24972 net.cpp:242] This network produces output Level0Only I0122 21:16:09.059403 24972 net.cpp:242] This network produces output Level>=0 I0122 21:16:09.060402 24972 net.cpp:242] This network produces output NoLevel I0122 21:16:09.060402 24972 net.cpp:255] Network initialization done. I0122 21:16:09.061404 24972 net.cpp:314] The NetState level (1) is above the max_level (0) specified by a rule in layer Level0Only I0122 21:16:09.061404 24972 net.cpp:51] Initializing net from parameters: state { phase: TEST level: 1 } layer { name: "data" type: "DummyData" top: "data" dummy_data_param { shape { dim: 1 dim: 1 dim: 10 dim: 10 } } } layer { name: "NoLevel" type: "InnerProduct" bottom: "data" top: "NoLevel" inner_product_param { num_output: 1 } } layer { name: "Level1Only" type: "InnerProduct" bottom: "data" top: "Level1Only" include { min_level: 1 max_level: 1 } inner_product_param { num_output: 1 } } layer { name: "Level>=0" type: "InnerProduct" bottom: "data" top: "Level>=0" include { min_level: 0 } inner_product_param { num_output: 1 } } layer { name: "Level>=1" type: "InnerProduct" bottom: "data" top: "Level>=1" include { min_level: 1 } inner_product_param { num_output: 1 } } I0122 21:16:09.061404 24972 layer_factory.hpp:77] Creating layer data I0122 21:16:09.061404 24972 net.cpp:84] Creating Layer data I0122 21:16:09.061404 24972 net.cpp:380] data -> data I0122 21:16:09.061404 24972 net.cpp:122] Setting up data I0122 21:16:09.061404 24972 net.cpp:129] Top shape: 1 1 10 10 (100) I0122 21:16:09.061404 24972 net.cpp:137] Memory required for data: 400 I0122 21:16:09.061404 24972 layer_factory.hpp:77] Creating layer data_data_0_split I0122 21:16:09.061404 24972 net.cpp:84] Creating Layer data_data_0_split I0122 21:16:09.061404 24972 net.cpp:406] data_data_0_split <- data I0122 21:16:09.061404 24972 net.cpp:380] data_data_0_split -> data_data_0_split_0 I0122 21:16:09.061404 24972 net.cpp:380] data_data_0_split -> data_data_0_split_1 I0122 21:16:09.061404 24972 net.cpp:380] data_data_0_split -> data_data_0_split_2 I0122 21:16:09.061404 24972 net.cpp:380] data_data_0_split -> data_data_0_split_3 I0122 21:16:09.061404 24972 net.cpp:122] Setting up data_data_0_split I0122 21:16:09.061404 24972 net.cpp:129] Top shape: 1 1 10 10 (100) I0122 21:16:09.061404 24972 net.cpp:129] Top shape: 1 1 10 10 (100) I0122 21:16:09.061404 24972 net.cpp:129] Top shape: 1 1 10 10 (100) I0122 21:16:09.061404 24972 net.cpp:129] Top shape: 1 1 10 10 (100) I0122 21:16:09.061404 24972 net.cpp:137] Memory required for data: 2000 I0122 21:16:09.061404 24972 layer_factory.hpp:77] Creating layer NoLevel I0122 21:16:09.062405 24972 net.cpp:84] Creating Layer NoLevel I0122 21:16:09.062405 24972 net.cpp:406] NoLevel <- data_data_0_split_0 I0122 21:16:09.062405 24972 net.cpp:380] NoLevel -> NoLevel I0122 21:16:09.062405 24972 net.cpp:122] Setting up NoLevel I0122 21:16:09.062405 24972 net.cpp:129] Top shape: 1 1 (1) I0122 21:16:09.062405 24972 net.cpp:137] Memory required for data: 2004 I0122 21:16:09.062405 24972 layer_factor.y.hpp:77] Creating layer Level1Only I0122 21:16:09.062405 24972 net.cpp:84] Creating Layer Level1Only I0122 21:16:09.062405 24972 net.cpp:406] Level1Only <- data_data_0_split_1 I0122 21:16:09.062405 24972 net.cpp:380] Level1Only -> Level1Only I0122 21:16:09.062405 24972 net.cpp:122] Setting up Level1Only I0122 21:16:09.062405 24972 net.cpp:129] Top shape: 1 1 (1) I0122 21:16:09.062405 24972 net.cpp:137] Memory required for data: 2008 I0122 21:16:09.062405 24972 layer_factory.hpp:77] Creating layer Level>=0 I0122 21:16:09.062405 24972 net.cpp:84] Creating Layer Level>=0 I0122 21:16:09.062405 24972 net.cpp:406] Level>=0 <- data_data_0_split_2 I0122 21:16:09.062405 24972 net.cpp:380] Level>=0 -> Level>=0 I0122 21:16:09.062405 24972 net.cpp:122] Setting up Level>=0 I0122 21:16:09.062405 24972 net.cpp:129] Top shape: 1 1 (1) I0122 21:16:09.062405 24972 net.cpp:137] Memory required for data: 2012 I0122 21:16:09.062405 24972 layer_factory.hpp:77] Creating layer Level>=1 I0122 21:16:09.062405 24972 net.cpp:84] Creating Layer Level>=1 I0122 21:16:09.062405 24972 net.cpp:406] Level>=1 <- data_data_0_split_3 I0122 21:16:09.062405 24972 net.cpp:380] Level>=1 -> Level>=1 I0122 21:16:09.062405 24972 net.cpp:122] Setting up Level>=1 I0122 21:16:09.062405 24972 net.cpp:129] Top shape: 1 1 (1) I0122 21:16:09.062405 24972 net.cpp:137] Memory required for data: 2016 I0122 21:16:09.062405 24972 net.cpp:200] Level>=1 does not need backward computation. I0122 21:16:09.062405 24972 net.cpp:200] Level>=0 does not need backward computation. I0122 21:16:09.062405 24972 net.cpp:200] Level1Only does not need backward computation. I0122 21:16:09.062405 24972 net.cpp:200] NoLevel does not need backward computation. I0122 21:16:09.062405 24972 net.cpp:200] data_data_0_split does not need backward computation. I0122 21:16:09.062405 24972 net.cpp:200] data does not need backward computation. I0122 21:16:09.062405 24972 net.cpp:242] This network produces output Level1Only I0122 21:16:09.062405 24972 net.cpp:242] This network produces output Level>=0 I0122 21:16:09.062405 24972 net.cpp:242] This network produces output Level>=1 I0122 21:16:09.062405 24972 net.cpp:242] This network produces output NoLevel I0122 21:16:09.062405 24972 net.cpp:255] Network initialization done. I0122 21:16:09.063402 24972 net.cpp:51] Initializing net from parameters: name: "testnet" force_backward: true state { phase: TRAIN level: 0 } layer { name: "data" type: "DummyData" top: "data" top: "label" dummy_data_param { data_filler { type: "gaussian" std: 1 } data_filler { type: "constant" } num: 5 num: 5 channels: 2 channels: 1 height: 3 height: 1 width: 4 width: 1 } } layer { name: "conv" type: "Convolution" bottom: "data" top: "conv" param { decay_mult: 1 } param { decay_mult: 0 } convolution_param { num_output: 11 pad: 3 kernel_size: 2 weight_filler { type: "gaussian" std: 1 } bias_filler { type: "constant" value: 2 } } } layer { name: "ip" type: "InnerProduct" bottom: "conv" top: "ip_blob" inner_product_param { num_output: 13 weight_filler { type: "gaussian" std: 2.5 } bias_filler { type: "constant" value: -3 } } } layer { name: "loss" type: "SoftmaxWithLoss" bottom: "ip_blob" bottom: "label" top: "loss" } I0122 21:16:09.063402 24972 layer_factory.hpp:77] Creating layer data I0122 21:16:09.063402 24972 net.cpp:84] Creating Layer data I0122 21:16:09.063402 24972 net.cpp:380] data -> data I0122 21:16:09.063402 24972 net.cpp:380] data -> label I0122 21:16:09.063402 24972 net.cpp:122] Setting up data I0122 21:16:09.063402 24972 net.cpp:129] Top shape: 5 2 3 4 (120) I0122 21:16:09.063402 24972 net.cpp:129] Top shape: 5 1 1 1 (5) I0122 21:16:09.063402 24972 net.cpp:137] Memory required for data: 500 I0122 21:16:09.063402 24972 layer_factory.hpp:77] Creating layer conv I0122 21:16:09.063402 24972 net.cpp:84] Creating Layer conv I0122 21:16:09.063402 24972 ne.t.cpp:406] conv <- data I0122 21:16:09.064405 24972 net.cpp:380] conv -> conv I0122 21:16:09.064405 24972 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead. I0122 21:16:09.064405 24972 net.cpp:122] Setting up conv I0122 21:16:09.064405 24972 net.cpp:129] Top shape: 5 11 8 9 (3960) I0122 21:16:09.064405 24972 net.cpp:137] Memory required for data: 16340 I0122 21:16:09.064405 24972 layer_factory.hpp:77] Creating layer ip I0122 21:16:09.064405 24972 net.cpp:84] Creating Layer ip I0122 21:16:09.064405 24972 net.cpp:406] ip <- conv I0122 21:16:09.064405 24972 net.cpp:380] ip -> ip_blob I0122 21:16:09.064405 24972 net.cpp:122] Setting up ip I0122 21:16:09.064405 24972 net.cpp:129] Top shape: 5 13 (65) I0122 21:16:09.064405 24972 net.cpp:137] Memory required for data: 16600 I0122 21:16:09.064405 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.064405 24972 net.cpp:84] Creating Layer loss I0122 21:16:09.064405 24972 net.cpp:406] loss <- ip_blob I0122 21:16:09.064405 24972 net.cpp:406] loss <- label I0122 21:16:09.064405 24972 net.cpp:380] loss -> loss I0122 21:16:09.064405 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.064405 24972 net.cpp:122] Setting up loss I0122 21:16:09.064405 24972 net.cpp:129] Top shape: (1) I0122 21:16:09.064405 24972 net.cpp:132] with loss weight 1 I0122 21:16:09.064405 24972 net.cpp:137] Memory required for data: 16604 I0122 21:16:09.064405 24972 net.cpp:198] loss needs backward computation. I0122 21:16:09.064405 24972 net.cpp:198] ip needs backward computation. I0122 21:16:09.064405 24972 net.cpp:198] conv needs backward computation. I0122 21:16:09.064405 24972 net.cpp:200] data does not need backward computation. I0122 21:16:09.064405 24972 net.cpp:242] This network produces output loss I0122 21:16:09.064405 24972 net.cpp:255] Network initialization done. I0122 21:16:09.067404 24972 net.cpp:51] Initializing net from parameters: name: "testnet" force_backward: true state { phase: TRAIN level: 0 } layer { name: "data" type: "DummyData" top: "data" top: "label" dummy_data_param { data_filler { type: "gaussian" std: 1 } data_filler { type: "constant" } num: 5 num: 5 channels: 2 channels: 1 height: 3 height: 1 width: 4 width: 1 } } layer { name: "conv" type: "Convolution" bottom: "data" top: "conv" param { decay_mult: 1 } param { decay_mult: 0 } convolution_param { num_output: 11 pad: 3 kernel_size: 2 weight_filler { type: "gaussian" std: 1 } bias_filler { type: "constant" value: 2 } } } layer { name: "ip" type: "InnerProduct" bottom: "conv" top: "ip_blob" inner_product_param { num_output: 13 weight_filler { type: "gaussian" std: 2.5 } bias_filler { type: "constant" value: -3 } } } layer { name: "loss" type: "SoftmaxWithLoss" bottom: "ip_blob" bottom: "label" top: "loss" } I0122 21:16:09.067404 24972 layer_factory.hpp:77] Creating layer data I0122 21:16:09.067404 24972 net.cpp:84] Creating Layer data I0122 21:16:09.067404 24972 net.cpp:380] data -> data I0122 21:16:09.067404 24972 net.cpp:380] data -> label I0122 21:16:09.067404 24972 net.cpp:122] Setting up data I0122 21:16:09.067404 24972 net.cpp:129] Top shape: 5 2 3 4 (120) I0122 21:16:09.067404 24972 net.cpp:129] Top shape: 5 1 1 1 (5) I0122 21:16:09.067404 24972 net.cpp:137] Memory required for data: 500 I0122 21:16:09.067404 24972 layer_factory.hpp:77] Creating layer conv I0122 21:16:09.067404 24972 net.cpp:84] Creating Layer conv I0122 21:16:09.067404 24972 net.cpp:406] conv <- data I0122 21:16:09.067404 24972 net.cpp:380] conv -> conv I0122 21:16:09.067404 24972 net.cpp:122] Setting up conv I0122 21:16:09.067404 24972 net.cpp:129] Top shape: 5 11 8 9 (3960) I0122 21:16:09.067404 24972 net.cpp:137] Memory required for data: 16340 I0122 21:16:09.067404 24972 layer_factory.hpp:77] Creating layer ip I0122 21:16:09.067404 24972 net.cpp:84] Cr.eating Layer ip I0122 21:16:09.067404 24972 net.cpp:406] ip <- conv I0122 21:16:09.067404 24972 net.cpp:380] ip -> ip_blob I0122 21:16:09.067404 24972 net.cpp:122] Setting up ip I0122 21:16:09.067404 24972 net.cpp:129] Top shape: 5 13 (65) I0122 21:16:09.067404 24972 net.cpp:137] Memory required for data: 16600 I0122 21:16:09.067404 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.067404 24972 net.cpp:84] Creating Layer loss I0122 21:16:09.067404 24972 net.cpp:406] loss <- ip_blob I0122 21:16:09.067404 24972 net.cpp:406] loss <- label I0122 21:16:09.067404 24972 net.cpp:380] loss -> loss I0122 21:16:09.067404 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.067404 24972 net.cpp:122] Setting up loss I0122 21:16:09.067404 24972 net.cpp:129] Top shape: (1) I0122 21:16:09.067404 24972 net.cpp:132] with loss weight 1 I0122 21:16:09.067404 24972 net.cpp:137] Memory required for data: 16604 I0122 21:16:09.067404 24972 net.cpp:198] loss needs backward computation. I0122 21:16:09.067404 24972 net.cpp:198] ip needs backward computation. I0122 21:16:09.067404 24972 net.cpp:198] conv needs backward computation. I0122 21:16:09.067404 24972 net.cpp:200] data does not need backward computation. I0122 21:16:09.067404 24972 net.cpp:242] This network produces output loss I0122 21:16:09.067404 24972 net.cpp:255] Network initialization done. I0122 21:16:09.069402 24972 net.cpp:51] Initializing net from parameters: name: "testnet" force_backward: true state { phase: TRAIN level: 0 } layer { name: "data" type: "DummyData" top: "data" top: "label" dummy_data_param { data_filler { type: "gaussian" std: 1 } data_filler { type: "constant" } num: 5 num: 5 channels: 2 channels: 1 height: 3 height: 1 width: 4 width: 1 } } layer { name: "conv" type: "Convolution" bottom: "data" top: "conv" param { decay_mult: 1 } param { decay_mult: 0 } convolution_param { num_output: 11 pad: 3 kernel_size: 2 weight_filler { type: "gaussian" std: 1 } bias_filler { type: "constant" value: 2 } } } layer { name: "ip" type: "InnerProduct" bottom: "conv" top: "ip_blob" inner_product_param { num_output: 13 weight_filler { type: "gaussian" std: 2.5 } bias_filler { type: "constant" value: -3 } } } layer { name: "loss" type: "SoftmaxWithLoss" bottom: "ip_blob" bottom: "label" top: "loss" } I0122 21:16:09.069402 24972 layer_factory.hpp:77] Creating layer data I0122 21:16:09.069402 24972 net.cpp:84] Creating Layer data I0122 21:16:09.069402 24972 net.cpp:380] data -> data I0122 21:16:09.069402 24972 net.cpp:380] data -> label I0122 21:16:09.069402 24972 net.cpp:122] Setting up data I0122 21:16:09.069402 24972 net.cpp:129] Top shape: 5 2 3 4 (120) I0122 21:16:09.069402 24972 net.cpp:129] Top shape: 5 1 1 1 (5) I0122 21:16:09.069402 24972 net.cpp:137] Memory required for data: 500 I0122 21:16:09.069402 24972 layer_factory.hpp:77] Creating layer conv I0122 21:16:09.069402 24972 net.cpp:84] Creating Layer conv I0122 21:16:09.069402 24972 net.cpp:406] conv <- data I0122 21:16:09.069402 24972 net.cpp:380] conv -> conv I0122 21:16:09.069402 24972 net.cpp:122] Setting up conv I0122 21:16:09.069402 24972 net.cpp:129] Top shape: 5 11 8 9 (3960) I0122 21:16:09.069402 24972 net.cpp:137] Memory required for data: 16340 I0122 21:16:09.069402 24972 layer_factory.hpp:77] Creating layer ip I0122 21:16:09.069402 24972 net.cpp:84] Creating Layer ip I0122 21:16:09.069402 24972 net.cpp:406] ip <- conv I0122 21:16:09.069402 24972 net.cpp:380] ip -> ip_blob I0122 21:16:09.069402 24972 net.cpp:122] Setting up ip I0122 21:16:09.069402 24972 net.cpp:129] Top shape: 5 13 (65) I0122 21:16:09.069402 24972 net.cpp:137] Memory required for data: 16600 I0122 21:16:09.069402 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.069402 24972 net.cpp:84] Creating Layer loss I0122 21:16:09.069402 24972 net.cpp:406] loss <- ip_blob I0122 21:16:09.06.9402 24972 net.cpp:406] loss <- label I0122 21:16:09.069402 24972 net.cpp:380] loss -> loss I0122 21:16:09.069402 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.069402 24972 net.cpp:122] Setting up loss I0122 21:16:09.069402 24972 net.cpp:129] Top shape: (1) I0122 21:16:09.069402 24972 net.cpp:132] with loss weight 1 I0122 21:16:09.069402 24972 net.cpp:137] Memory required for data: 16604 I0122 21:16:09.069402 24972 net.cpp:198] loss needs backward computation. I0122 21:16:09.069402 24972 net.cpp:198] ip needs backward computation. I0122 21:16:09.069402 24972 net.cpp:198] conv needs backward computation. I0122 21:16:09.069402 24972 net.cpp:200] data does not need backward computation. I0122 21:16:09.069402 24972 net.cpp:242] This network produces output loss I0122 21:16:09.069402 24972 net.cpp:255] Network initialization done. I0122 21:16:09.071403 24972 net.cpp:51] Initializing net from parameters: name: "testnet" force_backward: true state { phase: TRAIN level: 0 } layer { name: "data" type: "DummyData" top: "data" top: "label" dummy_data_param { data_filler { type: "gaussian" std: 1 } data_filler { type: "constant" } num: 5 num: 5 channels: 2 channels: 1 height: 3 height: 1 width: 4 width: 1 } } layer { name: "conv" type: "Convolution" bottom: "data" top: "conv" param { decay_mult: 1 } param { decay_mult: 0 } convolution_param { num_output: 11 pad: 3 kernel_size: 2 weight_filler { type: "gaussian" std: 1 } bias_filler { type: "constant" value: 2 } } } layer { name: "ip" type: "InnerProduct" bottom: "conv" top: "ip_blob" inner_product_param { num_output: 13 weight_filler { type: "gaussian" std: 2.5 } bias_filler { type: "constant" value: -3 } } } layer { name: "loss" type: "SoftmaxWithLoss" bottom: "ip_blob" bottom: "label" top: "loss" } I0122 21:16:09.071403 24972 layer_factory.hpp:77] Creating layer data I0122 21:16:09.071403 24972 net.cpp:84] Creating Layer data I0122 21:16:09.071403 24972 net.cpp:380] data -> data I0122 21:16:09.071403 24972 net.cpp:380] data -> label I0122 21:16:09.071403 24972 net.cpp:122] Setting up data I0122 21:16:09.071403 24972 net.cpp:129] Top shape: 5 2 3 4 (120) I0122 21:16:09.071403 24972 net.cpp:129] Top shape: 5 1 1 1 (5) I0122 21:16:09.071403 24972 net.cpp:137] Memory required for data: 500 I0122 21:16:09.071403 24972 layer_factory.hpp:77] Creating layer conv I0122 21:16:09.071403 24972 net.cpp:84] Creating Layer conv I0122 21:16:09.071403 24972 net.cpp:406] conv <- data I0122 21:16:09.071403 24972 net.cpp:380] conv -> conv I0122 21:16:09.071403 24972 net.cpp:122] Setting up conv I0122 21:16:09.071403 24972 net.cpp:129] Top shape: 5 11 8 9 (3960) I0122 21:16:09.071403 24972 net.cpp:137] Memory required for data: 16340 I0122 21:16:09.071403 24972 layer_factory.hpp:77] Creating layer ip I0122 21:16:09.071403 24972 net.cpp:84] Creating Layer ip I0122 21:16:09.071403 24972 net.cpp:406] ip <- conv I0122 21:16:09.071403 24972 net.cpp:380] ip -> ip_blob I0122 21:16:09.071403 24972 net.cpp:122] Setting up ip I0122 21:16:09.071403 24972 net.cpp:129] Top shape: 5 13 (65) I0122 21:16:09.071403 24972 net.cpp:137] Memory required for data: 16600 I0122 21:16:09.071403 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.071403 24972 net.cpp:84] Creating Layer loss I0122 21:16:09.071403 24972 net.cpp:406] loss <- ip_blob I0122 21:16:09.071403 24972 net.cpp:406] loss <- label I0122 21:16:09.071403 24972 net.cpp:380] loss -> loss I0122 21:16:09.071403 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.071403 24972 net.cpp:122] Setting up loss I0122 21:16:09.071403 24972 net.cpp:129] Top shape: (1) I0122 21:16:09.072402 24972 net.cpp:132] with loss weight 1 I0122 21:16:09.072402 24972 net.cpp:137] Memory required for data: 16604 I0122 21:16:09.072402 24972 net.cpp:198] loss needs backward computation. I0122 21:16:09.072402 24972 net...cpp:198] ip needs backward computation. I0122 21:16:09.072402 24972 net.cpp:198] conv needs backward computation. I0122 21:16:09.072402 24972 net.cpp:200] data does not need backward computation. I0122 21:16:09.072402 24972 net.cpp:242] This network produces output loss I0122 21:16:09.072402 24972 net.cpp:255] Network initialization done. I0122 21:16:09.074409 24972 net.cpp:51] Initializing net from parameters: name: "testnet" force_backward: true state { phase: TRAIN level: 0 } layer { name: "data" type: "DummyData" top: "data" top: "label" dummy_data_param { data_filler { type: "gaussian" std: 1 } data_filler { type: "constant" } num: 5 num: 5 channels: 2 channels: 1 height: 3 height: 1 width: 4 width: 1 } } layer { name: "conv" type: "Convolution" bottom: "data" top: "conv" param { decay_mult: 1 } param { decay_mult: 0 } convolution_param { num_output: 11 pad: 3 kernel_size: 2 weight_filler { type: "gaussian" std: 1 } bias_filler { type: "constant" value: 2 } } } layer { name: "ip" type: "InnerProduct" bottom: "conv" top: "ip_blob" inner_product_param { num_output: 13 weight_filler { type: "gaussian" std: 2.5 } bias_filler { type: "constant" value: -3 } } } layer { name: "loss" type: "SoftmaxWithLoss" bottom: "ip_blob" bottom: "label" top: "loss" } I0122 21:16:09.074409 24972 layer_factory.hpp:77] Creating layer data I0122 21:16:09.074409 24972 net.cpp:84] Creating Layer data I0122 21:16:09.074409 24972 net.cpp:380] data -> data I0122 21:16:09.074409 24972 net.cpp:380] data -> label I0122 21:16:09.074409 24972 net.cpp:122] Setting up data I0122 21:16:09.074409 24972 net.cpp:129] Top shape: 5 2 3 4 (120) I0122 21:16:09.074409 24972 net.cpp:129] Top shape: 5 1 1 1 (5) I0122 21:16:09.074409 24972 net.cpp:137] Memory required for data: 500 I0122 21:16:09.074409 24972 layer_factory.hpp:77] Creating layer conv I0122 21:16:09.074409 24972 net.cpp:84] Creating Layer conv I0122 21:16:09.074409 24972 net.cpp:406] conv <- data I0122 21:16:09.074409 24972 net.cpp:380] conv -> conv I0122 21:16:09.074409 24972 net.cpp:122] Setting up conv I0122 21:16:09.074409 24972 net.cpp:129] Top shape: 5 11 8 9 (3960) I0122 21:16:09.074409 24972 net.cpp:137] Memory required for data: 16340 I0122 21:16:09.074409 24972 layer_factory.hpp:77] Creating layer ip I0122 21:16:09.074409 24972 net.cpp:84] Creating Layer ip I0122 21:16:09.074409 24972 net.cpp:406] ip <- conv I0122 21:16:09.074409 24972 net.cpp:380] ip -> ip_blob I0122 21:16:09.074409 24972 net.cpp:122] Setting up ip I0122 21:16:09.074409 24972 net.cpp:129] Top shape: 5 13 (65) I0122 21:16:09.074409 24972 net.cpp:137] Memory required for data: 16600 I0122 21:16:09.074409 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.074409 24972 net.cpp:84] Creating Layer loss I0122 21:16:09.074409 24972 net.cpp:406] loss <- ip_blob I0122 21:16:09.074409 24972 net.cpp:406] loss <- label I0122 21:16:09.074409 24972 net.cpp:380] loss -> loss I0122 21:16:09.074409 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.074409 24972 net.cpp:122] Setting up loss I0122 21:16:09.074409 24972 net.cpp:129] Top shape: (1) I0122 21:16:09.074409 24972 net.cpp:132] with loss weight 1 I0122 21:16:09.074409 24972 net.cpp:137] Memory required for data: 16604 I0122 21:16:09.074409 24972 net.cpp:198] loss needs backward computation. I0122 21:16:09.074409 24972 net.cpp:198] ip needs backward computation. I0122 21:16:09.074409 24972 net.cpp:198] conv needs backward computation. I0122 21:16:09.074409 24972 net.cpp:200] data does not need backward computation. I0122 21:16:09.074409 24972 net.cpp:242] This network produces output loss I0122 21:16:09.074409 24972 net.cpp:255] Network initialization done. I0122 21:16:09.075400 24972 net.cpp:51] Initializing net from parameters: name: "testnet" force_backward: true state { phase: TRAIN level: 0 } layer { name: "data" ty.pe: "DummyData" top: "data" top: "label" dummy_data_param { data_filler { type: "gaussian" std: 1 } data_filler { type: "constant" } num: 5 num: 5 channels: 2 channels: 1 height: 3 height: 1 width: 4 width: 1 } } layer { name: "conv" type: "Convolution" bottom: "data" top: "conv" param { decay_mult: 1 } param { decay_mult: 0 } convolution_param { num_output: 11 pad: 3 kernel_size: 2 weight_filler { type: "gaussian" std: 1 } bias_filler { type: "constant" value: 2 } } } layer { name: "ip" type: "InnerProduct" bottom: "conv" top: "ip_blob" inner_product_param { num_output: 13 weight_filler { type: "gaussian" std: 2.5 } bias_filler { type: "constant" value: -3 } } } layer { name: "loss" type: "SoftmaxWithLoss" bottom: "ip_blob" bottom: "label" top: "loss" } I0122 21:16:09.076402 24972 layer_factory.hpp:77] Creating layer data I0122 21:16:09.076402 24972 net.cpp:84] Creating Layer data I0122 21:16:09.076402 24972 net.cpp:380] data -> data I0122 21:16:09.076402 24972 net.cpp:380] data -> label I0122 21:16:09.076402 24972 net.cpp:122] Setting up data I0122 21:16:09.076402 24972 net.cpp:129] Top shape: 5 2 3 4 (120) I0122 21:16:09.076402 24972 net.cpp:129] Top shape: 5 1 1 1 (5) I0122 21:16:09.076402 24972 net.cpp:137] Memory required for data: 500 I0122 21:16:09.076402 24972 layer_factory.hpp:77] Creating layer conv I0122 21:16:09.076402 24972 net.cpp:84] Creating Layer conv I0122 21:16:09.076402 24972 net.cpp:406] conv <- data I0122 21:16:09.076402 24972 net.cpp:380] conv -> conv I0122 21:16:09.076402 24972 net.cpp:122] Setting up conv I0122 21:16:09.076402 24972 net.cpp:129] Top shape: 5 11 8 9 (3960) I0122 21:16:09.076402 24972 net.cpp:137] Memory required for data: 16340 I0122 21:16:09.076402 24972 layer_factory.hpp:77] Creating layer ip I0122 21:16:09.076402 24972 net.cpp:84] Creating Layer ip I0122 21:16:09.076402 24972 net.cpp:406] ip <- conv I0122 21:16:09.076402 24972 net.cpp:380] ip -> ip_blob I0122 21:16:09.076402 24972 net.cpp:122] Setting up ip I0122 21:16:09.076402 24972 net.cpp:129] Top shape: 5 13 (65) I0122 21:16:09.076402 24972 net.cpp:137] Memory required for data: 16600 I0122 21:16:09.076402 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.076402 24972 net.cpp:84] Creating Layer loss I0122 21:16:09.076402 24972 net.cpp:406] loss <- ip_blob I0122 21:16:09.076402 24972 net.cpp:406] loss <- label I0122 21:16:09.076402 24972 net.cpp:380] loss -> loss I0122 21:16:09.076402 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.076402 24972 net.cpp:122] Setting up loss I0122 21:16:09.076402 24972 net.cpp:129] Top shape: (1) I0122 21:16:09.076402 24972 net.cpp:132] with loss weight 1 I0122 21:16:09.076402 24972 net.cpp:137] Memory required for data: 16604 I0122 21:16:09.076402 24972 net.cpp:198] loss needs backward computation. I0122 21:16:09.076402 24972 net.cpp:198] ip needs backward computation. I0122 21:16:09.076402 24972 net.cpp:198] conv needs backward computation. I0122 21:16:09.076402 24972 net.cpp:200] data does not need backward computation. I0122 21:16:09.076402 24972 net.cpp:242] This network produces output loss I0122 21:16:09.076402 24972 net.cpp:255] Network initialization done. I0122 21:16:09.077399 24972 net.cpp:51] Initializing net from parameters: name: "testnet" force_backward: true state { phase: TRAIN level: 0 } layer { name: "data" type: "DummyData" top: "data" top: "label" dummy_data_param { data_filler { type: "gaussian" std: 1 } data_filler { type: "constant" } num: 5 num: 5 channels: 2 channels: 1 height: 3 height: 1 width: 4 width: 1 } } layer { name: "conv" type: "Convolution" bottom: "data" top: "conv" param { decay_mult: 1 } param { decay_mult: 0 } convolution_param { num_output: 11 pad: 3 kernel_size: 2 weight_filler {. type: "gaussian" std: 1 } bias_filler { type: "constant" value: 2 } } } layer { name: "ip" type: "InnerProduct" bottom: "conv" top: "ip_blob" inner_product_param { num_output: 13 weight_filler { type: "gaussian" std: 2.5 } bias_filler { type: "constant" value: -3 } } } layer { name: "loss" type: "SoftmaxWithLoss" bottom: "ip_blob" bottom: "label" top: "loss" } I0122 21:16:09.077399 24972 layer_factory.hpp:77] Creating layer data I0122 21:16:09.077399 24972 net.cpp:84] Creating Layer data I0122 21:16:09.077399 24972 net.cpp:380] data -> data I0122 21:16:09.077399 24972 net.cpp:380] data -> label I0122 21:16:09.078402 24972 net.cpp:122] Setting up data I0122 21:16:09.078402 24972 net.cpp:129] Top shape: 5 2 3 4 (120) I0122 21:16:09.078402 24972 net.cpp:129] Top shape: 5 1 1 1 (5) I0122 21:16:09.078402 24972 net.cpp:137] Memory required for data: 500 I0122 21:16:09.078402 24972 layer_factory.hpp:77] Creating layer conv I0122 21:16:09.078402 24972 net.cpp:84] Creating Layer conv I0122 21:16:09.078402 24972 net.cpp:406] conv <- data I0122 21:16:09.078402 24972 net.cpp:380] conv -> conv I0122 21:16:09.078402 24972 net.cpp:122] Setting up conv I0122 21:16:09.078402 24972 net.cpp:129] Top shape: 5 11 8 9 (3960) I0122 21:16:09.078402 24972 net.cpp:137] Memory required for data: 16340 I0122 21:16:09.078402 24972 layer_factory.hpp:77] Creating layer ip I0122 21:16:09.078402 24972 net.cpp:84] Creating Layer ip I0122 21:16:09.078402 24972 net.cpp:406] ip <- conv I0122 21:16:09.078402 24972 net.cpp:380] ip -> ip_blob I0122 21:16:09.078402 24972 net.cpp:122] Setting up ip I0122 21:16:09.078402 24972 net.cpp:129] Top shape: 5 13 (65) I0122 21:16:09.078402 24972 net.cpp:137] Memory required for data: 16600 I0122 21:16:09.078402 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.078402 24972 net.cpp:84] Creating Layer loss I0122 21:16:09.078402 24972 net.cpp:406] loss <- ip_blob I0122 21:16:09.078402 24972 net.cpp:406] loss <- label I0122 21:16:09.078402 24972 net.cpp:380] loss -> loss I0122 21:16:09.078402 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.078402 24972 net.cpp:122] Setting up loss I0122 21:16:09.078402 24972 net.cpp:129] Top shape: (1) I0122 21:16:09.078402 24972 net.cpp:132] with loss weight 1 I0122 21:16:09.078402 24972 net.cpp:137] Memory required for data: 16604 I0122 21:16:09.078402 24972 net.cpp:198] loss needs backward computation. I0122 21:16:09.078402 24972 net.cpp:198] ip needs backward computation. I0122 21:16:09.078402 24972 net.cpp:198] conv needs backward computation. I0122 21:16:09.078402 24972 net.cpp:200] data does not need backward computation. I0122 21:16:09.078402 24972 net.cpp:242] This network produces output loss I0122 21:16:09.078402 24972 net.cpp:255] Network initialization done. I0122 21:16:09.079403 24972 net.cpp:51] Initializing net from parameters: name: "testnet" force_backward: true state { phase: TRAIN level: 0 } layer { name: "data" type: "DummyData" top: "data" top: "label" dummy_data_param { data_filler { type: "gaussian" std: 1 } data_filler { type: "constant" } num: 5 num: 5 channels: 2 channels: 1 height: 3 height: 1 width: 4 width: 1 } } layer { name: "conv" type: "Convolution" bottom: "data" top: "conv" param { decay_mult: 1 } param { decay_mult: 0 } convolution_param { num_output: 11 pad: 3 kernel_size: 2 weight_filler { type: "gaussian" std: 1 } bias_filler { type: "constant" value: 2 } } } layer { name: "ip" type: "InnerProduct" bottom: "conv" top: "ip_blob" inner_product_param { num_output: 13 weight_filler { type: "gaussian" std: 2.5 } bias_filler { type: "constant" value: -3 } } } layer { name: "loss" type: "SoftmaxWithLoss" bottom: "ip_blob" bottom: "label" top: "loss" } I0122 21:16:09.079403 24972 layer_factory.hpp:77] Creating layer data I0122 21:16:09.080401 24972 net.cpp:84] Creating Layer data I0122 21:16:09.080401 24972 net.cpp:380] data -> data I0122 21:16:09.080401 24972 net.cpp:380] data -> label I0122 21:16:09.080401 24972 net.cpp:122] Setting up data I0122 21:16:09.080401 24972 net.cpp:129] Top shape: 5 2 3 4 (120) I0122 21:16:09.080401 24972 net.cpp:129] Top shape: 5 1 1 1 (5) I0122 21:16:09.080401 24972 net.cpp:137] Memory required for data: 500 I0122 21:16:09.080401 24972 layer_factory.hpp:77] Creating layer conv I0122 21:16:09.080401 24972 net.cpp:84] Creating Layer conv I0122 21:16:09.080401 24972 net.cpp:406] conv <- data I0122 21:16:09.080401 24972 net.cpp:380] conv -> conv I0122 21:16:09.080401 24972 net.cpp:122] Setting up conv I0122 21:16:09.080401 24972 net.cpp:129] Top shape: 5 11 8 9 (3960) I0122 21:16:09.080401 24972 net.cpp:137] Memory required for data: 16340 I0122 21:16:09.080401 24972 layer_factory.hpp:77] Creating layer ip I0122 21:16:09.080401 24972 net.cpp:84] Creating Layer ip I0122 21:16:09.080401 24972 net.cpp:406] ip <- conv I0122 21:16:09.080401 24972 net.cpp:380] ip -> ip_blob I0122 21:16:09.080401 24972 net.cpp:122] Setting up ip I0122 21:16:09.080401 24972 net.cpp:129] Top shape: 5 13 (65) I0122 21:16:09.080401 24972 net.cpp:137] Memory required for data: 16600 I0122 21:16:09.080401 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.080401 24972 net.cpp:84] Creating Layer loss I0122 21:16:09.080401 24972 net.cpp:406] loss <- ip_blob I0122 21:16:09.080401 24972 net.cpp:406] loss <- label I0122 21:16:09.080401 24972 net.cpp:380] loss -> loss I0122 21:16:09.080401 24972 layer_factory.hpp:77] Creating layer loss I0122 21:16:09.080401 24972 net.cpp:122] Setting up loss I0122 21:16:09.080401 24972 net.cpp:129] Top shape: (1) I0122 21:16:09.080401 24972 net.cpp:132] with loss weight 1 I0122 21:16:09.080401 24972 net.cpp:137] Memory required for data: 16604 I0122 21:16:09.080401 24972 net.cpp:198] loss needs backward computation. I0122 21:16:09.080401 24972 net.cpp:198] ip needs backward computation. I0122 21:16:09.080401 24972 net.cpp:198] conv needs backward computation. I0122 21:16:09.080401 24972 net.cpp:200] data does not need backward computation. I0122 21:16:09.080401 24972 net.cpp:242] This network produces output loss I0122 21:16:09.080401 24972 net.cpp:255] Network initialization done. W0122 21:16:09.081403 24972 _caffe.cpp:141] DEPRECATION WARNING - deprecated use of Python interface W0122 21:16:09.081403 24972 _caffe.cpp:142] Use this instead (with the named "weights" parameter): W0122 21:16:09.081403 24972 _caffe.cpp:144] Net('C:\Users\vlj\AppData\Local\Temp\tmpnym432ck', 0, weights='C:\Users\vlj\AppData\Local\Temp\tmp93bss3xr') I0122 21:16:09.082403 24972 net.cpp:51] Initializing net from parameters: name: "testnet" force_backward: true state { phase: TRAIN level: 0 } layer { name: "data" type: "DummyData" top: "data" top: "label" dummy_data_param { data_filler { type: "gaussian" std: 1 } data_filler { type: "constant" } num: 5 num: 5 channels: 2 channels: 1 height: 3 height: 1 width: 4 width: 1 } } layer { name: "conv" type: "Convolution" bottom: "data" top: "conv" param { decay_mult: 1 } param { decay_mult: 0 } convolution_param { num_output: 11 pad: 3 kernel_size: 2 weight_filler { type: "gaussian" std: 1 } bias_filler { type: "constant" value: 2 } } } layer { name: "ip" type: "InnerProduct" bottom: "conv" top: "ip_blob" inner_product_param { num_output: 13 weight_filler { type: "gaussian" std: 2.5 } bias_filler { type: "constant" value: -3 } } } layer { name: "loss" type: "SoftmaxWithLoss" bottom: "ip_blob" bottom: "label" top: "loss" } I0122 21:16:09.082403 24972 layer_factory.hpp:77] Creating layer data I0122 21:16:09.082403 24972 net.cpp:84] Creating Layer data I0122 21:16:09.082403 24972 net.cpp:380] data -> data I0122 21:16:09.082403 24972 n*** Check failure stack trace: ***