2016-06-30 5 views
0

どのようにネットをトレーニングすることができますか?私は次のエラーがあり、トレーニングのトレース方法を知らない。画像が見つからない場合もありますが、初期の段階では画像が見つかりました。ネットをトレーニングするとCaffeがクラッシュする

[email protected]:~/caffe# ./build/tools/caffe train -solver models/caltech101/caltech101_solver.prototxt -weights models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel 

snapshot_prefix: "models/caltech101/caltech101" 
solver_mode: CPU 
net: "models/caltech101/caltech101_train.prototxt" 
I0702 16:19:43.065757 20618 solver.cpp:91] Creating training net from net file: models/caltech101/caltech101_train.prototxt 
I0702 16:19:43.066241 20618 net.cpp:313] The NetState phase (0) differed from the phase (1) specified by a rule in layer data 
I0702 16:19:43.066275 20618 net.cpp:313] The NetState phase (0) differed from the phase (1) specified by a rule in layer accuracy 
I0702 16:19:43.066431 20618 net.cpp:49] Initializing net from parameters: 
name: "CaffeNet" 
state { 
    phase: TRAIN 
} 
layer { 
    name: "data" 
    type: "ImageData" 
    top: "data" 
    top: "label" 
    include { 
    phase: TRAIN 
    } 
    transform_param { 
    mirror: true 
    crop_size: 227 
    mean_file: "data/ilsvrc12/imagenet_mean.binaryproto" 
    } 
    image_data_param { 
    source: "data/caltech101/caltech101_train.txt" 
    batch_size: 50 
    new_height: 256 
    new_width: 256 
    } 
} 
layer { 
    name: "conv1" 
    type: "Convolution" 
    bottom: "data" 
    top: "conv1" 
    param { 
    lr_mult: 1 
    decay_mult: 1 
    } 
    param { 
    lr_mult: 2 
    decay_mult: 0 
    } 
    convolution_param { 
    num_output: 96 
    kernel_size: 11 
    stride: 4 
    weight_filler { 
     type: "gaussian" 
     std: 0.01 
    } 
    bias_filler { 
     type: "constant" 
     value: 0 
    } 
    } 
} 
layer { 
    name: "relu1" 
    type: "ReLU" 
    bottom: "conv1" 
    top: "conv1" 
} 
layer { 
    name: "pool1" 
    type: "Pooling" 
    bottom: "conv1" 
    top: "pool1" 
    pooling_param { 
    pool: MAX 
    kernel_size: 3 
    stride: 2 
    } 
} 
layer { 
    name: "norm1" 
    type: "LRN" 
    bottom: "pool1" 
    top: "norm1" 
    lrn_param { 
    local_size: 5 
    alpha: 0.0001 
    beta: 0.75 
    } 
} 
layer { 
    name: "conv2" 
    type: "Convolution" 
    bottom: "norm1" 
    top: "conv2" 
    param { 
    lr_mult: 1 
    decay_mult: 1 
    } 
    param { 
    lr_mult: 2 
    decay_mult: 0 
    } 
    convolution_param { 
    num_output: 256 
    pad: 2 
    kernel_size: 5 
    group: 2 
    weight_filler { 
     type: "gaussian" 
     std: 0.01 
    } 
    bias_filler { 
     type: "constant" 
     value: 1 
    } 
    } 
} 
layer { 
    name: "relu2" 
    type: "ReLU" 
    bottom: "conv2" 
    top: "conv2" 
} 
layer { 
    name: "pool2" 
    type: "Pooling" 
    bottom: "conv2" 
    top: "pool2" 
    pooling_param { 
    pool: MAX 
    kernel_size: 3 
    stride: 2 
    } 
} 
layer { 
    name: "norm2" 
    type: "LRN" 
    bottom: "pool2" 
    top: "norm2" 
    lrn_param { 
    local_size: 5 
    alpha: 0.0001 
    beta: 0.75 
    } 
} 
layer { 
    name: "conv3" 
    type: "Convolution" 
    bottom: "norm2" 
    top: "conv3" 
    param { 
    lr_mult: 1 
    decay_mult: 1 
    } 
    param { 
    lr_mult: 2 
    decay_mult: 0 
    } 
    convolution_param { 
    num_output: 384 
    pad: 1 
    kernel_size: 3 
    weight_filler { 
     type: "gaussian" 
     std: 0.01 
    } 
    bias_filler { 
     type: "constant" 
     value: 0 
    } 
    } 
} 
layer { 
    name: "relu3" 
    type: "ReLU" 
    bottom: "conv3" 
    top: "conv3" 
} 
layer { 
    name: "conv4" 
    type: "Convolution" 
    bottom: "conv3" 
    top: "conv4" 
    param { 
    lr_mult: 1 
    decay_mult: 1 
    } 
    param { 
    lr_mult: 2 
    decay_mult: 0 
    } 
    convolution_param { 
    num_output: 384 
    pad: 1 
    kernel_size: 3 
    group: 2 
    weight_filler { 
     type: "gaussian" 
     std: 0.01 
    } 
    bias_filler { 
     type: "constant" 
     value: 1 
    } 
    } 
} 
layer { 
    name: "relu4" 
    type: "ReLU" 
    bottom: "conv4" 
    top: "conv4" 
} 
layer { 
    name: "conv5" 
    type: "Convolution" 
    bottom: "conv4" 
    top: "conv5" 
    param { 
    lr_mult: 1 
    decay_mult: 1 
    } 
    param { 
    lr_mult: 2 
    decay_mult: 0 
    } 
    convolution_param { 
    num_output: 256 
    pad: 1 
    kernel_size: 3 
    group: 2 
    weight_filler { 
     type: "gaussian" 
     std: 0.01 
    } 
    bias_filler { 
     type: "constant" 
     value: 1 
    } 
    } 
} 
layer { 
    name: "relu5" 
    type: "ReLU" 
    bottom: "conv5" 
    top: "conv5" 
} 
layer { 
    name: "pool5" 
    type: "Pooling" 
    bottom: "conv5" 
    top: "pool5" 
    pooling_param { 
    pool: MAX 
    kernel_size: 3 
    stride: 2 
    } 
} 
layer { 
    name: "fc6" 
    type: "InnerProduct" 
    bottom: "pool5" 
    top: "fc6" 
    param { 
    lr_mult: 1 
    decay_mult: 1 
    } 
    param { 
    lr_mult: 2 
    decay_mult: 0 
    } 
    inner_product_param { 
    num_output: 4096 
    weight_filler { 
     type: "gaussian" 
     std: 0.005 
    } 
    bias_filler { 
     type: "constant" 
     value: 1 
    } 
    } 
} 
layer { 
    name: "relu6" 
    type: "ReLU" 
    bottom: "fc6" 
    top: "fc6" 
} 
layer { 
    name: "drop6" 
    type: "Dropout" 
    bottom: "fc6" 
    top: "fc6" 
    dropout_param { 
    dropout_ratio: 0.5 
    } 
} 
layer { 
    name: "fc7" 
    type: "InnerProduct" 
    bottom: "fc6" 
    top: "fc7" 
    param { 
    lr_mult: 1 
    decay_mult: 1 
    } 
    param { 
    lr_mult: 2 
    decay_mult: 0 
    } 
    inner_product_param { 
    num_output: 4096 
    weight_filler { 
     type: "gaussian" 
     std: 0.005 
    } 
    bias_filler { 
     type: "constant" 
     value: 1 
    } 
    } 
} 
layer { 
    name: "relu7" 
    type: "ReLU" 
    bottom: "fc7" 
    top: "fc7" 
} 
layer { 
    name: "drop7" 
    type: "Dropout" 
    bottom: "fc7" 
    top: "fc7" 
    dropout_param { 
    dropout_ratio: 0.5 
    } 
} 
layer { 
    name: "fc8_caltech101" 
    type: "InnerProduct" 
    bottom: "fc7" 
    top: "fc8_caltech101" 
    param { 
    lr_mult: 10 
    decay_mult: 1 
    } 
    param { 
    lr_mult: 20 
    decay_mult: 0 
    } 
    inner_product_param { 
    num_output: 20 
    weight_filler { 
     type: "gaussian" 
     std: 0.01 
    } 
    bias_filler { 
     type: "constant" 
     value: 0 
    } 
    } 
} 
layer { 
    name: "loss" 
    type: "SoftmaxWithLoss" 
    bottom: "fc8_caltech101" 
    bottom: "label" 
    top: "loss" 
} 
I0702 16:19:43.067663 20618 layer_factory.hpp:77] Creating layer data 
I0702 16:19:43.067703 20618 net.cpp:91] Creating Layer data 
I0702 16:19:43.067718 20618 net.cpp:399] data -> data 
I0702 16:19:43.067746 20618 net.cpp:399] data -> label 
I0702 16:19:43.067770 20618 data_transformer.cpp:25] Loading mean file from: data/ilsvrc12/imagenet_mean.binaryproto 
I0702 16:19:43.069584 20618 image_data_layer.cpp:38] Opening file data/caltech101/caltech101_train.txt 
I0702 16:19:43.069648 20618 image_data_layer.cpp:58] A total of 84 images. 
I0702 16:19:43.071579 20618 image_data_layer.cpp:85] output data size: 50,3,227,227 
I0702 16:19:43.081862 20618 net.cpp:141] Setting up data 
I0702 16:19:43.081907 20618 net.cpp:148] Top shape: 50 3 227 227 (7729350) 
I0702 16:19:43.081920 20618 net.cpp:148] Top shape: 50 (50) 
I0702 16:19:43.081928 20618 net.cpp:156] Memory required for data: 30917600 
I0702 16:19:43.081943 20618 layer_factory.hpp:77] Creating layer conv1 
I0702 16:19:43.081975 20618 net.cpp:91] Creating Layer conv1 
I0702 16:19:43.081989 20618 net.cpp:425] conv1 <- data 
I0702 16:19:43.082007 20618 net.cpp:399] conv1 -> conv1 
I0702 16:19:43.083432 20618 net.cpp:141] Setting up conv1 
I0702 16:19:43.083454 20618 net.cpp:148] Top shape: 50 96 55 55 (14520000) 
I0702 16:19:43.083464 20618 net.cpp:156] Memory required for data: 88997600 
I0702 16:19:43.083483 20618 layer_factory.hpp:77] Creating layer relu1 
I0702 16:19:43.083499 20618 net.cpp:91] Creating Layer relu1 
I0702 16:19:43.083508 20618 net.cpp:425] relu1 <- conv1 
I0702 16:19:43.083519 20618 net.cpp:386] relu1 -> conv1 (in-place) 
I0702 16:19:43.083537 20618 net.cpp:141] Setting up relu1 
I0702 16:19:43.083549 20618 net.cpp:148] Top shape: 50 96 55 55 (14520000) 
I0702 16:19:43.083559 20618 net.cpp:156] Memory required for data: 147077600 
I0702 16:19:43.083566 20618 layer_factory.hpp:77] Creating layer pool1 
I0702 16:19:43.083578 20618 net.cpp:91] Creating Layer pool1 
I0702 16:19:43.083587 20618 net.cpp:425] pool1 <- conv1 
I0702 16:19:43.083598 20618 net.cpp:399] pool1 -> pool1 
I0702 16:19:43.083622 20618 net.cpp:141] Setting up pool1 
I0702 16:19:43.083636 20618 net.cpp:148] Top shape: 50 96 27 27 (3499200) 
I0702 16:19:43.083645 20618 net.cpp:156] Memory required for data: 161074400 
I0702 16:19:43.083654 20618 layer_factory.hpp:77] Creating layer norm1 
I0702 16:19:43.083668 20618 net.cpp:91] Creating Layer norm1 
I0702 16:19:43.083678 20618 net.cpp:425] norm1 <- pool1 
I0702 16:19:43.083703 20618 net.cpp:399] norm1 -> norm1 
I0702 16:19:43.083721 20618 net.cpp:141] Setting up norm1 
I0702 16:19:43.083734 20618 net.cpp:148] Top shape: 50 96 27 27 (3499200) 
I0702 16:19:43.083744 20618 net.cpp:156] Memory required for data: 175071200 
I0702 16:19:43.083752 20618 layer_factory.hpp:77] Creating layer conv2 
I0702 16:19:43.083768 20618 net.cpp:91] Creating Layer conv2 
I0702 16:19:43.083777 20618 net.cpp:425] conv2 <- norm1 
I0702 16:19:43.083789 20618 net.cpp:399] conv2 -> conv2 
I0702 16:19:43.093122 20618 net.cpp:141] Setting up conv2 
I0702 16:19:43.093155 20618 net.cpp:148] Top shape: 50 256 27 27 (9331200) 
I0702 16:19:43.093164 20618 net.cpp:156] Memory required for data: 212396000 
I0702 16:19:43.093183 20618 layer_factory.hpp:77] Creating layer relu2 
I0702 16:19:43.093199 20618 net.cpp:91] Creating Layer relu2 
I0702 16:19:43.093209 20618 net.cpp:425] relu2 <- conv2 
I0702 16:19:43.093222 20618 net.cpp:386] relu2 -> conv2 (in-place) 
I0702 16:19:43.093240 20618 net.cpp:141] Setting up relu2 
I0702 16:19:43.093255 20618 net.cpp:148] Top shape: 50 256 27 27 (9331200) 
I0702 16:19:43.093266 20618 net.cpp:156] Memory required for data: 249720800 
I0702 16:19:43.093274 20618 layer_factory.hpp:77] Creating layer pool2 
I0702 16:19:43.093287 20618 net.cpp:91] Creating Layer pool2 
I0702 16:19:43.093297 20618 net.cpp:425] pool2 <- conv2 
I0702 16:19:43.093308 20618 net.cpp:399] pool2 -> pool2 
I0702 16:19:43.093325 20618 net.cpp:141] Setting up pool2 
I0702 16:19:43.093338 20618 net.cpp:148] Top shape: 50 256 13 13 (2163200) 
I0702 16:19:43.093345 20618 net.cpp:156] Memory required for data: 258373600 
I0702 16:19:43.093354 20618 layer_factory.hpp:77] Creating layer norm2 
I0702 16:19:43.093370 20618 net.cpp:91] Creating Layer norm2 
I0702 16:19:43.093385 20618 net.cpp:425] norm2 <- pool2 
I0702 16:19:43.093397 20618 net.cpp:399] norm2 -> norm2 
I0702 16:19:43.093412 20618 net.cpp:141] Setting up norm2 
I0702 16:19:43.093425 20618 net.cpp:148] Top shape: 50 256 13 13 (2163200) 
I0702 16:19:43.093433 20618 net.cpp:156] Memory required for data: 267026400 
I0702 16:19:43.093442 20618 layer_factory.hpp:77] Creating layer conv3 
I0702 16:19:43.093458 20618 net.cpp:91] Creating Layer conv3 
I0702 16:19:43.093468 20618 net.cpp:425] conv3 <- norm2 
I0702 16:19:43.093480 20618 net.cpp:399] conv3 -> conv3 
I0702 16:19:43.119555 20618 net.cpp:141] Setting up conv3 
I0702 16:19:43.119588 20618 net.cpp:148] Top shape: 50 384 13 13 (3244800) 
I0702 16:19:43.119598 20618 net.cpp:156] Memory required for data: 280005600 
I0702 16:19:43.119616 20618 layer_factory.hpp:77] Creating layer relu3 
I0702 16:19:43.119632 20618 net.cpp:91] Creating Layer relu3 
I0702 16:19:43.119642 20618 net.cpp:425] relu3 <- conv3 
I0702 16:19:43.119655 20618 net.cpp:386] relu3 -> conv3 (in-place) 
I0702 16:19:43.119671 20618 net.cpp:141] Setting up relu3 
I0702 16:19:43.119683 20618 net.cpp:148] Top shape: 50 384 13 13 (3244800) 
I0702 16:19:43.119693 20618 net.cpp:156] Memory required for data: 292984800 
I0702 16:19:43.119701 20618 layer_factory.hpp:77] Creating layer conv4 
I0702 16:19:43.119719 20618 net.cpp:91] Creating Layer conv4 
I0702 16:19:43.119735 20618 net.cpp:425] conv4 <- conv3 
I0702 16:19:43.119750 20618 net.cpp:399] conv4 -> conv4 
I0702 16:19:43.139026 20618 net.cpp:141] Setting up conv4 
I0702 16:19:43.139058 20618 net.cpp:148] Top shape: 50 384 13 13 (3244800) 
I0702 16:19:43.139066 20618 net.cpp:156] Memory required for data: 305964000 
I0702 16:19:43.139080 20618 layer_factory.hpp:77] Creating layer relu4 
I0702 16:19:43.139094 20618 net.cpp:91] Creating Layer relu4 
I0702 16:19:43.139104 20618 net.cpp:425] relu4 <- conv4 
I0702 16:19:43.139117 20618 net.cpp:386] relu4 -> conv4 (in-place) 
I0702 16:19:43.139132 20618 net.cpp:141] Setting up relu4 
I0702 16:19:43.139152 20618 net.cpp:148] Top shape: 50 384 13 13 (3244800) 
I0702 16:19:43.139161 20618 net.cpp:156] Memory required for data: 318943200 
I0702 16:19:43.139170 20618 layer_factory.hpp:77] Creating layer conv5 
I0702 16:19:43.139188 20618 net.cpp:91] Creating Layer conv5 
I0702 16:19:43.139217 20618 net.cpp:425] conv5 <- conv4 
I0702 16:19:43.139231 20618 net.cpp:399] conv5 -> conv5 
I0702 16:19:43.152601 20618 net.cpp:141] Setting up conv5 
I0702 16:19:43.152634 20618 net.cpp:148] Top shape: 50 256 13 13 (2163200) 
I0702 16:19:43.152643 20618 net.cpp:156] Memory required for data: 327596000 
I0702 16:19:43.152662 20618 layer_factory.hpp:77] Creating layer relu5 
I0702 16:19:43.152678 20618 net.cpp:91] Creating Layer relu5 
I0702 16:19:43.152688 20618 net.cpp:425] relu5 <- conv5 
I0702 16:19:43.152701 20618 net.cpp:386] relu5 -> conv5 (in-place) 
I0702 16:19:43.152719 20618 net.cpp:141] Setting up relu5 
I0702 16:19:43.152730 20618 net.cpp:148] Top shape: 50 256 13 13 (2163200) 
I0702 16:19:43.152740 20618 net.cpp:156] Memory required for data: 336248800 
I0702 16:19:43.152750 20618 layer_factory.hpp:77] Creating layer pool5 
I0702 16:19:43.152761 20618 net.cpp:91] Creating Layer pool5 
I0702 16:19:43.152770 20618 net.cpp:425] pool5 <- conv5 
I0702 16:19:43.152782 20618 net.cpp:399] pool5 -> pool5 
I0702 16:19:43.152801 20618 net.cpp:141] Setting up pool5 
I0702 16:19:43.152817 20618 net.cpp:148] Top shape: 50 256 6 6 (460800) 
I0702 16:19:43.152827 20618 net.cpp:156] Memory required for data: 338092000 
I0702 16:19:43.152835 20618 layer_factory.hpp:77] Creating layer fc6 
I0702 16:19:43.152858 20618 net.cpp:91] Creating Layer fc6 
I0702 16:19:43.152869 20618 net.cpp:425] fc6 <- pool5 
I0702 16:19:43.152881 20618 net.cpp:399] fc6 -> fc6 
E0702 16:19:43.215560 20620 io.cpp:80] Could not open or find file 
F0702 16:19:43.215747 20620 image_data_layer.cpp:143] Check failed: cv_img.data Could not load 
*** Check failure stack trace: *** 
    @  0x7fb695883daa (unknown) 
    @  0x7fb695883ce4 (unknown) 
    @  0x7fb6958836e6 (unknown) 
    @  0x7fb695886687 (unknown) 
    @  0x7fb695d1f8ec caffe::ImageDataLayer<>::load_batch() 
    @  0x7fb695d2a048 caffe::BasePrefetchingDataLayer<>::InternalThreadEntry() 
    @  0x7fb693024a4a (unknown) 
    @  0x7fb6928dc182 start_thread 
    @  0x7fb694c6a47d (unknown) 
    @    (nil) (unknown) 
Aborted (core dumped) 

ご協力いただきありがとうございます。

+0

ファイル名は 'data/caltech101/caltech101_train.txt'の相対パスか絶対パスですか?あなたのケースでは絶対パスか '〜/ caffe'に相対的でなければなりません – Shai

+0

あなたの返事をありがとう。これはファイルの内容です: – user3549723

答えて

0

まず、$ CAFFEROOT/data/caltech101/caltech101_train.txtファイル(caltech101_train.prototxtの16行目を参照)を参照してください。ファイルは適切なパス($ CAFFEROOTから見たもの)でリストされており、読み込み可能ですか?これを確認するには、ビルド/ tools/caffeコマンド($ CAFFEROOT)を実行する場所から、そのディレクトリに 'ls -l'を実行してください(* _train.txtファイルから&を貼り付けてください)。適切な読み取り権限が表示されない場合は、権限またはパスを適切に調整してください。

アクセスの問題が終了したら: 適切なサイズのコンテンツはありますか?そうでない場合は、入力ディメンションを調整します。

image_data形式ですか?そうでない場合は、image_data_paramからdata_paramに切り替えます。

関連する問題