this guideに従った再訓練された開始モデルを提供しようとしています(開始を再試行する方法については、this guideを参照してください)。Tensorflowは再トレーニング開始
/serving/bazel-bin/tensorflow_serving/example/inception_inference --port=9000 EXPORT_DIR &> inception_log &
Serverログファイル(inception_log)含まれています:
... # Same as in the original script:
# Set up the pre-trained graph.
maybe_download_and_extract()
graph, bottleneck_tensor, jpeg_data_tensor, resized_image_tensor = (create_inception_graph())
... # Same as in the original script:
# Add the new layer that we'll be training.
(train_step, cross_entropy, bottleneck_input, ground_truth_input, final_tensor) = add_final_training_ops(len(image_lists.keys()),
FLAGS.final_tensor_name,
bottleneck_tensor)
... # Added at the end of the original script:
# Export model
with graph.as_default():
export_path = sys.argv[-1]
print('Exporting trained model to', export_path)
saver = tf.train.Saver(sharded=True)
model_exporter = exporter.Exporter(saver)
signature = exporter.classification_signature(input_tensor=jpeg_data_tensor, scores_tensor=final_tensor)
model_exporter.init(sess.graph.as_graph_def(), default_graph_signature=signature)
model_exporter.export(export_path, tf.constant(FLAGS.export_version), sess)
print('Done exporting!')
if __name__ == '__main__':
tf.app.run()
私のモデルをエクスポートした後、私は、サーバーの実行を開始:次のように私は私のモデルをエクスポートするretrain.pyを変更した
I tensorflow_serving/core/basic_manager.cc:190] Using InlineExecutor for BasicManager.
I tensorflow_serving/example/inception_inference.cc:384] Waiting for models to be loaded...
I tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:147] File-system polling found servable version {name: default version: 1} at path /tf_files/scope/export/00000001
I external/org_tensorflow/tensorflow/contrib/session_bundle/session_bundle.cc:129] Attempting to load a SessionBundle from: /tf_files/scope/export/00000001
I tensorflow_serving/example/inception_inference.cc:384] Waiting for models to be loaded...
I tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:147] File-system polling found servable version {name: default version: 1} at path /tf_files/scope/export/00000001
I external/org_tensorflow/tensorflow/contrib/session_bundle/session_bundle.cc:106] Running restore op for SessionBundle
I external/org_tensorflow/tensorflow/contrib/session_bundle/session_bundle.cc:203] Done loading SessionBundle
I tensorflow_serving/example/inception_inference.cc:350] Running...
I tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:147] File-system polling found servable version {name: default version: 1} at path /tf_files/scope/export/00000001
I tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:147] File-system polling found servable version {name: default version: 1} at path /tf_files/scope/export/00000001
I tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:147] File-system polling found servable version {name: default version: 1} at path /tf_files/scope/export/00000001
...
最後にクライアントを実行すると、次のエラーが表示されます。
/serving/bazel-bin/tensorflow_serving/example/inception_client --server=localhost:9000 --image=TEST_IMG
D0805 09:10:46.208704633 200 ev_posix.c:101] Using polling engine: poll
Traceback (most recent call last):
File "/serving/bazel-bin/tensorflow_serving/example/inception_client.runfiles/tensorflow_serving/example/inception_client.py", line 53, in <module>
tf.app.run()
File "/serving/bazel-bin/tensorflow_serving/example/inception_client.runfiles/external/org_tensorflow/tensorflow/python/platform/app.py", line 30, in run
sys.exit(main(sys.argv))
File "/serving/bazel-bin/tensorflow_serving/example/inception_client.runfiles/tensorflow_serving/example/inception_client.py", line 48, in main
result = stub.Classify(request, 10.0) # 10 secs timeout
File "/usr/local/lib/python2.7/dist-packages/grpc/beta/_client_adaptations.py", line 300, in __call__
self._request_serializer, self._response_deserializer)
File "/usr/local/lib/python2.7/dist-packages/grpc/beta/_client_adaptations.py", line 198, in _blocking_unary_unary
raise _abortion_error(rpc_error_call)
grpc.framework.interfaces.face.face.AbortionError: AbortionError(code=StatusCode.INTERNAL, details="FetchOutputs node : not found")
E0805 09:10:47.129263239 200 chttp2_transport.c:1810] close_transport: {"created":"@1470388247.129230608","description":"FD shutdown","file":"src/core/lib/iomgr/ev_poll_posix.c","file_line":427}
この件に関するアドバイスやガイダンスは、大変ありがとうございます。
ここに運がありますか?同様の問題に直面 – kampta
同じ問題がここにあります!どんなフィードバック/ガイダンスも高く評価されます! –