2017-03-17 9 views
0

sparkのtensorflowでmnistの例を実行します。ERROR SparkContext:SparkContextの初期化エラー(sparkContextのテンソルフローを使用したmnistの例)

スパーク:2.1.0

スカラ:2.11.8

tensorflow:0.12.1スパークに

tensorflow:最新の

パイソン:アナコンダ(パイソン= 3.5)

-----------------------------

私はその理由を知らない

(tf012-p35) [email protected]:~/TFSpark/TensorFlowOnSpark$ ${SPARK_HOME}/bin/spark-submit \ 
>  --master ${MASTER} \ 
>  --py-files /home/superstar/TFSpark/TensorFlowOnSpark/tfspark.zip,/home/superstar/TFSpark/TensorFlowOnSpark/examples/mnist/spark/mnist_dist.py \ 
>  --conf spark.cores.max=${TOTAL_CORES} \ 
>  --conf spark.task.cpus=${CORES_PER_WORKER} \ 
>  --conf spark.executorEnv.JAVA_HOME="$JAVA_HOME" \ 
>  /home/superstar/TFSpark/TensorFlowOnSpark/examples/mnist/spark/mnist_spark.py \ 
>  --cluster_size ${SPARK_WORKER_INSTANCES} \ 
>  --images examples/mnist/csv/train/images \ 
>  --labels examples/mnist/csv/train/labels \ 
>  --format csv \ 
>  --mode train \ 
>  --model mnist_model 

I tensorflow/stream_executor/dso_loader.cc:128] successfully opened CUDA library libcublas.so locally 
I tensorflow/stream_executor/dso_loader.cc:128] successfully opened CUDA library libcudnn.so locally 
I tensorflow/stream_executor/dso_loader.cc:128] successfully opened CUDA library libcufft.so locally 
I tensorflow/stream_executor/dso_loader.cc:128] successfully opened CUDA library libcuda.so.1 locally 
I tensorflow/stream_executor/dso_loader.cc:128] successfully opened CUDA library libcurand.so locally 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
17/03/17 18:08:56 INFO SparkContext: Running Spark version 2.1.0 
17/03/17 18:08:56 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
17/03/17 18:08:56 INFO SecurityManager: Changing view acls to: superstar 
17/03/17 18:08:56 INFO SecurityManager: Changing modify acls to: superstar 
17/03/17 18:08:56 INFO SecurityManager: Changing view acls groups to: 
17/03/17 18:08:56 INFO SecurityManager: Changing modify acls groups to: 
17/03/17 18:08:56 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(superstar); groups with view permissions: Set(); users with modify permissions: Set(superstar); groups with modify permissions: Set() 
17/03/17 18:08:56 INFO Utils: Successfully started service 'sparkDriver' on port 33685. 
17/03/17 18:08:56 INFO SparkEnv: Registering MapOutputTracker 
17/03/17 18:08:56 INFO SparkEnv: Registering BlockManagerMaster 
17/03/17 18:08:56 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 
17/03/17 18:08:56 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 
17/03/17 18:08:56 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-27ad28b7-ac07-4d51-86fd-1576e429faf0 
17/03/17 18:08:56 INFO MemoryStore: MemoryStore started with capacity 366.3 MB 
17/03/17 18:08:56 INFO SparkEnv: Registering OutputCommitCoordinator 
17/03/17 18:08:56 INFO Utils: Successfully started service 'SparkUI' on port 4040. 
17/03/17 18:08:56 INFO SparkUI: Bound SparkUI to 192.168.50.200, and started at http://192.168.50.200:4040 
17/03/17 18:08:56 INFO SparkContext: Added file file:/home/superstar/TFSpark/TensorFlowOnSpark/examples/mnist/spark/mnist_spark.py at spark://192.168.50.200:33685/files/mnist_spark.py with timestamp 1489745336876 
17/03/17 18:08:56 INFO Utils: Copying /home/superstar/TFSpark/TensorFlowOnSpark/examples/mnist/spark/mnist_spark.py to /tmp/spark-8e6197a5-5749-4f36-9f80-034986a6a03c/userFiles-71b4732f-9be7-4fb0-9e2a-ccd877ace88d/mnist_spark.py 
17/03/17 18:08:56 INFO SparkContext: Added file file:/home/superstar/TFSpark/TensorFlowOnSpark/tfspark.zip at spark://192.168.50.200:33685/files/tfspark.zip with timestamp 1489745336881 
17/03/17 18:08:56 INFO Utils: Copying /home/superstar/TFSpark/TensorFlowOnSpark/tfspark.zip to /tmp/spark-8e6197a5-5749-4f36-9f80-034986a6a03c/userFiles-71b4732f-9be7-4fb0-9e2a-ccd877ace88d/tfspark.zip 
17/03/17 18:08:56 INFO SparkContext: Added file file:/home/superstar/TFSpark/TensorFlowOnSpark/examples/mnist/spark/mnist_dist.py at spark://192.168.50.200:33685/files/mnist_dist.py with timestamp 1489745336883 
17/03/17 18:08:56 INFO Utils: Copying /home/superstar/TFSpark/TensorFlowOnSpark/examples/mnist/spark/mnist_dist.py to /tmp/spark-8e6197a5-5749-4f36-9f80-034986a6a03c/userFiles-71b4732f-9be7-4fb0-9e2a-ccd877ace88d/mnist_dist.py 
17/03/17 18:08:56 ERROR SparkContext: Error initializing SparkContext. 
java.lang.NumberFormatException: For input string: "" 
    at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) 
    at java.lang.Integer.parseInt(Integer.java:592) 
    at java.lang.Integer.parseInt(Integer.java:615) 
    at scala.collection.immutable.StringLike$class.toInt(StringLike.scala:272) 
    at scala.collection.immutable.StringOps.toInt(StringOps.scala:29) 
    at org.apache.spark.SparkConf$$anonfun$getInt$2.apply(SparkConf.scala:392) 
    at org.apache.spark.SparkConf$$anonfun$getInt$2.apply(SparkConf.scala:392) 
    at scala.Option.map(Option.scala:146) 
    at org.apache.spark.SparkConf.getInt(SparkConf.scala:392) 
    at org.apache.spark.scheduler.TaskSchedulerImpl.<init>(TaskSchedulerImpl.scala:79) 
    at org.apache.spark.scheduler.TaskSchedulerImpl.<init>(TaskSchedulerImpl.scala:60) 
    at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2521) 
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:501) 
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247) 
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 
    at py4j.Gateway.invoke(Gateway.java:236) 
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80) 
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69) 
    at py4j.GatewayConnection.run(GatewayConnection.java:214) 
    at java.lang.Thread.run(Thread.java:745) 
17/03/17 18:08:56 INFO SparkUI: Stopped Spark web UI at http://192.168.50.200:4040 

--------------------------

エラーコンテキスト。

sparkの設定が間違っていますか?または一部のソフトウェアのバージョンが他のバージョンと一致しない場合

誰でも手伝ってもらえますか?

+0

私はスパークのconfigsは、引数の順序でチェック一致しない入力引数が正しく、有効であると考え、正しいと有効? – FaigB

+0

はい。入力引数(TOTAL_CORES/CORES_PER_WORKER/SPARK_WORKER_INSTANCES)は空です。だから私はそれらを追加します。どうもありがとう! – Sidedger

+0

は$ JAVA_HOMEをエコーできますか? – BDR

答えて

0

私はスパークのconfigsと一致しない入力引数は、引数の順序でチェックしていると思う

関連する問題