2017-11-28 17 views
0

私はスパークするのが初めてで、スタンドアロンのローカルマスターを通して初めてのJavaスパークジョブを実行しようとしています。 私のマスターが立ち上がり、1人のワーカーも登録されますが、sparkプログラムの下で実行すると、org.apache.spark.SparkExceptionが発生しました:awaitResultで例外がスローされました。 masterがlocalに設定されていると、プログラムは正常に動作するはずです。Javaジョブがスタンドアロンローカルマスターに接続できませんでした

マイスパークコード:

public static void main(String[] args) { 

    //Setup configuration 
    String appName = "My Very First Spark Job"; 
    //String sparkMaster = "local[2]"; 
    String sparkMaster = "spark://10.0.0.116:7077"; 

    JavaSparkContext spContext = null; 

    SparkConf conf = new SparkConf() 
      .setAppName(appName) 
      .setMaster(sparkMaster); 

    //Create Spark Context from configuration 
    spContext = new JavaSparkContext(conf); 

ログ:

17/11/28 21:22:23 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://10.0.0.116:7077... 
    17/11/28 21:22:23 INFO TransportClientFactory: Successfully created connection to /10.0.0.116:7077 after 30 ms (0 ms spent in bootstraps) 
    17/11/28 21:22:23 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 10.0.0.116:7077 
    org.apache.spark.SparkException: Exception thrown in awaitResult 
     at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77) 
     at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75) 
     at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36) 
     at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
     at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
     at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167) 
     at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83) 
     at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88) 
     at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96) 
     at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:109) 
     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
     at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
     at java.lang.Thread.run(Thread.java:748) 
    Caused by: java.lang.RuntimeException: java.io.EOFException 
     at java.io.DataInputStream.readFully(DataInputStream.java:197) 
     at java.io.DataInputStream.readUTF(DataInputStream.java:609) 
     at java.io.DataInputStream.readUTF(DataInputStream.java:564) 
     at org.apache.spark.rpc.netty.RequestMessage$.readRpcAddress(NettyRpcEnv.scala:582) 
     at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:592) 
     at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:651) 
     at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:636) 
     at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:157) 
     at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:105) 
     at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
     at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
     at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
     at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
     at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
     at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) 
     at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480) 
     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442) 
     at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131) 
     at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) 
     at java.lang.Thread.run(Thread.java:748) 

     at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:190) 
     at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:121) 
     at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51) 
     at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) 
     at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) 
     at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) 
     at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) 
     at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846) 
     at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) 
     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) 
     at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) 
     ... 1 more 

スパークマスター:

Jings-MBP-6:bin jingzhou$ ./spark-class org.apache.spark.deploy.master.Master 
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
    17/11/28 20:55:11 INFO Master: Started daemon with process name: [email protected] 
    17/11/28 20:55:11 INFO SignalUtils: Registered signal handler for TERM 
    17/11/28 20:55:11 INFO SignalUtils: Registered signal handler for HUP 
    17/11/28 20:55:11 INFO SignalUtils: Registered signal handler for INT 
    17/11/28 20:55:11 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
    17/11/28 20:55:11 INFO SecurityManager: Changing view acls to: jingzhou 
    17/11/28 20:55:11 INFO SecurityManager: Changing modify acls to: jingzhou 
    17/11/28 20:55:11 INFO SecurityManager: Changing view acls groups to: 
    17/11/28 20:55:11 INFO SecurityManager: Changing modify acls groups to: 
    17/11/28 20:55:11 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jingzhou); groups with view permissions: Set(); users with modify permissions: Set(jingzhou); groups with modify permissions: Set() 
    17/11/28 20:55:12 INFO Utils: Successfully started service 'sparkMaster' on port 7077. 
    17/11/28 20:55:12 INFO Master: Starting Spark master at spark://10.0.0.116:7077 
    17/11/28 20:55:12 INFO Master: Running Spark version 2.2.0 
    17/11/28 20:55:12 INFO Utils: Successfully started service 'MasterUI' on port 8080. 
    17/11/28 20:55:12 INFO MasterWebUI: Bound MasterWebUI to 0.0.0.0, and started at http://10.0.0.116:8080 
    17/11/28 20:55:12 INFO Utils: Successfully started service on port 6066. 
    17/11/28 20:55:12 INFO StandaloneRestServer: Started REST server for submitting applications on port 6066 
    17/11/28 20:55:12 INFO Master: I have been elected leader! New state: ALIVE 
    17/11/28 20:59:27 INFO Master: Registering worker 10.0.0.116:64461 with 8 cores, 15.0 GB RAM 
    17/11/28 21:03:42 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() on RPC id 4722074090999773956 
    java.io.EOFException 
     at java.io.DataInputStream.readFully(DataInputStream.java:197) 
     at java.io.DataInputStream.readUTF(DataInputStream.java:609) 
     at java.io.DataInputStream.readUTF(DataInputStream.java:564) 
     at org.apache.spark.rpc.netty.RequestMessage$.readRpcAddress(NettyRpcEnv.scala:582) 
     at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:592) 
     at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:651) 
     at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:636) 
     at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:157) 
     at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:105) 
     at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
     at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
     at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
     at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
     at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
     at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) 
     at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480) 
     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442) 
     at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131) 
     at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) 
     at java.lang.Thread.run(Thread.java:748) 

スパークワーカー:

Jings-MBP-6:bin jingzhou$ ./spark-class org.apache.spark.deploy.worker.Worker spark://10.0.0.116:7077 
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
    17/11/28 20:59:26 INFO Worker: Started daemon with process name: [email protected] 
    17/11/28 20:59:26 INFO SignalUtils: Registered signal handler for TERM 
    17/11/28 20:59:26 INFO SignalUtils: Registered signal handler for HUP 
    17/11/28 20:59:26 INFO SignalUtils: Registered signal handler for INT 
    17/11/28 20:59:26 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
    17/11/28 20:59:26 INFO SecurityManager: Changing view acls to: jingzhou 
    17/11/28 20:59:26 INFO SecurityManager: Changing modify acls to: jingzhou 
    17/11/28 20:59:26 INFO SecurityManager: Changing view acls groups to: 
    17/11/28 20:59:26 INFO SecurityManager: Changing modify acls groups to: 
    17/11/28 20:59:26 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jingzhou); groups with view permissions: Set(); users with modify permissions: Set(jingzhou); groups with modify permissions: Set() 
    17/11/28 20:59:27 INFO Utils: Successfully started service 'sparkWorker' on port 64461. 
    17/11/28 20:59:27 INFO Worker: Starting Spark worker 10.0.0.116:64461 with 8 cores, 15.0 GB RAM 
    17/11/28 20:59:27 INFO Worker: Running Spark version 2.2.0 
    17/11/28 20:59:27 INFO Worker: Spark home: /Users/jingzhou/Desktop/hadoop/spark/spark-2.2.0-bin-hadoop2.7 
    17/11/28 20:59:27 INFO Utils: Successfully started service 'WorkerUI' on port 8081. 
    17/11/28 20:59:27 INFO WorkerWebUI: Bound WorkerWebUI to 0.0.0.0, and started at http://10.0.0.116:8081 
    17/11/28 20:59:27 INFO Worker: Connecting to master 10.0.0.116:7077... 
    17/11/28 20:59:27 INFO TransportClientFactory: Successfully created connection to /10.0.0.116:7077 after 26 ms (0 ms spent in bootstraps) 
    17/11/28 20:59:27 INFO Worker: Successfully registered with master spark://10.0.0.116:7077 
+0

スパークバージョンは使用していますか? –

+0

これはあなたの問題に対する答えかもしれません。https://stackoverflow.com/questions/42126186/spark-standalone-transportrequesthandler-error-while-invoking-rpchandler-whe – pgras

+0

マスターとあなたのために異なるSparkバイナリを使用している可能性がありますクライアント。それらが両側で同じであることを確認しましたか? –

答えて

0

私は、Eclipseの開発環境からローカルクラスタに接続しようとしているときに同じ問題に直面しました。

私のケースでは、それはmasterとmaven依存関係を使用していた開発環境でバージョンが一致しませんでした。

クラスタました:2.2.1 私のDEVだった:2.1.0

私はバージョンを修正すると、エラーが解決されました。

関連する問題