2016-01-05 6 views
5

Ubuntu 14.04でSpark 1.6のHashop 2.6以降のPre-builtバージョンをフレッシュにデスクトップにダウンロードしました。 "./bin/spark-shell" Ubuntu 14.04でHadoop 2.6+を使用したSpark 1.6のあらかじめ作成されたバージョンでは動作しません

私はスパークシェルにナビゲートし、私は次のエラーを受け付けております

./bin/spark-shell 

使用 Quick Start Spark Link下記リンクごとに火花を開始しました。 Mac OSX hereについても同様の質問がありました。

[email protected]:~/Desktop/spark-1.6.0-bin-hadoop2.6$ ./bin/spark-shell 
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). 
log4j:WARN Please initialize the log4j system properly. 
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. 
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties 
To adjust logging level use sc.setLogLevel("INFO") 
Welcome to 
     ____    __ 
    /__/__ ___ _____/ /__ 
    _\ \/ _ \/ _ `/ __/ '_/ 
    /___/ .__/\_,_/_/ /_/\_\ version 1.6.0 
     /_/ 

Using Scala version 2.10.5 (OpenJDK 64-Bit Server VM, Java 1.7.0_91) 
Type in expressions to have them evaluated. 
Type :help for more information. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 ERROR SparkContext: Error initializing SparkContext. 
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries! 
    at sun.nio.ch.Net.bind0(Native Method) 
    at sun.nio.ch.Net.bind(Net.java:463) 
    at sun.nio.ch.Net.bind(Net.java:455) 
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) 
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) 
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125) 
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485) 
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089) 
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430) 
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415) 
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903) 
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198) 
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348) 
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357) 
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357) 
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) 
    at java.lang.Thread.run(Thread.java:745) 
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries! 
    at sun.nio.ch.Net.bind0(Native Method) 
    at sun.nio.ch.Net.bind(Net.java:463) 
    at sun.nio.ch.Net.bind(Net.java:455) 
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) 
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) 
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125) 
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485) 
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089) 
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430) 
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415) 
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903) 
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198) 
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348) 
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357) 
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357) 
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) 
    at java.lang.Thread.run(Thread.java:745) 

java.lang.NullPointerException 
    at org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367) 
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
    at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028) 
    at $iwC$$iwC.<init>(<console>:15) 
    at $iwC.<init>(<console>:24) 
    at <init>(<console>:26) 
    at .<init>(<console>:30) 
    at .<clinit>(<console>) 
    at .<init>(<console>:7) 
    at .<clinit>(<console>) 
    at $print(<console>) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) 
    at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346) 
    at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) 
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) 
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) 
    at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) 
    at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) 
    at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) 
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132) 
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124) 
    at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) 
    at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124) 
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) 
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974) 
    at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159) 
    at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) 
    at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108) 
    at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64) 
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991) 
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) 
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) 
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) 
    at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) 
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) 
    at org.apache.spark.repl.Main$.main(Main.scala:31) 
    at org.apache.spark.repl.Main.main(Main.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) 
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) 
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) 
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) 
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 

<console>:16: error: not found: value sqlContext 
     import sqlContext.implicits._ 
       ^
<console>:16: error: not found: value sqlContext 
     import sqlContext.sql 
       ^

ヘルプがありますか?

+0

sudoで実行しようとしましたか? root権限が必要なのかもしれません。 – kometen

+0

@kometen Nud Sudoもどちらも役に立たない。同じエラー –

+0

http://stackoverflow.com/a/40523061/2777965 – 030

答えて

0

クラスタを起動するときにマスタがその例外から起動しないという同様の問題があります。

これを修正するために、$ SPARK_HOME/conf/spark-env.shファイルで設定していたプロパティを変更しました。

以前は、私のマスターノードのIPアドレスに 'SPARK_MASTER_IP'を設定しました。これをボックスのパブリックDNSに変更すると、問題が解決されたようです。

+0

トマスの提案に感謝します。私は同じ問題を抱えており、これは私の問題を解決しませんでした。 – ammills01

0

問題の原因としては、不正なIPアドレスにバインドしようとしている可能性があります。 $SPARK_HOME/conf/spark-env.shには、$SPARK_LOCAL_IPという名前の変数があります。設定されている場合は、実際にマシンを表しているか、Sparkシェルを実行しているか、コメントアウトしてください。それ以外の場合は、設定されていない場合は、たとえば127.0.0.1に設定してみてください。

+0

問題が引き続き発生する – 030

関連する問題