2017-07-27 15 views
0

私はspark sqlを学び始めています。私はsbtに以下の依存関係を使用しています。エラーが発生していますsparkSQLのSBT依存性

name := "sparkLearning" 

version := "1.0" 

scalaVersion := "2.11.8" 

val sparkVersion = "1.6.1" 
val sqlVersion = "1.3.1" 

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % sparkVersion, 
    "org.apache.spark" % "spark-sql" % sqlVersion 
) 

エラーが発生しています。

Error:Error while importing SBT project:<br/>...<br/><pre>[info] Resolving com.thoughtworks.paranamer#paranamer;2.6 ... 
[info] Resolving org.scala-sbt#completion;0.13.15 ... 
[info] Resolving org.scala-sbt#control;0.13.15 ... 
[info] Resolving org.scala-sbt#sbt;0.13.15 ... 
[info] Resolving org.scala-sbt#run;0.13.15 ... 
[info] Resolving org.scala-sbt.ivy#ivy;2.3.0-sbt-48dd0744422128446aee9ac31aa356ee203cc9f4 ... 
[info] Resolving org.scala-sbt#test-interface;1.0 ... 
[info] Resolving com.jcraft#jsch;0.1.50 ... 
[info] Resolving org.scala-lang#scala-compiler;2.10.6 ... 
[info] Resolving jline#jline;2.14.3 ... 
[info] Resolving org.scala-sbt#compiler-ivy-integration;0.13.15 ... 
[info] Resolving org.scala-sbt#incremental-compiler;0.13.15 ... 
[info] Resolving org.scala-sbt#logic;0.13.15 ... 
[info] Resolving org.scala-sbt#main-settings;0.13.15 ... 
[trace] Stack trace suppressed: run 'last *:update' for the full output. 
[trace] Stack trace suppressed: run 'last *:ssExtractDependencies' for the full output. 
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-sql;1.3.1: not found 
[error] (*:ssExtractDependencies) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-sql;1.3.1: not found 
[error] Total time: 15 s, completed 27-Jul-2017 15:29:52 
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=384M; support was removed in 8.0 

これを解決する方法を教えてください。あなたは、ファイルをSBTため

答えて

4

正しいフォームは

name := "sparkLearning" 

version := "1.0" 

scalaVersion := "2.11.8" 

val sparkVersion = "1.6.1" 

libraryDependencies ++= Seq(
    "org.apache.spark" % "spark-core_2.10" % sparkVersion, 
    "org.apache.spark" % "spark-sql_2.10" % sparkVersion 
) 

である私は、あなたがあなたの助けのための2.11.8

name := "sparkLearning" 

version := "1.0" 

scalaVersion := "2.11.8" 

val sparkVersion = "2.2.0" 

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % sparkVersion, 
    "org.apache.spark" %% "spark-sql" % sparkVersion 
) 
+0

おかげスカラ座と互換性があり、最新のスパークバージョンを使用することをお勧め。 – Srinivas

+0

アクセプタンスありがとうございます:)もしあなたが助けてくれればupvoteもできます –

+0

spark-submitを使ってスパークプログラムを実行する場合は、Provided to Sparkの依存関係を追加する必要があります: val sparkVersion = "2.2.0 " ... " org.apache.spark "%%" spark-core "%sparkVersion%、 " org.apache.spark "%%" spark-sql "%sparkVersion%、 ... –

関連する問題