2016-04-12 18 views
0

次の重大なファイルエラーを回避する方法を知っている人はいますか?Spark app [エラー](*:アセンブリ)deduplicate:次のファイルの内容が異なる:

私はInteliJコミュニティ版で昨夜まで働いていたSparkアプリケーションを持っていて、明らかに理由がないために、esotericsoftwareの2つのjarがエラーを引き起こしています。今、sbtアセンブリを実行すると、次のエラーメッセージが表示されます。

[warn] Merging 'META-INF\taglib.tld' with strategy 'discard' 
java.lang.RuntimeException: deduplicate: different file contents found in the following: 
C:\Users\osadmin\.ivy2\cache\com.esotericsoftware.kryo\kryo\bundles\kryo-2.21.jar:com/esotericsoftware/minlog/Log$Logger.class 
C:\Users\osadmin\.ivy2\cache\com.esotericsoftware.minlog\minlog\jars\minlog-1.2.jar:com/esotericsoftware/minlog/Log$Logger.class 
     at sbtassembly.Plugin$Assembly$.sbtassembly$Plugin$Assembly$$applyStrategy$1(Plugin.scala:253) 
     at sbtassembly.Plugin$Assembly$$anonfun$15.apply(Plugin.scala:270) 
     at sbtassembly.Plugin$Assembly$$anonfun$15.apply(Plugin.scala:267) 
     at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251) 
     at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251) 
     at scala.collection.Iterator$class.foreach(Iterator.scala:727) 
     at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) 
     at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) 
     at scala.collection.AbstractIterable.foreach(Iterable.scala:54) 
     at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251) 
     at scala.collection.AbstractTraversable.flatMap(Traversable.scala:105) 
     at sbtassembly.Plugin$Assembly$.applyStrategies(Plugin.scala:272) 
     at sbtassembly.Plugin$Assembly$.x$4$lzycompute$1(Plugin.scala:172) 
     at sbtassembly.Plugin$Assembly$.x$4$1(Plugin.scala:170) 
     at sbtassembly.Plugin$Assembly$.stratMapping$lzycompute$1(Plugin.scala:170) 
     at sbtassembly.Plugin$Assembly$.stratMapping$1(Plugin.scala:170) 
     at sbtassembly.Plugin$Assembly$.inputs$lzycompute$1(Plugin.scala:214) 
     at sbtassembly.Plugin$Assembly$.inputs$1(Plugin.scala:204) 
     at sbtassembly.Plugin$Assembly$.apply(Plugin.scala:230) 
     at sbtassembly.Plugin$Assembly$$anonfun$assemblyTask$1.apply(Plugin.scala:373) 
     at sbtassembly.Plugin$Assembly$$anonfun$assemblyTask$1.apply(Plugin.scala:370) 
     at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47) 
     at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40) 
     at sbt.std.Transform$$anon$4.work(System.scala:63) 
     at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226) 
     at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226) 
     at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17) 
     at sbt.Execute.work(Execute.scala:235) 
     at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226) 
     at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226) 
     at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159) 
     at sbt.CompletionService$$anon$2.call(CompletionService.scala:28) 
     at java.util.concurrent.FutureTask.run(Unknown Source) 
     at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) 
     at java.util.concurrent.FutureTask.run(Unknown Source) 
     at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) 
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) 
     at java.lang.Thread.run(Unknown Source) 
[error] (*:assembly) deduplicate: different file contents found in the following: 
[error] C:\Users\osadmin\.ivy2\cache\com.esotericsoftware.kryo\kryo\bundles\kryo-2.21.jar:com/esotericsoftware/minlog/Log$Logger.class 
[error] C:\Users\osadmin\.ivy2\cache\com.esotericsoftware.minlog\minlog\jars\minlog-1.2.jar:com/esotericsoftware/minlog/Log$Logger.class 

私はsbtを使用しており、次のようにmergeStrategyを持っています。

libraryDependencies ++= Seq("org.apache.spark" % "spark-core_2.10" % "1.5.2" % "provided" exclude("org.apache.hadoop", "hadoop-client"), 
    ("org.mongodb" % "mongo-java-driver" % "3.2.2") 
    .exclude("commons-beanutils", "commons-beanutils-core") 
    .exclude("commons-beanutils", "commons-beanutils") , 
    ("org.mongodb.mongo-hadoop" % "mongo-hadoop-core" % "1.4.2") 
    .exclude("commons-beanutils", "commons-beanutils-core") 
    .exclude("commons-beanutils", "commons-beanutils"), 
    ("com.stratio.datasource" % "spark-mongodb_2.10" % "0.11.0"), 
    ("org.apache.spark" % "spark-sql_2.10" % "1.5.2" % "provided"), 
    ("org.apache.spark" % "spark-hive_2.10" % "1.5.2" % "provided"), 
    ("org.apache.spark" % "spark-streaming_2.10" % "1.5.2") 
) 

assemblySettings 

jarName in assembly := "mongotest.jar" 

val meta = """META.INF(.)*""".r 

mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) => 
{ 
    case PathList("org", "apache", xs @ _*) => MergeStrategy.last 
    case PathList("plugin.properties") => MergeStrategy.last 
    case meta(_) => MergeStrategy.discard 
    case x => old(x) 
} 
} 


assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false) 

答えて

0

今日、多くの検索を行った結果、Stackoverflowで答えが見つかりました。問題は、Sparkストリーミング依存関係に「提供されていない」という欠点でした。ここに4e6からの明確な答えの他の質問へのURLがあります。 error while running sbt assembly : sbt deduplication error

関連する問題