2016-04-26 12 views
0
16/04/26 16:58:46 DEBUG ProtobufRpcEngine: Call: complete took 3ms 
Exception in thread "main" java.lang.NoClassDefFoundError: com/datastax/spark/connector/japi/CassandraJavaUtil 
     at com.baitic.mcava.lecturahdfssaveincassandra.TratamientoCSV.main(TratamientoCSV.java:123) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:498) 
     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) 
     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) 
     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) 
     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) 
     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
Caused by: java.lang.ClassNotFoundException: com.datastax.spark.connector.japi.CassandraJavaUtil 
     at java.net.URLClassLoader.findClass(URLClassLoader.java:381) 
     at java.lang.ClassLoader.loadClass(ClassLoader.java:424) 
     at java.lang.ClassLoader.loadClass(ClassLoader.java:357) 
     ... 10 more 
16/04/26 16:58:46 INFO SparkContext: Invoking stop() from shutdown hook 
16/04/26 16:58:46 INFO SparkUI: Stopped Spark web UI at http://10.128.0.5:4040 
16/04/26 16:58:46 INFO SparkDeploySchedulerBackend: Shutting down all executors 
16/04/26 16:58:46 INFO SparkDeploySchedulerBackend: Asking each executor to shut down 
16/04/26 16:58:46 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 
16/04/26 16:58:46 INFO MemoryStore: MemoryStore cleared 
16/04/26 16:58:46 INFO BlockManager: BlockManager stopped 
16/04/26 16:58:46 INFO BlockManagerMaster: BlockManagerMaster stopped 
16/04/26 16:58:46 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 
16/04/26 16:58:46 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 
16/04/26 16:58:46 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. 
16/04/26 16:58:46 INFO SparkContext: Successfully stopped SparkContext 
16/04/26 16:58:46 INFO ShutdownHookManager: Shutdown hook called 
16/04/26 16:58:46 INFO ShutdownHookManager: Deleting directory /srv/spark/tmp/spark-2bf57fa2-a2d5-4f8a-980c-994e56b61c44 
16/04/26 16:58:46 DEBUG Client: stopping client from cache: [email protected] 
16/04/26 16:58:46 DEBUG Client: removing client from cache: [email protected] 
16/04/26 16:58:46 DEBUG Client: stopping actual client because no more references remain: [email protected] 
16/04/26 16:58:46 DEBUG Client: Stopping client 
16/04/26 16:58:46 DEBUG Client: IPC Client (2107841088) connection to mcava-master/10.128.0.5:54310 from baiticpruebas2: closed 
16/04/26 16:58:46 DEBUG Client: IPC Client (2107841088) connection to mcava-master/10.128.0.5:54310 from baiticpruebas2: stopped, remaining connections 0 
16/04/26 16:58:46 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down. 

私はこの単純なコードを作る:火花カサンドラます。java.lang.NoClassDefFoundError:COM/datastax /火花/コネクタ/ japi/CassandraJavaUtil

/ String pathDatosTratados="hdfs://mcava-master:54310/srv/hadoop/data/spark/DatosApp/medidasSensorTratadas.txt"; 
    String jarPath ="hdfs://mcava-master:54310/srv/hadoop/data/spark/original-LecturaHDFSsaveInCassandra-1.0-SNAPSHOT.jar"; 
    String jar="hdfs://mcava-master:54310/srv/hadoop/data/spark/spark-cassandra-connector-assembly-1.6.0-M1-4-g6f01cfe.jar"; 
    String jar2="hdfs://mcava-master:54310/srv/hadoop/data/spark/spark-cassandra-connector-java-assembly-1.6.0-M1-4-g6f01cfe.jar"; 
    String[] jars= new String[3]; 
    jars[0]=jarPath; 
    jars[2]=jar; 
    jars[1]=jar2; 

    SparkConf conf=new SparkConf().setAppName("TratamientoCSV").setJars(jars); 
    conf.set("spark.cassandra.connection.host", "10.128.0.5"); 
    conf.set("spark.kryoserializer.buffer.max","512"); 
    conf.set("spark.kryoserializer.buffer","256"); 
//  conf.setJars(jars); 
    JavaSparkContext sc= new JavaSparkContext(conf); 

    JavaRDD<String> input= sc.textFile(pathDatos); 

を私はまた、火花防止にカサンドラ・ドライブへのパスを置きます

spark.driver.extraClassPath  hdfs://mcava-master:54310/srv/hadoop/data/spark/spark-cassandra-connector-java-assembly-1.6.0-M1-4-g6f01cfe.jar 
spark.executor.extraClassPath hdfs://mcava-master:54310/srv/hadoop/data/spark/spark-cassandra-connector-java-assembly-1.6.0-M1-4-g6f01cfe.jar 

は、default.conf私はまた、ドライバのパスにフラグ--jarsを置くが、私はいつも、私はなぜ理解していない同じエラーを持っています?

私はGoogleのエンジンで働く

答えて

6

アプリを提出する際にパッケージを追加してください。

$SPARK_HOME/bin/spark-submit --packages datastax:spark-cassandra-connector:1.6.0-M2-s_2.11 .... 
+0

私が問題を抱えているのは、アプリをlauchしているのですが... – Miren

+0

スケーラ版をお使いですか?それは2.11ですか? – Hlib

+0

no conector is 2.10 ...私はspark 1.6.1の最後のリリースを使用します – Miren

0

私は問題を解決する...私は脂肪ジャーへの参照のみカサンドラコネクタへの参照を示すために必要なすべての依存関係を持つ脂肪jarファイルを集約し、それがありません。

1

この問題を解決するために、この引数を追加します。--packages datastax:spark-cassandra-connector:1.6.0-M2-s_2.10。