私は、/ usr/local/spark#./bin/spark-submit-class "DataframeExample" --master local [2]を使用して起動するjarアプリケーションを送信しようとしています。 〜/ new/hbfinance-module-1.0-SNAPSHOT.jar /。私はApache Spark 1.5.0を使用しています。コンソールにはアプリケーションがロードされているようですが、これで失敗します。エラーアプライアンスのjava.lang.NoClassDefFoundError
mple --master local[2] ~/new/hbfinance-module-1.0-SNAPSHOT.jar/
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/10/12 12:42:08 INFO SparkContext: Running Spark version 1.5.0
16/10/12 12:42:09 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/10/12 12:42:09 INFO SecurityManager: Changing view acls to: root
16/10/12 12:42:09 INFO SecurityManager: Changing modify acls to: root
16/10/12 12:42:09 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
16/10/12 12:42:10 INFO Slf4jLogger: Slf4jLogger started
16/10/12 12:42:10 INFO Remoting: Starting remoting
16/10/12 12:42:10 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:44153]
16/10/12 12:42:10 INFO Utils: Successfully started service 'sparkDriver' on port 44153.
16/10/12 12:42:10 INFO SparkEnv: Registering MapOutputTracker
16/10/12 12:42:10 INFO SparkEnv: Registering BlockManagerMaster
16/10/12 12:42:10 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-2416dd91-f441-4618-8141-e97257cdd17b
16/10/12 12:42:10 INFO MemoryStore: MemoryStore started with capacity 530.0 MB
16/10/12 12:42:10 INFO HttpFileServer: HTTP File server directory is /tmp/spark-f80de7bc-6613-4125-b047-eae7574df54e/httpd-a9849575-8777-4e8d-a6df-fdb2a08a0b87
16/10/12 12:42:10 INFO HttpServer: Starting HTTP Server
16/10/12 12:42:11 INFO Utils: Successfully started service 'HTTP file server' on port 44964.
16/10/12 12:42:11 INFO SparkEnv: Registering OutputCommitCoordinator
16/10/12 12:42:11 INFO Utils: Successfully started service 'SparkUI' on port 4040.
16/10/12 12:42:11 INFO SparkUI: Started SparkUI at http://178.62.18.22:4040
16/10/12 12:42:11 INFO SparkContext: Added JAR file:/root/new/hbfinance-module-1.0-SNAPSHOT.jar at http://178.62.18.22:44964/jars/hbfinance-module-1.0-SNAPSHOT.jar with timestamp 1476290531299
16/10/12 12:42:11 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
16/10/12 12:42:11 INFO Executor: Starting executor ID driver on host localhost
16/10/12 12:42:11 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46409.
16/10/12 12:42:11 INFO NettyBlockTransferService: Server created on 46409
16/10/12 12:42:11 INFO BlockManagerMaster: Trying to register BlockManager
16/10/12 12:42:11 INFO BlockManagerMasterEndpoint: Registering block manager localhost:46409 with 530.0 MB RAM, BlockManagerId(driver, localhost, 46409)
16/10/12 12:42:11 INFO BlockManagerMaster: Registered BlockManager
Exception in thread "main" java.lang.NoClassDefFoundError: com/mongodb/hadoop/MongoInputFormat
at com.hbfinance.DataframeExample.run(DataframeExample.java:47)
at com.hbfinance.DataframeExample.main(DataframeExample.java:88)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.mongodb.hadoop.MongoInputFormat
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 11 more
16/10/12 12:42:11 INFO SparkContext: Invoking stop() from shutdown hook
16/10/12 12:42:11 INFO SparkUI: Stopped Spark web UI at http://178.62.18.22:4040
16/10/12 12:42:11 INFO DAGScheduler: Stopping DAGScheduler
16/10/12 12:42:12 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/10/12 12:42:12 INFO MemoryStore: MemoryStore cleared
16/10/12 12:42:12 INFO BlockManager: BlockManager stopped
16/10/12 12:42:12 INFO BlockManagerMaster: BlockManagerMaster stopped
16/10/12 12:42:12 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/10/12 12:42:12 INFO SparkContext: Successfully stopped SparkContext
16/10/12 12:42:12 INFO ShutdownHookManager: Shutdown hook called
16/10/12 12:42:12 INFO ShutdownHookManager: Deleting directory /tmp/spark-f80de7bc-6613-4125-b047-eae7574df54e
iveが正直に間違っていたかわからない。
ここに私のpomファイル。私は、コードのaswellを追加する必要がありますけれども
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.hbfinance</groupId>
<artifactId>hbfinance-module</artifactId>
<packaging>jar</packaging>
<version>1.0-SNAPSHOT</version>
<name>hbfinance-module</name>
<url>http://maven.apache.org</url>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>1.5.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.5.1</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.14</version>
</dependency>
<dependency>
<groupId>org.mongodb.mongo-hadoop</groupId>
<artifactId>mongo-hadoop-core</artifactId>
<version>1.4.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
</plugins>
</build>
。
public void run() {
JavaSparkContext sc = new JavaSparkContext(new SparkConf());
// Set configuration options for the MongoDB Hadoop Connector.
Configuration mongodbConfig = new Configuration();
// MongoInputFormat allows us to read from a live MongoDB instance.
// We could also use BSONFileInputFormat to read BSON snapshots.
mongodbConfig.set("mongo.job.input.format", "com.mongodb.hadoop.MongoInputFormat");
// MongoDB connection string naming a collection to use.
// If using BSON, use "mapred.input.dir" to configure the directory
// where BSON files are located instead.
mongodbConfig.set("mongo.input.uri",
"mongodb://hadoopUser:[email protected]:27017/hbdata.ppt_logs");
// mongodbConfig.set("mongo.input.uri",
// "mongodb://hadoopUser:[email protected]:27017/hbdata.ppa_logs");
// mongodbConfig.set("mongo.input.uri",
// "mongodb://hadoopUser:[email protected]:27017/hbdata.dd_logs");
// mongodbConfig.set("mongo.input.uri",
// "mongodb://hadoopUser:[email protected]:27017/hbdata.fav_logs");
// mongodbConfig.set("mongo.input.uri",
// "mongodb://hadoopUser:[email protected]:27017/hbdata.pps_logs");
// Create an RDD backed by the MongoDB collection.
JavaPairRDD<Object, BSONObject> documents = sc.newAPIHadoopRDD(
mongodbConfig, // Configuration
MongoInputFormat.class, // InputFormat: read from a live cluster.
Object.class, // Key class
BSONObject.class // Value class
);
JavaRDD<AppLog> logs = documents.map(
new Function<Tuple2<Object, BSONObject>, AppLog>() {
public AppLog call(final Tuple2<Object, BSONObject> tuple) {
AppLog log = new AppLog();
BSONObject header =
(BSONObject) tuple._2().get("headers");
log.setTarget((String) header.get("target"));
log.setAction((String) header.get("action"));
return log;
}
}
);
SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);
DataFrame logsSchema = sqlContext.createDataFrame(logs, AppLog.class);
logsSchema.registerTempTable("logs");
DataFrame groupedMessages = sqlContext.sql(
"select target, action, Count(*) from logs group by target, action");
// "SELECT to, body FROM messages WHERE to = \"[email protected]\"");
groupedMessages.show();
logsSchema.printSchema();
}
public static void main(final String[] args) {
new DataframeExample().run();
}
あなたのアプリケーションがMongoInputFormat.classをロードすることができません...これをチェックしてください。.. https://docs.oracle.com/javase/7/docs/api/java/ lang/NoClassDefFoundError.html –
ああ、それは瓶から行方不明を意味する –
はいまたはjarが正しくロードされていない –