2017-11-28 16 views
-2

新着スパーク) 私はLinuxマシンでPOCを実行するためにスタンドアロンでインストールされています。すべてがOK探していたと我々のコードがうまく働いていた私たちは、以下の断続的なエラー取得を開始:働いた、私は疲れてコードを移動PySpark:ERROR SparkContext:SparkContextの初期化中にエラーが発生しました。 java.nio.file.AccessDeniedException:

ERROR SparkContext: Error initializing SparkContext. 
java.nio.file.AccessDeniedException: /tmp/spark-8bcc1872-653b-419f-9d46-b3e449b3c223/userFiles-6ac82898-4415-4a93-9075-b50ace65ddc4/myscript.py 

をコードの名前を変更、そのコードが、どれを簡素化します。数秒でエラーはそれだけで止まった。いいえ、どこにも行きません。

マイスパークバージョンが火花2.2.0 を一度そのエラーが起こることを奇妙なことに、私はpysparkを入力すると、それはまた、(最後の部分以下)非常に長いエラートレースを開始できませんでしたです:

Caused by: java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ 
java.sql.SQLException: Failed to create database 'metastore_db', see the next exception for details. 
     at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source) 
     at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source) 
     at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source) 
     at org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source) 
     at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source) 
     at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source) 
     at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source) 
     at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source) 
     at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source) 
     at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source) 
     at java.sql.DriverManager.getConnection(DriverManager.java:664) 
     at java.sql.DriverManager.getConnection(DriverManager.java:208) 
     at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361) 
     at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416) 
     at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120) 
     at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501) 
     at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 
     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
     at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 
     at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631) 
     at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301) 
     at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187) 
     at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356) 
     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775) 
     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333) 
     at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:498) 
     at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) 
     at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166) 
     at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) 
     at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) 
     at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365) 
     at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394) 
     at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291) 
     at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258) 
     at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76) 
     at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136) 
     at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57) 
     at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66) 
     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593) 
     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571) 
     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624) 
     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461) 
     at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) 
     at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72) 
     at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762) 
     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199) 
     at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 
     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
     at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 
     at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) 
     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) 
     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) 
     at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) 
     at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005) 
     at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) 
     at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503) 
     at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:191) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 
     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
     at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 
     at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264) 
     at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:362) 
     at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:266) 
     at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66) 
     at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65) 
     at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:194) 
     at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:194) 
     at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:194) 
     at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97) 
     at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:193) 
     at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:105) 
     at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:93) 
     at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39) 
     at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54) 
     at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52) 
     at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:35) 
     at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:289) 
     at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1050) 
     at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:130) 
     at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:130) 
     at scala.Option.getOrElse(Option.scala:121) 
     at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:129) 
     at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:126) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:498) 
     at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 
     at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 
     at py4j.Gateway.invoke(Gateway.java:280) 
     at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 
     at py4j.commands.CallCommand.execute(CallCommand.java:79) 
     at py4j.GatewayConnection.run(GatewayConnection.java:214) 
     at java.lang.Thread.run(Thread.java:748) 
Caused by: ERROR XJ041: Failed to create database 'metastore_db', see the next exception for details. 

ヘルプは高く評価されています

+0

そのスタックトレースは大規模です。確かにそれを減らすことはできませんか? –

+0

実際のスタックトレースは私が投稿するものの5倍です。 metastore_dbに問題があります – fromSAS2Spark

+0

いくつかのコードを追加してください。どのようにセッションをインスタンス化しますか? – mkaran

答えて

0

問題は解決されました。 umask設定により、metastore_dbフォルダの不正なアクセス権が発生します。 これらのフォルダをどこにも作成せずにpysparkを提出する正しい "クリーン"なコマンドは何ですか?

関連する問題