2016-04-04 1 views
0

イム開始ハイブメタストアサーバを起動カント:は、以下のコマンドでハイブメタストアサーバに

hive --service metastore & 

しかし、そのImは、以下のこのエラーを取得し、動作していません。しかし、これは既にうまくいきましたが、私は理解できません。なぜ動作が止まったのか理解できません...私が行った唯一の事は、スカラを使用してハイブを使っていくつかのクエリを実行し、

このエラーが表示されます。

Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused 
     at org.apache.thrift.transport.TSocket.open(TSocket.java:187) 


16/04/04 23:32:39 [main]: INFO hive.metastore: Waiting 1 seconds before next connection attempt. 
16/04/04 23:32:40 [main]: INFO hive.metastore: Trying to connect to metastore with URI thrift://localhost:9084 
16/04/04 23:32:40 [main]: WARN hive.metastore: Failed to connect to the MetaStore Server... 

そしてまた、この:

[[email protected] ~]$ hive --service metastore & 
[1] 3773 
[[email protected] ~]$ SLF4J: Class path contains multiple SLF4J bindings. 
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.7.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
SLF4J: Found binding in [jar:file:/usr/local/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 
Starting Hive Metastore Server 
SLF4J: Class path contains multiple SLF4J bindings. 
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.7.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
SLF4J: Found binding in [jar:file:/usr/local/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 
javax.jdo.JDODataStoreException: Exception thrown obtaining schema column information from datastore 
    at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451) 
    at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:732) 
    at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752) 
    at org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3228) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114) 
    at com.sun.proxy.$Proxy2.addRole(Unknown Source) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles_core(HiveMetaStore.java:656) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles(HiveMetaStore.java:648) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:462) 
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) 
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5756) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5751) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:5984) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:5909) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221) 
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 
NestedThrowablesStackTrace: 
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'hive.PARTITION_KEY_VALS' doesn't exist 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:406) 
    at com.mysql.jdbc.Util.getInstance(Util.java:381) 
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1030) 
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956) 
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3515) 
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3447) 
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1951) 
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2101) 
    at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2548) 
    at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2477) 
    at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1422) 
    at com.mysql.jdbc.DatabaseMetaData$2.forEach(DatabaseMetaData.java:2487) 
    at com.mysql.jdbc.IterateBlock.doForAll(IterateBlock.java:50) 
    at com.mysql.jdbc.DatabaseMetaData.getColumns(DatabaseMetaData.java:2361) 
    at org.datanucleus.store.rdbms.adapter.BaseDatastoreAdapter.getColumns(BaseDatastoreAdapter.java:1532) 
    at org.datanucleus.store.rdbms.schema.RDBMSSchemaHandler.refreshTableData(RDBMSSchemaHandler.java:921) 
    at org.datanucleus.store.rdbms.schema.RDBMSSchemaHandler.getRDBMSTableInfoForTable(RDBMSSchemaHandler.java:821) 
    at org.datanucleus.store.rdbms.schema.RDBMSSchemaHandler.getRDBMSTableInfoForTable(RDBMSSchemaHandler.java:770) 
    at org.datanucleus.store.rdbms.schema.RDBMSSchemaHandler.getSchemaData(RDBMSSchemaHandler.java:206) 
    at org.datanucleus.store.rdbms.RDBMSStoreManager.getColumnInfoForTable(RDBMSStoreManager.java:2378) 
    at org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:215) 
    at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3393) 
    at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:3190) 
    at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2841) 
    at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122) 
    at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:1605) 
    at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:954) 
    at org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:679) 
    at org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.java:2045) 
    at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1365) 
    at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3827) 
    at org.datanucleus.state.JDOStateManager.setIdentity(JDOStateManager.java:2571) 
    at org.datanucleus.state.JDOStateManager.initialiseForPersistentNew(JDOStateManager.java:513) 
    at org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:232) 
    at org.datanucleus.ExecutionContextImpl.newObjectProviderForPersistentNew(ExecutionContextImpl.java:1414) 
    at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2218) 
    at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:2065) 
    at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1913) 
    at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217) 
    at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:727) 
    at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752) 
    at org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3228) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114) 
    at com.sun.proxy.$Proxy2.addRole(Unknown Source) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles_core(HiveMetaStore.java:656) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles(HiveMetaStore.java:648) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:462) 
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) 
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5756) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5751) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:5984) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:5909) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221) 
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 
Exception in thread "main" javax.jdo.JDODataStoreException: Exception thrown obtaining schema column information from datastore 
    at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451) 
    at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:732) 
    at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752) 
    at org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3228) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114) 
    at com.sun.proxy.$Proxy2.addRole(Unknown Source) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles_core(HiveMetaStore.java:656) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles(HiveMetaStore.java:648) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:462) 
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) 
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5756) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5751) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:5984) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:5909) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221) 
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 
NestedThrowablesStackTrace: 
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'hive.PARTITION_KEY_VALS' doesn't exist 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:406) 
    at com.mysql.jdbc.Util.getInstance(Util.java:381) 
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1030) 
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956) 
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3515) 
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3447) 
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1951) 
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2101) 
    at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2548) 
    at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2477) 
    at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1422) 
    at com.mysql.jdbc.DatabaseMetaData$2.forEach(DatabaseMetaData.java:2487) 
    at com.mysql.jdbc.IterateBlock.doForAll(IterateBlock.java:50) 
    at com.mysql.jdbc.DatabaseMetaData.getColumns(DatabaseMetaData.java:2361) 
    at org.datanucleus.store.rdbms.adapter.BaseDatastoreAdapter.getColumns(BaseDatastoreAdapter.java:1532) 
    at org.datanucleus.store.rdbms.schema.RDBMSSchemaHandler.refreshTableData(RDBMSSchemaHandler.java:921) 
    at org.datanucleus.store.rdbms.schema.RDBMSSchemaHandler.getRDBMSTableInfoForTable(RDBMSSchemaHandler.java:821) 
    at org.datanucleus.store.rdbms.schema.RDBMSSchemaHandler.getRDBMSTableInfoForTable(RDBMSSchemaHandler.java:770) 
    at org.datanucleus.store.rdbms.schema.RDBMSSchemaHandler.getSchemaData(RDBMSSchemaHandler.java:206) 
    at org.datanucleus.store.rdbms.RDBMSStoreManager.getColumnInfoForTable(RDBMSStoreManager.java:2378) 
    at org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:215) 
    at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3393) 
    at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:3190) 
    at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2841) 
    at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122) 
    at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:1605) 
    at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:954) 
    at org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:679) 
    at org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.java:2045) 
    at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1365) 
    at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3827) 
    at org.datanucleus.state.JDOStateManager.setIdentity(JDOStateManager.java:2571) 
    at org.datanucleus.state.JDOStateManager.initialiseForPersistentNew(JDOStateManager.java:513) 
    at org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:232) 
    at org.datanucleus.ExecutionContextImpl.newObjectProviderForPersistentNew(ExecutionContextImpl.java:1414) 
    at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2218) 
    at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:2065) 
    at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1913) 
    at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217) 
    at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:727) 
    at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752) 
    at org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3228) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114) 
    at com.sun.proxy.$Proxy2.addRole(Unknown Source) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles_core(HiveMetaStore.java:656) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles(HiveMetaStore.java:648) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:462) 
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) 
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5756) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5751) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:5984) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:5909) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221) 
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136 

私はmysqlのハイブのデータベースをドロップして再度作成します。そして今、私はハイブメタストアを起動しようとすると:

hive --service metastore & 

Imはこの取得:ハイブのlibフォルダに

[[email protected] ~]$ hive --service metastore & 
[1] 4102 
[[email protected] ~]$ SLF4J: Class path contains multiple SLF4J bindings. 
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.7.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
SLF4J: Found binding in [jar:file:/usr/local/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 
Starting Hive Metastore Server 
SLF4J: Class path contains multiple SLF4J bindings. 
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.7.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
SLF4J: Found binding in [jar:file:/usr/local/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] 
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 
org.apache.thrift.transport.TTransportException: Could not create ServerSocket on address 0.0.0.0/0.0.0.0:9084. 
    at org.apache.thrift.transport.TServerSocket.<init>(TServerSocket.java:109) 
    at org.apache.thrift.transport.TServerSocket.<init>(TServerSocket.java:91) 
    at org.apache.thrift.transport.TServerSocket.<init>(TServerSocket.java:83) 
    at org.apache.hadoop.hive.metastore.TServerSocketKeepAlive.<init>(TServerSocketKeepAlive.java:34) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:5968) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:5909) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221) 
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 
Exception in thread "main" org.apache.thrift.transport.TTransportException: Could not create ServerSocket on address 0.0.0.0/0.0.0.0:9084. 
    at org.apache.thrift.transport.TServerSocket.<init>(TServerSocket.java:109) 
    at org.apache.thrift.transport.TServerSocket.<init>(TServerSocket.java:91) 
    at org.apache.thrift.transport.TServerSocket.<init>(TServerSocket.java:83) 
    at org.apache.hadoop.hive.metastore.TServerSocketKeepAlive.<init>(TServerSocketKeepAlive.java:34) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:5968) 
    at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:5909) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221) 
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 
+1

'MySQLSyntaxErrorException:テーブル 'hive.PARTITION_KEY_VALS'は存在しません。あなたのMySQLテーブルのように思えます –

+0

私はmysqlのプロンプトに入力すると、テーブルが表示されますが、私はこのテーブルから(*)を選択すると私はこれを取得します:ERROR 1146(42S02):テーブル 'hive.PARTITION_KEY_VALS'は存在しません。しかし、なぜ私はsparkからいくつかのクエリを実行し、いくつかのビューを作成する理由を理解していません。これを修正する方法があるかどうか、あるいはmysqlをアンインストールして再度インストールして問題が解決されたかどうかを確認してください。 – codin

+0

どういうわけか、MySQLでハイブメタストアをセットアップしています。正しいアプローチは、バックアップを再度設定することだと思います。 –

答えて

0

移動して、log4jの-SLF4Jを含むjarファイルを削除します。 この問題を引き起こすhadoopフォルダにも同じファイルが存在します。

関連する問題