2017-06-08 25 views
0

Hadoop3-alpha3を単一ノードクラスタ(Psuedo-distributed)でセットアップし、the apache guideを使用してセットアップしようとしています。私はMapReduceの例を実行しようとしましたが、接続が拒否されるたびに実行されました。 sbin/start-all.shを実行した後、私は(と同様にノードマネージャログに)のResourceManagerログにこれらの例外を見てきました。その後、ResourceManagerとNodeManagerを起動するとHadoopエラーが発生する

xxxx-xx-xx xx:xx:xx,xxx INFO org.apache.commons.beanutils.FluentPropertyBeanIntrospector: Error when creating PropertyDescriptor for public final void org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)! Ignoring this property. 
xxxx-xx-xx xx:xx:xx,xxx DEBUG org.apache.commons.beanutils.FluentPropertyBeanIntrospector: Exception is: 
java.beans.IntrospectionException: bad write method arg count: public final void org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object) 
    at java.desktop/java.beans.PropertyDescriptor.findPropertyType(PropertyDescriptor.java:696) 
    at java.desktop/java.beans.PropertyDescriptor.setWriteMethod(PropertyDescriptor.java:356) 
    at java.desktop/java.beans.PropertyDescriptor.<init>(PropertyDescriptor.java:142) 
    at org.apache.commons.beanutils.FluentPropertyBeanIntrospector.createFluentPropertyDescritor(FluentPropertyBeanIntrospector.java:178) 
    at org.apache.commons.beanutils.FluentPropertyBeanIntrospector.introspect(FluentPropertyBeanIntrospector.java:141) 
    at org.apache.commons.beanutils.PropertyUtilsBean.fetchIntrospectionData(PropertyUtilsBean.java:2245) 
    at org.apache.commons.beanutils.PropertyUtilsBean.getIntrospectionData(PropertyUtilsBean.java:2226) 
    at org.apache.commons.beanutils.PropertyUtilsBean.getPropertyDescriptor(PropertyUtilsBean.java:954) 
    at org.apache.commons.beanutils.PropertyUtilsBean.isWriteable(PropertyUtilsBean.java:1478) 
    at org.apache.commons.configuration2.beanutils.BeanHelper.isPropertyWriteable(BeanHelper.java:521) 
    at org.apache.commons.configuration2.beanutils.BeanHelper.initProperty(BeanHelper.java:357) 
    at org.apache.commons.configuration2.beanutils.BeanHelper.initBeanProperties(BeanHelper.java:273) 
    at org.apache.commons.configuration2.beanutils.BeanHelper.initBean(BeanHelper.java:192) 
    at org.apache.commons.configuration2.beanutils.BeanHelper$BeanCreationContextImpl.initBean(BeanHelper.java:669) 
    at org.apache.commons.configuration2.beanutils.DefaultBeanFactory.initBeanInstance(DefaultBeanFactory.java:162) 
    at org.apache.commons.configuration2.beanutils.DefaultBeanFactory.createBean(DefaultBeanFactory.java:116) 
    at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:459) 
    at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:479) 
    at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:492) 
    at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.createResultInstance(BasicConfigurationBuilder.java:447) 
    at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.createResult(BasicConfigurationBuilder.java:417) 
    at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.getConfiguration(BasicConfigurationBuilder.java:285) 
    at org.apache.hadoop.metrics2.impl.MetricsConfig.loadFirst(MetricsConfig.java:119) 
    at org.apache.hadoop.metrics2.impl.MetricsConfig.create(MetricsConfig.java:98) 
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configure(MetricsSystemImpl.java:478) 
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.start(MetricsSystemImpl.java:188) 
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.init(MetricsSystemImpl.java:163) 
    at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.init(DefaultMetricsSystem.java:62) 
    at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.initialize(DefaultMetricsSystem.java:58) 
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceInit(ResourceManager.java:678) 
    at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) 
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.createAndInitActiveServices(ResourceManager.java:1129) 
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceInit(ResourceManager.java:315) 
    at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) 
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1407) 

以降のファイルで:

xxxx-xx-xx xx:xx:xx,xxx FATAL org.apache.hadoop.yarn.server.resourcemanager.ResourceManager: Error starting ResourceManager 
java.lang.ExceptionInInitializerError 
    at com.google.inject.internal.cglib.reflect.$FastClassEmitter.<init>(FastClassEmitter.java:67) 
    at com.google.inject.internal.cglib.reflect.$FastClass$Generator.generateClass(FastClass.java:72) 
    at com.google.inject.internal.cglib.core.$DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:25) 
    at com.google.inject.internal.cglib.core.$AbstractClassGenerator.create(AbstractClassGenerator.java:216) 
    at com.google.inject.internal.cglib.reflect.$FastClass$Generator.create(FastClass.java:64) 
    at com.google.inject.internal.BytecodeGen.newFastClass(BytecodeGen.java:204) 
    at com.google.inject.internal.ProviderMethod$FastClassProviderMethod.<init>(ProviderMethod.java:256) 
    at com.google.inject.internal.ProviderMethod.create(ProviderMethod.java:71) 
    at com.google.inject.internal.ProviderMethodsModule.createProviderMethod(ProviderMethodsModule.java:275) 
    at com.google.inject.internal.ProviderMethodsModule.getProviderMethods(ProviderMethodsModule.java:144) 
    at com.google.inject.internal.ProviderMethodsModule.configure(ProviderMethodsModule.java:123) 
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:340) 
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:349) 
    at com.google.inject.AbstractModule.install(AbstractModule.java:122) 
    at com.google.inject.servlet.ServletModule.configure(ServletModule.java:52) 
    at com.google.inject.AbstractModule.configure(AbstractModule.java:62) 
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:340) 
    at com.google.inject.spi.Elements.getElements(Elements.java:110) 
    at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:138) 
    at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:104) 
    at com.google.inject.Guice.createInjector(Guice.java:96) 
    at com.google.inject.Guice.createInjector(Guice.java:73) 
    at com.google.inject.Guice.createInjector(Guice.java:62) 
    at org.apache.hadoop.yarn.webapp.WebApps$Builder.build(WebApps.java:332) 
    at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:377) 
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startWepApp(ResourceManager.java:1116) 
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1218) 
    at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) 
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1408) 
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make protected final java.lang.Class java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain) throws java.lang.ClassFormatError accessible: module java.base does not "opens java.lang" to unnamed module @173f73e7 
    at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:337) 
    at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:281) 
    at java.base/java.lang.reflect.Method.checkCanSetAccessible(Method.java:197) 
    at java.base/java.lang.reflect.Method.setAccessible(Method.java:191) 
    at com.google.inject.internal.cglib.core.$ReflectUtils$2.run(ReflectUtils.java:56) 
    at java.base/java.security.AccessController.doPrivileged(Native Method) 
    at com.google.inject.internal.cglib.core.$ReflectUtils.<clinit>(ReflectUtils.java:46) 
    ... 29 more 

参考のために私のコアサイト.xmlファイル:

<configuration> 
    <property> 
     <name>fs.default.name</name> 
     <value>hdfs://localhost:9000</value> 
    </property> 
</configuration> 

HDFS-site.xmlを:

<configuration> 
    <property> 
     <name>dfs.replication</name> 
     <value>1</value> 
    </property> 
</configuration> 

mapred-site.xmlを:

<configuration> 
    <property> 
     <name>mapreduce.framework.name</name> 
     <value>yarn</value> 
    </property> 
</configuration> 

と糸-site.xmlを:

<configuration> 
    <property> 
     <name>yarn.nodemanager.aux-services</name> 
     <value>mapreduce_shuffle</value> 
    </property> 
    <property> 
     <name>yarn.nodemanager.env-whitelist</name> 
     <value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED_HOME</value> 
    </property> 
</configuration> 

私はこれらの例外の原因となっているものは考えている、彼らと任意のヘルプは参考になります。

編集:追加されましたhadoop-env.sh:コメントで@ tk421が言及した時

export JAVA_HOME=/usr/local/jdk-9 
export HADOOP_HOME=/usr/local/hadoop 
export HADOOP_OS_TYPE=${HADOOP_OS_TYPE:-$(uname -s)} 
case ${HADOOP_OS_TYPE} in 
    Darwin*) 
    export HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.realm= " 
    export HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.kdc= " 
    export HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.conf= " 
    ;; 
esac 
export HADOOP_ROOT_LOGGER=DEBUG,console 
export HADOOP_DAEMON_ROOT_LOGGER=DEBUG,RFA 
+0

Javaのバージョンと使用しているオペレーティングシステムは何ですか? –

+0

Lubuntu 17.04、Javaビルド9-ea + 171 – user2361174

+0

hadoop-env.shのコンテンツも提供してください –

答えて

1

。 Java 9は、Hadoop 3(および任意のハープトップ版)と互換性がありません。

https://issues.apache.org/jira/browse/HADOOP-11123

私はJavaの8.181に変更したとの両方が現在起動されています

[email protected]:/usr/local/hadoop$ sbin/start-all.sh 
WARNING: Attempting to start all Apache Hadoop daemons as hadoop in 10 seconds. 
WARNING: This is not a recommended production deployment configuration. 
WARNING: Use CTRL-C to abort. 
Starting namenodes on [localhost] 
Starting datanodes 
Starting secondary namenodes [hadoop] 
Starting resourcemanager 
Starting nodemanagers 
[email protected]:/usr/local/hadoop$ jps 
8756 SecondaryNameNode 
8389 NameNode 
9173 NodeManager 
9030 ResourceManager 
8535 DataNode 
9515 Jps 
関連する問題