2016-05-04 14 views
0

スタック:Ambariを使用してHDP-2.3.2.0-2950をインストール2.1Sqoopの許可例外

インストールは自動でした。マシン(合計9ノード)がインターネットに接続し、ルートの資格情報を使用して実行されたためです。

参照のためのアンのlsコマンドの出力(sqoopユーザーがが欠落している):

[[email protected] ~]# hadoop fs -ls /user 
Found 7 items 
drwx------ - accumulo hdfs   0 2015-11-05 14:03 /user/accumulo 
drwxrwx--- - ambari-qa hdfs   0 2015-10-30 16:08 /user/ambari-qa 
drwxr-xr-x - hcat  hdfs   0 2015-10-30 16:17 /user/hcat 
drwxr-xr-x - hdfs  hdfs   0 2015-11-11 10:09 /user/hdfs 
drwx------ - hive  hdfs   0 2015-11-06 09:42 /user/hive 
drwxrwxr-x - oozie  hdfs   0 2015-11-05 12:53 /user/oozie 
drwxrwxr-x - spark  hdfs   0 2015-11-05 13:59 /user/spark 
[[email protected] ~]# 
[[email protected] ~]# 

もう一つ気になる私は、ユーザ・グループを考えたときに出力を(sqoopユーザーが欠落している):

cat /etc/group 
root:x:0: 
bin:x:1:bin,daemon 
daemon:x:2:bin,daemon 
sys:x:3:bin,adm 
adm:x:4:adm,daemon 
tty:x:5: 
disk:x:6: 
lp:x:7:daemon 
mem:x:8: 
kmem:x:9: 
wheel:x:10: 
mail:x:12:mail 
uucp:x:14: 
man:x:15: 
games:x:20: 
gopher:x:30: 
video:x:39: 
dip:x:40: 
ftp:x:50: 
lock:x:54: 
audio:x:63: 
nobody:x:99: 
users:x:100:oozie,ambari-qa,tez,falcon 
dbus:x:81: 
utmp:x:22: 
utempter:x:35: 
floppy:x:19: 
vcsa:x:69: 
cdrom:x:11: 
tape:x:33: 
dialout:x:18: 
haldaemon:x:68:haldaemon 
ntp:x:38: 
saslauth:x:76: 
mailnull:x:47: 
smmsp:x:51: 
stapusr:x:156: 
stapsys:x:157: 
stapdev:x:158: 
sshd:x:74: 
tcpdump:x:72: 
slocate:x:21: 
ovirtagent:x:175: 
rpc:x:32: 
rpcuser:x:29: 
nfsnobody:x:65534: 
munin:x:499: 
screen:x:84: 
scotty:x:999: 
tquest:x:6382: 
fuse:x:497: 
httpfs:x:496:httpfs 
knox:x:6383: 
spark:x:6384: 
hdfs:x:6385:hdfs 
accumulo:x:495: 
falcon:x:494: 
flume:x:493: 
hbase:x:492: 
hive:x:491: 
oozie:x:490: 
storm:x:489: 

Sqoop(を 'sqoop' Linuxユーザーとする)を使用してSQL ServerからHDFSにテーブルをインポートするとき:

私はSqoop(ASを使用してHDFSにSQL Serverからテーブルをインポートする場合

ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/user/root/.staging":hdfs:hdfs:drwxr-xr-x 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1755) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1738) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896) 
     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622) 
     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) 
     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2133) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2131) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
     at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
     at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106) 
     at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73) 
     at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3010) 
     at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2978) 
     at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1047) 
     at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1043) 
     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) 
     at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1043) 
     at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1036) 
     at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133) 
     at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:144) 
     at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) 
     at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) 
     at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308) 
     at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196) 
     at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169) 
     at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266) 
     at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673) 
     at org.apache.sqoop.manager.SQLServerManager.importTable(SQLServerManager.java:163) 
     at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497) 
     at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) 
     at org.apache.sqoop.Sqoop.run(Sqoop.java:148) 
     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) 
     at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184) 
     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226) 
     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235) 
     at org.apache.sqoop.Sqoop.main(Sqoop.java:244) 
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=root, access=WRITE, inode="/user/root/.staging":hdfs:hdfs:drwxr-xr-x 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1755) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1738) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896) 
     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622) 
     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) 
     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2133) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2131) 
     at org.apache.hadoop.ipc.Client.call(Client.java:1427) 
     at org.apache.hadoop.ipc.Client.call(Client.java:1358) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) 
     at com.sun.proxy.$Proxy14.mkdirs(Unknown Source) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:558) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:606) 
     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) 
     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) 
     at com.sun.proxy.$Proxy15.mkdirs(Unknown Source) 
     at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3008) 
     ... 28 more 

( 'ルート' Linuxユーザとして)Sqoopを使用して、HDFS上にSQL Serverからテーブルをインポート中0

ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.security.AccessControlException: Permission denied: user=sqoop, access=WRITE, inode="/user/sqoop/.staging":hdfs:hdfs:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1755) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1738) at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2133) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2131) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106) at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73) at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3010) at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2978) at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1047) at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1043) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1043) at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1036) at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:144) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308) at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196) at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169) at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266) at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673) at org.apache.sqoop.manager.SQLServerManager.importTable(SQLServerManager.java:163) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:148) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235) at org.apache.sqoop.Sqoop.main(Sqoop.java:244) Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=sqoop, access=WRITE, inode="/user/sqoop/.staging":hdfs:hdfs:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1755) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1738) at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2133) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2131) at org.apache.hadoop.ipc.Client.call(Client.java:1427) at org.apache.hadoop.ipc.Client.call(Client.java:1358) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy14.mkdirs(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:558) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy15.mkdirs(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3008) ... 28 more 

'HDFS' Linuxのユーザ)が、それは動作しますが、 ONEエラーログの文

INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.2.0-2950 
16/05/04 16:34:13 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 
16/05/04 16:34:14 INFO manager.SqlManager: Using default fetchSize of 1000 
16/05/04 16:34:14 INFO tool.CodeGenTool: Beginning code generation 
16/05/04 16:34:14 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [DimSampleDesc] AS t WHERE 1=0 
16/05/04 16:34:15 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.3.2.0-2950/hadoop-mapreduce 
Note: /tmp/sqoop-hdfs/compile/6f239d67662b5e2a3462b51268033d6e/DimSampleDesc.java uses or overrides a deprecated API. 
Note: Recompile with -Xlint:deprecation for details. 
16/05/04 16:34:17 ERROR orm.CompilationManager: Could not make directory: /root/. 

はありI以下の質問があります。

  1. なぜ自動であってもエラーです。インストールすなわち、私は(すべてのサービス/構成
  2. SqoopインポートまたはMR(私は、各ユーザーが使用する必要のある意味)

答えて

0

あなたはホームユーザーを作成する必要がありますを実行するための理想的な方法は何をスキップしませんでしたコマンドを開始するユーザーのためのHDFS)に

あなたがsqoopコマンドを起動すると、Hadoopが/user/${USER.NAMEであるホーム}

を見つけるためにしようと、HDFSのユーザーとローカルユーザーをマッピングします

Hadoopのスーパーユーザーはhdfsなので、あなたがする必要があります:

$ su - hdfs 'hadoop fs -mkdir /user/sqoop' 
$ su - hdfs 'hadoop fs -chown sqoop:hdfs /user/sqoop ' 

し、ユーザーsqoop

別の代替としてsqoop開始し、すべてのユーザーが(/ tmpなど)への書き込みアクセス権を持っているいくつかの他のHDFSの場所にハイブのステージングディレクトリを変更することです

でhive-site.xml

<property> 
    <name>hive.exec.stagingdir</name> 
    <value>/tmp</value> 
</property> 
+0

私は自分の質問を編集しました。確認して理解できますか? –

+0

はい、正確です! 'hdfs'ユーザとしてコマンドを起動すると、hadoop(またはhdfs)はコマンドをhdfsとして実行し、' hdfs'ユーザはすでにhdfsのホームを持っているので、リクエストは成功しますが、ユーザsqoopではhome/user/sqoopを使用しようとすると、それを見つけることができませんでした。それを作成しようとすると、sqoopは/ user(スーパーユーザーの 'hdfs'が所有しています) 。 – user1314742

+0

これはうまくいくと思っています。今はいくつかの疑問ともう1つの問題がありますが、別の質問を投稿します。 –

関連する問題