首页 > 编程语言 > 详细

[Kerberos] Java client访问kerberos-secured cluste

时间:2015-12-29 22:26:09      阅读:632      评论:0      收藏:0      [点我收藏+]

使用java client访问kerberos-secured cluster,最重要的是先从admin那里拿到可用的keytab文件,用来作认证。接下来就是调整连接的配置问题。以下先用连接hdfs为例进行说明。

申请可用的keytab文件

keytab文件用来存储principal的key,有KDC那边生成的principal,最终可以存储在keytab文件中。

连接测试

可用通过执行Hadoop command来测试是否能连接成功。

1. kinit 认证

kinit -kt path-to-keytab principalName

先认证principalName是否合法。如果合法,KDC会返回initial TGT。该TGT有效期通常是几个小时。

2. 执行Hadoop命令

hadoop fs -ls hdfs://namenode1:8020

执行这个命令之后,会返回儿各种exception,按照exception的提示,逐步添加配置,可能的配置如下:

1) 配置使用kerberos认证

    hadoop.security.authentication: kerberos

2)配置说明server principal

    dfs.namenode.kerberos.principal

3)执行Hadoop command

    正常情况下,配置完前两项,不要有大的问题,Hadoop command可以正常返回结果。如果一致返回 Server has invalid Kerberos principal,这个时候可以从以下几个三个方面考虑:

  1. server principal是否配置对,正常情况下值设置成namenode configuration一致就可以了
  2. check DSN resolver是否一致。The HDFS client will initiate an RPC call to the namenode to get the hdfs service principal. Then the client with compare the hostname from the service princpal to the canonical name of the namenode hostname. In this case the namenode canonical name on the client machine resolved to a different hostname then what was in DNS.
  3. ipc传输数据加密问题。可以尝试添加 dfs.encrypt.data.transfer,dfs.encrypt.data.transfer.algorithm,dfs.trustedchannel.resolver.class,dfs.datatransfer.client.encrypt。

Java Kerberos认证代码

public class HadoopSecurityUtil {

    public static final String EAGLE_KEYTAB_FILE_KEY = "eagle.keytab.file";
    public static final String EAGLE_USER_NAME_KEY = "eagle.kerberos.principal";

    public static void login(Configuration kConfig) throws IOException {
        if (kConfig.get(EAGLE_KEYTAB_FILE_KEY) == null || kConfig.get(EAGLE_USER_NAME_KEY) == null) return;

        kConfig.setBoolean("hadoop.security.authorization", true);
        kConfig.set("hadoop.security.authentication", "kerberos");
        UserGroupInformation.setConfiguration(kConfig);
        UserGroupInformation.loginUserFromKeytab(kConfig.get(EAGLE_USER_NAME_KEY), kConfig.get(EAGLE_KEYTAB_FILE_KEY));
    }
}

HDFS & HBase配置

  • HDFS
{
 "fs.defaultFS":"hdfs://nameservice1",
 "dfs.nameservices": "nameservice1",
 "dfs.ha.namenodes.nameservice1":"namenode1,namenode2",
 "dfs.namenode.rpc-address.nameservice1.namenode1": "hadoopnamenode01:8020",
 "dfs.namenode.rpc-address.nameservice1.namenode2": "hadoopnamenode02:8020",
 "dfs.client.failover.proxy.provider.apollo-phx-nn-ha": "org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider",
 "eagle.keytab.file":"/EAGLE-HOME/.keytab/b_eagle.keytab_apd",
 "eagle.kerberos.principal":"eagle@APD.EBAY.COM"
}
  • HBase 
 {
  "hbase.zookeeper.property.clientPort":"2181",
  "hbase.zookeeper.quorum":"localhost",
  "hbase.security.authentication":"kerberos",
  "hbase.master.kerberos.principal":"hadoop/_HOST@EXAMPLE.COM",
  "zookeeper.znode.parent":"/hbase",
  "eagle.keytab.file":"/EAGLE-HOME/.keytab/eagle.keytab",
  "eagle.kerberos.principal":"eagle@EXAMPLE.COM"
}

References

  • https://github.com/randomtask1155/HadoopDNSVerifier
  • https://support.pivotal.io/hc/en-us/articles/204391288-hdfs-ls-command-fails-with-Server-has-invalid-Kerberos-principal




   

 

[Kerberos] Java client访问kerberos-secured cluste

原文:http://www.cnblogs.com/qingwen/p/5087196.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!