点击上方“IT那活儿”公众号,关注后了解更多内容,不管IT什么活儿,干就完了!!!
前 言
环境依赖
hadoop认证配置
kadmin.local -q "addprinc -randkeyhadoop/bigdata-03@HADOOP.COM"
kadmin.local -q "addprinc -randkeyhadoop/bigdata-05@HADOOP.COM"
kadmin.local -q "xst -k /root/keytabs/kerberos/hadoop.keytabhadoop/bigdata-03@HADOOP.COM"
kadmin.local -q "xst -k /root/keytabs/kerberos/hadoop.keytab hadoop/bigdata-05@HADOOP.COM"
klist -kt /root/keytabs/kerberos/hadoop.keytab
klist -kt /home/gpadmin/hadoop.keytab
<property>
<name>hadoop.security.authenticationname>
<value>kerberosvalue>
property>
<property>
<name>hadoop.security.authorizationname>
<value>truevalue>
property>
<property>
<name>dfs.block.access.token.enablename>
<value>truevalue>
property>
<property>
<name>dfs.permissions.enabledname>
<value>falsevalue>
property>
<property>
<name>dfs.namenode.kerberos.principalname>
<value>hadoop/_HOST@EXAMPLE.COMvalue>
property>
<property>
<name>dfs.namenode.keytab.filename>
<value>/home/gpadmin/hadoop.keytabvalue>
property>
<property>
<name>dfs.secondary.namenode.kerberos.principalname>
<value>hadoop/_HOST@EXAMPLE.COMvalue>
property>
<property>
<name>dfs.secondary.namenode.keytab.filename>
<value>/home/gpadmin/hadoop.keytabvalue>
property>
<property>
<name>dfs.web.authentication.kerberos.principalname>
<value>hadoop/_HOST@EXAMPLE.COMvalue>
property>
<property>
<name>dfs.web.authentication.kerberos.keytabname>
<value>/home/gpadmin/hadoop.keytabvalue>
property>
<property>
<name>dfs.datanode.kerberos.principalname>
<value>hadoop/_HOST@EXAMPLE.COMvalue>
property>
<property>
<name>dfs.datanode.keytab.filename>
<value>/home/gpadmin/hadoop.keytabvalue>
property>
<property>
<name>dfs.data.transfer.protectionname>
<value>authenticationvalue>
property>
<property>
<name>dfs.http.policyname>
<value>HTTPS_ONLYvalue>
<description>所有开启的web页面均使用https, 细节在ssl server 和client那个配置文件内配置description>
property>
<property>
<name>yarn.resourcemanager.principalname>
<value>hadoop/_HOST@EXAMPLE.COMvalue>
property>
<property>
<name>yarn.resourcemanager.keytabname>
<value>/home/gpadmin/hadoop.keytabvalue>
property>
<property>
<name>yarn.nodemanager.principalname>
<value>hadoop/_HOST@EXAMPLE.COMvalue>
property>
<property>
<name>yarn.nodemanager.keytabname>
<value>/home/gpadmin/hadoop.keytabvalue>
property>
<property>
<name>mapreduce.jobhistory.principalname>
<value>hadoop/_HOST@EXAMPLE.COMvalue>
property>
<property>
<name>mapreduce.jobhistory.keytabname>
<value>/home/gpadmin/hadoop.keytabvalue>
property>
<property>
<name>ssl.server.truststore.locationname>
<value>/home/gpadmin/kerberos_https/keystorevalue>
<description>Truststore to be used by NN and DN. Must be specified.
description>
property>
<property>
<name>ssl.server.truststore.passwordname>
<value>passwordvalue>
<description>Optional. Default value is "".
description>
property>
<property>
<name>ssl.server.truststore.typename>
<value>jksvalue>
<description>Optional. The keystore file format, default value is "jks".
description>
property>
<property>
<name>ssl.server.truststore.reload.intervalname>
<value>10000value>
<description>Truststore reload check interval, in milliseconds.
Default value is 10000 (10 seconds).
description>
property>
<property>
<name>ssl.server.keystore.locationname>
<value>/home/gpadmin/kerberos_https/keystorevalue>
<description>Keystore to be used by NN and DN. Must be specified.
description>
property>
<property>
<name>ssl.server.keystore.passwordname>
<value>passwordvalue>
<description>Must be specified.
description>
property>
<property>
<name>ssl.server.keystore.keypasswordname>
<value>passwordvalue>
<description>Must be specified.
description>
property>
<property>
<name>ssl.server.keystore.typename>
<value>jksvalue>
<description>Optional. The keystore file format, default value is "jks".
description>
property>
<property>
<name>ssl.server.exclude.cipher.listname>
<value>TLS_ECDHE_RSA_WITH_RC4_128_SHA,SSL_DHE_RSA_EXPORT_WITH_DES40_CBC_SHA,
SSL_RSA_WITH_DES_CBC_SHA,SSL_DHE_RSA_WITH_DES_CBC_SHA,
SSL_RSA_EXPORT_WITH_RC4_40_MD5,SSL_RSA_EXPORT_WITH_DES40_CBC_SHA,
SSL_RSA_WITH_RC4_128_MD5value>
<description>Optional. The weak security cipher suites that you want excluded
from SSL communication.description>
property>
keytool -keystore keystore -alias hadoop -validity 365000 -
keystore/home/gpadmin/kerberos_https/keystore/keystore -
genkey -keyalg RSA -keysize 2048 -dname "CN=hadoop,
OU=shsnc, O=snc, L=hunan, ST=changsha, C=CN"
kinit -kt /home/gpadmin/hadoop.keytabhadoop/bigdata-05@HADOOP.COM
flink认证配置
<property>
<name>dfs.permissions.enabledname>
<value>truevalue>
property>
klist -kt /root/keytabs/kerberos/hadoop.keytab
security.kerberos.login.use-ticket-cache: true
security.kerberos.login.keytab: /home/gpadmin/hadoop.keytab
security.kerberos.login.principal: gpadmin@HADOOP.COM
security.kerberos.login.contexts: Client
flink run -m yarn-cluster
-p 1
-yjm 1024
-ytm 1024
-ynm amp_zabbix
-c com.shsnc.fk.task.tokafka.ExtratMessage2KafkaTask
-yt /home/gpadmin/jar_repo/config/krb5.conf
-yD env.java.opts.jobmanager=-Djava.security.krb5.conf=krb5.conf
-yD env.java.opts.taskmanager=-Djava.security.krb5.conf=krb5.conf
-yD security.kerberos.login.keytab=/home/gpadmin/hadoop.keytab
-yD security.kerberos.login.principal=gpadmin@HADOOP.COM
$jarname
文章版权归作者所有,未经允许请勿转载,若此文章存在违规行为,您可以联系管理员删除。
转载请注明本文地址:https://www.ucloud.cn/yun/129142.html
大数据开发系列五:kafka& zookeeper 配置kerberos认证 img{ display:block; margin:0 auto !important; width:100%; } body{ ...
摘要:一大数据平台介绍大数据平台架构演变如图所示魅族大数据平台架构演变历程年底,我们开始实践大数据,并部署了测试集群。因此,大数据运维的目标是以解决运维复杂度的自动化为首要目标。大数据运维存在的问题大数据运维存在的问题包括部署及运维复杂。 一、大数据平台介绍 1.1大数据平台架构演变 showImg(https://segmentfault.com/img/bVWDPj?w=1024&h=...
阅读 1346·2023-01-11 13:20
阅读 1684·2023-01-11 13:20
阅读 1132·2023-01-11 13:20
阅读 1858·2023-01-11 13:20
阅读 4100·2023-01-11 13:20
阅读 2704·2023-01-11 13:20
阅读 1385·2023-01-11 13:20
阅读 3594·2023-01-11 13:20