python3.6.5基于kerberos认证的hive和hdfs连接调⽤⽅式
1. Kerberos是⼀种计算机⽹络授权协议,⽤来在⾮安全⽹络中,对个⼈通信以安全的⼿段进⾏⾝份认证。具体请查阅官⽹
2. 需要安装的包(基于centos)
yum install libsasl2-dev
yum install gcc-c++ python-devel.x86_64 cyrus-sasl-devel.x86_64
yum install python-devel
yum install krb5-devel
yum install python-krbV
pip install krbcontext==0.9
pip install thrift==0.9.3
pip install thrift-sasl==0.2.1
pip install impyla==0.14.1
pip install hdfs[kerberos]
pip install pykerberos==1.2.1
3. /f 配置,在这个⽂件⾥配置你服务器所在的域
4./etc/hosts 配置,配置集机器和域所在机器
5. 通过kinit ⽣成 ccache_file或者keytab_file
6. 连接hive代码如下
import os
from impala.dbapi import connect
from krbcontext import krbcontext
keytab_path = os.path.split(alpath(__file__))[0] + '/xxx.keytab'
principal = 'xxx'
with krbcontext(using_keytab=True,principal=principal,keytab_file=keytab_path):
conn = connect(host=ip, port=10000, auth_mechanism='GSSAPI', kerberos_service_name='hive')
cursor = conn.cursor()
for row in cursor:
print(row)
7. 连接hdfs代码如下
kerberos import KerberosClient
from krbcontext import krbcontext
hdfs_url = '' + host + ':' + port
data = self._get_keytab(sso_ticket)
self._save_keytab(data)
with krbcontext(using_keytab=True, keytab_file=self.keytab_file, principal=self.user):
self.client = KerberosClient(hdfs_url)
self.client._list_status(path).json()['FileStatuses']['FileStatus'] #获取path下⽂件及⽂件夹
8. 注:krbcontext这个包官⽅说⽀持python2,但是python3也能⽤
这个hdfs_url ⼀定要带""不然会报错
9. 我新增了⼀些配置⽂件配置,具体的操作如下
python3.6.5基于kerberos认证的hdfs,hive连接调⽤(含基础环境配置)
1需要准备的环境
yum包(需要先装yum包,再装python包,不然会有问题)
yum install openldap-clients -y
yum install krb5-workstation krb5-libs -y
yum install gcc-c++ python-devel.x86_64 cyrus-sasl-devel.x86_64
yum install python-devel
yum install krb5-devel
yum install python-krbV
yum install cyrus-sasl-plain cyrus-sasl-devel cyrus-sasl-gssapi
python包安装(pip或pip3,请根据实际情况选择)
pip install krbcontext==0.9
pip install thrift==0.9.3
pip install thrift-sasl==0.2.1
pip install impyla==0.14.1
pip install hdfs[kerberos]
pip install pykerberos==1.2.1
配置/etc/hosts⽂件(需要把⼤数据平台的机器和域名进⾏配置)
配置/f(具体查看kerberos服务配置中⼼)
参考配置(仅供参考,具体更具⾃⼰实际配置修改)
[libdefaults]
renew_lifetime = 9d
forwardable = true
default_realm = PANEL.COM
ticket_lifetime = 24h
dns_lookup_realm = false
dns_lookup_kdc = false
default_ccache_name = /tmp/krb5cc_%{uid}
[logging]
default = FILE:/var/log/krb5kdc.log
admin_server = FILE:/var/log/kadmind1.log
kdc = FILE:/var/log/krb5kdc1.log
[realms]
PANEL.COM = {
thrift
admin_server = st1
kdc = st1
}
连接代码:
hdfs:
import json, os
kerberos import KerberosClient
from krbcontext import krbcontext
def _connect(self, host, port, sso_ticket=None):
try:
hdfs_url = '' + host + ':' + port
active_str = 'kinit -kt {0} {1}'.format(self.keytab_file, self.user)
# 激活当前kerberos⽤户认证,因为python缓存机制,切换⽤户,这个缓存不会⾃动切换,需要⼿动处理下
os.system(active_str)
with krbcontext(using_keytab=True, keytab_file=self.keytab_file, principal=self.user):
self.client = KerberosClient(hdfs_url)
except Exception as e:
raise e
hive
import os
from krbcontext import krbcontext
from impala.dbapi import connect
from auto_model_platform.settings import config
def _connect(self, host, port, sso_ticket=None):
try:
active_str = 'kinit -kt {0} {1}'.format(self.keytab_file, self.user)
# 同hdfs
os.system(active_str)
with krbcontext(using_keytab=True, principal=self.user, keytab_file=self.keytab_file):
< = connect(host=host, port=port, auth_mechanism='GSSAPI', kerberos_service_name='hive')
self.cursor = ursor()
except Exception as e:
raise e
总结
我在做的时候也遇到很多坑,其实在这个需要理解其中原理,⽐如kerberos的机制和对应命令
如果是做基础平台⽤,⽤多⽤户切换的情况,建议不要⽤python,因为⼀点都不友好,官⽅包问题很多,我都改⽤java的jdbc 去操作hdfs和hive了
如果只是⾃⼰测试和和做算法研究,还是可以⽤的,因为这个代码简单,容易实现
补充
kinit命令
kinit -kt xxxx.keytab #激活xxxx⽤户当前缓存
kinit list #查看当前缓存⽤户
以上这篇python3.6.5基于kerberos认证的hive和hdfs连接调⽤⽅式就是⼩编分享给⼤家的全部内容了,希望能给⼤家⼀个参考,也希望⼤家多多⽀持。

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系QQ:729038198,我们将在24小时内删除。