logstash采集带kerberos认证的kafka数据到带kerberos认证的hdfs

1.kafka输入插件

input {
  kafka {
    bootstrap_servers => "test1:9092,test2:9092,test3:9092" #集群地址
    topics => ["test"] #主题,可以有多个
    security_protocol => "SASL_PLAINTEXT"
    sasl_kerberos_service_name => "kafka"
    group_id => "test"
    codec => "json"
    jaas_path => "/etc/logstash-6.5.4/config/jaas.conf"#jaas路径
    kerberos_config => "/etc/krb5.conf"
  }
}

新建文件jaas.conf:

KafkaClient {
  com.sun.security.auth.module.Krb5LoginModule required
  useKeyTab=true
  storeKey=false
  keyTab="/etc/keytabs/kafka.test.keytab" #keytab 路径
  principal="XXXXX";  #使用klist -k kafka.test.keytab查看,不通环境需要修改
};


2.hdfs输出插件

output{
        webhdfs {
        host => "test-hadoop"  #namenode
        port => "50070"
        user => "hdfs"  #这里是启动hdfs的系统用户名
        path => "/applog/logstash-%{+YYYY}-%{+MM}-%{+dd}.log" #按天保存日志
        use_kerberos_auth => "true" #开启kerberos认证
        kerberos_keytab => "/etc/keytabs/hdfs.test.keytab" #keytab地址
        codec => "json"
}       
 stdout {
    codec => rubydebug  # 将日志输出到当前的终端上显示
  }
}

运行logstash可能会出现下面的错误:

java.lang.IllegalStateException: Logstash stopped processing because of an error: (LoadError) no such file to load -- gssapi

解决办法,安装gssapi插件:

一、 在线安装:

./bin/logstash-plugin install --no-verify gssapi

二、离线安装:

同离线安装webhdfs

在安装过插件的服务器上,进入logstash目录执行如下命令:

./logstash-plugin prepare-offline-pack --overwrite --output gssapi.zip gssapi

把离线插件传输到需要安装的服务器上,进入logstash目录执行如下命令:

   ./bin/logstash-plugin install file:///home/cd-jsjwhzx/logstash-6.5.4/gssapi.zip

已有 0 条评论

    欢迎您,新朋友,感谢参与互动!