logstash采集带kerberos认证的kafka数据到es

1.kafka输入插件

input {
  kafka {
    bootstrap_servers => "test1:9092,test2:9092,test3:9092" #集群地址
    topics => ["test"] #主题,可以有多个
    security_protocol => "SASL_PLAINTEXT"
    sasl_kerberos_service_name => "kafka"
    group_id => "test"
    codec => "json"
    jaas_path => "/etc/logstash-6.5.4/config/jaas.conf"#jaas路径
    kerberos_config => "/etc/krb5.conf"
  }
}

新建文件jaas.conf:

KafkaClient {
  com.sun.security.auth.module.Krb5LoginModule required
  useKeyTab=true
  storeKey=false
  keyTab="/etc/keytabs/kafka.test.keytab" #keytab 路径
  principal="XXXXX";  #使用klist -k kafka.test.keytab查看,不通环境需要修改
};

2.ES输出插件

output {
  elasticsearch {
   hosts => [ "192.168.0.111:9202", "192.168.0.112:9202", "192.168.0.113:9202" ] #目标集群
    index => "test" #索引
    user => "test"
    password => "test"
  }
}

启动logstash:

./bin/logstash -f  配置文件路径

已有 0 条评论

    欢迎您,新朋友,感谢参与互动!