Thursday, July 7, 2016

Transfer logs from Kafka to Elasticsearch via Logstash

You can transfer logs from Kafka to Elasticsearch via Logstash with the follwoing configuration:

input {
  kafka {
    topic_id => 'some_log'
  }
}

filter {
  grok {
    patterns_dir => ["./patterns"]
    match => { "message" => "%{INT:log_version}\t%{INT:some_id}\t%{DATA:some_field}\t%{GREEDYDATA:last_field}" }
  }

  if [some_id] not in ["1", "2", "3"]  {
    drop { }
  }
}

output {
  elasticsearch {
    hosts => [ "1.2.3.4:9200" ]
  }

  #stdout {
    #codec => json
  #  codec => rubydebug
  #}
}

Note that the last field can't be `DATA`. If you use `DATA`, the last field won't be parsed.

Reference:
http://stackoverflow.com/questions/38240392/logstash-grok-filter-doesnt-work-for-the-last-field

No comments:

Post a Comment