php - How to solve data loss when logstash writes data to elasticsearch
欧阳克
欧阳克 2017-07-01 09:11:55
0
1
1102

There are more than 600 pieces of data in my log, but only more than 300 pieces are written to elasticsearch.
Does anyone know the reason for this?
This is my configuration
input {

file {
    path => ["/usr/local/20170730.log"]
    type => "log_test_events"
    tags => ["log_tes_events"]
    start_position => "beginning"
    sincedb_path => "/data/logstash/sincedb/test.sincedb"
    codec => "json"
    close_older => "86400"
    #1 day
    ignore_older => "86400"
}
beats{port => 5044}

}
filter {

urldecode {
    all_fields => true
}

}

output{

   elasticsearch {
      hosts  => "localhost:9200"
      index  => "logstash_%{event_date}"
}

stdout { codec => json }
}

欧阳克
欧阳克

温故而知新,可以为师矣。 博客:www.ouyangke.com

reply all(1)
过去多啦不再A梦

Because when reading the log, the ES template automatically creates a data type based on the format of the data. For example, the value of field a is int and string. The first index he creates reads a number, which is an int type index

Modify configuration and do mapping
output {

      elasticsearch {
        hosts  => "localhost:9200"
        index  => "test1"
        manage_template => true
        template_overwrite => true
        template => "/usr/local/logstash/templates/stat_day.json"
     }
     

stat_day.json template format

     {

"order" : 1,
"template" : "test1",
"mappings" : {

 "log_test": {
      "properties" : {
            "event_id": { "type": "string"}
       }
  }

}
}

Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template