0

I have a log file looking like this

116.50.181.5 - - [18/May/2015:19:05:32 +0000] "GET /images/web/2009/banner.png HTTP/1.1" 200 52315 "http://www.semicomplete.com/style2.css" "Mozilla/5.0 (X11; Linux x86_64; rv:26.0) Gecko/20100101 Firefox/26.0"

my logstash configuration is as below:

input {  
    file {
       path => "C:\Users\PC\Documents\elk\Input\listening.txt"
       start_position => "beginning"
      }
    }
filter {
  grok {
    match => {
      "message" => '%{IPORHOST:clientip} %{USER:ident} %{USER:auth} \[%{HTTPDATE:timestamp}\] "%{WORD:verb} %{DATA:request} HTTP/%{NUMBER:httpversion}" %{NUMBER:response:int} (?:-|%{NUMBER:bytes:int}) %{QS:referrer} %{QS:agent}'
    }
  }

  date {
    match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
    locale => en
  }

  geoip {
    source => "clientip"
  }

  useragent {
    source => "agent"
    target => "useragent"
  }
}

    output {
      elasticsearch 
      {
        hosts => "http://localhost:9200"
        index => "log"
      }  


    }

everything works just fine, I have no errors in logstash , but the data does'nt appear in elasticsearch as expected.

C:\elk\logstash-7.1.1\bin>logstash -f logstashETL.conf
Sending Logstash logs to C:/elk/logstash-7.1.1/logs which is now configured via log4j2.properties
[2019-06-12T16:02:27,371][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-06-12T16:02:27,405][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2019-06-12T16:02:36,087][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-06-12T16:02:36,344][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-06-12T16:02:36,428][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-06-12T16:02:36,428][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2019-06-12T16:02:36,469][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2019-06-12T16:02:36,493][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-06-12T16:02:36,513][INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x75642d2 run>"}
[2019-06-12T16:02:36,753][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-06-12T16:02:37,814][INFO ][logstash.inputs.file     ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/elk/logstash-7.1.1/data/plugins/inputs/file/.sincedb_636c54fa423804cc695f80e1cb9d6ccd", :path=>["C:\\Users\\PC\\Documents\\elk\\Input\\listening.txt"]}
[2019-06-12T16:02:37,878][INFO ][logstash.javapipeline    ] Pipeline started {"pipeline.id"=>"main"}
[2019-06-12T16:02:37,988][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-06-12T16:02:38,008][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2019-06-12T16:02:38,773][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

maybe there are somethings wrong or misssing in my code.

  • 1
    Have you tried adding an additional output (e.g: stdout -> https://www.elastic.co/guide/en/logstash/current/plugins-outputs-stdout.html ) to verify that messages are being processed? – Gonzalo Matheu Jun 12 '19 at 14:25
  • 1
    can you add debug flag when running your logstash script and paste logs here if there are any? Or add `output { stdout { codec => rubydebug } }` – dejanmarich Jun 12 '19 at 14:28
  • 1
    I added stdout in the config file and nothing changes `C:\elk\logstash-7.1.1\bin>logstash -f logstashETL.conf --config.test_and_exit Sending Logstash logs to C:/elk/logstash-7.1.1/logs which is now configured via log4j2.properties [2019-06-12T16:58:37,557][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified Configuration OK [2019-06-12T16:58:48,746][INFO ][logstash.runner ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash` – moez skanjii Jun 12 '19 at 15:00
  • 1
    Logstash remembers where it has stopped when reading a file, so if Logstash has already read a file and no more lines had been appended, Logstash won't do anything. See https://stackoverflow.com/a/25119894/6113627 for a solution. – baudsp Jun 12 '19 at 15:05
  • 1
    i added this `sincedb="NUL" `in my config file, same thing, it didn't work. I've no error, but the index have not been created in elastic. – moez skanjii Jun 12 '19 at 15:29
  • 1
    Try to change your path to use forward slashes `path => "C:/Users/PC/Documents/elk/Input/listening.txt"` – leandrojmp Jun 12 '19 at 16:21
  • What about cluster status? is it yellow ? If it is I found the solution – Orkun Jan 18 '21 at 12:14
  • Did you check my answer ? – Orkun Mar 26 '21 at 13:56

1 Answers1

0

Add the below code in your input

start_position => "beginning"
sincedb_path => "/dev/null"

sincedb_path => "/dev/null" means it doesn't store sincedb files. These files are keeping byte ofset of where the logstash left on the file.

Then go to logstash/data/plugins/inputs/file directory. After that run below command at this directory

rm -r .sincedb*

Finally run your logstash pipeline. It should work.

Orkun
  • 492
  • 4
  • 16