I am using filebeat
to send multiple files to Logstash
but I am not able to map which file has which status So what are the possible ways to track mapped data for each log file.
Asked
Active
Viewed 603 times
0
-
What do you mean by `I am not able to map which file has which status` ? Are you indexing multiple file into single elasticsearch index and type? – Roopendra Dec 01 '16 at 10:14
-
yes index is same for all files – as1992 Dec 01 '16 at 10:15
-
for each file there is id name and status which will be use as filter – as1992 Dec 01 '16 at 10:16
-
I think you should index each log file in separate type. – Roopendra Dec 01 '16 at 10:18
-
Can you share your logstash and filebeat conf file? Please refer this [link](http://stackoverflow.com/questions/18330541/how-to-handle-multiple-heterogeneous-inputs-with-logstash) it will give you idea how to handle multiple files, though solution for file input. we have similar option in filebeat too. – Roopendra Dec 01 '16 at 10:19
-
but i need to make final charts based on all log files data in kibana – as1992 Dec 01 '16 at 10:21
-
All log files will be store in same index but in different type. I don't think so you will face any problem while creation of the dashboard. – Roopendra Dec 01 '16 at 10:24
1 Answers
0
You can use source
field which is coming from filebeat to filter your logs. Please check the documentation for more information.
The file from which the line was read. This field contains the full path to the file. For example: /var/log/system.log.
You can give not_analyzed property for this field to filter more effectively.

hkulekci
- 1,894
- 15
- 27