Using the ELK stack, I have to parse some file but they are remote. My solution
- rsync over ssh to get the remote file locally
My concern is that my elasticsearch index is growing exponentially (more tha 130MB) whereas the logfile are only 25MB. Is that possible that each rsync cron (*/5 mn) leads logstash to read the whole file again without taking the sincedb stuff ?
Thanks for your help :)
The context, I'm using acquia as hoster for drupal site and so I do not have control over how I can access the logfile
Guillaume Renard