20

I'm in the process of setting up Elasticsearch and Kibana as a centralized logging platform in our office.

We have a number of custom utilities and plug-ins which I would like to track the usage of and if users are encountering any errors. Not to mention there are servers, and scheduled jobs I would like to keep track of as well.

So if I have a number of different sources for log data all going to the same elasticsearch cluster what are the conventions or best practices for how this is organized into indexes and document types?

The default index value used by Logstash is "logstash-%{+YYYY.MM.dd}". So it seems like it's best to suffix any index names with the current date, as this makes it easy to purge old data.

However, Kibana allows for adding multiple "index patterns" that can be selected from in the UI. Yet all the tutorials I've read only mention creating a single pattern like logstash-*.

How are multiple index patterns used in practice? Would I just give names for all the sources for my data? Such as:

BackupUtility-%{+YYYY.MM.dd}
UserTracker-%{+YYYY.MM.dd}
ApacheServer-%{+YYYY.MM.dd}

I'm using nLog in a number of my tools which has an elastic search target. The convention for nLog and other similar logging frameworks is to have a "logger" for each class in the source code. Should these logger translate to indexes in elastic search?

MyCompany.CustomTool.FooClass-%{+YYYY.MM.dd}
MyCompany.CustomTool.BarClass-%{+YYYY.MM.dd}
MyCompany.OtherTool.BazClass-%{+YYYY.MM.dd}

Or is this too granular for elasticsearch index names, and it would be better to stick to just to a single dated index for the application?

CustomTool-%{+YYYY.MM.dd}
Eric Anastas
  • 21,675
  • 38
  • 142
  • 236
  • what index have you used at the end? I have tried using `app-name-yyy.mm.dd` but this made elastic and searching on kibana really slow see https://discuss.elastic.co/t/using-elstic-kibana-is-very-slow/85244 do you use daily/ monthly indexes? – dina May 15 '17 at 08:30
  • "So it seems like it's best to suffix any index names with the current date, as this makes it easy to purge old data." [date math](https://www.elastic.co/guide/en/elasticsearch/reference/current/date-math-index-names.html) in index names means that querying could be faster too, agree? Still your question is a good one.... – Nate Anderson May 23 '17 at 22:20

3 Answers3

6

In my environment we're working through a similar question. We have a mix of system logs, metric alerts from Prometheus, and application logs from both client and server applications. In addition, we have some shared variables between the client and server apps that let us correlate the two (e.g., we know what server logs match some operation on the client that made requests to said server). We're experimenting with the following scheme to help Kibana answer questions for us:

logs-system-{date}
logs-iis-{date}
logs-prometheus-{date}
logs-app-{applicationName}-{date}

Where:

  • {applicationName} is the unique name of some application we wrote (these could be client or server side)
  • {date} is whatever date-based scheme you use for indexes

This way we can set up Kibana searches against logs-app-* and quickly search for logs among any of our applications. This is still new for us, but we started without this type of scheme and are already regretting it. It makes searching for correlated logs across applications much harder than it should be.

Sam Storie
  • 4,444
  • 4
  • 48
  • 74
3

In my company we have worked lot about this topic. We agree the following convention:

  • Customer -- Product --- Application ---- Date

In any case, it is neccesary to review both how the data is organized and how the data is consulted inside the organization

Kind Regards

Dario Rodriguez

Dario R
  • 63
  • 1
  • 10
0

I am not aware of such conventions, but for my environment, we used to create two different type of indexes logstash-* and logstash-shortlived-*depending on the severity level. In my case, I create index pattern logstash-* as it will satisfy both kind of indices.

As these indices will be stored at Elasticsearch and Kibana will read them, I guess it should give you the options of creating the indices of different patterns. Give it a try on your local machine. Why don't you try logstash-XYZ if you want more granularity otherwise you can always create indices with your custom name.

vvs14
  • 720
  • 8
  • 19