3

We would like to use ELK for monitoring our IBM Integration Bus.

We would like to preform 2 things:

  1. Get the IIB log (the default broker log) from several Linux servers to logstash (is there any tutorial to do that? grok?)
  2. Write messages that goes through the IIB to logstash and the view them on the kibana (any grok?)

groks and how-to explenations would be much appreciated.

Sufiyan Ghori
  • 18,164
  • 14
  • 82
  • 110
octo-developer
  • 181
  • 1
  • 1
  • 10

2 Answers2

3

This tutorial from "digitalocean" would be helpful.

https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elk-stack-on-ubuntu-14-04

  • install logstash and kibana on separate server
  • install and configure filebeat agents on IIB severs to transfer logs
  • monitor logs and define filters on kibana
Mojtaba
  • 382
  • 1
  • 2
  • 26
0

Below is what I think might help you:

You'll have to use filebeat to ship logs from the /var/log/messages to ship it to ELK. But this file has more system log details along with IIB related logs. More effective approach would be to create an Centralized logging framework(could be a IIB flow) to log whatever interests you while processing data through the IIB flows and log it locally on the server. Use filebeat to ship this IIB specific log file to ELK

You can enable terminal events on a flow and send these events with message payload and write into a MQ queue or kafka. Then make Logstash read form IBM MQ or Kakfa and load Elastic.

Rohan
  • 607
  • 5
  • 5