I am trying to feed some netflow data into kafka. I have some netflow.pcap
files which I read like
tcpdump -r netflow.pcap
and get such an output:
14:48:40.823468 IP abts-kk-static-242.4.166.122.airtelbroadband.in.35467 > abts-kk-static-126.96.166.122.airtelbroadband.in.9500: UDP, length 1416
14:48:40.824216 IP abts-kk-static-242.4.166.122.airtelbroadband.in.35467 > abts-kk-static-126.96.166.122.airtelbroadband.in.9500: UDP, length 1416
. . . .
In the official docs they mention the traditional way of starting a kafka producer, starting a kafka consumer and in the terminal input some data on producer which will be shown in the consumer. Good. Working.
Here they show how to input a file to kafka producer. Mind you, just one single file, not multiple files.
Question is:
How can I feed the output of a shell script into kakfa broker?
For example, the shell script is:
#!/bin/bash
FILES=/path/to/*
for f in $FILES
do
tcpdump -r netflow.pcap
done
I can't find any documentation or article where they mention how to do this. Any idea? Thanks!