I'm new to Apache Kafka and wonder to know how big a message can be in apache Kafka. Is it efficient to use Apache Kafka is the size of the messages become quite big, let's say hundreds of MB?
I have a scenario in which I would like to copy files to HDFS to be used by a Hadoop job, these files are also used by other process. I was thinking of copying the files into Apache Kafka first and then a consumer can copy them to HDFS and other consumers utilize the Kafka. Is this the best approach or not?