0

I'm using spark-sql-2.4.1v , kafka-client 10.x with java8. I need to broadcast some configuration/ meta data information in my spark application.

currently using spark structured streaming , in my application , in every few data my meta-data would change i.e. new records added or current records information is changed.

As it is a streaming application , I only start the application/spark-job once that time I load all these meta-data and broadcast it.

After few days something in meta-data changes how to load and populate those changes on every executor i.e. how to refresh meta-data on each executors?

Naman
  • 27,789
  • 26
  • 218
  • 353
BdEngineer
  • 2,929
  • 4
  • 49
  • 85

0 Answers0