1

My use case:
IoT Hub -> Event Hub -> ASA -> Azure SQL (staging table)

The problem is that i get duplicate data in my staging table.
For testing purpose i sent exactly 10k json messages to IoT-Hub but my staging table then contains much more data like over 40k.

Is there something i have to adjust in event hub or in ASA? Is this normal for ASA to process duplicate messages?

  • Was this on a single run, or did you restart the job multiple times? If it’s the first case, please share your query. If you did restart the job, every time you did you may have created duplicated data if you re-ingested the same time span without cleaning the destination table yourself. – Florian Eiden Apr 27 '22 at 15:06
  • And no it’s not normal, ASA provides exactly once processing. But to get exactly once delivery, you need to think about your pipeline end to end (how you restart your job etc) – Florian Eiden Apr 27 '22 at 15:08

0 Answers0