0

I am exploring ways to incrementally copy S3 blobs to azure blobs. I tried azure data factory and I was not able to find this option.

S3 has millions of objects and without an incremental option it takes hours to complete.

I am open to explore other tools/options.

Dan
  • 1
  • 1

1 Answers1

0

if you need to do daily/hourly incrementally copy, the scheduled/tumbling window trigger should be the option. Here is an example to reference. ADFv2 also supports compression in copy, you could specify the type and level of compression for the s3 object, click here for more information.

Wang Zhang
  • 317
  • 1
  • 3
  • I am still new to data factory. It looks like to me that example is good if the files are well structured in S3 bucket. In our scenario, we don't know which folder or which file will have updates. What I am trying to achieve is after the initial load and if we run the data factory on a schedule, we want only the changed files to be moved from AWS to Azure. – Dan Nov 02 '18 at 19:09
  • Did you find a solution to this? – Markive Nov 20 '18 at 11:23