I'm still new to Azure Data Factory and am trying to move files that are dumped in my S3 folder/bucket daily to Azure blob. I already created datasets (for source and sink) and linked services in Data Factory.
But since my S3 bucket receives new file every day, I'm wondering how to move the latest file that was dropped in the S3 (say at 5am EST) on a daily basis. I have looked through most of the answers online like this, this, this and this. But none of them explains how to figure out which is the latest file in S3 (maybe based on last modified date/time or by matching the file name pattern that goes like this 'my_report_YYYYMMDD.csv.gz') and only copy that file to the destination blob.
Thank you in advance for your help/answer!