0

I'm uploading rather a lot of data (30gb+) across thousands of files. The whole process takes a while but I've been finding that consistently after 15 mins of transfers, the upload process fails and I get errors for each file that is currently being transferred (I'm doing it multithreaded so there are multiple uploads at once). The error code I'm getting is "error: Amazon.S3.AmazonS3Exception: The difference between the request time and the current time is too large. ---> Amazon.Runtime.Internal.HttpErrorResponseException: The remote server returned an error: (403) Forbidden. ---> System.Net.WebException: The remote server returned an error: (403) Forbidden."

Seeing as its exactly 15 mins from the start of the whole process that this thing crashes, I think its maybe that the client is timing out, however I've set my client's timout to 45 mins I think:

 {
     var client = new AmazonS3Client(new AmazonS3Config()
     {
         RegionEndpoint = RegionEndpoint.EUWest2,
         UseAccelerateEndpoint = true,
         Timeout = TimeSpan.FromMinutes(45),
         ReadWriteTimeout = TimeSpan.FromMinutes(45),
         RetryMode = RequestRetryMode.Standard,
         MaxErrorRetry = 10
     });

     Parallel.ForEach(srcObjList, async srcObj =>
     {
         try
             {
                 var putObjectRequest = new PutObjectRequest();
                 putObjectRequest.BucketName = destBucket;
                 putObjectRequest.Key = srcObj.Key;
                 putObjectRequest.FilePath = filePathString;
                 putObjectRequest.CannedACL = S3CannedACL.PublicRead;

                 var uploadTask = client.PutObjectAsync(putObjectRequest);

                 lock (threadLock)
                 {
                     syncTasks.Add(uploadTask);
                 }

                 await uploadTask;
             }
             catch (Exception e)
             {
                 Debug.LogError($"Copy task ({srcObj.Key}) failed with error: {e}");
                 throw;
             }
     });

     try
     {
         await Task.WhenAll(syncTasks.Where(x => x != null).ToArray());
     }
     catch (Exception e)
     {
         Debug.LogError($"Upload encountered an issue: {e}");
     }
 });

await transferOperations;

Debug.Log("Done!");```

1 Answers1

0

The documentation doesn't specify the maximum timeout value, but given that you're seeing 15 minutes exactly, it stands to reason there is some upper limit to the timeout value, either a hard limit or something in the S3 bucket's settings.

This answer suggests a clock synchronization difference might also be the case, but then I'd wonder why the transfer starts at all.

Simmetric
  • 1,443
  • 2
  • 12
  • 20