0

I have a WCF service that has a very time consuming method that uploads large data files to "azure table storage".

I set my timeouts at runtime on the client side as follows:-

binding = new BasicHttpBinding();
binding.CloseTimeout = TimeSpan.FromMilliseconds(2147483647.0);
binding.OpenTimeout = TimeSpan.FromMilliseconds(2147483647.0);
binding.ReceiveTimeout = TimeSpan.FromMilliseconds(2147483647.0);
binding.SendTimeout = TimeSpan.FromMilliseconds(2147483647.0);

and my web.config has the timeouts set as follows:-

<bindings>
      <basicHttpBinding>
        <binding maxBufferPoolSize="2147483647" maxReceivedMessageSize="2147483647" sendTimeout="22:30:00" receiveTimeout="22:30:00" openTimeout="22:30:00" closeTimeout="22:30:00" maxBufferSize="2147483647">
          <readerQuotas maxArrayLength="2147483647" maxBytesPerRead="2147483647" maxDepth="2147483647" maxNameTableCharCount="2147483647" maxStringContentLength="2147483647" />
        </binding>
      </basicHttpBinding>
    </bindings>

I'm running my code in VS 2012 and the problem i am seeing is that the file upload method crashes after 60 minutes with an unhandled CommunicationException: The remote server returned an error: NotFound. If i press F5, the upload continues and completes. The crash appears in the Reference.cs file at this point:-

public void EndFileUploadMethod(System.IAsyncResult result) {
                object[] _args = new object[0];
                base.EndInvoke("FileUploadMethod", _args, result);
Ionică Bizău
  • 109,027
  • 88
  • 289
  • 474
ChrisW
  • 53
  • 1
  • 5

2 Answers2

0

For file uploads, there are a few other configurations that need to be in place. First, how large are the files you are trying to upload? If they are > 50MB or so, you might have to chunk them into smaller pieces and send the pieces over.

Before that, try adding the settings below to your config. Don't worry about the <authentication> and <compilation> tags. It is the <httpRuntime> and <requestLimits> tags that are of interest.

My maxAllowedContentLength attribute value is arbitrary below so you can set it to whatever you want. I believe it is measured in bytes.

  <system.web>
    <compilation debug="true" targetFramework="4.0"/>
    <authentication mode="Windows" />
    <httpRuntime maxRequestLength="2147483647" />
  </system.web>

  <system.webServer>
    <security>
      <requestFiltering>
        <requestLimits maxAllowedContentLength="2000000000" />
      </requestFiltering>
    </security>
  </system.webServer>
Vinnie
  • 1,053
  • 11
  • 31
Jordan Parmer
  • 36,042
  • 30
  • 97
  • 119
  • Thanks for the info, are these attributes related to limits on the size of the file being transferred to server side? If so, I don't think they are the problem, the file i'm testing with is 4.5MB. That is the zipped size, i'm unzipping and processing on the server, which is the time consuming part. – ChrisW Feb 27 '13 at 16:21
  • Well, it depends where the error is being thrown. Do you know if your upload request actually makes it to the services? In my non-azure WCF applications, when I get the error mentioned on file upload, it is because the default content length of the web server is too small. Using the configuration above on the server-end, I was able to upload files. Even if you have a few MB, the default size is usually too small. This is a separate config from the WCF message size. – Jordan Parmer Feb 27 '13 at 16:55
  • Here is another question that explains it: http://stackoverflow.com/questions/6327452/which-gets-priority-maxrequestlength-or-maxallowedcontentlength. Are you using IIS at all or posting directly to Azure? – Jordan Parmer Feb 27 '13 at 16:57
  • The service is definitely being reached as it's getting about 35% of the way through the upload of the file. When i ran the upload last i had maxRequestLength="2147483647" and maxAllowedContentLength="4294967295" (previously, maxAllowedContentLength was 2000000000) and i experienced the same crash after one hour. I am not using IIS, posting straight to Azure. – ChrisW Mar 04 '13 at 11:03
0

I use similar azure blob to store my contents. my code doesn't have any timout setting. try this and let me know.

    public static CloudBlobContainer Container
    {
        get
        {
            CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
                {
                    // Provide the configSetter with the initial value
                    configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
                    RoleEnvironment.Changed += (sender, arg) =>
                    {
                        if (arg.Changes.OfType<RoleEnvironmentConfigurationSettingChange>().Any((change) =>
                        (change.ConfigurationSettingName == configName)))
                        {
                            // The corresponding configuration setting has changed, so propagate the value
                            if (!configSetter(RoleEnvironment.GetConfigurationSettingValue(configName)))
                            {
                                // In this case, the change to the storage account credentials in the
                                // service configuration is significant enough that the role needs to be
                                // recycled in order to use the latest settings (for example, the 
                                // endpoint may have changed)
                                RoleEnvironment.RequestRecycle();
                            }
                        }
                    };
                });
            CloudStorageAccount acc = CloudStorageAccount.FromConfigurationSetting("RecordingsStorageAccount");
            CloudBlobClient bc = acc.CreateCloudBlobClient();
            CloudBlobContainer c = bc.GetContainerReference(RoleEnvironment.GetConfigurationSettingValue("RecordingsContainer"));
            return c;
        }
    }
Vinnie
  • 1,053
  • 11
  • 31
  • I need to do quite a bit of processing before transferring the file content to the azure table storage, so i don't think this applies – ChrisW Feb 27 '13 at 16:23