1

I'm using the solution from this post and it's working perfectly for me for all image sizes when testing locally. However when I deploy to an Azure Cloud Service I can only handle files up to about 50k, any larger and the solution hangs.

I've attempted to debug using the remote azure debugger and the code is hanging on the following line of code.

var provider = await Request.Content.ReadAsMultipartAsync<InMemoryMultipartFormDataStreamProvider>(new InMemoryMultipartFormDataStreamProvider());

I've found an exception is thrown of: A first chance exception of type 'System.AccessViolationException' occurred in System.Net.Http.Formatting.dll

As I said, this works fine locally in the emulator but not on the azure platform..?

Can anyone provide any insight into this?

I've also tried adding the following to my web.config but no luck...

<system.web>
  <httpRuntime targetFramework="4.5" maxRequestLength="32768" />
</system.web>

<system.webServer>
  <security>
    <requestFiltering>
      <requestLimits maxAllowedContentLength="4294967295" />
    </requestFiltering>
  </security>
</system.webServer>
Community
  • 1
  • 1
NeilT
  • 63
  • 2
  • 8
  • Well, I think I'm in the boat with you. Did you find a solution? – phpmeh Apr 02 '15 at 22:04
  • I found the issue was with the multipart parser. I implemented this code to parse the data: https://github.com/Vodurden/Http-Multipart-Data-Parser – NeilT Apr 16 '15 at 15:27

0 Answers0