6

I'm trying to upload files from local command line client to Azure storage through web-api. I'm using Azure Web-Site for that. Working with the client is not a problem. And I've got everything working locally fine. Here is the web-api code:

    public async Task<HttpResponseMessage> PostUpload()
    {
        // need a local resource to store uploaded files temporarily
        LocalResource localResource = null;
        try
        {
            // Azure web-site fails here
            localResource = RoleEnvironment.GetLocalResource("TempStorage");
        }
        catch (Exception e)
        {
            return Request.CreateErrorResponse(HttpStatusCode.InternalServerError, "Unable to get access to local resources");
        }

        var provider = new MultipartFormDataStreamProvider(localResource.RootPath);

        // Read the form data.
        await Request.Content.ReadAsMultipartAsync(provider);

        // snipped validation code
        var container = // code to get container

        foreach (var fileData in provider.FileData)
        {
            var filename = GetBlobName(fileData);
            var blob = container.GetBlockBlobReference(filename);
            using (var filestream = File.OpenRead(fileData.LocalFileName))
            {
                blob.UploadFromStream(filestream);
            }
            File.Delete(fileData.LocalFileName);
        }

        return Request.CreateResponse(HttpStatusCode.OK);
    }

Everything works fine when I run locally, but as soon as I deploy web-site in Azure, I can't upload, because Azure Web-Sites don't have access to LocalResource. And I'll need to switch to Azure Web-Role. I can switch, but accessing local file system is bothering me all together.

And LocalResource is required for instance of MultipartFormDataStreamProvider(). And I have not found alternative ways to upload files to WebApi. My plan was to channel through upload directly to Azure, without storing anything on a local HDD.

Is there any other way to upload files?

p.s. I have seen usages of Shared Access Signatures where I can give client application a url with signature and let the client upload directly to Azure Blog. But I'm not sure about how secure that is going to be and not really comfortable (yet) with passing the signatures down to the client. At the moment I presume the client is going to be run in very hostile environment and nothing can be trusted coming back from the client.

UPD My final solution involved using write only Shared Access Signature issued on the server and passed down to the client. And client then uploads files directly to Azure. This way I save a lot of hassle with managing uploaded files. And here is more detailed description of my solution.

trailmax
  • 34,305
  • 22
  • 140
  • 234
  • What exception is being throw by the following code "localResource = RoleEnvironment.GetLocalResource("TempStorage")" – Brian Dishaw Jun 03 '13 at 14:08
  • are you comfortable using REST API? – dev2d Jun 03 '13 at 14:15
  • @BrianDishaw The exception is something along the lines of "Unable to acquire LocalResource, System.Runtime.InteropServices.SEHException (0x80004005): External component has thrown an exception.". Also this page (http://msdn.microsoft.com/en-us/library/windowsazure/ee758708.aspx ) suggest that local resources are only for Web- and Worker-roles, not for web-sites. – trailmax Jun 03 '13 at 14:16
  • @VJD yeah, kind of. I guess will have to bite the bullet, generate Shared Access Signature and pass it down to the client, and then client uploads files directly to Azure via the signature. – trailmax Jun 03 '13 at 14:18
  • i think using REST will allow you do it easily please refer following, check Put Blob http://convective.wordpress.com/2010/08/18/examples-of-the-windows-azure-storage-services-rest-api/ – dev2d Jun 03 '13 at 14:21
  • @VJD thankd for the link, but I have seen this one. It requires to hand over account storage key to the client. And that is a NO-NO in my scenario. Also, I think there are more up-to date API to do the same things with Azure. – trailmax Jun 03 '13 at 14:32
  • 1
    you could probably take a look at my answer in this post: http://stackoverflow.com/questions/15842496/is-it-possible-to-override-multipartformdatastreamprovider-so-that-is-doesnt-sa/15843410#15843410 ...Here i do not have speicifically talk about uploading to Azure, but it should probably give you a good idea. – Kiran Jun 03 '13 at 15:02
  • @KiranChalla now that looks promising! thanks! – trailmax Jun 03 '13 at 15:06

3 Answers3

12

This isn't exactly the answer you are looking for, but you can use local storage with Azure Websites using MultiPartFileStreamProvider and Path.GetTempPath(). The code would look something like this

public async Task<HttpResponseMessage> PostUpload()
{
    var provider = new MultipartFileStreamProvider(Path.GetTempPath());

    // Read the form data.
    await Request.Content.ReadAsMultipartAsync(provider);

    // do the rest the same     
}
Daniel Auger
  • 12,535
  • 5
  • 52
  • 73
  • Now I wonder how Azure is managing the TempPath() in web-sites and how reliable that is. Could it not be wiped half-way through big file upload? – trailmax Jun 03 '13 at 16:31
  • I've removed the dead link that indicated that the temp directory isn't transient. To my knowledge this is still true. – Daniel Auger Sep 05 '15 at 05:54
2

I found this StackOverflow article that overrides the MultipartFormDataStreamProvider so that Files are not stored locally first, but directly written to an AWSStream. See: Is it possible to override MultipartFormDataStreamProvider so that is doesn't save uploads to the file system?

But I have to say I also like the solution of trailmax.

Community
  • 1
  • 1
woutercx
  • 195
  • 8
1

One possible solution to get your code to work as is with a LocalResource would be to host this inside of a worker process that self-hosts web api via Owin.

You can find a simple walkthrough at: http://www.asp.net/web-api/overview/hosting-aspnet-web-api/host-aspnet-web-api-in-an-azure-worker-role

You just need to startup the Owin-hosted api within the OnStart() method of the RoleEntryPoint. Keep in mind you can also return Html from a web api response so you can make a worker role a very flexible base project.

Here's a quick snippet showing how to set up the Owin host from the link above:

private IDisposable _webApp = null;

public override bool OnStart() {
   ServicePointManager.DefaultConnectionLimit = 5;

   var endpoint = RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["DefaultEndpoint"];
   var baseUri = string.Format("{0}://{1}", endpoint.Protocol, endpoint.IPEndpoint);

   _webApp = WebApp.Start<Startup>(new StartOptions(baseUri));

   return base.OnStart();
}

public override void OnStop() {
    if (_webApp != null) {
        _webApp.Dispose();
    }
    base.OnStop();
}

...

using System.Web.Http;
using Owin;

class Startup {
    public void Configuration(IAppBuilder app) {
        var config = new HttpConfiguration();

        config.Routes.MapHttpRoutes("Default", "{controller}/{id}",
                                    new { id = RouteParameter.Optional });

        app.UseWebApi(config);

    }
}
Matthew
  • 706
  • 9
  • 12
  • I could've just deployed the site into a web-role and have a LocalResource available. Deploying just web-api into a worker role is an overkill. Anyway, I solved the problem by issuing write-only Shared Access Signature in blob storage, passing that down to client via WebApi. And then client uploading the files directly to Azure. – trailmax Jul 19 '13 at 21:15