7

I want users to be able to download blobs from my website. I want the fastest/cheapeast/best way to do this.

Heres what i came up with:

        CloudBlobContainer blobContainer = CloudStorageServices.GetCloudBlobsContainer();

        CloudBlockBlob blob = blobContainer.GetBlockBlobReference(blobName);
        MemoryStream memStream = new MemoryStream();
        blob.DownloadToStream(memStream);

        Response.ContentType = blob.Properties.ContentType;
        Response.AddHeader("Content-Disposition", "Attachment; filename=" + fileName + fileExtension);
        Response.AddHeader("Content-Length", (blob.Properties.Length).ToString());
        Response.BinaryWrite(memStream.ToArray());
        Response.End();

I'm using memorystream now but im guessing i should go with filestream because off the blobs being, in some cases, large.. Right?

I tried it with filestream but I failed miserable.. Think you could give me some code for filestream?

Reft
  • 2,333
  • 5
  • 36
  • 64

2 Answers2

17

IMHO, the cheapest and fastest solution would be directly downloading from blob storage. Currently your code is first downloading the blob on your server and streaming from there. What you could do instead is create a Shared Access Signature with Read permission and Content-Disposition header set and create blob URL based on that and use that URL. In this case, the blob contents will be directly streamed from storage to the client browser.

For example look at the code below:

    public ActionResult Download()
    {
        CloudStorageAccount account = new CloudStorageAccount(new StorageCredentials("accountname", "accountkey"), true);
        var blobClient = account.CreateCloudBlobClient();
        var container = blobClient.GetContainerReference("container-name");
        var blob = container.GetBlockBlobReference("file-name");
        var sasToken = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
            {
                Permissions = SharedAccessBlobPermissions.Read,
                SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(10),//assuming the blob can be downloaded in 10 miinutes
            }, new SharedAccessBlobHeaders()
            {
                ContentDisposition = "attachment; filename=file-name"
            });
        var blobUrl = string.Format("{0}{1}", blob.Uri, sasToken);
        return Redirect(blobUrl);
    }
Gaurav Mantri
  • 128,066
  • 12
  • 206
  • 241
  • Hi! Would i need to use Shared Access Signature even if I already programmed who is allowed to download what? – Reft May 22 '14 at 13:58
  • Yes, if your container is private. If your container is public, then you could just set the blob's content-disposition property and then you don't have to set it in SAS and you could use blob's URL. – Gaurav Mantri May 22 '14 at 14:00
  • Thanks! I'm using linq to validate the download,(user can only download blobs on their account). I never used SAS before and i feel completely lost with it. Is there any possibility you can give me some code to work with, (SAS), that goes hand to hand with the example you gave me? I really appreciate your help. Thanks again. – Reft May 22 '14 at 14:23
  • I wrote a blog post about SAS sometime ago which you can read here: http://gauravmantri.com/2013/02/13/revisiting-windows-azure-shared-access-signature/. Hopefully this should give you some ideas. Feel free to ask any specific questions that you may have on SAS (though as a separate question and not in comments here :)). It's quite awesome and very powerful. – Gaurav Mantri May 22 '14 at 14:36
  • Ah awesome:) Would i need to use SAS on my upload function aswell, (normal tableoperation and such)? If i understand it correctly SAS not only gives users access but it also gives you better performance and cheaper bill for the host? – Reft May 22 '14 at 15:29
  • SAS essentially gives your users temporary access to storage account. You can certainly use SAS for upload as well. That way the users would directly upload blobs into your storage account instead of first uploading on your server and then you transferring them to blob storage. But for this CORS must be enabled on your storage account. Check out this post on how you can accomplish browser based uploads directly to your storage account: http://gauravmantri.com/2013/12/01/windows-azure-storage-and-cors-lets-have-some-fun/. Contd. – Gaurav Mantri May 22 '14 at 15:34
  • Since your server is not coming in between, you will definitely see some performance improvement but you have to be careful regarding the validity duration of your SAS token. You should keep it long enough for the user to perform operation. – Gaurav Mantri May 22 '14 at 15:35
  • Sweet! But how much difference would there be between uploading directly and uploading to server then transfer the files to blob storage? (Cost/performance). Would it for an example double the performance/half price or is it not that noticeable? (is SAS worth it?) Thanks:) – Reft May 22 '14 at 15:53
  • Think of it this way ... assuming you have 1000 users of your application and each user is uploading 1 MB file. If the traffic is routed via your server, your server is handling/processing 1 GB of data thus the performance is bound to get impacted. If they are uploading directly into blob storage, that saves your server this much data processing. HTH. – Gaurav Mantri May 22 '14 at 16:23
  • http://stackoverflow.com/questions/23818137/azure-sas-download-blob I am afraid im lost again. Would you mind helping me haha:(? – Reft May 22 '14 at 22:32
  • This approach is pretty awesome. SharedAccessExpiryTime can have longer time spans (1day, 10days etc) if you need. – Dhanuka777 Nov 17 '15 at 04:42
  • For the best possible download speeds, throw a CDN in front of the generated blob url (e.g. CloudFront). I've used this approach along with SAS tokens to allow users to download videos securely and it works perfectly. No need to hit the server and downloading securely from an edge location. – GFoley83 Dec 15 '15 at 22:17
  • @GFoley83 This won't work if server uses auth will it? Client browser will contact `/download` with `Authorization` header, response will be `30X`, client calls up `Location` header and also tacks on all other headers from initial request incl. `Authorization`, this results in `InvalidAuthenticationInfo` from azure. Any ideas? – Mardoxx Mar 15 '18 at 13:45
  • Sorry @Mardoxx I haven't worked on that product in over 2 years so hard to say. Are you sure your SAS token is reaching Azure? Is it definitely the most recent SAS token for the latest request? CDNs usually ignore query string params `?somevar=value` which is where your SAS token would be. Caching + Auth is a tricky bag. – GFoley83 Mar 15 '18 at 19:44
  • @GFoley83 SAS reaches azure but because Authorization header is present in the request, it tries to use its value over the sas token and fails! Difficult indeed. – Mardoxx Mar 16 '18 at 08:13
1

Your code is almost right. Try this:

    public virtual ActionResult DownloadFile(string name)
    {
        Response.AddHeader("Content-Disposition", "attachment; filename=" + name); // force download
        CloudBlobContainer blobContainer = CloudStorageServices.GetCloudBlobsContainer();
    CloudBlockBlob blob = blobContainer.GetBlockBlobReference(blobName);
        blob.DownloadToStream(Response.OutputStream);
        return new EmptyResult();
    }

Hope this helps.

lopezbertoni
  • 3,551
  • 3
  • 37
  • 53
  • Hey! Sorry for being unclear. My downloadfunction is working but its using memorystream. Shouldnt i go with filestream(?). I also would like to know if this is the cheapest and best way to do it. Thank! – Reft May 22 '14 at 13:20
  • Check out that answer: http://stackoverflow.com/questions/5828315/write-pdf-stream-to-response-stream – lopezbertoni May 22 '14 at 13:35