0

Goal: POST a PDF file, store its blob content in Azure Storage, and then get its content right back to display it in browser

What works: So I have the following code which succesfully calls the controller POST method with a PDF file content, and returns a response with the content which angular displays in browser

angular/html:

//html
<object ng-show="content" data="{{content}}" type="application/pdf" style="width: 100%; height: 400px;"></object>

//angular controller
...
.success(function (data) {
   console.log(data);
   var file = new Blob([(data)], { type: 'application/pdf' });
   var fileURL = URL.createObjectURL(file);
   $scope.content = $sce.trustAsResourceUrl(fileURL);
}

WebAPI controller:

    // POST api/<something>/Upload
    [Authorize]
    [HttpPost]
    [Route("Upload")]
    public async Task<HttpResponseMessage> Post()
    {
        try
        {
            HttpRequestMessage request = this.Request;
            if (!request.Content.IsMimeMultipartContent())
            {
                return new HttpResponseMessage(HttpStatusCode.UnsupportedMediaType);
                //return StatusCode(HttpStatusCode.UnsupportedMediaType);
            }

            var customMultipartFormDataProvider = new CustomMultipartFormDataProvider();

            var provider = await request.Content.ReadAsMultipartAsync<CustomMultipartFormDataProvider>(customMultipartFormDataProvider);
            //contents[1] and //contents[2] were StreamContent of the FormData
            var fileContent = provider.Contents[2];
            var formData = provider.FormData;

            //can succesfully write to a SQL database here without fail

            HttpResponseMessage response = new HttpResponseMessage(HttpStatusCode.OK);
            response.Content = fileContent;
            return response;
    } 

  public class CustomMultipartFormDataProvider : MultipartFormDataRemoteStreamProvider
  {
    public override RemoteStreamInfo GetRemoteStream(HttpContent parent, HttpContentHeaders headers)
    {
        return new RemoteStreamInfo(
            remoteStream: new MemoryStream(),
            location: string.Empty,
            fileName: string.Empty);
    }
  }

The problem: However if I add the lines of code to upload this content to my Azure Storage it does not work:

  string blobStorageConnectionString = ConfigurationManager.ConnectionStrings["AzureStorageAccount"].ConnectionString;
  CloudStorageAccount blobStorageAccount = CloudStorageAccount.Parse(blobStorageConnectionString);
  CloudBlobClient blobClient = blobStorageAccount.CreateCloudBlobClient();
  CloudBlobContainer container = blobClient.GetContainerReference(<containerName>);
  container.CreateIfNotExists();
  CloudBlockBlob block = container.GetBlockBlobReference(<keyname>);
  block.UploadFromStream(await fileContent.ReadAsStreamAsync());

The problem is it succesfully uploads to the storage and the control flow even gets to the return statement in the webapi controller, but its almost like it does not return.

console.log(data) in my controller on the success function is never called. The return statement seemingly does not execute even though it is acting like it does.

Pipeline
  • 1,029
  • 1
  • 17
  • 45
  • Are you sure the file content stream can be reused? At the very least, I think you'd have to reset Position to 0, and more likely you'd need to read into a MemoryStream. – Stephen Cleary Apr 12 '16 at 02:52
  • Can you clarify what you mean by "reused"? Is it because I read the fileContent when I upload it to storage that something with the "position" gets changed? Where can I find documentation on position? Sorry if noob question – Pipeline Apr 12 '16 at 02:53
  • It's a stream. Every stream has a position. You can't reuse streams in general. – Stephen Cleary Apr 12 '16 at 08:55

1 Answers1

0

As mentioned in comments, Streams (in general) can't be used more than once (because, well, they are an abstraction over a stream of data after all). What you could do is to write first your stream inside a MemoryStream, and than use that stream (which may be repositioned) for both the Blob upload and the response. A small conceptual example:

//copy stream
var ms = new MemoryStream();
await fileContent.CopyToAsync(ms);

//other stuff

//upload to blob storage
ms.Position = 0;
block.UploadFromStream(ms);

//other stuff

//set response content
ms.Position = 0;
response.Content = new StreamContent(ms);

However I suggest you an alternative approach: instead of returning the content of the blob directly, why don't you return the URI of the uploaded blob to the client? This way is your client who decides if it wants to download the file, or simply ignore it, and you offload traffic and bandwidth from your application to Azure Blob service. You can give access to the blob for a limited amount of time, using a token generated with SAS (Shared Access Signature):

SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy();
sasConstraints.SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-5);
sasConstraints.SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(2); //two minutes before the access token expires
sasConstraints.Permissions = SharedAccessBlobPermissions.Read;

string sasBlobToken = block.GetSharedAccessSignature(sasConstraints);

var blobFinalUri = block.Uri + sasBlobToken;

References:

How do I copy the contents of one stream to another?

Shared Access Signatures, Part 2: Create and use a SAS with Blob storage

Community
  • 1
  • 1
Federico Dipuma
  • 17,655
  • 4
  • 39
  • 56