Although this is a bit outside of the box, I would advice you to do the same as this is the best scalable solution when developing in the .NET environment.
Use Azure Storage! Or any other similar online cloud storage solution.
- It makes sure your web app is separate from your files, so you don't have to worry about moving an application to a different web environment.
- Web storage is mostly more expensive then azure storage (1GB with about 3000 operations (read/write/list) costs in total about $0.03.
- When you scale your application where downtime is more critical, point 1 also applies when you use a swapping/staging technique.
- Azure storage takes care of the expiry of so called Shared Access Tokens (SAS)
For the sake of simplicity for you, I will just include my code here so you don't have to google the rest
So what I do in my case, all my files are saved as Attachments
within the database (not the actual file of course).
When someone requests an attachment, I do a quick check to see if the expire date has passed and if so we should generate a new url.
//where ever you want this to happen, in the controller before going to the client for example
private async Task CheckSasExpire(IEnumerable<AttachmentModel> attachments)
{
foreach (AttachmentModel attachment in attachments)
{
await CheckSasExpire(attachment);
}
}
private async Task CheckSasExpire(AttachmentModel attachment)
{
if (attachment != null && attachment.LinkExpireDate < DateTimeOffset.UtcNow && !string.IsNullOrWhiteSpace(attachment.AzureContainer))
{
Enum.TryParse(attachment.AzureContainer, out AzureStorage.ContainerEnum container);
string url = await _azureStorage.GetFileSasLocator(attachment.Filename, container);
attachment.FileUrl = url;
attachment.LinkExpireDate = DateTimeOffset.UtcNow.AddHours(1);
await _attachmentRepository.UpdateAsync(attachment.AttachmentId, attachment);
}
}
AzureStorage.ContainerEnum
is just an internal enum to easily track the container certain files are stored in, but these can be strings of course
And my AzureStorage
class:
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
public async Task<string> GetFileSasLocator(string filename, ContainerEnum container, DateTimeOffset expire = default(DateTimeOffset))
{
var cont = await GetContainer(container);
CloudBlockBlob blockBlob = cont.GetBlockBlobReference(filename);
DateTimeOffset expireDate = DateTimeOffset.UtcNow.AddHours(1);//default
if (expire != default(DateTimeOffset) && expire > expireDate)
{
expireDate = expire.ToUniversalTime();
}
SharedAccessBlobPermissions permission = SharedAccessBlobPermissions.Read;
var sasConstraints = new SharedAccessBlobPolicy
{
SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-30),
SharedAccessExpiryTime = expireDate,
Permissions = permission
};
var sasToken = blockBlob.GetSharedAccessSignature(sasConstraints);
return blockBlob.Uri + sasToken;
}
private async Task<CloudBlobContainer> GetContainer(ContainerEnum container)
{
//CloudConfigurationManager.GetSetting("StorageConnectionString")
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(_config["StorageConnectionString"]);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
string containerName = container.ToString().ToLower();
CloudBlobContainer cloudContainer = blobClient.GetContainerReference(containerName);
await cloudContainer.CreateIfNotExistsAsync();
return cloudContainer;
}
So this will produce url's like so: http://127.0.0.1:10000/devstoreaccount1/invoices/NL3_2002%20-%202019-04-12.pdf?sv=2018-03-28&sr=b&sig=gSiohA%2BGwHj09S45j2Deh%2B1UYP1RW1Fx5VGeseNZmek%3D&st=2019-04-18T14%3A16%3A55Z&se=2019-04-18T15%3A46%3A55Z&sp=r
Of course you have to apply your own authentication logic when retrieving the attachments, if the user is allowed to view the file or not. But that can all be done with the JWT token and in the controller or the repository. I wouldn't worry about the URL being a public url, if one is so mighty to get that URL... within one hour... well then reduce the expire date :D