I am currently working on a project to build an integration between an existing ASP.Net MVC website and a file hosting service my company is using. The typical use case is:
- A user requests one or more files
- The controller makes one call per file to the file host API
- The file host returns the file data to the controller
- The controller returns a file result
The hosting service can handle concurrent calls, and I've found that executing each API call within a task (see example below) leads to fairly drastic improvements.
private void RetrieveDocuments(DocumentIdentifier[] identifiers, List<FileHostResult> results)
{
var tasks = identifiers.Select(x => RetrieveDocument(results, x)).ToArray();
Task.WaitAll(tasks);
}
private Task RetrieveDocument(List<FileHostResult> results, DocumentIdentifier x)
{
return Task.Run(() =>
{
var result = GetFileHostResultFromFileHost(x.ExternalIdentifier);
lock (results)
{
results.Add(result);
}
});
}
My question is whether or not there is a better way of doing this, or if there are any potential pitfalls I might run into? (eg. locking server resources, etc).
EDIT 1: I didn't post the code for GetFileHostResultFromFileHost because I don't really have any access to change it. Its basically a method call implemented in a library I cant change.
EDIT 2: To clarify. My main concern is to avoid harming the current user experience on the site. To that end I want to make sure that running tasks concurrently out of an ASP.net mvc isn't going to lock up the site.