This question is a bit academic, inspired by a misunderstanding of how an API actually works, but I'm curious how I'd be able to resolve the issue in a good way if I had understood the API correctly.
Let's say we need to integrate with a service with an OAuth-like method of authenticating. You call one server to get a token. You then use this token to request data from other endpoints. Here is the kicker: Each token can only be used once and will be cached by the remote server until it either expires or is consumed. If you request a token again and there is a cached token, you will receive the same token.
Now, let's say you have multiple processes which may happen concurrently which need to integrate with this service. They each request a token for using with the other endpoints. It won't matter if you've cached this token or not, they will all get the same token. Now you have a race condition where only one process will succeed and the others will fail since the token is only good for one time use.
A naïve solution would be for each process until it succeeds, but this would be inefficient and in a worse case scenario, an "unlucky" process might take forever because it always loses the race.
I'm thinking a more efficient process would submit some function to a service which would be responsible for requesting the token, handling each request in some sort of queue, and then passing it back to the consumer which might then await
this response.
The I can imagine the function might look something like:
public async Task<T> DoWithToken<T>(Func<Token, T> doSomething)
{
TryAsync<Token> tokenAttempt = _authorizationClient.TryAuthorize();
T? result = default;
_ = await tokenAttempt.Match(
Succ: token =>
result = doSomething(token),
Fail: exception =>
Console.Write($"Something went wrong: {exception.Message}")).ConfigureAwait(false);
return result!;
}
... but this would not control against concurrent processes also trying to use the token at the same time.
How could I do that?