Two such requests in the .NET code will cause two such HTTP requests, which can be checked quite easily by just building something that does so, running it, and then testing what happens at the server.
This is appropriate because it may be that the two requests will receive a different response, especially considering that one of them might suffer an error that the other doesn't. There are other reasons (e.g. the server may send a response that differs every time along with instructions that it shouldn't be cached).
However, there can be an exception. There is a default limit on the number of requests that will simultaneously be sent to the same domain, which is configurable but defaults to two (this is something often complained about because it is inappropriate in some use-cases - however two requests per server does give the highest total throughput in most cases, so it's there for a good reason).
Because of this, it's quite possible that one of the two requests will be delayed as it is queued due to this rule. Even if the default limit is increased, it's possible that that limit was still surpassed.
Now, as Marc notes a response can be cached once its response stream has been read to the end*, and this may have happened by the time the second request begins, which would lead to it using the cached response if applicable (the response was cacheable and there were no errors in downloading it).
So, on balance we would expect there to be two separate downloads and we should be glad of that (in case one has an error), but there are possible conditions in which there will be only one download as the two "simultaneous" requests are forced to not actually be simultaneous.
*Actually, while the documentation says the stream has to be read to the end, it actually has to be read to the end and closed, whether manually, by disposing it (e.g. from using
) or from its finaliser being executed.