I'm storing some large files in a Blog Storage Container.
Later, using a WebJob (also in Azure), I read from each of these blobs using CloudBlockBlob.OpenRead()
that gives me a Stream
.
I open the stream and read from it. The problem is that when the file is larger than 25 MB aprox., after some time reading OK, it throws this exception (during a read):
Unhandled Exception: System.Net.Http.HttpRequestException: Error while copying content to a stream. ---> System.ObjectDisposedException: Cannot access a closed Stream. at System.IO.__Error.StreamIsClosed() at System.IO.MemoryStream.get_Position() at System.Net.Http.StreamToStreamCopy.StartAsync()
It seems that the file is closed on the other side!
Why does this happen? Is there a timeout? How can I handle this situation?