2

I need a method that reads a file to a byte array asynchronously but I don't know what size the file will be (it can be a few Kb of a good few Mb).

I've tried FileStream to get the length and use BeginRead, but the problem is length is a long and BeginRead only accepts int, if the file is to big it'll probably overflow. Another way I was thinking was read it by smaller chunks but every time I have to read a new chunk of bytes I'll have to create a new array (just wanted to avoid having to initialize new and bigger arrays).

I am open to better or simpler ways, otherwise I'll do it with the reading in smaller chunks.

halfer
  • 19,824
  • 17
  • 99
  • 186
Hugo Alves
  • 1,555
  • 2
  • 18
  • 41
  • 1
    It really depends what you need to do data after you read. Do you wanna load all of it into memory - which I doubt? Or process them as they are read into a buffer of let's say 4K? Make it more clear. – Aliostad Feb 01 '11 at 13:45
  • 1
    Note that the count parameter represents the maximum number of bytes to read. Also, you would need to call the EndRead method to determine how many bytes were actually read. http://msdn.microsoft.com/en-us/library/system.io.stream.beginread.aspx – RQDQ Feb 01 '11 at 13:49

4 Answers4

2

You can chunk it into a MemoryStream (the MemoryStream will manage appending the binary information in memory) and at the end you can just call memoryStream.ToArray().

Also, here is a way to copy between two stream instances (from your file stream to your MemorySream:

How do I copy the contents of one stream to another?

Community
  • 1
  • 1
RQDQ
  • 15,461
  • 2
  • 32
  • 59
1

You'll have to read it by chunks anyway since .NET doesn't support objects larger than 2Gb.

Community
  • 1
  • 1
Konstantin Oznobihin
  • 5,234
  • 24
  • 31
  • 2
    It looks like the OP doesn't know the *exact* size, but "a few KB or a good few MB" sounds like it'll fall under the 2GB limit. – Jon Skeet Feb 01 '11 at 14:01
  • "but the problem is length is a long and BeginRead only accepts int, if the file is to big it'll probably overflow" probably this makes me think OP could get pretty large files and if he needs generic solution putting everything in one big array might not always work. – Konstantin Oznobihin Feb 01 '11 at 14:19
0

You can specify an offset in your array which will allow you to allocate a single big array.

public IAsyncResult BeginRead(byte[] buffer, int offset, int count, AsyncCallback callback, Object state)

stream.BeginRead(buffer, totalRead, chunkSize, ...);

Then on the EndRead, add the read size to totalRead

Nekresh
  • 2,948
  • 23
  • 28
0

Another way to do it would be to just spin up a thread and call System.IO.File.ReadAllBytes(string). It doesn't sound like there's any advantage to chunking (because you're going to bring the whole thing into memory) so this would be pretty straightforward.

With some help from this sample:

    private void GetTheFile()
    {
        FileFetcher fileFetcher = new FileFetcher(Fetch);

        fileFetcher.BeginInvoke(@"c:\test.yap", new AsyncCallback(AfterFetch), null);
    }

    private void AfterFetch(IAsyncResult result)
    {
        AsyncResult async = (AsyncResult) result;

        FileFetcher fetcher = (FileFetcher)async.AsyncDelegate;

        byte[] file = fetcher.EndInvoke(result);

        //Do whatever you want with the file
        MessageBox.Show(file.Length.ToString());
    }

    public byte[] Fetch(string filename)
    {
        return File.ReadAllBytes(filename);
    }
RQDQ
  • 15,461
  • 2
  • 32
  • 59