0

I am using UploadCsv method of BigQuery Client class. This class accepts Stream class object. Can I change encoding of my file and pass this file to stream object without converting my text file to ByteArray? I'm using a large file and in case of ByteArray it gives out of memory exception.

UploadCsvAsync(string datasetId, string tableId, TableSchema schema, Stream input, UploadCsvOptions options = null, CancellationToken cancellationToken = null);
Roman Kalinchuk
  • 718
  • 3
  • 14

1 Answers1

0

It seems that it is not possible to change the encoding of your input. This is because the UploadCSVAsync requires the input to be a System.IO.Stream class.

This class is the abstract base class of all streams in C#. A stream, on the other hand is a sequence of bytes, such as a file, input/output devices, an inter-process communication pipe or a TCP/IP socket. If you changed the type or encoding of the input, it might not match with the System.IO.Stream remarks.

This covers the main question "If it possible to change the encoding?", the answer is No.

Now, the next you may ask should be: "If it is not possible, then what do we need to do?", and that's a great question!

For that particular matter, the Out of memory Exception Error message can be solved by changing the way we are reading the data. Having a single loop statement to read a single file might not be the best option due to the allocated memory.

Instead, use a buffer to read data as they did on this other question or use smaller arrays or jagged arrays as done on this question.

Hope this is helpful! :)

Kevin Quinzel
  • 1,430
  • 1
  • 13
  • 23
  • If you are still having issues with the _Out of memory_ Exception, I would recommend to create a new question to focus on it – Kevin Quinzel Jun 25 '20 at 14:52