I'm trying to get the data from an API URL containing at least 3GB of data (since this is an enterprise level) and insert it to a table in SQL Server. I'm getting an error of "Insufficient memory to continue the execution of the program". I'm aware that a string can only contain a 2GB maximum of data that's why the following will not work:
string data = client.DownloadString(siteUrl) nor
string data = readStream.ReadToEnd()
We came up with an idea to have a query parameter that will filter the data by region but this will also fail if a certain region exceeds 2GB of data. So is there any other way i can get the whole chunk of data from the API URL?
Edit What we currently have is storing the data from to a string object in C#.
string data = client.DownloadString(siteUrl)
Then deserialize the data so i can filter out the element that i need with the next step.
JsonConvert.DeserializeObject<Dictionary<string, object>>(data)
Since the returned API does not return just a list of data, i need to define a startElement on where the data should be taken.
List<Dictionary<string, string>> arrayOfData = (List<Dictionary<string, string>>)(data[startElement] as Newtonsoft.Json.Linq.JArray).ToObject(typeof(List<Dictionary<string, string>>));
I iterate thru arrayOfData.Count to match the mapping of column in sql and the data for it then store it to a List> which will then be inserted to the database by a SQL Provider. Below is a sample data structure (since actual data is confidential)
{"href" : "someUrl","limit" : "1","records" : [{"columnA":"1","columnB":"2","columnC":"3"}]}
The problem is, i cannot do the processing since 3GB of data is too much for a string to handle so i'm asking if there are any other way to store (i know i can store it in a file but the data makes it difficult to be inserted as is).