I searched around and found pieces of what I want to do and how to do it but I have a feeling combining them all is a process I shouldnt do and that I should write it a different way.
Currently I had a small application that would use a MSSQL library in Node to Query a SQL Server with a sql command and get the results and store it as an object. Which then I use express and some javascript to decipher it or modify it before calling it with an AJAX call and responding it as a proper JSON object.
SQLDB -> NodeJs -> API -> localhost
My problem now is I want to repurpose this and expand it. Currently storing the SQLDB responses as objects inside an array is becoming a huge memory problem. Considering some of these requests can be hundred thousands of rows with hundreds of columns, the node process begins eatingup outrageous amounts of RAM.
I then thought, maybe I could easily just take that object when it comes through in the result and write it to file. Then when ajax calls come to expressjs, it can read from the file and res.json from there.
Will this work if say 50-200 some people request data at the same time? Or should I look for another method?