0

I searched around and found pieces of what I want to do and how to do it but I have a feeling combining them all is a process I shouldnt do and that I should write it a different way.

Currently I had a small application that would use a MSSQL library in Node to Query a SQL Server with a sql command and get the results and store it as an object. Which then I use express and some javascript to decipher it or modify it before calling it with an AJAX call and responding it as a proper JSON object.

SQLDB -> NodeJs -> API -> localhost

My problem now is I want to repurpose this and expand it. Currently storing the SQLDB responses as objects inside an array is becoming a huge memory problem. Considering some of these requests can be hundred thousands of rows with hundreds of columns, the node process begins eatingup outrageous amounts of RAM.

I then thought, maybe I could easily just take that object when it comes through in the result and write it to file. Then when ajax calls come to expressjs, it can read from the file and res.json from there.

Will this work if say 50-200 some people request data at the same time? Or should I look for another method?

user2457035
  • 81
  • 10
  • You need to provide a bit more info. It seems like you're running this query on startup and then respond with that (possibly stale) data on demand? Saving to a file is definitely an option. So if you have parallel requests, just stream the file contents back to the (ajax) client and there are no memory issues for you. If you want to get results directly from the db though, that might be an issue. How much RAM are you talking about? – Zlatko Jul 30 '14 at 11:29
  • I decided on the first option actually @Zlatko. Stale data isnt an issue so snapshot responses like this works in this case. I am now using fs.writefile and readfile but I am running into performance issues as some of these JSON files became very large. Im reading a bunch of things about writestream but I cant seem to find a tutorial that can explain it in the manner i want to use it for (write a large object to file) which leads me to think I need to redo a lot of things.... Reference: http://stackoverflow.com/questions/2496710/writing-files-in-nodejs/10368255#10368255 I am out of my depth – user2457035 Jul 31 '14 at 19:08
  • All right, just try to use streams as much as you can and large files should not be the problem. I still think the biggest thing will be to get that SQL result to come in as a stream, rather then a blob. And ask questions when you get stuck with details! – Zlatko Aug 01 '14 at 08:27
  • Yea i asked a specific question about the syntax of the streams in my case. I tried a few things and I think i am misunderstanding something. Callbacks are still bizarre to me, no matter how many tutorials i read or try even with successfully written callback functions it still does not click with me like everything else. When a callback works for me its magic to me. I have no idea how I managed to get it to work :/ – user2457035 Aug 01 '14 at 12:54
  • It is not about callbacks, it is about events. Once That one clicks, it is easier to understand and follow. Plus, streams Are also about events, što YOU get that one too. – Zlatko Aug 01 '14 at 21:52

0 Answers0