1

I have an view that has a for loop that inserts rows to a table. The table is very big and already consisting of couple of thousand of rows.

When I run it, the server throws out of memory exception.

I would like to add an infinite scrolling feature so I won't have to load all the data at once.

Right now the data is being sent with regular res.render(index.ejs, data) (data is JSON)

I can figure out the infinite scrolling part, but how do I get the JSON data in chunks from the server ?

I am using node.js with express and ejs as template engine. I am open for using any framework that will aid me through the process (was particularly checking out Angualr.js).

Thanks

Michael
  • 22,196
  • 33
  • 132
  • 187
  • Where exactly the data is stored? sql database ? mongo? If it's mongo - check http://stackoverflow.com/questions/5049992/mongodb-paging – Murukesh Nov 05 '13 at 15:33
  • @Murukesh The data is taken from Mongodb but I don't use paging and it is loaded in memory. The problem arises when I try to render the view with the data, then it throws out of memory. I am guessing that `render` is a memory expensive operation therefor it crashes. Is there a way to stream the data as JSON to the browser in chunks ? I don't want to change the implementation too much and add mongodb paging – Michael Nov 05 '13 at 15:44
  • 1
    IMHO, there is no point in using infinite scroll on the client if you don't use paging on the server. You have to work on that first. – Kos Prov Nov 05 '13 at 15:56
  • @KosProv I want to use paging on the server, I'm just wondering if there's a way to do the paging in the http level rather than doing it from mongodb – Michael Nov 05 '13 at 16:03
  • I think that you would be wasting resources (datacenter network / CPU / memory / IO) that way. Plus, this will not scale well. What if your dataset grows up to millions of rows? It will not fit into you web tier's memory. And -bottom line- your user will only see of small portion of all this work. – Kos Prov Nov 05 '13 at 16:08

1 Answers1

2

Firstly, there is an angular component for infinite scroll: http://ngmodules.org/modules/ngInfiniteScroll

Then, you have to change you backend query to look something like:

http://my.domain.com/items?before=1392382383&count=50

This essentially tells your server to fetch items created/published/changed/whatever before the given timestamp and return only 50 of them. That means you back-end entities (be them blog entries, products etc) need to have some short of natural ordering in a continuous space (a publication date timestamp is almost continuous). This is very important, cause even if you use a timestamp, you may end-up with extreme heisenbugs where items are rendered twice (if you use <= that's definate), loose entries (if you use < and 2 entries on the edges of your result sets are on the same timestamp) or even load the same items again and again (more than 50 items on the same timestamp). You have to take care of such cases by filtering duplicates.

Your server-side code translates this into a query like (DB2 SQL of course):

SELECT * FROM ITEMS
WHERE PUBLICATION_DATE <= 1392382383
ORDER BY PUBLICATION_DATE DESC
FETCH FIRST 50 ROWS ONLY

When infinite scroll reaches the end of the page and calls your registered callback, you create this $http.get request by taking into account the last item of your already loaded items. For the first query, you can use the current timestamp.

Another approach is to simply send the id of the last item, like:

http://my.domain.com/items?after_item=1232&count=50

and let the server decide what to do. I suppose you can use NoSQL storage like Redis to answer this kind of query very fast and without side-effects.

That's the general idea. I hope it helps.

Kos Prov
  • 4,207
  • 1
  • 18
  • 14
  • Thanks for that. I was trying to avoid changing the access to my DB layer and somehow do the paging in the `res.send()` phase in express. But since the calls are async I would have to keep an open connection between the client and the server and I don't know if it's possible. – Michael Nov 05 '13 at 16:08
  • Why do you need an open connection? – Kos Prov Nov 05 '13 at 16:10
  • Lets say I have the big JSON data object right before I want to send it to my view. I want to be able to make calls from the view to `getMoreData()` that will get another chunk of the data. How would I do that ? – Michael Nov 05 '13 at 16:23
  • You don't need an open connection. Angular asks for the first batch with a single HTTP call. That call ends and the connection eventually closes. When the user scrolls to the end of the page, a second -totally independent- HTTP call asks for the second patch. You would need an open connection if you had server-side push etc but i think that's not the case. – Kos Prov Nov 05 '13 at 16:36