8

I'm building a UI for a data importer using angularjs. The angular app is going to be crunching through the input data source (a spreadsheet, or whatever) and sending GETs/POSTs to an API to create/update records on the server and retrieve changes, etc.

If a user is importing thousands of records, I probably don't want to be opening up thousands of ajax calls at once (not that Angular would be able to get all of the requests sent out before the first finished). My thought was to add some sort of connection pool so that it could be throttled to just 10 or 50 or so ajax calls at once.

Does angular already have a built-in means of throttling ajax calls? I know I could build one without too much trouble, but I don't want to re-invent the wheel if there's already something slick out there. Can anyone recommend any tools/plugins for that? I know there are a few for jquery, but I'm hoping to avoid jquery, as much as possible, for this project.

Troy
  • 21,172
  • 20
  • 74
  • 103
  • 1
    Something like this: https://github.com/mikepugh/angular-http-throttler – nathancahill Oct 27 '14 at 21:14
  • Looks promising... I'll check it out. – Troy Oct 27 '14 at 21:21
  • Or you could just implement transactions with your datastore of choice coupled with promises (javascript). other than that it appears you are reaching the levels of threading in which case I would build an api to hand off write operations to a low-level handler like D (dlang) or C and send "write completed" messages back through a socket like Redis or Rabbit.mq – Coldstar Oct 28 '14 at 08:06

1 Answers1

4

In investigating this further, I found that, according to this question and this page, browsers automatically throttle http requests to the same server to somewhere between 6 and 13 concurrent requests, and they queue up additional requests until other requests complete. So, apparently, no action is needed, and I can just let my code fly.

I looked at angular-http-throttler (suggested by nathancahill), but (1) it appears to duplicate what browsers are already doing by themselves, and (2) it doesn't (currently) appear to have a mechanism for handling error cases, so if the server doesn't respond, or if it was a bad request, it doesn't decrement the request count, thus permanently clogging up it's queue.

Letting the browser queue up too many requests at once can still cause memory/performance issues if dealing with very large amounts of data. I looked at various techniques for creating a fixed-length queue in javascript, where it can fire off a callback when a queue spot becomes available, but the non-blocking nature of javascript makes such a queue tricky & fragile... Hopefully, I won't need to go that direction (I'm not dealing with THAT much data), and the browser's throttling will be sufficient for me.

Community
  • 1
  • 1
Troy
  • 21,172
  • 20
  • 74
  • 103
  • I am looking to do the same thing right now. I am curious where you ended up with this. I found the HTTP calls completed in 10min, but the page took 9hours to render itself. – Jason Wicker Nov 11 '14 at 16:14
  • 1
    So far, the automatic browser throttling has been sufficient for my needs. I just queue up all of the AJAX requests as fast as my JS code will run, and then just let the browser execute those requests as fast as it sees fit. If your page is taking 9 hours to render itself, then it sounds like you either have way more data than the browser can handle, or there is something really inefficient in your code. The "Timeline" and "Profiles" tabs of the Chrome debugger should make it apparent which one of those it is. – Troy Nov 11 '14 at 16:34
  • Thanks! Yeah, I got it down to 20minutes. (I am using the IndexedDb as a work queue -- I made the mistake of queuing the ajax calls from the cursor.) Curious, what kind of time do you get for 10K records? – Jason Wicker Nov 11 '14 at 20:55