48

I'm working with a code that handles all AJAX requests using Web Workers (when available). These workers do almost nothing more than XMLHttpRequest object handling (no extra computations). All requests created by workers are asynchronous (request.open("get",url,true)).

Recently, I got couple of issues regarding this code and I started to wonder if I should spend time fixing this or just dump the whole solution.

My research so far suggests that this code may be actually hurting performance. However, I wasn't able to find any credible source supporting this. My only two findings are:

  • 2 year old jQuery feature suggestion to use web workers for AJAX calls
  • this SO question that seems to ask about something a bit different (using synchronous requests in web workers vs AJAX calls)

Can someone point me to a reliable source discussing this issue? Or, are there any benchmarks that may dispel my doubts?

[EDIT] This question gets a little bit more interesting when WebWorker is also responsible for parsing the result (JSON.parse). Is asynchronous parsing improving performance?

Community
  • 1
  • 1
Konrad Dzwinel
  • 36,825
  • 12
  • 98
  • 105
  • uffffffffffff I just wait a good answer. – ZaoTaoBao Sep 12 '13 at 15:38
  • 1
    You can build your own test here: http://jsperf.com/webworker-vs-single-thread/11 – Diodeus - James MacFarlane Sep 12 '13 at 16:08
  • 16
    Webworkers can be helpful when doing computationally expensive work. AJAX requests are primarily IO bound. If you aren't doing anything with the results other than passing them back to the main app, it is unlikely you would see any performance benefit and very possible you would have a slight degradation in performance as there is some overhead associated with passing the results back. – dc5 Sep 12 '13 at 17:09
  • 3
    if they're causing you problems and they don't solve any... :) – dtudury Sep 20 '13 at 08:12
  • 6
    AJAX calls are async by nature so other than making a call to the JavaScript engine they will not take up much computing time. WebWorkers are great if you are parsing the results in the web worker code but if you are handling the results of the requests in the main thread that is where 99% of the computational time will be. This means it will optimize that 1% which is probably not helping much. – sharpper Oct 07 '13 at 18:36
  • As @dc5 explained webworkers are generally intended for computational use. That being said generating a vast number of xhr request can be quite a big overhead too, and in this case it makes sense to have a webworker acting as an ajax service (for keeping the UI fluid for example). I don't think starting a webworker for each ajax call is a viable option though. – Renaud Oct 11 '13 at 08:48

5 Answers5

31

I have created a proper benchmark for that on jsperf. Depending on the browser, WebWorker approach is 85-95% slower than a raw ajax call.


Notes:

  • since network response time can be different for each request, I'm testing only new XMLHttpRequest() and JSON.parse(jsonString);. There are no real AJAX calls being made.
  • WebWorker setup and teardown operations are not being measured
  • note that I'm testing a single request, results for webworker approach may be better for multiple simultaneous requests
  • Calvin Metcalf explained to me that comparing sync and async on jsperf won't give accurate results and he created another benchmark that eliminates async overhead. Results still show that WebWorker approach is significantly slower.
  • From the Reddit discussion I learned that data passed between the main page and WebWorker are copied and have to be serialized in the process. Therefore, using WebWorker for parsing only doesn't make much sense, data will have to be serialized and deserialized anyway before you can use them on the main page.
Konrad Dzwinel
  • 36,825
  • 12
  • 98
  • 105
  • 3
    +1 - first rule of optimization is Don't. Second rule is really, Don't. Third rule is if you have to optimize make sure you benchmark it! – slebetman Oct 07 '14 at 08:16
  • @Konrad, Benchmark with [transfering](https://developer.mozilla.org/en-US/docs/Web/API/MessagePort/postMessage#Parameters) instead of copying---serializing is a sure lose, it ain't even fair for the worker case. – Pacerier Sep 28 '17 at 21:39
  • @Pacerier right, now that we have ability to transfer the data w/o serializing this test is worth redoing. Feel free to take a shot at it, let me know about the results and I'll update the answer. – Konrad Dzwinel Sep 29 '17 at 06:58
  • 2
    It matters when the time is being spent. For example, the user may hit an animated toggle switch that triggers the AJAX call, I have found that JS activity in the main thread can cause frame drops in CSS transitions. On a mobile device, setting up an XmlHttpRequest can take around 1ms; I am sure that posting a message to a web worker will take less than that. I am less concerned about transferring the data back, which will happen at an arbitrary time that could easily not be during an animation. – Adam Leggett Nov 05 '18 at 19:01
  • Since jsperf is down for years, if anyone is curious about how that test was working, you can check a snapshot of the test via https://web.archive.org/web/20160403082959/http://jsperf.com/web-workers-handling-ajax-calls-optimisation-overkill – Murat Çorlu Jul 05 '22 at 08:40
16

First thing to remember is that web workers rarely make things faster in the sense of taking less time, they make things faster in the sense that they off load computation to a background thread so that processing related to user interaction is not blocked. For instance when you take into account transferring the data, doing a huge calculation might take 8 seconds instead of 4. But if it was done on the main thread the entire page would be frozen for 4 seconds which is likely unacceptable.

With that in mind moving just the ajax calls off the main thread won't gain you anything as ajax calls are non blocking. But if you have to parse JSON or even better, extract a small subset out of a large request then a web worker can help you out.

A caveat i've heard but not confirmed is that workers use a different cache than the main page so that if the same resources are being loaded in the main thread and the worker it could cause a large duplication of effort.

Calvin
  • 784
  • 4
  • 10
  • 2
    caching is done on the HTTP side.. ain't nothing to do with webworkers. https://www.mnot.net/cache_docs/ and https://www.mnot.net/blog/2017/03/16/browser-caching – Pacerier Sep 28 '17 at 21:42
7

You are optimizing your code in the wrong place.

AJAX requests already run in a separate thread and return to the main event loop once they fulfil (and call the defined callback function).

Web workers are an interface to threads, meant for computationally expensive operations. Just like in classical desktop applications when you don't want to block the interface with computations that take a long time.

Radu Potop
  • 338
  • 6
  • 10
3

Asynchronous IO is an important concept of Javascript.

First, your request is already asynchronous, the IO is non-blocking and during your request, you can run any another Javascript code. Executing the callback in a worker is much more interesting than the request.

Second, Javascript engines execute all code in the same thread, if you create new threads, you need to handle data communication with the worker message api (see Semaphore).

In conclusion, the asynchronous and single-threaded nature of JavaScript is powerful, use it as much as possible and create workers only if you really need it, for example in a long Javascript process.

jsan
  • 2,794
  • 2
  • 18
  • 25
  • Doing a request in webworker seemed like an overkill from the beginning. The real question was if doing a request *and* `JSON.parse` in a webworker makes sense. My benchmark shows that it does not, at least for the average sized JSON strings. – Konrad Dzwinel Oct 07 '14 at 11:35
  • The relevant part of the web worker is the JSON.parse and it should be used specially for big json content. – jsan Oct 07 '14 at 15:55
  • I get the response as xml. How can I pass the document object received by jquery to the worker? It is not a transferable object – ccsakuweb Jan 07 '15 at 13:58
1

From my experience, Web Workers should not be used for AJAX calls. First of all, they are asynchronous, meaning code will still run while you're waiting for the information to come back.

Now, using a worker to handle the response is definitely something you could use the Web Worker for. Some examples:

  • Parsing the response to build a large model
  • Computing large amounts of data from the response
  • Using a Shared Web Worker with a template engine in conjunction with the AJAX response to build the HTML which will then be returned for appending to the DOM.

Edit: another good read would be: Opinion about synchronous requests in web workers

Community
  • 1
  • 1
transformerTroy
  • 1,296
  • 2
  • 11
  • 12
  • I want to parse the response. But when I send the data to parse in the worker, I get an error because the datatype is xml and it is not a transferable object. So I have to parse the data in the main thread to be able to post it in the message, and it will be cloned. – ccsakuweb Jan 07 '15 at 15:23
  • Handling of AJAX calls is still synchronous, so doing them in workers is still beneficial. And of course, to handle them in the worker, you'd got to do the XHR on the worker in the first place – Pacerier Sep 28 '17 at 21:44