2

Since many public APIs such as GitHub public API has a request limit, so it makes sense for us to implement some cache mechanism to avoid unnecessary request calls. However I discovered that this might incur race condition.

I coded up an example to demonstrate the situation https://codesandbox.io/s/race-condition-9kynm?file=/src/index.js

Here I first implement a cachedFetch ,

const cachedFetch = (url, options) => {
  // Use the URL as the cache key to sessionStorage
  let cacheKey = url;
  let cached = sessionStorage.getItem(cacheKey);
  if (cached !== null) {
    console.log("reading from cache....");
    let response = new Response(new Blob([cached]));
    return Promise.resolve(response);
  }

  return fetch(url, options).then(async response => {
    if (response.status === 200) {
      let ct = response.headers.get("Content-Type");
      if (ct && (ct.includes("application/json") || ct.includes("text"))) {
        response
          .clone()
          .text()
          .then(content => {
            sessionStorage.setItem(cacheKey, content);
          });
      }
    }
    return response;
  });
};

It uses sessionStorage to cache the results.

And I am making the requests to Github API. The idea is simple, there is a Input and a p tag, and the Input has a event listener to listen for input changes and uses the input value to get the github user's name and the p will render the name on the page.

The race condition might occur in the following situation:

  1. User types jack in the input field, since this is the first time the user types jack so the result is not cached. The request will be made to fetch this jack user's Github profile
  2. Then the user types david in the input field, since this is also the first time the user types david so the result is not cached. The request will be made to fetch this david user's Github profile
  3. Finally the user types jack in the input field for the second time, since the result is already in the cache. The no request will be made and we can read the user profile from sessionStorage and render the result immediately.

Then you can image that, if the second request, i.e. request to fetch david's profile takes too long, user will see david end up being the final result rendered on the page even if his/her last search was for jack. This is because jack's result got overridden by the david's result which takes much longer to get back.

In my example, I used this function to simulate the user typing

async function userTyping() {
  sessionStorage.clear();
  inputEl.value = "jack";
  inputEl.dispatchEvent(new Event("input"));

  await sleep(100);
  inputEl.value = "david";
  inputEl.dispatchEvent(new Event("input"));

  await sleep(100);
  inputEl.value = "jack";
  inputEl.dispatchEvent(new Event("input"));
}

the sleep function is defined as

const sleep = ms => new Promise(resolve => setTimeout(resolve, ms));

Right now what I can think of is using debounce to avoid the situation when the user is typing too fast. However it doesn't solve the problem in the fundamental level.

Also we can use some global variable to keep track of the latest input value, and use that to check if the result we are about to render is from the latest input value. Somehow I just don't think this is an elegant solution to this problem.

Any suggestions are appreciated.

Joji
  • 4,703
  • 7
  • 41
  • 86
  • Return the query along with the response. Then check that the query matches the current input value, and only render the response if they match. – Barmar Apr 10 '20 at 23:29
  • @Barmar Could you elaborate more on the solution by coding up the solution or some part of the solution? Not sure if I understand what you meant by "Return the query along with the response" – Joji Apr 11 '20 at 00:12
  • I think CertainPerformance's answer is along the same lines as my idea. – Barmar Apr 11 '20 at 00:12

2 Answers2

2

You can save the current e.target.value in a variable inside the input handler. Then, once the cachedFetch response comes back, check if the same value is still in the input field. Only set the input field if the values match.

(If the values don't match, like if the input is a, then b, then a, and it takes the b request a longer time to finish, then b will be stored in the cache, but it won't be displayed to the user)

Also, make sure to only display the result to the user when an error does not occur:

inputEl.addEventListener("input", e => {
  const { value } = e.target;
  if (value === "") {
    return;
  }
  const url = endpoint + value;
  cachedFetch(url)
    .then(response => response.json())
    .then((result) => {
      if (e.target.value === value) {
        resultContainer.innerHTML = result.name;
      }
    })
    .catch(errorHandler);
});
CertainPerformance
  • 356,069
  • 52
  • 309
  • 320
  • so the `e.target.value` will always reflect the latest input value, we can use that to check if the current input value is equal to the input value when the request is made? Cool. I thought about something similar, using some sort of global variable to keep track of the latest input value. But I just don't think that solution is elegant enough. – Joji Apr 10 '20 at 23:45
  • I was not suggesting your solution is not elegant. I was saying using a global variable wasn't elegant enough imo. Thanks for the reply tho – Joji Apr 10 '20 at 23:50
1

You might be able to use the AbortController. It's experimental and not added to all browsers yet (missing in IE).

https://developer.mozilla.org/en-US/docs/Web/API/AbortController

Create an AbortController instance.

const controller = new AbortController();
const signal = controller.signal;

And connect it to your fetch.

return fetch(url, { ...options, signal }).then(async response => ...

And then cancel the request when you return something from the cache.

if (cached !== null) {
  controller.abort();
  ...
}
  • Thanks for the reply. Interestingly I was reading some article about aborting fetch requests and came across this `AbortController`. This `AbortController` seems really obscure and I asked some of my local friends who's been coding as a front end engineer for years and they haven't seen or used this stuff before – Joji Apr 10 '20 at 23:57
  • I haven't used it myself. But it seems like you can connect it to multiple in progress fetches https://stackoverflow.com/questions/48998013/multiple-fetch-with-one-signal-in-order-to-abort-them-all . So it might be a neat solution for your case. – Kristoffer Karlsson Apr 11 '20 at 00:01
  • Have you ever encountered this type of race condition in your dev career? How did you solve it without using `AbortController`? – Joji Apr 11 '20 at 00:10
  • I've made debounce functions like you suggested previously. In order to not perform the action until you've stopped typing for a while. As you say it's not a very nice solution, it would be cool to be able to cancel all started requests when something is returned from the cache. – Kristoffer Karlsson Apr 11 '20 at 00:18