0

I'm attempting to stop multiple requests from hitting the disk at once by caching requests and storing promises into an array. When the initial request finishes it should resolve all of the promises. Here's what I have, but unfortunately it doesn't look like new Promise() can be used this way, and deffered is no longer part of the spec. Note: some ES6 syntax such as const and the rocket operator are present in this example

This is a NodeJS application and I would prefer to not bring in any external libraries, however I will if necessary.

var observers = {}

function resolveObservers(link, value) {
    for(var i = observers[link].length - 1; i >= 0; i--) {
        if(observers[link][i] != null) {
            observers[link][i].resolve(value)
            observers[link].splice(i, 1)
        }
    }
}

function get(link) {
    const b64link = base64.encode(link)
    const promise = new Promise()
    var handle = false
    if(observers[b64link] == null) {
         observers[b64link] = []
         handle = true
    } else if(observers[b64link].length == 0) {
        handle = true
    }
    observers[b64link].push(promise)
    if(handle) {
        doAsyncOne.then(() => {
            doAsyncTwo.then(() => {
                doAsyncThree.then(data => {
                    resolveObservers(b64link, data)
                })
            })
         })
    }
}

The idea is that the Async code will only execute one time, and once it finishes all promises created by parallel requests will be resolved.

EDIT: I'm aware of how Promises in JS are normally used, I guess I'm looking for how Promises are used in other languages, usually called deferring.

EDIT2: You should be able to chain this event, for example:

get('...').then(data => {
   // ...
})
Hobbyist
  • 15,888
  • 9
  • 46
  • 98

1 Answers1

1

You still can use the new Promise constructor in that way, even if you don't have deferreds any more:

var observers = {}

function get(link) {
    const b64link = base64.encode(link)
    return new Promise(resolve => {
        if (observers[b64link] == undefined) {
            observers[b64link] = [];
        }
        observers[b64link].push(resolve);

        if (observers[b64link].length == 1) {
            doAsyncOne
            .then(() => doAsyncTwo)
            .then(() => doAsyncThree)
            .then(data => {
                for (resolve of resolveObservers[b64link])
                    resolve(data)
            }, err => {
                err = Promise.rejct(err)
                for (resolve of resolveObservers[b64link])
                    resolve(err)
            })
        }
    });
}

But as you can see, error handling is not especially pretty (you'd even forgotten it completely), this is basically the deferred antipattern. There's a much simpler solution - just cache the promise objects themselves; they're values like every other and can be memoised! You don't even need to construct a new promise on every call:

var promises = {}

function get(link) {
    const b64link = base64.encode(link)
    if (promises[b64link] == undefined) {
        promises[b64link] = doAsyncOne
        .then(() => doAsyncTwo)
        .then(() => doAsyncThree);
    }
    return promises[b64link];
}

That's it!

Community
  • 1
  • 1
Bergi
  • 630,263
  • 148
  • 957
  • 1,375
  • That one at the bottom is clean and makes me want to wrap all of these callbacks from this library in Promises just for the Syntax. I'm aware that I didn't do error handling, it was simply to get the point across in as little code as possible as to what I wanted. I guess I made a mistake in my Question by demonstrating that doAsyncX was a promise, when it's actually a function with a callback. Anyway, thanks this helped me figure out what I needed. – Hobbyist Sep 21 '16 at 19:27
  • Yes, you should generally [wrap all callback functions in promises](http://stackoverflow.com/q/22519784/1048572) *immediately*, just to make sure to get all error handling correct and to get the benefits of promises as soon as possible. – Bergi Sep 21 '16 at 19:32