0

this is more of a opinionated question. I do have a working solution, but I'm not 100% comfortable with it, as it has it's flaws. Maybe someone can help me to improve this.

Goal: I have an external api that only allows 4 calls to be made concurrently against it (for each user). Our app can impersonate multiple users at once.

So the issue comes, if more than 4 calls are made against the api simultaneously. (sometimes more than 20 calls are made)

Using a batched approach with Promise.all and chunking would be very inefficient, as the calls have a different runtime each.

Ideally, the queue would work FIFO and as soon as one call finishes, the next call is started. At the current standing, I created 4 own FIFO queues with somewhat of a Promise chain and I fill these evenly (if all are running).

The problem that I have is, that I do not know how long a request is running. So choosing one of the queues can lead to an overall longer runtime as necessary. The calls are automatically rejected after 30s from the external api, so no dead lock here.

Another problem that I have is, that I have to return the data provided from the api to a dependency. The queue is called/filled from withan a callback function... This is wrapped in a Promise itself, so we can wait as long as we want. But storing the values in a cache for later retrieval is no option either.

So long story short, here is the code

class QueuingService {
    /** ~FIFO queue */
    private static queue: Record<string, { promise: Promise<Function>, uuid: string }[]> = {};
    private static MAX_CONCURRENCY = 4

    /** cacheKey is used as cachekey for grouping in the queue */
    public static async addToQueueAndExecute(func: Function, cacheKey: string) {
        let resolver: Function | null = null;
        let promise = new Promise<Function>(resolve => {
            resolver = resolve;
        });
        //in the real world, this is a real key created by nanoId
        let uuid = `${Math.random()}`;


        Array.isArray(this.queue[cacheKey])
            ? this.queue[cacheKey].push({ promise: promise, uuid: uuid })
            : this.queue[cacheKey] = [{ promise: promise, uuid: uuid }];

        //queue in all calls, until MAX_CONCURRENCY is reached. After that, slice the first entry and await the promise
        if (this.queue[cacheKey].length > this.MAX_CONCURRENCY) {
            let queuePromise = this.queue[cacheKey].shift();
            if (queuePromise){
                await queuePromise.promise;
            }
        }
        //console.log("elements in queue:", this.queue[cacheKey].length, cacheKey)

        //technically this wrapping is not necessary, but it makes to code more readable imho
        let result = async () => {
            let res = await func();
            if (resolver) {
                //resolve and clean up
                resolver();
                this.queue[cacheKey] = this.queue[cacheKey].filter(elem => elem.uuid !== uuid);
            }
            //console.log(res, cacheKey, "finshed after", new Date().getTime() - enteredAt.getTime(),"ms", "entered at:", enteredAt)

            return res;
        }

        return await result();
    }
}

async function sleep(ms:number){
    return await new Promise(resolve=>window.setTimeout(resolve,ms))
}

async function caller(){
    /* //just for testing with fewer entries
    for(let i=0;i<3;i++){
        let groupKey = Math.random() <0.5? "foo":"foo"
        QueuingService.addToQueueAndExecute(async ()=>{
            await sleep(Math.floor(4000-Math.random()*2000));
            console.log(i, new Date().getSeconds(), new Date().getMilliseconds(),groupKey)
            return Math.random()
        },groupKey)
    }
    */
    
    for(let i=0;i<20;i++){
        let groupKey = Math.random() <0.5? "foo":"foo";
        let startedAt = new Date();
        QueuingService.addToQueueAndExecute(async ()=>{
            await sleep(Math.floor(4000-Math.random()*2000));
            console.log(i, new Date().getTime()-startedAt.getTime(),groupKey)
            return Math.random()
        },groupKey)
    }
    
}
caller()

Also, here is a playground with the code to play around with: https://www.typescriptlang.org/play?#code/MYGwhgzhAECKCuBTeBLAdgcwMqIE4DcVhFoBvAWACgBIAegCp7oA-AMQElWB5aARySTR6tKtQAOuFPjAAXEhBmyifAYgBc0AEqJgAe1wATADwLJmADRloE3QFsUEddAAKuOw8RHW8NMBkpdNAA+S3hUAw1TdAxoAF8AbQBdIOgAXjJYgG5RCSlZeUV-YGgAWQBBAA0AfQBhLgA5GoBVTU0AUUaATTToABYqUQYmYDBgAAtEAGlEAE9oB2h4RwNoSGgR8cQAa1noADN9aAw3eDFo+bRoGQmVZBJhHPgAIxBlBSViyBnfVYMDABVdAg7mU0AY2gAPHTwOQACj2PmAGm8vn8gUsGwm0xmkRkZgwAEoyKJqCBEDJoLhEBBdCB8HhkYi0ZcAD7QNDwEAgHocrnZGik8nWNz2Rw8xAAdxcIo8XiZAWCsKpNLpJFSKQoAuoytp9NwPR1qv51GosQJ-OglqtltotHQVxuVLA3Il+hABks1wWCzAlMQzugOzmwCdchWTzmaDAaF07AMJLJFLCKBW6QABgASUglWRjAB0uGjBjssIJsTT-IGArKuELMzzDhrddhXogef4d3imKms0SBJJ1AA-A6HO3VF3Rlje3mxEsxrDSML3I4NDZRYhQuENMmVmaBxpW2PO93sYkevFF2uPKuZY5Nynt+E4olKwLbR3BPbndyRlyIKE0H8blymqOpGhadounmGAnU2Aw82gMo9jkfVrlkSwIFeYgHRIPYUFwBRoEQQDcDmItVglMAUApa4SCvRwSRQPZoBbMZRw-RAJ02U88zJTBrmgFJDxA2oGmaVoOhqToiU1E1BQpDjXGXNURzbDiuKnGZEjzCA2OQ0tjRNJiWMU29EAJWS5LASjqNuJAlPXGczIta1XMtWISQ8yg3JtWg9DQFVEF43QMFhAAiRAyVsYiZBge0OLUMLPTYtTxxPac+Iwa4MUnHsZn7SgSVtORxjQIhvzmVtoAlQsxDOTBoPZXQKTQHRqQgMBSMsJ4YXmClbDAHYYBkXR1l0AwSFsfQSCdAwwBeEgUFsMZdATIVlU5Cl0i+H5SzSDUB0TP0YG2myKQRXwDIHYylWpXU8Bkgc6FoQ16VWMF1jJaNFjEJ7XrwK6tWoQ91PSrSehBtLcp4vCQBQ2FIsQWx9qIqK8x3aAAEJUnSHdzQHLyfNc21-MC4LQuVHLuNmSwwrwgKJhWMBkLwJL2UlaAABF8lLPMMHJf4lsQPaAFoiMAvBEAMMoZD5gWhdLcwwtsCA2YiiWqSZmREssGLJelmQCoHKkZHgXBLmVQyvJJE2zcuayqIpDa4cB00qGtygduKC6-AVaBMMQRAxFhFW1A5Wwnge2TbfNijHfZqUHI8W6VXpdUJXQYsJR0+Xot0GEU-u8wVYJAqPa9-Z5UCdZvwBiyqGtBhoFtAArJZzsOOQFHODOBL2SU8HFvEUGpBurQOXBYSOlBUgABkyFAjAAZgXgBqVf6+8nyjuOfOxGxHoc2uAsixLIkjFnvMAFZhzp3RdDCxKDgfse3OBVBMBwAgiCCsA-kBd+iBQTgihMAGEwsK6lnVJqIm1oHa2QDkHWER98x7BAPfSevRZ7YJFigk+YIz70AAEzYNnqXFysCxoBVpEFdBoUUCWFalKbmcICRyxkDgfyBgICKwTlzHmbD+YyBKCgLkHguE8IJOYXepxsQFUoZaGOlw8GFgIbYUsr9XKxGkScfesx5FWkJlaB4W9LSaInlPIUM956LxIWvDeMDt5ChkXouY6QVGn3UefS+N9oB3wfk-e+YUKGuSOu8XAYYZbimYQIkJ1p37RC-oQYgeY-4AiBKoYBkJoRwkgQSaBmiibwIpIg4OeC0EYNhFgnBHi1GlmIaQ8hhSfKkxoeTWEDC+EsOFoI3OPSRbhMibLIRgtoqKxcXI5pbklGlFzPg4sXiplxB0XvSZpi4iaPdlQX8ZJJ4EiAA

user13897241
  • 101
  • 5
  • 1
    Hmm, so this looks like "this code works but I wonder if it could be better" and you have multiple concerns. I wonder if this question belongs more on the codereview stack exchange site (not going to link there directly, and they have their own rules there as for what belongs and doesn't belong so you should check those rules out if you decide to go there). If you want it to stay here you should figure out a primary question to ask (not multiple). – jcalz Oct 18 '22 at 20:30
  • 1
    Assuming this stays here: why do you need more than one queue (per whatever)? I'd expect a single queue that only allows four things to be running at once. Maybe like [this](https://tsplay.dev/w2z9Yw) although I'm no expert in Promise queues. Maybe this is a duplicate of [this question](https://stackoverflow.com/questions/40639432/what-is-the-best-way-to-limit-concurrency-when-using-es6s-promise-all) and you should consider using an existing well-tested library like p-limit instead of rolling your own. Thoughts? How should we proceed here? – jcalz Oct 18 '22 at 20:34
  • @jcalz thanks for pointing out codereview stack exchange. Didn't know this exists ;) I'll move my question there. As for the second comment: Depending on the browser/node environment, more than 4 requests can be done concurrently. Chrome can make 6 if I recall correctly. The problem I have is, that depending on the ressource I'm calling I have a limit of 4 concurrent calls against an endpoint with the same user token. Other tokens also have 4 concurrent requests. In total I want to make as many calls as possible. Thanks a lot for your help. Moving the question and I'll close this soon. – user13897241 Oct 19 '22 at 13:02

0 Answers0