6

First i use async and await very often and i get this error:

RangeError: Value undefined out of range for undefined options property undefined
    at Set.add (<anonymous>)
    at AsyncHook.init (internal/inspector_async_hook.js:19:25)
    at PromiseWrap.emitInitNative (internal/async_hooks.js:134:43)

And i dont know how i can fix this, i write my code completly in Typescript and i dont created any file that is named 'async_hooks'.

And i dont run more then 10 function async at once i use await very often so it shouldnt stack up but javascript seems not to reduce the asyncId and reach the number limit very fast.

I tried to use less async await but this didnt fix the problem, but the error msg comes later. If i use very less async await i can prevent that this error comes until the function successfully finish the job. (I use Electron 7)

Electron seems to have a very low async pool but it can be reproduced by a default typescript code:

class Test {
    private async testCompare(a,b):Promise<boolean> {
        return a == b;
    }

    public async testRun():Promise<void> {
        for (let index = 0; index < 999999999; index++) {
            for (let index2 = 0; index2 < 999999999; index2++) {
                await this.testCompare(index,index2)
            }
        }
    }

}
new Test().testRun();

This Code produce very much ram usage, and i think i have the same problem in my program. I think that the async pool get filled up until it reached its limit.

Dani GTA
  • 372
  • 3
  • 13
  • This is not related to async/await. It's an array range error. Can you share a part of your code where this issue occurs? – MEDZ Nov 02 '19 at 19:45
  • We need to see relevant code in order to help you. As MED said, this has nothing to do with async/await. It looks like you're trying to reference some object property where the object turns out to be undefined. – jfriend00 Nov 02 '19 at 21:10
  • I cant share a "part" of my code because the RangeError comes on every async call, every time a function got called by async get this error pointed at inspector_async_hook.js:19:25 and when i debug it the asyncId get called in an array but the asyncId is too long for an array and throws this error. – Dani GTA Nov 02 '19 at 23:26
  • It takes a while until this error apears, i tried to remove async and await as much as i can and the error didnt come again, if i readd async and await to the functions the error comes back when it runs out of asyncIds. – Dani GTA Nov 02 '19 at 23:40
  • So, the code you show is creating 999999999 * 999999999 number of promises and adding them all to the event queue. My guess would be that this exhausts some sort of internal resource. That's ~10^17 which is 100 quadrillion. That's a lot of objects and a lot of items in the event queue. If you're just trying to exhaust internal resources, congrats you've done it. Not sure what real world problem this represents. If you have real world code doing this, you need to fix your design to limit the number of asynchronous operations in flight to a reasonable amount. – jfriend00 Nov 04 '19 at 17:31
  • Oh, and the example you show is not a "very low async pool". That's 10^17 items. – jfriend00 Nov 04 '19 at 17:32
  • @jfriend00 yeah i expected that after the await the event get removed from the event pool but this didnt happend. – Dani GTA Nov 05 '19 at 18:40
  • Even an immediately resolved promise calls its `.then()` handler or resolves the `await` on the next tick of the event loop. So, your double `for` loop runs to completion before ANY promises are resolved. – jfriend00 Nov 05 '19 at 18:46
  • 1
    So, the first thing your code does is create 10^17 promise objects and since they all immediately resolve, it then adds 10^17 events to the event queue for their resolve handlers to get called. – jfriend00 Nov 05 '19 at 19:22
  • @jfriend00 I'm working on a data collector in NodeJS, which easily generates hundreds of thousands of async Promises. Occasionally I run into this problem. Is there a way to get a warning before the crash happens? Can the limit of 2^24 somehow be adapted? – Jonas Sourlier Jul 06 '20 at 07:56
  • @cheesus - No, there is no way to get a warning before you run out of memory in nodejs. Seriously though, you need to write your code in a way that manages its resources more efficiently. Hundreds of thousands of simultaneous promises is just not a resource efficient way to write code. If you want help with improving how efficient your code is, please write your own question and show your existing code. – jfriend00 Jul 06 '20 at 21:00
  • @jfriend00 Well, I see your point. We can mitigate this by replacing `Promise.all(...)` with something that executes the promises more sequentially (instead of all at a time). This seems to work, but it is sometimes remarkably slower. We have 128 GB of RAM, because this really is a use case that involves a lot of data, but most of this RAM stays empty. I just hoped there would be a way to accelerate the app by really using those 128 GB. – Jonas Sourlier Jul 07 '20 at 08:16
  • @cheesus - Probably what you want to do is to execute your operations N at a time where N is a carefully chosen value that uses an appropriate amount of memory while getting as much parallelism in your operations as is useful. How to make this go faster depends upon a lot of details of the actual operation and whether you benefit most from parallelism in networking, parallelism in CPU processing or some combination. There are dozens of implementations of functions that help you run N promise-based operations at a time. – jfriend00 Jul 07 '20 at 15:28
  • @jfriend00 Yes, that's what we are doing. To me, it just seems illogical to have an absolutely fixed limit in a runtime. I understand that there are limits, but why hard limits? max-old-space-size is a limit of NodeJS too, but it's configurable. Thanks anyway. – Jonas Sourlier Jul 07 '20 at 19:51
  • What "fixed limit" are you talking about? – jfriend00 Jul 08 '20 at 01:02
  • Limits are not illogical. They are typically derived from practical implementation tradeoffs such as doubling the storage of something by using a 64-bit value instead of 32-bit value and the implementer needs to decide what is the best overall tradeoff in their design that will benefit the most projects. Or sometimes, it's more efficient to borrow a few bits from a value to store other state information than creating whole new boolean properties. Or use exists floats? I have no idea what the exact limits are caused by here, but there are often design tradeoffs involved. – jfriend00 Jul 08 '20 at 01:40
  • @jfriend00 I'm talking about the fixed limit of 2^24 = 16'777'216 asyncIds. – Jonas Sourlier Jul 10 '20 at 06:56
  • Well, why don't you open your own question where you can ask a clear question with all the appropriate info in the question? I was looking in the question here and didn't see any reference to that and only now I see that you're just trying to hijaack this question to ask something different. Please make your own question. – jfriend00 Jul 10 '20 at 07:02

4 Answers4

14

I got the same error for Set.add as you did once my set size reached 16777216 (2^24). I'm unable to find the info on this limit in the documentation, but I'll assume sets are limited to 16777216 unique values?

Easily tested with a simple for loop.

This will throw the exact same error:

let s = new Set();
for (let i = 0; i <= 16777216; i++) s.add(i);

This will run successfully:

let s = new Set();
for (let i = 0; i < 16777216; i++) s.add(i);

Do note that this will eat up some ~5GB of memory, so increase your heap limit accordingly if it's crashing due to memory restrictions.

MunyuShizumi
  • 141
  • 2
  • 4
1

Just encountered this using a Set, i used the following as a quick workaround

class Set {
  hash = {}

  add (v) {
    this.hash[JSON.stringify(v)] = true
  }

  has (v) {
    return this.hash[JSON.stringify(v)] == true
  }
}

this has no limit, only limited by your systems memory

Biereagu Sochima
  • 510
  • 6
  • 13
1
RangeError: Value undefined out of range for undefined options property undefined

The above error is caused when a Set hitting the maximum limit of elements. See reference: Maximum number of entries in Node.js Map?

If still run into the problem, you can try large-set, a package that allows you to store and handle a large number of elements without worrying about the 16.777.216 (2^24) limit in a Set by partitions into the smaller sets when it reached the limit, enabling it to store and access more elements than the built-in Set.

But if you don't want to use any package, you can try this small workaround:

class LargeSet {
    constructor(limit = 16777216) {
        this.limit = limit;
        this.sets = [new Set()];
    }

    has(value) {
        return this.sets.some(set => set.has(value));
    }

    add(value) {
        if (this.sets[this.sets.length - 1].size >= this.limit) {
            this.sets.push(new Set());
        }
        if (this.has(value)) return this;
        this.sets[this.sets.length - 1].add(value);
        return this;
    }

    delete(value) {
        for (const set of this.sets) {
          if (set.delete(value)) return true;
        }
        return false;
    }
    
    clear() {
        this.sets = [new Set()];
    }
}

Then test the code:

const largeSet = new LargeSet();
for (let i = 0; i <= 16777216; i++) {
    largeSet.add(i); // No errors will be thrown
}

const set = new Set();
for (let i = 0; i <= 16777216; i++) {
    set.add(i); // Throws a 'RangeError: Value undefined out of range for undefined options property undefined'
}
0

A workaround for that (as fast as Set) And Unique,

const setA = {};
try {
    for (let i = 0; i < 16777216 + 500; i++) setA[i] = null;
} catch (err) {
    console.log('Died at ', setA.size, ' Because of ');
    console.error(err);
}

console.log('lived even after ', Object.keys(setA).length);
ezio4df
  • 3,541
  • 6
  • 16
  • 31