0

I have invoked node up to 2048(mb?) without any success so at this point I don't think it makes sense to continue raising the memory limit, especially if my code is inefficient to begin with. This is the answer from FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory in ionic 3 though.

ref: node --max-old-space-size=2048

for lack of brevity:

<--- Last few GCs --->
io[5481:0x5693440]   286694 ms: Mark-sweep 2048.0 (2051.1) -> 2047.3 (2051.3) MB, 1268.3 / 0.0 ms  (+ 0.0 ms in 13 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 1272 ms) (average mu = 0.087, current mu = 0.003) allocatio[5481:0x
5693440]   290098 ms: Mark-sweep 2048.2 (2051.3) -> 2047.5 (2051.3) MB, 3398.6 / 0.0 ms  (+ 0.0 ms in 13 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 3404 ms) (average mu = 0.026, current mu = 0.002) allocatio

<--- JS stacktrace --->

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
 1: 0xa2b020 node::Abort() [node]
 2: 0x97a467 node::FatalError(char const*, char const*) [node]
 3: 0xb9e0ee v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [node]
 4: 0xb9e467 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [node]
 5: 0xd3e875  [node]
 6: 0xd3f21b v8::internal::Heap::RecomputeLimits(v8::internal::GarbageCollector) [node]
 7: 0xd4d012 v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [node]
 8: 0xd4de65 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [node]
 9: 0xd5082c v8::internal::Heap::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [node]
10: 0xd1fecb v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationType, v8::internal::AllocationOrigin) [node]
11: 0x10501ef v8::internal::Runtime_AllocateInYoungGeneration(int, unsigned long*, v8::internal::Isolate*) [node]
12: 0x13a9ed9  [node]
Aborted
npm ERR! code ELIFECYCLE

Here is my problematic function:

    public async upsertToDb(courses: Record<string, Record<string, any>>): Promise<string> {
        this.courseTransformationUtility.transformToFlatStructure(courses);
        const flatCourses: Array<Course> = this.courseTransformationUtility.getFlatCourses();
        const flatClasses: Array<Class> = this.courseTransformationUtility.getFlatClasses();

        console.info(`Courses exploded into ${flatCourses.length} rows.`)
        console.info(`Classes exploded into ${flatClasses.length} rows.`)

        await this._deleteTable("Course");
        await this._deleteTable("Class");

        for(let i = 0; i < flatCourses.length; i+=100) {
            Backendless.Data.of(Course).bulkCreate(flatCourses.slice(i, i + 100))
                .then(() => {
                    process.stdout.write(".");
                })
                .catch((e: Error) => console.info(e));
        }


        for(let i = 0; flatClasses.length; i+=100) {
            Backendless.Data.of(Class).bulkCreate(flatClasses.slice(i, i + 100))
                .then(() => {
                    process.stdout.write(".");
                })
                .catch((e: Error) => console.info(e));
        }

        return "";
    }

If I comment out the second loop node runs this on default memory settings without any issue. If I had to guess the issue has something to do with asynchronization but I really can't be sure. If it is the issue though, can we de-asynchronize it, at all? Sorry, there are a lot of questions here that I lack the depth to answer.

Edit: code update

Promise.all is not the solution, at least for my case.

        for(let i = 0; i < flatCourses.length; i+=100) {
            let promise: Promise<Array<string>> = Backendless.Data.of(Course).bulkCreate(flatCourses.slice(i, i + 100));
            promise.then(() => process.stdout.write('.'));
            coursePromises.push(promise);
        }

        await Promise.all(coursePromises.slice(0, Math.floor(coursePromises.length/2))).then(() => console.info("1/4"));
        await Promise.all(coursePromises.slice(Math.floor(coursePromises.length/2))).then(() => console.info("2/4"));

        for(let i = 0; flatClasses.length; i+=100) {
            let promise: Promise<Array<string>> = Backendless.Data.of(Class).bulkCreate(flatClasses.slice(i, i + 100));
            promise.then(() => process.stdout.write('.'));
            classPromises.push(promise);
        }

        await Promise.all(classPromises.slice(0, Math.floor(classPromises.length/2))).then(() => console.info("3/4"));
        await Promise.all(classPromises.slice(Math.floor(classPromises.length/2))).then(() => console.info("4/4"));

logging

Using term 202008...
Publishing 2113 courses...
Courses exploded into 11630 rows.
Classes exploded into 10986 rows.
....................................................................................................................1/4
.2/4

<--- Last few GCs --->
io[5628:0x5a47390]   310095 ms: Mark-sweep 2047.4 (2050.7) -> 2046.7 (2051.0) MB, 1935.6 / 0.0 ms  (+ 0.0 ms in 14 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 1941 ms) (average mu = 0.117, current mu = 0.003) allocatio[5628:0x
5a47390]   311960 ms: Mark-sweep 2047.6 (2051.0) -> 2046.9 (2051.2) MB, 1860.4 / 0.0 ms  (+ 0.0 ms in 13 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 1865 ms) (average mu = 0.063, current mu = 0.003) allocatio

<--- JS stacktrace --->
notacorn
  • 3,526
  • 4
  • 30
  • 60
  • 2
    Because neither of your `for` loops uses `await`, every single call to `.bulkCreate()` in both loops will be in flight at the same time, all consuming memory and all attacking the database at the same time. The solution is to only have N requests in flight at the same time where you tune N to a balance of performance and memory usage. With databases, you often don't really gain much by doing more than 10 at a time. – jfriend00 Jul 25 '20 at 23:49
  • 1
    Various discussion of this type of issue here with solutions: [Consumes all my ram](https://stackoverflow.com/questions/46654265/promise-all-consumes-all-my-ram/46654592#46654592), [Properly batch promises](https://stackoverflow.com/questions/59976352/properly-batch-nested-promises-in-node/59976509#59976509) and [Loop through API on multiple requests](https://stackoverflow.com/questions/48842555/loop-through-an-api-get-request-with-variable-url/48844820#48844820). – jfriend00 Jul 25 '20 at 23:51
  • And, one more [API that can handle 20 at a time](https://stackoverflow.com/questions/33378923/make-several-requests-to-an-api-that-can-only-handle-20-request-a-minute/33379149#33379149). – jfriend00 Jul 25 '20 at 23:54

2 Answers2

2

You can either, do every async operation one at a time, or do courses and classes one at a time.

Everything one at a time.

for (let i = 0; i < flatCourses.length; i+=100) {
    try {
        await Backendless.Data.of(Course).bulkCreate(flatCourses.slice(i, i + 100))
        process.stdout.write(".");
    } catch (e) {
        console.info(e)
    }
}
// Do the same for flatClasses

courses and classes one at a time

const promises = []

for (let i = 0; i < flatCourses.length; i+=100) {
    const promise = Backendless.Data.of(Course).bulkCreate(flatCourses.slice(i, i + 100))
    promise.then(() => {
        process.stdout.write(".");
    })
    .catch((e: Error) => console.info(e));

    promises.push(promise)
}

await Promise.all(promises)
// Do the same for flatClasses

More complex approaches involve doing N operations at a time, but I wouldn't go that far if these simple approaches solve your issue.

Eric Guan
  • 15,474
  • 8
  • 50
  • 61
  • looks promising ill give it a go! – notacorn Jul 25 '20 at 23:53
  • edited post with the changes you suggested - i think there really is some issue wrt memory because I can only get 1 `Promise.all` to go through. one at a time only let the first one through and going halfsies only let the first half go through. Has to be some kind of issue with maybe running out of memory for the response??? – notacorn Jul 26 '20 at 00:30
  • or maybe theres an issue with the stdout – notacorn Jul 26 '20 at 00:36
  • try reducing the chunk size from 100 to 50, 25, 10, etc. – Eric Guan Jul 26 '20 at 01:17
  • so you dont think its possible the problem could lie with the data from the responses? – notacorn Jul 26 '20 at 01:19
  • fwiw i broke to down to chunks of 10 and its breaking on the very first promise.all. i can make a new question explaining everything – notacorn Jul 26 '20 at 01:33
  • await Promise.all(promises) its not flushing the memory. – Karunakaran May 12 '22 at 05:50
0

You arent blocking on any of your async code so its all of them are getting started at once and killing your memory. A temporary solution is to use await instead of then. A more comprehensive solution would involve batching calls and using Promie.all to resolve the batches.

Deadron
  • 5,135
  • 1
  • 16
  • 27
  • could you show how I would use `Promise.all` in my case? – notacorn Jul 25 '20 at 23:51
  • Promise.all is your problem. You're creating a ton of async operations all running at the same time. Start with awaiting each one and processing them one at a time. – Jason Goemaat Jul 27 '20 at 03:07