I have huge JSON file to process which holds around 15,00,000 JSON objects. I am performing some searching operation where I am using two for loops under which I am comparing object values.
Below is an example:
const data = [
{
"slug": "vertical-lift-module-market",
"id": 68055,
"related_reports_updated": {
"sub_categories": [
{
"slug": "audience-analytics-market",
"id": 66684,
"short_title": "Audience Analytics Market"
},
{
"slug": "mobile-wallet-market",
"id": 68830,
"short_title": "Mobile Wallet Market"
}
}
},
{
"slug": "united-states-real-estate-services---growth-trends-and-forecast-2022-- -2027",
"id": 68056,
"related_reports_updated": {
"sub_categories": [
{
"slug": "canada-real-estate-services-market---growth-trends-and-forecast-2020---2025",
"id": 68051,
"short_title": "Canada Real Estate Services Market"
},
{
"slug": "germany-real-estate-services-market--growth-trends-and-forecast-2020---2025",
"id": 68054,
"short_title": "Germany Real Estate Services Market"
},
}
},
{
...
}
]
//This data holds 15,00,000 JSON objects
What I am trying to do is comparing slug
of one object with slug
available in sub_categories
array of other objects. If it's present then create one object and push it into the result
array and send that result
array.
const result = [];
for(var i=0;i<data.length;i++) {
for(var j=0;j<data.length;j++) {
//Comparing operation
}
}
console.log(result);
But after running some time, it's giving me this error:
[41955:0x523ce90] 162238 ms: Mark-sweep (reduce) 4096.9 (4102.7) -> 4096.9 (4104.7)
MB, 3481.7 / 0.4 ms (average mu = 0.092, current mu = 0.000) allocation failure scavenge might not succeed
<--- JS stacktrace --->
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
1: 0xa3ac10 node::Abort() [node]
2: 0x970199 node::FatalError(char const*, char const*) [node]
3: 0xbba58e v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool)
[node]
4: 0xbba907 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char
const*, bool) [node]
5: 0xd76b25 [node]
6: 0xd776af [node]
7: 0xd854eb v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace,
v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [node]
8: 0xd890ac v8::internal::Heap::AllocateRawWithRetryOrFailSlowPath(int,
v8::internal::AllocationType, v8::internal::AllocationOrigin,
v8::internal::AllocationAlignment) [node]
9: 0xd5778b v8::internal::Factory::NewFillerObject(int, bool,
v8::internal::AllocationType, v8::internal::AllocationOrigin) [node]
10: 0x109fd4f v8::internal::Runtime_AllocateInYoungGeneration(int, unsigned long*,
v8::internal::Isolate*) [node]
11: 0x1448f59 [node]
Aborted (core dumped)
To get rid of this error I even tried node --max-old-space-size=4096 index.js
for maximizing memory for node processes.
But I am still getting the same issue. Is there any other way to resolve this issue and get the desired result?