0

I am trying to read a file to parse it through into different file type with some process on it. but i am not able to get anything from the file as it only return 'Garbage Collector' Error To me

I have tried the Property.create method too but it of no use to me. I have even tried increasing the Heap/ Memory of the Node but of no use either.

fs.readFile( '1401_complete.xml', 'utf-8', ( err, xmlData ) => {
    if ( err ) throw err;
    if ( parser.validate( xmlData ) === true ) { //optional (it'll return an object in case it's not valid)
        var jsonObj = parser.parse( xmlData, options );
        console.log(xmlData);
    }
});

I wanted the data of the file to be print down on the console or i may also process it and get processed data in other file but that work of future if this reading file functionality works.

Error i m facing is

<--- Last few GCs --->

[14488:000002CAF943F8A0] 32978 ms: Scavenge 1394.7 (1405.1) -> 1393.9 (1406.1) MB, 1.6 / 0.0 ms (average mu = 0.229, current mu = 0.219) allocation failure [14488:000002CAF943F8A0] 32982 ms: Scavenge 1394.8 (1406.1) -> 1394.0 (1406.6) MB, 1.2 / 0.0 ms (average mu = 0.229, current mu = 0.219) allocation failure [14488:000002CAF943F8A0] 32985 ms: Scavenge 1394.9 (1406.6) -> 1394.1 (1407.1) MB, 1.1 / 0.0 ms (average mu = 0.229, current mu = 0.219) allocation failure

<--- JS stacktrace --->

==== JS stack trace =========================================

0: ExitFrame [pc: 000003DE48550361]
1: StubFrame [pc: 000003DE4854737B] Security context: 0x01fac3d9d969 <JSObject>
2: replace [000001FAC3D8F509](this=0x03afb37195c9 <String[18]: Container capacity>,0x00d138f05e21 <JSRegExp <Very long

string[16829]>>,0x03afb37197e1 ) 3: processTagValue(aka processTagValue) [00000115E29F4771] [D:\POC\Xml-Parser\node_modules\fast-xml-parser\src\xmlstr2xml...

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory 1: 00007FF6CD10D945 2: 00007FF6CD0E5EE6 3: 00007FF6CD0E68D0 4: 00007FF6CD52677E 5: 00007FF6CD5266B7 6: 00007FF6CD7FC724 7: 00007FF6CD7F23CE 8: 00007FF6CD7F0AC8 9: 00007FF6CD7FA3A7 10: 00007FF6CD7FA426 11: 00007FF6CD3BE951 12: 00007FF6CDB0EF92 13: 000003DE48550361

Aayushi Gupta
  • 369
  • 4
  • 18
  • How big is `1401_complete.xml` ? – Anand Undavia May 02 '19 at 11:27
  • 60 lakhs of line :( – Aayushi Gupta May 02 '19 at 11:28
  • Well, there is the problem. Node would not be able to load all the file in memory at once. -- even if node can, you should not do it. Try reading the file in processable chunks. [this](https://stackoverflow.com/questions/25110983/node-reading-file-in-specified-chunk-size) might be helpful – Anand Undavia May 02 '19 at 11:31
  • sure I would try that thanks for the help. – Aayushi Gupta May 02 '19 at 11:33
  • hey @AnandUndavia i thought what u have suggested of reading a file in chunks but that will increase my problem. i am parsing in file and in chunks i might not get full slot of the data i am parsing. There would be some chances that i would get that slot in 2 different chunks. – Aayushi Gupta May 03 '19 at 03:05

0 Answers0