0

I have developed a script in JavaScript for the purpose of parsing log files and extracting relevant information. The script functions as expected for smaller log files, however, when applied to log files exceeding 1 GB in size, it results in the following error: 'Cannot create a string longer than 0x1fffffe8 characters.'

I would appreciate any recommendations or strategies for optimizing my code and resolving this issue, in order to enable it to handle larger log files effectively.

For context, the initial part of my code involves storing the contents of the log file in a lines variable, as shown below:

const lines = fileContents.split("\n");
fs.readFile("./my_file.log", "utf8", (err, data) => {
  if (err) throw err; 
  const lines = data.split("\n");   
  console.log(lines);
})
Yogi
  • 6,241
  • 3
  • 24
  • 30
  • @Yogi, I acknowledge that this solution has been previously discussed, however, I am working with a .log or .txt file and the solutions provided are specifically geared towards working with .json files. – Omkar Jadhav Feb 02 '23 at 17:19
  • The solution depends on how the log files are read, but those details haven't been shared. Please update the question to show the code that reads the log file. – Yogi Feb 02 '23 at 18:31
  • That's how I am reading and accessing my file - ```fs.readFile( "c:/Cerence Work/Task 0/Script/my_file.log", "utf8", (err, data) => { if (err) throw err; const lines = data.split("\n"); // split the log file into an array of lines console.log(lines); ``` – Omkar Jadhav Feb 05 '23 at 08:10
  • Does this answer your question? [Parsing huge logfiles in Node.js - read in line-by-line](https://stackoverflow.com/q/16010915/943435) And this blog provides more detail: [A Memory-Friendly Way of Reading Files in Node.js](https://betterprogramming.pub/a-memory-friendly-way-of-reading-files-in-node-js-a45ad0cc7bb6) – Yogi Feb 05 '23 at 17:22

0 Answers0