What is the most efficient way to read heavy files(>~10Gb) using Node.JS(LTS)?
Essentially in today's world, I need to read the file content, parse each line to a known data-structure , perform certain validations, and push the data-structure into the database(SQL Server). I do it using C#(memory-mapped files). It works pretty well because I am able to read the file in chunks (in parallel).
I am planning to migrate the solution to Node(and MongoDB) for a business use-case.
Any leads/suggestions?
Environment:
I am using a 64-bit Windows OS, x64 based processor, 8Gigs of RAM