3

What is the most efficient way to read heavy files(>~10Gb) using Node.JS(LTS)?

Essentially in today's world, I need to read the file content, parse each line to a known data-structure , perform certain validations, and push the data-structure into the database(SQL Server). I do it using C#(memory-mapped files). It works pretty well because I am able to read the file in chunks (in parallel).

I am planning to migrate the solution to Node(and MongoDB) for a business use-case.

Any leads/suggestions?

Environment:

I am using a 64-bit Windows OS, x64 based processor, 8Gigs of RAM

Hanuma
  • 41
  • 4

1 Answers1

1

What you're looking for is usually referred to as streams in node.js.

You can read or write very large files with streams by processing portions of it.

Here are a few links that could help you to get started.

Parsing huge logfiles in Node.js - read in line-by-line

Using Node.js to Read Really, Really Large Datasets & Files

Read large text files in nodejs

Dinesh Pandiyan
  • 5,814
  • 2
  • 30
  • 49
  • 1
    Additionally... there are often standard filter streams and stream parsers for handling common file types. – Brad Nov 25 '18 at 05:07