You really don't want to "spend time" splitting large strings in Node.
If you have to use vanilla
This is entirely possible with JavaScript (and you're pretty close). Though this is more elegant without regular expressions and with generators:
function* chunk(str, size = 3) {
for(let i = 0; i < str.length; i+= size ) yield str.slice(i, i + size);
}
[...chunk('hello world')]; // ["hel", "lo ", "wor", "ld"];
If you can use Node.js
I'd read the file you want to split with a createReadStream
and then write it to different files when it reaches the limit. This is much more effective since you don't create many small strings or keep all the data in memory:
(async () => {
let currentFileIndex = 0, currentBytes = 0;
let currentFile = fs.createWriteStream(`${currentFileIndex}.csv`);
for await(const chunk of fs.createReadStream('input.csv') {
currentBytes += chunk.length;
if (currentBytes > 32000) { // or whatever limit you want
currentFile.end(); // probably wait for the allback here
currentBytes = 0;
currentFile = fs.createWriteStream(`${++currentFileIndex}.csv`)
}
await util.promisify(cb => currentFile.write(chunk, cb)();
}
})();