I am trying to upload larger files to a google bucket from nodejs. uploading any file under and around the 200MB size mark works perfectly fine. Anything greater than that returns an error
Cannot create a string longer than 0x1fffffe8 characters
By me having a file that big, I have found out that node, does have limitations on how big a blob/file can be. Here are the two code snippets that both throw the same error
This one is with upload streaming
let fileSize = file.size;
fs.createReadStream(file)
.pipe(
upload({
bucket: BUCKET,
file: file,
})
)
.on("progress", (progress) => {
console.log("Progress event:");
console.log("\t bytes: ", progress.bytesWritten);
const pct = Math.round((progress.bytesWritten / fileSize) * 100);
console.log(`\t ${pct}%`);
})
.on("finish", (test) => {
console.log(test);
console.log("Upload complete!");
resolve();
})
.on("error", (err) => {
console.error("There was a problem uploading the file");
reject(err);
});
and of course just a regular bucket upload
await storage.bucket(BUCKET)
.upload(file.path, {
destination: file.name,
})
I have come to terms that the only solution can be to chunk the file, upload it in chunks, and rejoin the file chunks in the bucket. The problem is that i don't know how to do that and i cant find any documentation on google or GitHub for this clause