I decided to try out. Precision of normal js numbers starts to be not enough around 3-4PB file size. So in Firefox js integers have around 55bits precision. I guess this might differ by browsers, but I need this for nodejs.
console.log("start");
var B = 8;
var KB = 1024*B;
var MB = 1024*KB;
var GB = 1024*MB;
var TB = 1024*GB;
var PB = 1024*TB;
var EB = 1024*PB;
var maxBits = 4 * PB;
var bitSteps = 100 * TB;
var maxChunkSize = 333 * B;
for (var bits = 0; bits <= maxBits; bits += bitSteps)
for (var chunkSize = 1; chunkSize <= maxChunkSize; ++chunkSize) {
var calculatedChunkCount = Math.ceil(bits/chunkSize);
var calculatedBits = calculatedChunkCount * chunkSize;
var difference = calculatedBits - bits;
var error = difference>= chunkSize || difference<0;
if (error)
console.log({
chunkSize: chunkSize,
calculatedChunkCount: calculatedChunkCount,
bits: bits,
calculatedBits: calculatedBits,
difference: difference
});
}
console.log("end");
output:
{ chunkSize: 97, calculatedChunkCount: 290180388361501, bits: 28147497671065600, calculatedBits: 28147497671065596, difference: -4 }
{ chunkSize: 1579, calculatedChunkCount: 20611490932343, bits: 32545544182169600, calculatedBits: 32545544182169596, difference: -4 }
I'll start a parallel nodejs script for night just to be sure, but I think that precision is enough in the 0 - 100GB range.
According to another answer: https://stackoverflow.com/a/2803010/607033 js integers are accurate up to 53 bits. So this division limit might be close to that value, maybe a few bits lower.