I need to copy big files (GBs) into another file (the container), and I was wondering about performance and ram use.
Reading the entire source file like the following:
RandomAccessFile f = new RandomAccessFile(origin, "r");
originalBytes = new byte[(int) f.length()];
f.readFully(originalBytes);
And later on, copy everything into the container like this:
RandomAccessFile f2 = new RandomAccessFile(dest, "wr");
f2.seek(offset);
f2.write(originalBytes, 0, (int) originalBytes.length);
does everything in memory, correct? So copying big files can have an impact on memory and can result in an OutOfMemory Exception?
Is it better to read the original file bytes by bytes instead of entirely? In that case how should I have to proceed? Thank you in advance.
EDIT:
Following the answer of mehdi maick I finally found the solution: I can use RandomAccessFile as destination as I wanted, and because RandomAccessFile has a method "getChannel" that returns a FileChannel I can pass that to the following method that will do the copy (32KB at time) of the file in the position of the destination I want:
public static void copyFile(File sourceFile, FileChannel destination, int position) throws IOException {
FileChannel source = null;
try {
source = new FileInputStream(sourceFile).getChannel();
destination.position(position);
int currentPosition=0;
while (currentPosition < sourceFile.length())
currentPosition += source.transferTo(currentPosition, 32768, destination);
} finally {
if (source != null) {
source.close();
}
}
}