3

I have a large file in windows XP - its 38GB. (a VM image)

I cannot seem to copy it.

Dragging on the desktop - gives error of "Insufficient system resources exist to complete the requested service"

Using Java - FileChannel.transferTo(0, fileSize, dest) fails for all files > 2GB

Using Java - FileChannel.transferTo() in chunks of 100Mb fails after ~18Gb

java.io.IOException: Insufficient system resources exist to complete the requested service
at sun.nio.ch.FileDispatcher.write0(Native Method)
at sun.nio.ch.FileDispatcher.write(FileDispatcher.java:44)
at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:72)
at sun.nio.ch.IOUtil.write(IOUtil.java:28)
at sun.nio.ch.FileChannelImpl.write(FileChannelImpl.java:198)
at sun.nio.ch.FileChannelImpl.transferToTrustedChannel(FileChannelImpl.java:439)
at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:510)

I mean - the computer has 3GB of RAM. A 100GB buffer should be enough!?!?

Apparently the DOS commands "copy" and "xcopy" also fail.

(edit) I've tried COPY & XCOPY - these fail with the same error. XCOPY seems to take a really really long time about it too.

I've heard of Robocopy, but it doesn't copy single files?

I'm really feeling that Windows is for the lose right now. Surely microsoft have heard of files larger than a few GB?

Thanks!

time4tea
  • 2,169
  • 3
  • 16
  • 21
  • Isn't it easy enough to try the COPY command? Then if you get an error, report that? I think I have used it to copy files in the few GB range, but I could be misremembering. – MJB Dec 14 '10 at 21:07
  • its been a very frustrating day. i'm trying to back up a 1TB raid drive. everything is not working! – time4tea Dec 14 '10 at 21:16
  • 1) the jmicron esata driver is dodgy. 2) rsync is very very very slow for local file copies (it manages 2-10MB/s), so moved to programmatic version 3) having problems with these very large files - maybe rsync could manage, but it would take hours probably. – time4tea Dec 14 '10 at 21:19
  • F:\>copy f:\vmware\Desktop\Desktop.vmdk h:\vmware\Desktop\Desktop.vmdk Overwrite h:\vmware\Desktop\Desktop.vmdk? (Yes/No/All): y Insufficient system resources exist to complete the requested service. 0 file(s) copied. F:\> – time4tea Dec 14 '10 at 21:27
  • @time4tea Have you tried xcopy /z? – Aaron McIver Dec 14 '10 at 21:51
  • the /z switch is for resuming after network failures. curious what effect that would have? – time4tea Dec 14 '10 at 22:29
  • i'm in the process of trying that. – time4tea Dec 14 '10 at 22:44
  • after about a million years it came back! – time4tea Dec 14 '10 at 23:23
  • oh - but it failed. F:\>xcopy /z f:\vmware\Desktop\Desktop.vmdk h:\vmware\Desktop\Desktop.vmdk Does H:\vmware\Desktop\Desktop.vmdk specify a file name or directory name on the target (F = file, D = directory)? f F:\vmware\Desktop\Desktop.vmdk File creation error - Insufficient system resources exist to complete the reques ted service. – time4tea Dec 14 '10 at 23:24

8 Answers8

2

In Java, don't try to copy the whole file in a single operation. The transferTo() method works on chunks of a file; wasn't intended as a high-level file copy method. Invoke transferTo() in a loop, and assume that count bytes of data will be in RAM (i.e., lower that parameter to be comfortable fitting in RAM).

FileChannel src = ... 
FileChannel dst = ...
final long CHUNK = 16 * 1024 * 1024; /* 16 Mb */
for (long pos = 0; pos < fileSize; ) {
  pos += src.transferTo(pos, CHUNK, dst);
}

The comment in the transferTo() JavaDoc about it being "more efficient than a simple loop" refers to the fact that channel-to-channel communication can be optimized more than channel-to-user-space-to-channel. It doesn't mean that all looping can be avoided.

erickson
  • 265,237
  • 58
  • 395
  • 493
  • @time4tea - Sorry, I totally skipped that. Perhaps each invocation is creating a buffer at the kernel level and they are not getting cleaned up in time. Just speculating, but maybe it will help you locate a bug report. – erickson Dec 14 '10 at 21:18
  • @time4tea - Googling around I see a few suggestions that it works if the "chunk" is small enough. See http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4938442 – erickson Dec 14 '10 at 21:23
2

I am a Vmware ESX user, I have 30 production VM's with the largest being 232GB. I backup my VM instances onto an internal SATA drive and then copy these off once a week to an external eSata. I use teracopy (free), it runs on average at 45MB/s on an XP machine with 3GB.

Hope that helps Sailen

Sailen
  • 21
  • 2
1

Well - I've not managed to find a way that works.

None of the packaged tools in windows will copy the file. Drag and drop, COPY, XCOPY, java - all fail to copy the file.

The reason I wanted to copy the file was for a backup before doing an OS upgrade.

In the end i booted into knoppix and copied it.

time4tea
  • 2,169
  • 3
  • 16
  • 21
  • Man this makes me laugh :) I hope you upgraded to Ubuntu or other linux; not that newer versions of win or even XP aren’t good but just for so. – Adrian Aug 09 '18 at 19:16
0

Take a look at this Hotfix, worth a try as everything I have seen points to this as being a cure for your issue.

EDIT: You can also try XCOPY /Z as pointed out here.

Aaron McIver
  • 24,527
  • 5
  • 59
  • 88
  • Thanks - not sure. This is just for normal, large files. not offline files - which I think is something to do with the volume shadow copy service? – time4tea Dec 14 '10 at 21:22
  • It seems they mentioned offline as the example scenario; so in theory it may work as the issue lies in the fact of copying the large amounts of data. If it were me I would give it a whirl...up to you...I have never gotten bit by applying an "un-needed" Hotfix...not to say people haven't though. – Aaron McIver Dec 14 '10 at 21:25
  • Well - the description of the problem there is totally different, so its likely that the actual issue is totally different too. They talk about machines hanging & offline files. This involves no hanging and no offline files. Thanks though. – time4tea Dec 14 '10 at 21:46
0

There may be a hardware issue as well.. I suspect you don't have much time, however you may try dumber stream solution and don't set large buffers (8-16MB should be enough):

public static void copy(InputStream input, OutputStream output) throws IOException {
     byte[] buffer = new byte[1024 * 1024 * 8]; // 8MB
     int n = 0;
     while (-1 != (n = input.read(buffer))) {
         output.write(buffer, 0, n);
     }
}

public static void main(String args[]) {

    if (args.length != 2) {
        System.err.println("wrong argument count");
        System.exit(1);
    }

    FileInputStream in = null;
    FileOutputStream out = null;

    try {
        in = new FileInputStream(new File(args[0]));
        out = new FileOutputStream(new File(args[1]));
        copy(in, out);
    } catch (Exception e) {
        e.printStackTrace();
    }

    if (in != null) { try { in.close(); } catch (Exception e) {}}
    if (out != null) { try { out.close(); } catch (Exception e) {}}

}
barti_ddu
  • 10,179
  • 1
  • 45
  • 53
0

are you sure the filesystem is actually able to cope with such big files (FAT32 cannot for example)? Take a look on this link for details http://www.ntfs.com/ntfs_vs_fat.htm

The system is 32 or 64 bit? On 32-bit you may have problems copy-ing files larger that 2-4Gb.

Also, you said that rsync scucks for you. I've had a very nice experience with it, copying between 2 hard drives at near-native speed. I've had lots of small files..you seem to have on big blob instead.

You may also try splitting the big blob into smaller blobs:)

Quamis
  • 10,924
  • 12
  • 50
  • 66
  • Thanks but this isn't an answer. You made this up with no facts at all. The fact that its a 32 bit system is totally irrelevant - if you can make a file of a certain size you should be able to copy it. – time4tea Dec 16 '10 at 12:32
  • i created a file by mistake on a fat32 fs that i cannot delete now, as it has a funny(utf8) character at the end. Windows simply says that the file dosen't exists, so yeah, the system type does matter... sorry if its not your answer though, hope you eventually find the fix:) – Quamis Dec 16 '10 at 14:28
0
final long CHUNK = 16 * 1024 * 1024; /* 16 Mb */
for (long pos = 0; pos < fileSize; pos++) {   
     pos += src.transferTo(pos, CHUNK, dst); 
} 

This does work! just make sure your src and dst are FileChannel objects (input, output respectively)

  • 2
    Although it looks like it works, it really doesn't. It gives up after about 20 or so GB, failing with the "Insufficient Resources" error. :-( – time4tea Jan 07 '11 at 12:06
0

Another possible answer is Files.copy (java NIO 2), e.g.:

Path sourcePath      = Paths.get("big-file.dat");
Path destinationPath = Paths.get("big-file-copy.dat");

try {
    Files.copy(sourcePath, destinationPath,
            StandardCopyOption.REPLACE_EXISTING);
} catch (IOException e) {
    // something else went wrong
    e.printStackTrace();
}
Adrian
  • 3,321
  • 2
  • 29
  • 46