0

I need to upload a very large file from my machine to a server. (a few GB) Currently, I tried the below approach but I keep getting.

 Caused by: java.lang.OutOfMemoryError: Java heap space
    at java.util.Arrays.copyOf(Arrays.java:3236)

I can increase the memory but this is not something I want to do because not sure where my code will run. I want to read a few MB/kb send them to the server and release the memory and repeat. tried other approaches like Files utils or IOUtils.copyLarge but I get the same problem.

URL serverUrl =
                new URL(url);
    HttpURLConnection urlConnection = (HttpURLConnection) serverUrl.openConnection();

    urlConnection.setConnectTimeout(Configs.TIMEOUT);
    urlConnection.setReadTimeout(Configs.TIMEOUT);

    File fileToUpload = new File(file);

    urlConnection.setDoOutput(true);
    urlConnection.setRequestMethod("POST");
    urlConnection.addRequestProperty("Content-Type", "application/octet-stream");

    urlConnection.connect();

    OutputStream output = urlConnection.getOutputStream();
    FileInputStream input = new FileInputStream(fileToUpload);
    upload(input, output);
            //..close streams



private static long upload(InputStream input, OutputStream output) throws IOException {
        try (
                ReadableByteChannel inputChannel = Channels.newChannel(input);
                WritableByteChannel outputChannel = Channels.newChannel(output)
        ) {
            ByteBuffer buffer = ByteBuffer.allocateDirect(10240);
            long size = 0;

            while (inputChannel.read(buffer) != -1) {
                buffer.flip();
                size += outputChannel.write(buffer);
                buffer.clear();
            }

            return size;
        }
    }

I think it has something to do with this but I can't figure out what I am doing wrong.

Another approach was but I get the same issue:

private static long copy(InputStream source, OutputStream sink)
            throws IOException {
        long nread = 0L;
        byte[] buf = new byte[10240];
        int n;
        int i = 0;
        while ((n = source.read(buf)) > 0) {
            sink.write(buf, 0, n);
            nread += n;
            i++;
            if (i % 10 == 0) {
                log.info("flush");
                sink.flush();
            }
        }
        return nread;
    }
user1995187
  • 393
  • 5
  • 18
  • 1
    I suspect that URLConnection is buffering everything in memory in order to figure out the Content-Length header. Try using a more complete HTTP client library. They most likely have functions to deal with sending files directly, so you don't have to do any of this copying yourself. – Thilo Jun 10 '19 at 07:18
  • @Thilo isn't the OutputStream not suppose to do that? – user1995187 Jun 10 '19 at 08:36
  • 3
    Duplicate: https://stackoverflow.com/questions/2082057/outputstream-outofmemoryerror-when-sending-http – Denis Tulskiy Jun 10 '19 at 08:48
  • @DenisTulskiy thank you; looks like that is the case – user1995187 Jun 10 '19 at 09:53

1 Answers1

1

Use setFixedLengthStreamingMode as per this answer on the duplicate question Denis Tulskiy linked to:

conn.setFixedLengthStreamingMode((int) fileToUpload.length());

From the docs:

This method is used to enable streaming of a HTTP request body without internal buffering, when the content length is known in advance.

At the moment, your code is attempting to buffer the file into Java's heap memory in order to compute the Content-Length header on the HTTP request.

Rich
  • 15,048
  • 2
  • 66
  • 119