3

I'm working on writing a program to download very large files (~2GB) from a server. I've written the program to be able to resume partially finished downloads,

In order to simulate a bad internet connection, I've been pulling my ethernet cord out of my router while mid-download. Unfortunately, this causes my program to hang on the following call: while((bytesRead = in.read(data)) > 0)

(Where bytesRead is an int, in is a BufferedInputStream built from an HttpURLConnection, and data is a byte array).

I've tried to "interrupt" the call by calling in.close() on another thread, but it has no effect until the internet connection is restored (at which time an exception is thrown).

Is there any way I can prevent a severed internet connection from freezing my program?

Samusaaron3
  • 512
  • 1
  • 7
  • 17
  • 2
    There are similar answered questions: http://stackoverflow.com/questions/804951/is-it-possible-to-read-from-a-java-inputstream-with-a-timeout – Joop Eggen Jan 27 '12 at 23:45

3 Answers3

2

The only reliable way I found is to instantiate Socket as an InterruptibleChannel, and do an interrupt on a stuck IO thread. (BTW, you don't have to use asynchronous NIO calls with InterruptibleChannels, blocking I/O works fine, you just have a really nice and uniform way of kicking the stuck exchanges)

Though, it looks like URLConnection does not allow you to hook up a custom Socket factory.

Maybe you should investigate HttpClient from Apache.

EDIT

Here is how you create Interruptible Socket.

import java.net.InetSocketAddress;
import java.net.Socket;
import java.net.SocketAddress;
import java.nio.channels.SocketChannel;

final SocketAddress remoteAddr =
    new InetSocketAddress(
        serverAddress,
        servicePort
    );

final SocketChannel socketChannel = SocketChannel.open( );

socketChannel.connect( remoteAddr );

// Here java.io.Socket is obtained
Socket socket = socketChannel.socket( );

I don't have HttpClient sample, but I know that you can customize socket initialization.

Alexander Pogrebnyak
  • 44,836
  • 10
  • 105
  • 121
  • Do you have any examples of this? I've been searching for quite a while, and I've been having trouble finding anything useful. Essentially what I need is a reliable, interruptible, resumable way to download very large files (~2gb). I can't find any equivalent to HttpURLConnection.setRequestProperty()... – Samusaaron3 Jan 28 '12 at 07:02
  • @Samusaaron3. I've added an example of how you create interraptible socket. – Alexander Pogrebnyak Jan 28 '12 at 12:38
1

Have you .setReadTimeout(int timeout) on your URLConnection?

-- EDIT

See answer from @DNA for a neat solution:

in short words you can spawn a parallel thread that .disconnect()s the URLConnection (after letting your second thread sleep for timeout milliseconds), thus triggering an IOException that'll get you out of the stalled read.

Unai Vivi
  • 3,073
  • 3
  • 30
  • 46
  • 2
    This is good practice, but won't help if the server returns some data and _then_ gets stuck - it only works in the case when the server return no data at all. – DNA Jan 27 '12 at 23:44
1

See http://thushw.blogspot.com/2010/10/java-urlconnection-provides-no-fail.html for code to handle this situation

Edited: actually, setting a Socket timeout (in milliseconds) using setSoTimeout (as suggested in the link comment from Joop Eggen) is probably better.

DNA
  • 42,007
  • 12
  • 107
  • 146
  • I've tried calling .disconnect() from a separate thread, but even after the call, no IOException is thrown, so I'm still stuck... – Samusaaron3 Jan 28 '12 at 00:03
  • @DNA In order to use setSoTimeout you should have a Socket, but HttpURLConnection doesn't allow you to retrieve its underlying Socket. (I'm assuming OP needs to rely on a HttpURLConnection) – Unai Vivi Jan 28 '12 at 00:18