0

I have cycle, where i download image, I need to load for example 10 images and merge them in one image. In my interest what images will all loaded. This is how i do that. I have executor for limit thread count, and i have CountDownLatch barrier which waiting until all images will be loaded.

CountDownLatch barrier = new CountDownLatch(images.size());
private static ExecutorService executorService = Executors.newFixedThreadPool(MAX_THREAD_POOL);

for (Image image : images) {
    executorService.execute(new ImageRunnable(image, barrier));
}
    barrier.await();

In ImageRunnable i download image like this. From google static map.

String url ="my url"
try {
    URL target = new URL(url);
    ImageIO.read(target);
    barrier.countDown();
    //exaggerated logic
} catch (IOException e) {
    System.out.println("Can not load image, " + e);
}

Other people said to me that i can get case when all threads in executor will be busy and my algorithm never ends because he will wait until all threads get barrier.await() point (deadlock). How said to me it's will happen when ImageIO.read(target) called and connection will established but HTTP session never be closed (response from server do not come back). This can happen? I thought in this case i get some exception and bad thread will interrupted. Exactly that happens when I start my cycle but on third image i close internet connection by firewall. On output I get broken image like network was closed and image not loaded to end. Am I wrong?

Nathan Hughes
  • 94,330
  • 19
  • 181
  • 276
Mrusful
  • 1,503
  • 1
  • 18
  • 32
  • 2
    This sounds more appropriate for an `ExecutorCompletionService` that you poll for all `images.size()` completed runnables, rather than adding the `CountDownLatch` to the picture. – Louis Wasserman Aug 17 '12 at 18:14
  • Hmm, i am newbie in concurrent package. Name of class speaks for itself, it is necessary to consider the class. Thanks for advice. – Mrusful Aug 17 '12 at 18:21
  • If you're just worried about a deadlock, you could just use the await method that takes a timeout value: await(long timeout, TimeUnit unit) – Alper Akture Aug 18 '12 at 16:34

3 Answers3

3

The concern is you may throw an exception and never count down your latch.

I would consider doing this:

String url ="my url"
try {
    URL target = new URL(url);
    ImageIO.read(target);
} catch (IOException e) {
    System.out.println("Can not load image, " + e);
    throw e;
} finally {
    barrier.countDown();
}

Throw the exception to let the world know you've run into a problem and may not be able to complete (you know you can't recover from it) but at the very least let the barrier get lowered. I'd rather have to deal with an exception than a deadlock.

corsiKa
  • 81,495
  • 25
  • 153
  • 204
  • Yes finally is good, but people say what `IOException` never happen because HTTP session will open and never be closed. I do not know why it may happen. – Mrusful Aug 17 '12 at 18:31
  • 1
    Use a [ScheduledExecutorService](http://java.sun.com/j2se/1.5.0/docs/api/java/util/concurrent/ScheduledExecutorService.html) to apply a timeout as described in [this SO post](http://stackoverflow.com/questions/2758612/executorservice-that-interrupts-tasks-after-a-timeout) and put your `countDown()` call in the timeout handling. – corsiKa Aug 17 '12 at 18:53
  • Thanks, i learn this too, but seems what `ExecutorCompletionService` appropriate for me. – Mrusful Aug 17 '12 at 19:35
1

Just to flesh out my comment:

CompletionService<Image> service = new ExecutorCompletionService<Image>(
  Executors.newFixedThreadPool(nThreads));
for (Image image : images) {
  service.submit(new ImageRunnable(image), image);
}
try {
  for (int i = 0; i < images.size(); i++) {
    service.take();
  }
} catch (InterruptedException e) {
  // someone wants this thread to cancel peacefully; either exit the thread
  // or at a bare minimum do this to pass the interruption up
  Thread.currentThread().interrupt();
}

There. That's it.

If you're concerned about enforcing timeouts on the HTTP connection, my quick and dirty research suggests something like...

URL target = // whatever;
URLConnection connection = target.openConnection();
connection.setReadTimeout(timeoutInMilliseconds);
InputStream stream;
try {
  stream = connection.getInputStream();
  return ImageIO.read(stream);
} finally {
  if (stream != null) { stream.close(); } 
}
Louis Wasserman
  • 191,574
  • 25
  • 345
  • 413
  • Doesn't this have the same pitfalls as the original? There appears to be no timeout on a hung HTTP connection. If you use the timeout option on `poll(long,TimeUnit)` you have to either run them sequentially, meaning you take `N*timeout` time instead of simply `timeout` time, or you have to spawn an additional `N` threads to `poll` them in parallel. I'm not entirely sure this solves the problem at hand. – corsiKa Aug 17 '12 at 19:41
  • I'm not convinced that that represents the common case. Any successful attempt takes as long as it would "normally" take. Mind you, what I would do is do the download myself in the `ImageRunnable` -- implementing the timeout myself -- and then pass the completed download to `ImageIO.read(InputStream)` to get an image out of it. – Louis Wasserman Aug 17 '12 at 19:46
0

Apart from moving barrier.countDown() to finally block as suggested by @corsiKa, make sure your code ever finishes. Set some timeout on reading URL and on await():

barrier.await(1, TimeUnit.MINUTES);
Tomasz Nurkiewicz
  • 334,321
  • 69
  • 703
  • 674