0

I've got an application that is downloading several large binary files and saving them to disk. On some machines it works fine and on some other machines every once in a while a download will proceed to 99.9% complete and the URLStream object will not fire Event.COMPLETE

This is almost identical to the issue that appears here:

Why does URLStream complete event get dispatched when the file is not finished loading?

I've tried using the 'Cache Bust' method described in one of the answers but still no dice.

Any help would be appreciated.

Here is some sample code to help illustrate what I am trying to do:

var contentURL:String = "http://some-large-binary-file-in-amazon-s3.tar";

var stream:URLStream = new URLStream();
stream.addEventListener(Event.COMPLETE, function(e:Event):void{
    //This should fire when the file is done downloading
    //On some systems this fails to fire once in a while
    //On other systems it never fails to fire               
});
stream.addEventListener(ProgressEvent.PROGRESS, function(pe:ProgressEvent):void{
    //Write the bytes available in the stream and save them to disk
   //Note that a download will reach 100% complete in terms of total progress but the 'complete' event might still not fire.
});

var urlRequest:URLRequest = new URLRequest(contentURL);
//Here we might add some headers to the URL Request for resuming a file
//but they don't matter, the 'Event.COMPLETE' will fail to fire with our without
//these headers
addCustomHeaders( urlRequest );

stream.load( urlRequest );
Community
  • 1
  • 1
Phil
  • 11
  • 1
  • I've updated the post to contain a code sample. – Phil Apr 14 '16 at 13:06
  • **"On some systems this fails to fire once in a while ...On others it never fails..."* I would begin by checking those failing systems. Is it an internet connection issue (ie: do they have something cutting off large downloads? etc). Seems like a hardware / network issue instead of anything wrong with your code (otherwise it would never work for all systems, right?). – VC.One Apr 15 '16 at 01:38
  • Also is this an AIR application? If yes, maybe check the total bytes on disk at regular intervals and your own code can decide if the download is complete (got all bytes) and do whatever... If 99.9% means a missing chunk, just re-download the final range of bytes and append to the downloaded to have a complete one. In fact : why not try getting in smaller chunks, stitching up the parts (`file.append`) to make a final large file on disk... – VC.One Apr 15 '16 at 01:41

1 Answers1

0

Imo this is a code meant to fail where you purposely give up any control on whatever is going on and just assume that everything would work by itself and go well. I never had any problems whatsoever with the URLStream class but here's basically what I never do:

  1. I never not register all the different error event available (you don't register any).

  2. I never use anonymous listeners. Even though they are supposed to not be GC until the download is complete this is imo an unnecessary unsafe bet especially since it's not rare for the URLStream to idle a little while loading the last bits. I would not be surprised if removing those anonymous listeners would actually fix the problem.

BotMaster
  • 2,233
  • 1
  • 13
  • 16
  • The code in the example is meant for illustration purposes, it's not production code. In my actual application I don't use anonymous functions although I don't agree at all that they would matter. In the production code I do listen for all errors. If you take a look at the post I link to, it's the exact same problem. – Phil Apr 15 '16 at 17:49
  • It's pointless to post a code that has nothing to do with the actual code.where the problem exist. – BotMaster Apr 15 '16 at 20:57