1

I have a FTP (Filezila server), and I would like to download a large file on it, using c#.

 WebClient client = new WebClient();
 client.Credentials = new NetworkCredential(userName, password);
 client.DownloadFile(new Uri("ftp://XXX.XXX.XXX.XXX/" + fileName), destinationFileFullPath);

or

 FtpWebRequest request = (FtpWebRequest)WebRequest.Create("ftp://XXX.XXX.XXX.XXX/" + fileName);
 request.Method = WebRequestMethods.Ftp.DownloadFile;
 request.KeepAlive = true; // I tried both
 request.UseBinary = true; // I tried both
 request.UsePassive = true; 
 request.Credentials = new NetworkCredential(userName, password);
 FtpWebResponse response = (FtpWebResponse)request.GetResponse();
 Stream responseStream = response.GetResponseStream();

 // Writing bytes to local files
 using (var outputStream = File.OpenWrite(destinationFileFullPath))
 {
     byte[] chunk = new byte[2048];
     int bytesRead;
     do
     {
         bytesRead = responseStream.Read(chunk, 0, chunk.Length);
         outputStream.Write(chunk, 0, bytesRead);
     } while (bytesRead > 0);

     outputStream.Flush();
}

responseStream.Close();
response.Close();

For small file, it works well, but for large file (by large I mean around ~300MB and more) it works, but at the end of the transfer (I see on filezilla server side that the transfer is a success) for an unknown reason my app just hang on the line

 client.DownloadFile(new Uri("ftp://XXX.XXX.XXX.XXX/" + fileName), destinationFileFullPath);

or

bytesRead = responseStream.Read(chunk, 0, chunk.Length); 

until I got a timeout exception. If I force the break (pause) I can see that is really stuck on this line. It is like if Filezilla close the connection and C# just wait more to read... Did you already experienced this issue?

Thanks for your help.

Dev_MC
  • 11
  • 3
  • 1
    Have you compared size received to size sent? – BugFinder Feb 12 '18 at 10:59
  • Can you download the big file using any standalone FTP client running on the same machine as your C# code? – Martin Prikryl Feb 12 '18 at 11:11
  • When TCP (FTP uses TCP in the transport layer) keep-alive is running you get data with zero bytes in the message. So I expect that your client is getting out of the for loop before all the data is read. – jdweng Feb 12 '18 at 11:39
  • @jdweng OP claims that the code hangs in `responseStream.Read`. Also `responseStream.Read` does not return 0 until the end of the download is reached. Even the `WebClient.DownloadFile` internally relies on this. – Martin Prikryl Feb 12 '18 at 14:20
  • Op said following line hangs : client.DownloadFile(new Uri("ftp://XXX.XXX.XXX.XXX/" + fileName), destinationFileFullPath); Can't find this in code. – jdweng Feb 12 '18 at 17:22
  • @BugFinder: I try with the second code. It's not the same size, as the flush is after my do/while. However, as a test I put it just after the write and it's the same size. It means that I always get all the bytes, just that it continue to want to read something Martin: I use FileZilla client and no problem to download the file jdweng: please re-read my entire post. I have posted 2 different code that provoke the same result : VS always stuck in the read part of the 2 different code. all: I can see on my ftp the message "226 Successfully transferred". But my app always get stuck on Read. – Dev_MC Feb 13 '18 at 02:24
  • I just notice that https://stackoverflow.com/questions/11950211/downloading-large-files150mb-from-ftp-server-hangs could be related. Seems is more a FTP server issue, have to understand why exactly.... – Dev_MC Feb 13 '18 at 02:30

0 Answers0