I have a FTP (Filezila server), and I would like to download a large file on it, using c#.
WebClient client = new WebClient();
client.Credentials = new NetworkCredential(userName, password);
client.DownloadFile(new Uri("ftp://XXX.XXX.XXX.XXX/" + fileName), destinationFileFullPath);
or
FtpWebRequest request = (FtpWebRequest)WebRequest.Create("ftp://XXX.XXX.XXX.XXX/" + fileName);
request.Method = WebRequestMethods.Ftp.DownloadFile;
request.KeepAlive = true; // I tried both
request.UseBinary = true; // I tried both
request.UsePassive = true;
request.Credentials = new NetworkCredential(userName, password);
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Stream responseStream = response.GetResponseStream();
// Writing bytes to local files
using (var outputStream = File.OpenWrite(destinationFileFullPath))
{
byte[] chunk = new byte[2048];
int bytesRead;
do
{
bytesRead = responseStream.Read(chunk, 0, chunk.Length);
outputStream.Write(chunk, 0, bytesRead);
} while (bytesRead > 0);
outputStream.Flush();
}
responseStream.Close();
response.Close();
For small file, it works well, but for large file (by large I mean around ~300MB and more) it works, but at the end of the transfer (I see on filezilla server side that the transfer is a success) for an unknown reason my app just hang on the line
client.DownloadFile(new Uri("ftp://XXX.XXX.XXX.XXX/" + fileName), destinationFileFullPath);
or
bytesRead = responseStream.Read(chunk, 0, chunk.Length);
until I got a timeout exception. If I force the break (pause) I can see that is really stuck on this line. It is like if Filezilla close the connection and C# just wait more to read... Did you already experienced this issue?
Thanks for your help.