1

I have code inside of an httphandler which I am using the serve files on a website. Basically this is used as a dynamic replacement instead of linking directly to a file. It takes an ID as input, checks database & permissions then responds back with the associated file. The file itself is stored off site on a different location. I have been instructed to use ftp to bring the file back to the our server. Our server environment is using .net 3.5.

The code I have is working without error when only a single ftp action is happening at one time.

However when the httphandler is being called multiple times simultaneously (e.g. if 4 people are using it once or one person executes 4 times at once) then some of the executions fail, with error The remote server returned an error: 150 Opening data channel for file download from server of XXX: Authentication failed because the remote party has closed the transport stream.

I was originally using fluentFTP 3rd party libraries when I encountered this problem. I thought perhaps it was 3rd party library causing the problem so in debugging I moved to just use FtpWebRequest in the below code but the same error is occurring.

I have been reading into possible fixes for the past couple of days. A lot of common responses have been tried and not worked: these include:

Setting the ServicePointManager.SecurityProtocol

Setting a high connection limit

Setting a connection group for each execution.

None of those changes have had any effect on how the code executes.

A summary of the code is noted below

try
{
    //Hook a callback to verify the remote certificate
    ServicePointManager.ServerCertificateValidationCallback = new RemoteCertificateValidationCallback(MyCertValidationCb);
    ServicePointManager.SecurityProtocol = ServicePointManager.SecurityProtocol = (SecurityProtocolType)48 | (SecurityProtocolType)192 | (SecurityProtocolType)768 | (SecurityProtocolType)3072;

    FtpWebRequest reqFTP = (FtpWebRequest)WebRequest.Create(new Uri("ftp://" + hostAddress + downloadByFtpSource));
    reqFTP.Credentials = new NetworkCredential(ftpUSername, ftpPassword);
    reqFTP.EnableSsl = true;
    reqFTP.UsePassive = true;
    reqFTP.Method = WebRequestMethods.Ftp.DownloadFile;
    reqFTP.UseBinary = true;

    reqFTP.KeepAlive = false;
    reqFTP.ServicePoint.ConnectionLimit = 100;
    Random r = new Random();
    var debugTestingRandomNumber = (-1 * r.Next(1000000));
    string debugTestingGroupName = "MyGroupName" + debugTestingRandomNumber.ToString();
    reqFTP.ConnectionGroupName = debugTestingGroupName;

    using (Stream ftpStream = reqFTP.GetResponse().GetResponseStream())                           
    using (Stream fileStream = File.Create(downloadByFtpDestination))
    {

        CopyTo(ftpStream, fileStream);                    
        FtpWebResponse response = (FtpWebResponse)reqFTP.GetResponse();
        var debuggingTest = response.StatusCode.ToString();

        result = true;

    }
    reqFTP.ServicePoint.CloseConnectionGroup(debugTestingGroupName);               
}
catch (Exception ex)
{
    result = false;
}

edited: to add in line reqFTP.UseBinary = true;

  • The files may be binary so use : reqFTP.UseBinary = true; – jdweng Sep 14 '18 at 10:31
  • Unfortunately adding UseBinary = true; did not have any effect. It continues to work 100% of the time on a single action, fails sporadically(more often than not) when executed multiple at same time. – Sullivanets1 Sep 14 '18 at 10:38
  • Usually with FTP issues it is one of 3 things 1) Binary 2) Max Transfer size is exceeded. Server has settings 3) The connection closes when it is idle for a long time. You can use a sniffer like wireshark or fiddler. FTP uses TCP as transport layer. See when connection closes FIN message and if client or server closes connection. Does it fail on single transfer if file is very large? From what you say it doesn't seem like data is getting corrupted or ACK are failing since one file works. You can open a CMD.EXE window and try >Netstat -a to get status of connection after it fails. – jdweng Sep 14 '18 at 10:54
  • I don't think that it is file size related or timeout, it can work successfully taking a long time on files over 100 megabytes, uploading half a gig linearly without error over minutes of time. It would then fail immediately within seconds of trying to simultaneously download files that are in the KB range. – Sullivanets1 Sep 14 '18 at 11:02
  • Unfortunately I've not been able to see any meaningful information after running netstat -a A lot of information, a lot of established connections, not actually seeing the ftp servers ip address on list – Sullivanets1 Sep 14 '18 at 11:14
  • Then connection was closed. Which is good. Sometimes connections are left in a half open state. If an FTP is taking a few minutes you should see the connection while the transfer is being made. You may see the computer name instead of the IP. – jdweng Sep 14 '18 at 13:01
  • One additional note - I changed it to use just basic ftp instead of ftps and everything works without problem large numbers of simultaneous connections proceeded without error. – Sullivanets1 Sep 14 '18 at 14:40

0 Answers0